Category: storage

  • Job Losses Expected At SanDisk


    SanDisk is about to announce job cuts of around 15 per cent of its staff – or between 450 to 500 employees.

    Quoting unnamed industry sources, Engadget reported today that the job losses were to reduce costs following a USD $155 million Q3 loss and sales that have sunk by 21 per cent year-on-year to USD $281 million.

    SanDisk has just announced a new technology that will allow solid state disk (SSD) drives to perform up to 100 times faster than they can now.

    Called ExtremeFFS (Flash File System), it will accelerate random write speeds by up to 100 times over existing systems.

    The technology will appear in SanDisk SSD drives from next year.

  • Interxion To Expand London Data Center


    Carrier-neutral data center expert, Interxion, is to expand the capacity of its London City data center for the second time this year.

    Growing customer demand for high power density infrastructure is behind the expansion.

    The new 400 m² of equipped space, scheduled for completion in Q1 2009, will have access to the data centers 13 megawatt power supply, allowing it to deliver exceptionally high density power configurations of up to 17.5 kW per cabinet position.

    In April, Interxion completed a 1,250 m2 build-out. The Interxion London data center also hosts a Point of Presence for the London Internet Exchange (LINX).

    Greg McCulloch, MD of Interxion UK, said the mix of City of London location, power and connectivity was proving highly attractive to customers.

    “The site is served by some 28 carriers and network partners in total, giving a wide range of connectivity options, and customers can reduce latency while enhancing network resilience and availability by peering directly to LINX.”

  • IT Decision Makers Unclear About Unified Storage

    Unified storage has yet to make an impact on IT decision makers, with few even able to define what it stands for and even less aware of the business benefits of implementation, according to a survey.

    The study was conducted by Gartner and ONStor among 1600 IT and business decision makers from four continents and over 37 countries who attended the recent Gartner Data Center Summit and SNW Europe.

    It found that only 58 per cent of those questioned were familiar with the term unified storage.

    Unified storage has been defined as a single integrated storage infrastructure that functions as a unification engine to simultaneously support Fibre Channel, IP Storage Area Networks (SAN) and Network Attached Storage (NAS) data formats.

    Despite this, of the 50 per cent who said they could define it, 43 per cent thought it referred to virtualised storage and 56 per cent believed it was a combination of back up and storage.

    Narayan Venkat, vice president of corporate marketing at ONStor, said more had to be done to ensure IT professional were better briefed about unified storage.

    “What is absolutely clear from these top line survey findings is that the market needs more education on the benefits of unified storage, and that is where the vendor community needs to join forces with analysts to drive this message home," he said.

    Other key findings highlighted:

    • 21 per cent of the same who had heard of unified storage believed it would deliver a lower total cost of ownership

    • A further 21 per cent believed it would provide a more flexible network moving forward

    • The ability to protect the current investment in infrastructure was only cited by 8 per cent and only 6 per cent felt it would reduce operating costs and capital expenditure
  • Future SIM Cards Capable of Mass Audio and Video Storage


    Infineon Technologies and Micron Technology have announced a joint-venture to develop high-density subscriber identity module (HD-SIM) cards with a capacity greater than 128MB.

    HD-SIMs combine high density with improved security functionality, which the firms say enables operators to offer graphically-rich, value-added services such as mobile banking and contactless mobile ticketing.

    Operators are also able to securely update or delete applications through their wireless network while new applications, services and settings can be downloaded or pushed to the HD-SIM at any time.

    Working in close technical collaboration, both companies are leveraging their respective expertise to architect modular chip solutions that combine an Infineon security microcontroller with Micron’s NAND flash memory with features designed specifically for HD-SIM applications.

    Micron will manufacture the NAND on 50-nanometer (nm) and 34-nm process technology.

    Dr. Helmut Gassel, vice president and general manager of the Chip Card and Security division at Infineon Technologies, said: "Infineon envisions a new role of future SIM cards that will be capable of audio and video mass content storage and even Flash card replacement."

    Prototypes are expected to be available in the autumn of 2009 and will be sold in die form or in a chip card IC package.

  • Service Offers Camcorder Storage Solution


    The Photo Archival Company has launched a new archiving service that stores digital camcorder footage to Blu-ray Disc or DVD.

    Charles Laughlin, president and founder of The Photo Archival Company, said it would unlock millions of hours of video trapped inside today’s generation of no-tape camcorders that record to internal hard drive, external USB hard drive or flash memory.

    He said video footage could be preserved to produce long-lasting DVDs or Blu-ray Discs.

    "The recurring theme from a typical customer is that it is impractical for them to archive their digital footage first hand," he said.

    "For the average household, it can be a daunting task to spend the necessary time to tend to the successful creation of several DVDs or Blu-ray Discs just to continue filming."

  • Researchers Opt For COPAN's Fast Access and Security


    One of the world’s leading life science research institutes announced today that it has chosen a COPAN Systems-based storage solution to meet its demanding data storage needs.

    The Friedrich Miescher Institute for Biomedical Research (FMI) has a strong record of innovation in the molecular biology of disease.

    Researching, developing, testing and delivering these medical breakthroughs require generating, analyzing and retaining huge quantities of data.

    This critical data is not necessarily accessed regularly but must be kept instantly available at all times for crucial analysis.

    Friedrich Miescher Institute, Basel, Switzerland

    Dean Flanders, head of informatics at FMI, said no other company on the market could match the COPAN Systems solution in such a demanding environment.

    “Our competitive review found that no other system could provide this kind of high performance, scalability and cost-effectiveness to meet our persistent data needs," he said.

    "This means we can provide fast access to a vast amount of data in a very small footprint.”

    Located in Basel, Switzerland, a large proportion of FMI’s life science data is generated from microscopy, but new projects led the institute to seek a storage solution to support a wider range of data.

    A single new piece of laboratory equipment can radically alter the organization’s storage needs.

    For example, two new Illumina Genome Analyzers are each capable of producing up to two terabytes of data per week.

    In this environment, FMI sought a cost-effective solution for backing up and restoring multi-terabyte file systems while coping with limited power, cooling and space resources.

    The institute, part of the Novartis Research Foundation, needed to innovate beyond a traditional file server system which leaves too much persistent data on expensive tier one storage, straining existing infrastructure.

    A traditional HSM system was out of the question since the high volume and file size of FMI’s life science data meant a tape system would present slow access rates, data integrity issues and no online access.

    FMI was also concerned about pressure on space, power and cooling resources as their data production grows and their system scales.

    To meet these challenges, FMI selected COPAN Systems’ disk-based Virtual Tape Library, because of its fast access times with the security and reliability of disk.

    The new highly scalable tiered file system has almost no impact to the existing cooling and power infrastructure.

    By migrating persistent data to the new COPAN Systems solution, FMI frees up more expensive tier one storage for its original purpose – modifying and storing quickly changing transactional data.

    Some of the key benefits of the new system for FMI are:

    • Scalability: The system currently holds 40 terabytes of data but can scale in one rack to 896 terabytes without redesigning or changing cooling requirements.
    • Simplicity: By writing a file to FMI’s HSM file system within a defined period of time, the file can be automatically copied to the COPAN Systems MAID platform and another copy created to tape in a remote location as required.
    • Efficiency: COPAN Systems’ ultra-dense disk configurations are enhanced using Enterprise MAID technology. COPAN Systems powers off disks that have no outstanding IO requests, thus reducing power consumption by around 85 percent.
  • Digital-data Explosion Requires New Tools


    By 2011, the digital universe will be ten times the size it was in 2006, according to research from IDC.

    This digital-data explosion will require IT organizations to adopt new tools and standards to ensure an efficient information infrastructure.

    The study points out that existing relationships with business units will have to be transformed.

    It will take all competent hands in an organization to deal with information creation, storage, management, security, retention, and disposal.

    Importantly, the researchers said the problem is not just a technical one, but requires organization-wide policies.

    Titled The Diverse and Exploding Digital Universe: An Updated Forecast of Worldwide Information Growth Through 2011, the report highlights several findings that will affect individuals and business around the world in the years to come, including:

    • At 281 billion gigabytes (281 exabytes), the digital universe in 2007 was 10 percent bigger than originally estimated.

    • With a compound annual growth rate of almost 60 percent, the digital universe is growing faster and is projected to be nearly 1.8 zettabytes (1,800 exabytes) in 2011, a 10-fold increase over the next five years.

    • Your "digital shadow" — that is, all the digital information generated about the average person on a daily basis — now surpasses the amount of digital information individuals actively create themselves.

    • The digital universe in 2007 was equal to almost 45 gigabytes (GB) of digital information for every person on earth — or the equivalent of more than 17 billion 8 GB iPhones.

    • About 70 percent of the digital universe is created by individuals, yet enterprises are responsible for the security Relevant Products/Services, privacy, reliability, and compliance of 85 percent.

    To deal with this explosion, IDC says IT organizations must:

    • Transform existing relationships with business units. It will take all competent hands in an organization to deal with information creation, storage Relevant Products/Services, management, security, retention, and disposal. It’s not a technical problem alone.

    • Spearhead the development of organization-wide policies for information governance: information security, information retention, data access, and compliance.

    • Rush new tools and standards into the organization, from storage optimization, unstructured data search, and database Relevant Products/Services analytics to resource pooling (virtualization Relevant Products/Services) and management and security tools. All will be required to make the information infrastructure.
  • Will Microsoft's Cloud-Computing Initiative Be Good For The Storage Industry?


    Microsoft this week finally laid out its cloud-computing strategy during a keynote speech at the Microsoft Professional Developers Conference 2008.

    Ray Ozzie, Microsoft’s chief software architect, announced Windows Azure, a cloud-based service foundation underlying its Azure Services Platform.

    He explained Windows Azure’s role in delivering a software-plus-services approach to computing.

    The Azure Services Platform is intended to help developers build the next generation of applications that will span from the cloud to the enterprise data center and deliver compelling new experiences across the PC, web and phone.

    Ray Ozzie, Chief Software Architect, Microsoft

    Azure gives Microsoft’s customers the choice of deploying applications via cloud-based Internet services or through on-premises servers, or to combine them in any way that makes the most sense for the needs of their business.

    While the much-awaited news makes clear Microsoft’s intentions, how will it affect the storage industry generally?

    The Register’s Chris Mellor has no doubt that the move towards a few large providers of cloud computing services will spell trouble for many storage vendors.

    Noting that Microsoft has now joined Amazon and Google in offering cloud computing services he cited IDC research, which says cloud computing will grow 16 per cent a year through to 2012.

    He points out that by 2012 there could be six major cloud computing suppliers – Amazon, Google, Microsoft, Dell, HP and IBM – with half a million customers each by 2012, meaning 3 million fewer customers directly buying servers and storage for their apps because they’ve been transferred to the Cloud.

    While Mellor concedes that the storage industry is seen by some as being "ridiculously over-supplied", he concludes that the news that cloud computing is set to grow is very bad news for the storage industry.

    What do you think? Please send us your comment.

  • Enterprises Failing To Properly Encrypt Backup Data


    Backup tapes are being neglected by administrators, according to a study conducted jointly by security vendor Thales Group and Trust Catalyst.

    The results of the survey of 330 large enterprises worldwide showed that 35 per cent don’t know if they will encrypt their backup tapes.

    Failure to have a backup tape encryption plan could place an organization’s data at risk, leading it into a breach of compliance – and possible heavy financial losses.

    Kevin Bocek, director of product marketing at Thales, said storage departments were often more concerned with the cost and speed of data recovery than with encryption.

    Enterprises also felt they lacked access to technology adequate for enterprise-grade tape encryption.

    "Traditionally, storage has been a domain in and of itself, and IT security has been focusing on front-facing business applications, so they don’t pay that much attention to security," he said.

    "Previously, tape encryption technology used to be bolted on or would be an application used for general backup, and some didn’t trust those to encrypt their tapes for backup."

    The situation is changing, as more and more applications come with built-in encryption. However, a new problem then emerges – managing the encryption keys.

    If these are lost, then so is the data.

    The Thales study found that most people do not know where to store their encryption keys. More than 40 per cent of the survey’s respondents answered that they didn’t know where to store keys for seven out of 13 encryption apps.

    Most of the remainder stored their encryption keys in software or on a disk, while very few stored the keys in a dedicated appliance.

    Key management issues would continue to be an issue for backup media, according to Bocek.

  • CEOs Must Take Responsibility For Data Breaches


    A rapid rise in losses from giant databases highlights the need for tougher sanctions to deter such security breaches, according to a privacy watchdog.

    The UK’s Information Commissioner’s Office (ICO) is also calling on chief executives to take responsibility for the personal information their organisations hold.

    The number of data breaches reported to the ICO has soared to 277 in the past year.

    New figures, released today by the ICO, include 80 reported breaches by the private sector, 75 within the National Health Service and other health bodies, 28 reported by central government, 26 by local authorities and 47 by the rest of the public sector.

    The ICO is investigating 30 of the most serious cases.

    Richard Thomas, the Information Commissioner, said information can be a toxic liability and that accountability rests at the top.

    He said CEOs must make sure their organisations have the right policies and procedures in place.

    "It is alarming that despite high profile data losses, the threat of enforcement action, a plethora of reports on data handling and clear ICO guidance, the flow of data breaches and sloppy information handling continues," he said.

    "We have already seen examples where data loss or abuse has led to fake credit card transactions, witnesses at risk of physical harm or intimidation, offenders at risk from vigilantes, fake applications for tax credits, falsified Land Registry records and mortgage fraud.

    "Addresses of service personnel, police and prison officers and battered women have also been exposed. Sometimes lives may be at risk."

    Describing these breaches as "serious and worrying", Thomas said this was especially so because personal information is now the lifeblood of government and business.

    He said that as a result data protection has never been more important.

    "It is time for the penny to drop. The more databases that are set up and the more information exchanged from one place to another, the greater the risk of things going wrong," he said.

    "The more you centralise data collection, the greater the risk of multiple records going missing or wrong decisions about real people being made.

    "The more you lose the trust and confidence of customers and the public, the more your prosperity and standing will suffer.

    "Put simply, holding huge collections of personal data brings significant risks."

    Earlier this year, the UK Parliament decided that the ICO should have the power to impose substantial penalties for deliberate or reckless breaches.

    The ICO is working with the government to ensure this measure is implemented as soon as possible.

    It hopes that the threat and reality of substantial penalties will concentrate minds and act as a real deterrent.