Tag: voices-of-the-industry

  • There's No Such Thing As Too Much Storage

    As demand for mobile data storage keeps rising, the hard drive industry needs to work harder at adapting its technology and products to keep pace.


    Storage.biz-news.com spoke to Daniel Mauerhofer, of storage giant Western Digital, to find out more about this evolving market.

    Desktop computing remains the largest market for hard drives but the young upstart – consumer electronics – is the fastest growing.

    Demand for data storage is soaring in everything from PDAs, navigation systems and automotive applications to handheld devices that store music, books, news content, movies and television programs.

    In parallel with this is the need for portable data collection devices, something storage giant Western Digital (WD) has been quick to pick up on.

    It recently launched My Passport, a 500 GB capacity portable USB drive that is small enough to fit in the palm of your hand.

    Not so long ago it would have been inconceiveable to imagine how most consumers would use that amount of storage capacity – let alone in a mobile format.

    Yet Daniel Mauerhofer, senior PR manager EMEA for WD, said that since storage space was now quickly eaten up by even modest amounts of photo, video and music files, finding a use for half a Terabyte of storage wasn’t that difficult.

    He said the advent of compact cameras with the capacity for ever-larger resolution meant even just storing photographs required a great deal of memory space.

    “There’s no such thing as too much storage these days,” he said.

    WD was founded in Lake Forest, California in 1970 and has been manufacturing internal hard drives since 1990. It moved into the external drive market four years ago.

    While its principal markets – desktop and notebook computing – are expected to continue growing strongly, the launch of the My Passport portable series positions it strongly in the consumer electronics sector.

    This hard drive market, which today accounts for sales of 81 million units worth more than USD $6 billion, is expected to grow to 220 million units in 2010 – a compound annual growth rate of 29 per cent.

    Mauerhofer said external drives generated very little revenue for WD three years ago.

    “Now they represent a fifth of our turnover. It’s a billion dollar business now,” he said. “People are spending considerable time on the internet and its penetration is getting better, so people are downloading more and more. We do not see that stopping.”

    For this reason, the consumer rather than corporate user is seen as being the principal buyer of My Passport portable drives.

    This is borne out by the sleek design and color choices for the drives – a far step from the customary image of external drives as functional “blocks”.

    Technology is evolving to cope with the ever-increasing demand for portable storage

    Mauerhofer said the industry currently used Perpindicular Magnetic Recording (PMR), which still had potential for greater capacity.
    So much so that he predicted that within the next 18 months a 1 Terabyte storage drive would become available.

    “There is a big need in the B2B enterprise space for huge capacity coupled with small form factor and it’s a safe bet to say you will find them in our portable products as well,” he said.

    However, Mauerhofer said there would come a point when even the PMR technology reached a capacity limit. This would open up the market to replacement technology such as Heat Assisted Magnetic Recording.

    The consumers’ appetite for storage appears insatiable – but technology has managed to keep ahead of the game. Can it continue to do so?

  • Mobile TV To Become Standard Feature of Smartphones

    Mobile TV has really only achieved great popularity in nations such as Japan and Korea.

    But the market is expected to expand rapidly over the next few years, spurred on by the smartphone which is driving improvements in screen quality, microchips and antennas.

    Smartphone.biz-news.com spoke to David Srodzinski, chief executive of fledgling semiconductor firm Elonics, about his expectations for the future of mobile TV.

    Mobile TV will soon become as accepted a feature of mobile handsets as the camera.

    That is the prediction of David Srodzinski, founder and chief executive of Elonics, a semiconductor company that has designed a silicon radio frequency (RF) tuner used to convert signals into sound and pictures.

    “We do see mobile TV as going to take off just like the camera phone has taken off,” he said.

    “It’s not something you will use all the time, but it’s a part of the phone that will be such a ‘nice to have’ feature that all phones will simply have to have them.”

    David Srodzinski CEO Elonics

    Based in Livingston, Scotland, Elonics recently announced that David Milne, the founder and former chief executive of chip maker Wolfson Microelectronics, was joining its board as non-executive chairman.

    Milne was credited with taking Wolfson from a university spinout to the FTSE 250 and the company made its name as a key supplier of microchips to the iPod.

    Founded in 2003, Elonics has developed RF architecture called DigitalTune that is the foundation for a family of re-configurable CMOS RF front end products.

    Its E4000 device is designed for reception of all major world-wide fixed and handheld terrestrial digital multi-media broadcast standards within UHF to L-Band ranges (76MHz to 1.70GHz).

    It allows designers to implement front ends capable of cost effectively supporting multiple TV and radio broadcast standards and enabling smaller, lighter, cheaper and lower power consumer electronics.

    Elonics has finished market sampling its products and is about to begin mass-production.

    Srodzinski said the immediate focus for the broadcast receiver technology was the traditional TV market, ranging from digital TVs, set-top boxes and PC TVs to multi-media devices.

    But he believed the biggest opportunities lay in the mobile TV market, with analysts forecasting sales of mobile TV enabled handsets rising to 100 millionin 2010.

    “All future potential growth is coming from the cell phone side of the market,” he said. “Smartphones are increasingly a sizeable part of that market.”

    Screen size and quality a key factor influencing the adoption of mobile TV on cell phones

    Srodzinski said that with QVGA screens appearing on increasing numbers of handsets, a barrier to mobile TV was being removed.

    He said that prior to the introduction of QVGA screens, adding mobile TV to a cell phone meant additional costs for the screen, the graphic processors and mobile TV chip set.

    “With the advent of QVGA offerings, such as on the new HTC phones and the iPhone, which have them as standard, the cost add of mobile TV is minimal now,” he said.

    For the screens alone, Srodzinski estimated that the cost add had dropped by a tenth, from USD $50 to $5.
    “All that has to be added now is the mobile TV chip set,” he said.

    But if cost and technological issues were no longer an impediment to widespread uptake of mobile TV, what about users’ appetite for the service?

    Srodzinski expected mobile TV to be something people would use once or twice a week for five to 10 minutes, most probably as a free-to-view service.

    “That user experience will be such a good feature and such a compelling reason, that people will want mobile TV on their cell phones in a similar way to how they want to have camera phones too,” he said.

    “We believe that if mobile TV works and takes off in that way, it will be a major opportunity that will grow out of the smartphone and into middle layer cell phones.”

    The great success of mobile TV in Japan and Korea, where penetration rates now reach 40 per cent, owes a great deal to government intervention promoting the services, according to Srodzinski.

    He said this had created revenue opportunities and lifted technological barriers to entry.

    “What’s holding back other parts of the world has more to do with the infrastructure roll-out and the cost of doing that,” he said. “That and the lack of clear government support.”

    However, Srodzinski insisted that the growth of mobile TV in territories outwith Japan and Korea would accelerate as more people experienced it and saw the quality of the services and content.

    “I think other regions will catch on,” he said. “This is not a technological push situation – it has to be a consumer-led requirement., especially if it’s free-to-air that takes off.”

    While content may be free, any explosion in mobile TV will also have to offer opportunities for revenue to the industry.
    As Srodzinski said: “The question has to be: who makes any money out of it? There’s no particular economic benefit to operators.”

    Undoubtedly an answer to that conundrum will be found, but will mobile TV really take off?
    Please let us know your thoughts on the matter.

  • Vyke Says Mobile Operators Risk Being Leap-frogged in Evolving Market


    VoIP provider, Vyke, has warned that mobile operators are poorly positioned to cope with latest industry developments.

    Aaron Powers, head of business development at Vyke, said the operators are failing to spearhead new innovations – leaving them open to greater than ever competition from a new breed of rivals.

    "Metro WiFi networks are springing up everywhere; there’s one in Singapore – a whole country covered by WiFi," he said.

    "At the same time handsets are advancing to a stage where they are becoming the access point for services installed by the consumer, meaning they don’t need network-provided services anymore, and this goes straight to the core of mobile operators’ revenues.

    "That leaves them poorly positioned in a rapidly evolving market. They need to be the ones directing these changes in a way that benefits them or they’ll get leap-frogged."

    Powers also criticised telecoms operators for opposing the European Commission’s proposal to cap mobile data roaming rates.

    European telecoms commissioner Viviane Reding is set to recommend restrictions on data roaming fees this autumn.

    She has also made clear she intends to impose caps on SMS roaming charges and mobile termination rates, proposals which have drawn widespread criticism from a number of European telcos.

    Citing a recent GSMA report, Powers claimed that the rising uptake of mobile data services was boosted by a 25 per cent fall in roaming rates in the year to April 2008.

    "Actually when you look at it, a small reduction in roaming rates has led to operators making a lot more money off data by volume of usage," he said.

    "Yet all of a sudden there’s uproar when the EU tries to set a cap – mobile providers have taken a head in the sand point of view."

  • VoIP Providers Must Allow Emergency Calls and Give Caller Location


    The UK communications industry regulator, Ofcom, has told internet telephony providers that they must now allow emergency 999 calls over their networks or face the risk of enforcement action.

    Caller location information must also be provided where technically feasible.

    Effective immediately, the ruling for Voice over Internet Protocol (VoIP) providers affects businesses such as BT, Vonage and Skype that offer services that connect VoIP calls to the public telephone network.

    Operators must now provide the ability to make calls to 999, the emergency number used in the UK, and 112, the number most used in other EU countries.

    Ofcom had previously told operators to place stickers on equipment or on-screen labels indicating whether or not emergency calls were possible over a service.

    The rule, known as General Condition 4 of the General Conditions of Entitlement, also provides that the network operator must provide Caller Location Information for calls to the emergency call numbers "to the extent that is technically feasible".

    Ofcom said that ‘technically feasible’ should be taken to mean that location information must be provided where the VoIP service is being used at a predominantly fixed location.

    In May, a child died in Calgary, Canada after an ambulance was dispatched to the wrong address in response to an emergency call placed by his parents using a VoIP phone. The ambulance had been dispatched to an address in Ontario, 2,500 miles away.

    The requirements already apply to fixed line and mobile communications providers but the VoIP industry had resisted their extension.

    In December last year, the Voice on the Net (VON) Coalition Europe was set up as a lobby group to influence the regulation of internet telephony.

    The group, which includes Google, Microsoft and Skype among its founding members, warned against the “premature application” of emergency call rules to VoIP services that are not a replacement for traditional home or business phone services".

    The VON Coalition said the move "could actually harm public safety, stifle innovations critical to people with disabilities, stall competition, and limit access to innovative and evolving communication options where there is no expectation of placing a 112 call".

  • Skype Questions Carriers Commitment to "Open" Networks


    Christopher Libertelli, Skype’s senior director of government and regulatory affairs for North America, has written a strongly-worded letter complaining that the major US wireless carriers are all talk when it comes to "open" networks.

    Writing to the Federal Ccommunications Commission (FCC) chairman, Kevin Martin, he said that if the Commission wanted to live up to its stated goal of making open networks more accessible, it would affirm that this policy covered wireless networks.

    Libertelli said that last week at the CTIA Wireless IT and Entertainment conference in San Francisco, the major US carriers paid lip service to the idea of open networks, but strongly cautioned that too much choice would lead to chaos and damage the viability of their business model.

    "The attitude of the wireless carriers was perhaps best summed up in Sprint Nextel Corp,” he wrote.

    He quoted Sprint CEO Dan Hesse’s recent comment: ‘The big Internet can be daunting… There can be too much choice.’

    Libertelli continued: “This stands in stark contrast to the Commission’s wise policies designed to promote as much consumer choice as possible."

    He said Skype was mindful of the challenges wireless carriers faced in moving to an open network. But he also said it was not enough to simply talk about open networks.

    "Consumer choice, competition and free markets, not carriers acting to block competition, should win the day in wireless–now, not later," he said.

    "If the Commission believed that the transition to more open networks was going to proceed quickly, statements out of CTIA’s convention suggest just the opposite.”

  • Coming Year Important for New Wireless HDTV Products

    Wireless High Definition Special: Over the coming weeks hdtv.biz-news.com will be interviewing representatives from the competing wireless high definition TV systems to assess their current state of readiness and future viability.

    To kick things off, Steve Wilson, principal analyst at ABI Research, which recently produced a report Wireless Video Cable Replacement Market and Technologies, gives his opinion on wireless HDTV developments.

    The end-of-year shopping season, followed by the annual CES trade show in January, will give the next indications of the likely short-term prospects for wireless high-definition television systems in the consumer space.

    Holiday sales of existing products and new product announcements at CES will help paint a picture as to which of several competing systems – if any – is likely to lead the charge towards wide consumer acceptance of wireless HDTV.

    There are three contending technologies, loosely characterized as: 5 GHz, 60 GHz, and ultra-wideband (UWB).

    Small numbers of 5 GHz and UWB devices are currently shipping; demo products of 60 GHz systems are expected early next year.

    “Over the next two to three years, we’re going to see one or two of these wireless HDTV approaches emerge as the primary ones,” said Wilson.

    Two industry groups have emerged to promote 5 GHZ and 60 GHz solutions.

    Israeli company Amimon, around whose technology the 5 GHz platforms are based, took an initiative in July, forming the WHDI Special Interest Group, which has been joined by Hitachi, Motorola, Sharp, Samsung and Sony.
    Hedging their bets, the latter two vendors are also members of the competing industry body, WirelessHD, which is intended to promote the 60 Hz approach designed by SiBEAM, Inc.

    Other members of WirelessHD include Intel, LG Electronics, Matsushita Electric, NEC and Toshiba.

    Samsung is said to believe that WHDI should be seen as a stopgap technology until WirelessHD becomes “the ultimate solution in the long run”.

    But until then, Wilson believes “the WHDI group has the early momentum”.

    He continued: “Announcements at CES of systems using the 60 GHz band will give some indication of whether consumer products will actually make it to market in 2009.

    “The coming year will be a very important period for the introduction of all types of new wireless high-definition TV products.”

  • Creativity the Key to Secure Data Backup

    Guus Leeuw jr, president & CEO of ITPassion Ltd, urges creativity in the way data is stored.

    Any piece of electronic information needs to be stored somewhere and somehow. This should guarantee access to that piece of information over the years.

    You want that information backed up, in case a disaster strikes, so that you can restore and access it again. For some information, a need exists to keep it for a long period of time, three or seven years.

    Let’s focus on backup and restore for a moment. Often, a system or its data is backed up for disaster recovery purposes.

    Tapes are then eventually sent off-site for safe storage. Such tapes must be re-introduced to a restore environment. What happens with the tape while it is in secure storage is often unknown to the Enterprise.

    A tape that is sent for off-site storage contains some form of catalogue to identify the tape and its contents.
    This catalogue, in extreme cases, must hold enough information to retrieve the stored data, even if one had to re-install a new backup environment due to disaster.

    Backup solutions conforming to the NDMP standard could utilise a pre-described recipe to store the data on the tape, in form of well-quantified storage records. Anybody with a conforming reader application could then retrieve the data off the tape and try to inspect it.

    This is a potential security risk, especially in light of recent events of lost data and the concern that that caused with the general public. It would be good if the backups were duly encrypted so that even a good hacker cannot crack the contents of the tape, which is supposedly important, considering that a lot of Government Agencies deal with private data.

    Equally important is the fraud that we hear about so often in the news lately: Thrown-away computers that get shipped to some far-away location, where the hard disks are inspected for private data such as credit card and other “useful” information. It would be good if a PC had a little program that wipes all data securely off the disk, before people turn it off one last time.

    Governments have done what it takes to support this kind of security: Air Force System Security Instructions 5020, CESG, German VSITR, just to name a few. Tools are not hard to find, however they are generally not for free, and in my opinion, Governments can do more to publish the availability of this type of product.

    Talking of storage, let’s focus on the part of the storage infrastructure that is mostly “forgotten”, but very critical: the fibre optical network between the server equipment and the actual storage equipment.

    With the current trend to reduce carbon footprint and hence save the planet, there is another aspect of virtualisation that is actually more critical to business than the reduction of carbon footprint alone. That aspect is cost savings. Did you know that you can slash your annual IT cost by at least 40 per cent when opting for virtualised server environments alone: You need less hardware, which is the biggest cost, and overall you would spend less on power and cooling.

    As these virtualised environments support more and more guest environments, simply because the underlying physical layer gets more powerful, a faster and better access to the back-end storage systems is required.

    Speeds of up to 8Gbps are not unheard of in the industry for your storage network. Even storage devices start supporting 8Gbps connection speeds. Do you need it? Not always. If you’re supporting several I/O-intensive guest servers, you might be surprised how much more throughput you can achieve over 8Gbps bandwidth versus over 4Gpbs bandwidth.

    Implementing Microsoft Exchange environments on virtualised hardware becomes very possible. Especially if you can achieve end-to-end, virtual server to storage, guaranteed data paths as if your virtual environment were a physical environment.

    Hosting for multiple Government Agencies starts to wander into the realm of the possible as well. If all Agencies in a County were to put their IT together, great things can happen to the overall cost of running IT at the Government.

    Sharing knowledge and space wherever possible would seem a good strategy to follow up on, especially now that the public is intense on reducing Government Expenditure, increasing the success of Government IT Projects, and, last but least, enforcing the reduction of carbon footprint, which is also supported by the Government itself.

    Overall a good many ways exist to increase the capabilities of storage, backup and restore, and archiving. It is time that the IT industry becomes creative in this area.

  • DLM Technology to Achieve ILM

    Alec Bruce, solutions manager, Hitachi Data Systems UK, explains what is currently possible with and ILM what resellers need to tell their customers about achieving true ILM.

    Information Lifecycle Management (ILM) has been hyped in the last few years and is often seen as a panacea for all business and IT challenges that can be implemented immediately.

    The reality is different, as true ILM is still many years away.

    A SNIA survey found that one of the most common ways of losing information is not being able to interpret it properly – a problem ILM is intended to overcome.

    The key lies in the difference between information and data. Data is defined as the raw codes that make up any document or application.

    This data becomes information when it is put into context – its value and meaning can change depending on that context.

    An IT system works with data. Information is a much more subjective concept – something that is simple for humans to understand but not easy for machines. Establishing rules and processes that govern business and IT operations based on the value of information is correspondingly complex.

    Distributed Lock Manager (DLM) is the combination of solutions that helps CIOs and IT managers deliver data management services to any given application environment. This includes protecting data, moving it around, and presenting it to that environment – activities that are tightly connected with managing the different storage resource profiles.

    Information cannot exist without the data that underpins it, so ILM relies on DLM processes to effectively fit in with the IT infrastructure while also addressing changing business priorities.

    General management practices put in place around storage mean that many IT departments have deployed DLM at least partially. It has become wide-spread because it enables better alignment of data storage and management practices with key enterprise applications, helping to drive IT towards business process management objectives – an important aim for all CIOs and part of the eventual ILM vision.

    ILM has generated hype because it enables IT to drive better efficiency and business performance but it may be five to ten years before we are able to realise true ILM. What most of the industry sees as ILM at the moment is in fact DLM – controlling the movement of data across the storage hierarchy depending on its value to the business.

    Traditionally content is moved down the storage hierarchy as it ages, but in fact the most important piece of information in any organisation is the one needed for the next business evolution. DLM ensures that wherever the answer is, it is easily accessible when required.

    By introducing rules to relate the movement of data to application demands, companies are incorporating a link with business process management as well, but not equivalent to ILM practices. While DLM can be related to business on an application requirement level, ILM will do so on a business information level.

    In summary, managing information is much more complex than managing data. While the industry should be looking towards ILM as a future goal, the technology available today means that DLM is currently more achievable and should be approached as the first step in the process.

  • Tips for email management and archiving

    With only 20 per cent of companies demonstrating good control on email management, Dave Hunt, CEO of C2C, comments on the state of email management and archiving and notes what resellers can do to position themselves as protectors of companies’ most used and valuable communication method.

    Although around 30 per cent of organisations have some form of archiving in place, most consider that this would not constitute adequate control.

    A recent survey by C2C found that 65 per cent of respondents had set mailbox capacity limits meaning in effect, that end users were responsible for managing their own mailboxes.

    Just how bad does it get?

    In practice, this self regulation probably results in significant lost productivity and constitutes a poor strategy for managing and discovering data.

    We consider the top five questions being by resellers interested in recommending email management:

    1. Is Email control a management or archive issue?

    It is a management issue and archiving is part of the solution. Resellers should identify a solution that identifies unnecessary emails, handles attachments and provides automated quota management which should be part of a strategic ‘cradle to grave’ management of email. It isn’t a case of archiving email merely to reduce the live storage footprint, but part of a well thought-out strategy, designed hand-in-hand with the customer that aids productivity and time management and that can be implemented by an IT department simply and economically.

    2. What is the biggest problem for email management – storage costs, ‘loss’ of information or compliance issues?

    All of these are problems. Some will cost your customers on a daily basis; others could result in huge fines in liability. Failure to preserve email properly could have many consequences including brand damage, high third-party costs to review or search for data, court sanctions, or even instructions to a jury that it may view a defendant’s failure to produce data as evidence of culpability.

    3. What guidelines should be in place for mailbox quotas – and how can these be made more business friendly?

    Most specialists in email management agree that mailbox quotas are a bad idea. The only use would be a quota for automatic archiving, whereby, on reaching a specific mailbox threshold, email is archived automatically (and invisibly to the user) until a lower threshold is reached. Our C2C survey also found that those who self-manage email to stay within quotas frequently delete messages, delete attachments, and/or create a PST file. The over-reliance on PST files as a means to offload email creates several challenges when companies must meet legal requirements, since PST files do not have a uniform location and cannot be searched centrally for content with traditional technologies. Resellers can explain that reliance on PST files is poor practice.

    4. Once retention schedules and compliance have been met, does the email need to be destroyed – and if so, how should resellers’ recommend companies go about this?

    In some instances it is necessary to delete emails once the retention period has passed, in others it is only an option. Deletion also depends on the industry type, for instance, does it have to be guaranteed destruction, such as to US DoD standards, or is a simple removal of the email sufficient?

    5. What would be your top tips be for email management?

    Resellers that wish to add true value should consider the whole picture of email data management, from the instant an email is sent to the time it is finally destroyed.

  • How Do You Turn a PS3 Owner Onto Blu-ray? With a Remote

    Hollywood studios recognise the importance of PlayStation as a driver for Blu-ray Disc (BD) sales and remote control is indicator of household demand

    Reports on Blu-ray’s progress – and difficulties – on the road to becoming the mass-market video format are legion.

    Monica Juniel, vice president of international marketing for Warner Home Video, added an interesting statistic into the mix during her presentation at IFA 2008 in Berlin last week.

    According to the former commercial banker, Sony PlayStation owners that possess remote controls for their games consoles buy more than twice as many BDs as those that don’t have them.

    Perhaps not rocket science, since if you are going to be watching movies on the PS3 it’s fairly fiddly doing it with a game controller.

    But with millions of PS3s sold around the globe, it’s understandable why those with an interest in the Blu-ray industry pay particular attention to how they’re used.

    Games and Movies

    In July, a report from the Entertainment Merchants Association (EMA) showed that 87 per cent of PS3 owners watched Blu-ray movies on their console.

    While this is an impressive headline figure, Warner aren’t getting carried away with it.

    According to Juniel owners of stand-alone Blu-ray players buy twice as many BDs as PlayStation households.
    She said this undoubtedly meant there were “other opportunities” for the format.

    “There are a few things that are slowing us down,” she said. “PS3 comprises the majority of the installed base, but with software buy rates significantly lagging behind those of Blu-ray set-top box owners.”

    Control by Remote

    Juniel said one way to drive BD movie sales was to “convert PS3 households via remote control usage” – the logic presumably being that if it’s easier to play the disc, you’re more likely to buy more of them.

    No figure was given for the percentage of PS owners who had remote controls but HDTV.biz-news.com has asked for the data and will post an update as soon as its received.

    She also detailed a few other barriers to purchasing Blu-ray Discs, such as hardware prices and consumer indifference.

    Remove issues such as these and there might be a lot more people happily zapping their PS3s.

    Do you agree? Please let us know your comments on what the real barriers are to consumers adopting Blu-ray technology.