Tag: biztalk

  • There's No Such Thing As Too Much Storage

    As demand for mobile data storage keeps rising, the hard drive industry needs to work harder at adapting its technology and products to keep pace.


    Storage.biz-news.com spoke to Daniel Mauerhofer, of storage giant Western Digital, to find out more about this evolving market.

    Desktop computing remains the largest market for hard drives but the young upstart – consumer electronics – is the fastest growing.

    Demand for data storage is soaring in everything from PDAs, navigation systems and automotive applications to handheld devices that store music, books, news content, movies and television programs.

    In parallel with this is the need for portable data collection devices, something storage giant Western Digital (WD) has been quick to pick up on.

    It recently launched My Passport, a 500 GB capacity portable USB drive that is small enough to fit in the palm of your hand.

    Not so long ago it would have been inconceiveable to imagine how most consumers would use that amount of storage capacity – let alone in a mobile format.

    Yet Daniel Mauerhofer, senior PR manager EMEA for WD, said that since storage space was now quickly eaten up by even modest amounts of photo, video and music files, finding a use for half a Terabyte of storage wasn’t that difficult.

    He said the advent of compact cameras with the capacity for ever-larger resolution meant even just storing photographs required a great deal of memory space.

    “There’s no such thing as too much storage these days,” he said.

    WD was founded in Lake Forest, California in 1970 and has been manufacturing internal hard drives since 1990. It moved into the external drive market four years ago.

    While its principal markets – desktop and notebook computing – are expected to continue growing strongly, the launch of the My Passport portable series positions it strongly in the consumer electronics sector.

    This hard drive market, which today accounts for sales of 81 million units worth more than USD $6 billion, is expected to grow to 220 million units in 2010 – a compound annual growth rate of 29 per cent.

    Mauerhofer said external drives generated very little revenue for WD three years ago.

    “Now they represent a fifth of our turnover. It’s a billion dollar business now,” he said. “People are spending considerable time on the internet and its penetration is getting better, so people are downloading more and more. We do not see that stopping.”

    For this reason, the consumer rather than corporate user is seen as being the principal buyer of My Passport portable drives.

    This is borne out by the sleek design and color choices for the drives – a far step from the customary image of external drives as functional “blocks”.

    Technology is evolving to cope with the ever-increasing demand for portable storage

    Mauerhofer said the industry currently used Perpindicular Magnetic Recording (PMR), which still had potential for greater capacity.
    So much so that he predicted that within the next 18 months a 1 Terabyte storage drive would become available.

    “There is a big need in the B2B enterprise space for huge capacity coupled with small form factor and it’s a safe bet to say you will find them in our portable products as well,” he said.

    However, Mauerhofer said there would come a point when even the PMR technology reached a capacity limit. This would open up the market to replacement technology such as Heat Assisted Magnetic Recording.

    The consumers’ appetite for storage appears insatiable – but technology has managed to keep ahead of the game. Can it continue to do so?

  • Mobile TV To Become Standard Feature of Smartphones

    Mobile TV has really only achieved great popularity in nations such as Japan and Korea.

    But the market is expected to expand rapidly over the next few years, spurred on by the smartphone which is driving improvements in screen quality, microchips and antennas.

    Smartphone.biz-news.com spoke to David Srodzinski, chief executive of fledgling semiconductor firm Elonics, about his expectations for the future of mobile TV.

    Mobile TV will soon become as accepted a feature of mobile handsets as the camera.

    That is the prediction of David Srodzinski, founder and chief executive of Elonics, a semiconductor company that has designed a silicon radio frequency (RF) tuner used to convert signals into sound and pictures.

    “We do see mobile TV as going to take off just like the camera phone has taken off,” he said.

    “It’s not something you will use all the time, but it’s a part of the phone that will be such a ‘nice to have’ feature that all phones will simply have to have them.”

    David Srodzinski CEO Elonics

    Based in Livingston, Scotland, Elonics recently announced that David Milne, the founder and former chief executive of chip maker Wolfson Microelectronics, was joining its board as non-executive chairman.

    Milne was credited with taking Wolfson from a university spinout to the FTSE 250 and the company made its name as a key supplier of microchips to the iPod.

    Founded in 2003, Elonics has developed RF architecture called DigitalTune that is the foundation for a family of re-configurable CMOS RF front end products.

    Its E4000 device is designed for reception of all major world-wide fixed and handheld terrestrial digital multi-media broadcast standards within UHF to L-Band ranges (76MHz to 1.70GHz).

    It allows designers to implement front ends capable of cost effectively supporting multiple TV and radio broadcast standards and enabling smaller, lighter, cheaper and lower power consumer electronics.

    Elonics has finished market sampling its products and is about to begin mass-production.

    Srodzinski said the immediate focus for the broadcast receiver technology was the traditional TV market, ranging from digital TVs, set-top boxes and PC TVs to multi-media devices.

    But he believed the biggest opportunities lay in the mobile TV market, with analysts forecasting sales of mobile TV enabled handsets rising to 100 millionin 2010.

    “All future potential growth is coming from the cell phone side of the market,” he said. “Smartphones are increasingly a sizeable part of that market.”

    Screen size and quality a key factor influencing the adoption of mobile TV on cell phones

    Srodzinski said that with QVGA screens appearing on increasing numbers of handsets, a barrier to mobile TV was being removed.

    He said that prior to the introduction of QVGA screens, adding mobile TV to a cell phone meant additional costs for the screen, the graphic processors and mobile TV chip set.

    “With the advent of QVGA offerings, such as on the new HTC phones and the iPhone, which have them as standard, the cost add of mobile TV is minimal now,” he said.

    For the screens alone, Srodzinski estimated that the cost add had dropped by a tenth, from USD $50 to $5.
    “All that has to be added now is the mobile TV chip set,” he said.

    But if cost and technological issues were no longer an impediment to widespread uptake of mobile TV, what about users’ appetite for the service?

    Srodzinski expected mobile TV to be something people would use once or twice a week for five to 10 minutes, most probably as a free-to-view service.

    “That user experience will be such a good feature and such a compelling reason, that people will want mobile TV on their cell phones in a similar way to how they want to have camera phones too,” he said.

    “We believe that if mobile TV works and takes off in that way, it will be a major opportunity that will grow out of the smartphone and into middle layer cell phones.”

    The great success of mobile TV in Japan and Korea, where penetration rates now reach 40 per cent, owes a great deal to government intervention promoting the services, according to Srodzinski.

    He said this had created revenue opportunities and lifted technological barriers to entry.

    “What’s holding back other parts of the world has more to do with the infrastructure roll-out and the cost of doing that,” he said. “That and the lack of clear government support.”

    However, Srodzinski insisted that the growth of mobile TV in territories outwith Japan and Korea would accelerate as more people experienced it and saw the quality of the services and content.

    “I think other regions will catch on,” he said. “This is not a technological push situation – it has to be a consumer-led requirement., especially if it’s free-to-air that takes off.”

    While content may be free, any explosion in mobile TV will also have to offer opportunities for revenue to the industry.
    As Srodzinski said: “The question has to be: who makes any money out of it? There’s no particular economic benefit to operators.”

    Undoubtedly an answer to that conundrum will be found, but will mobile TV really take off?
    Please let us know your thoughts on the matter.

  • Vyke Says Mobile Operators Risk Being Leap-frogged in Evolving Market


    VoIP provider, Vyke, has warned that mobile operators are poorly positioned to cope with latest industry developments.

    Aaron Powers, head of business development at Vyke, said the operators are failing to spearhead new innovations – leaving them open to greater than ever competition from a new breed of rivals.

    "Metro WiFi networks are springing up everywhere; there’s one in Singapore – a whole country covered by WiFi," he said.

    "At the same time handsets are advancing to a stage where they are becoming the access point for services installed by the consumer, meaning they don’t need network-provided services anymore, and this goes straight to the core of mobile operators’ revenues.

    "That leaves them poorly positioned in a rapidly evolving market. They need to be the ones directing these changes in a way that benefits them or they’ll get leap-frogged."

    Powers also criticised telecoms operators for opposing the European Commission’s proposal to cap mobile data roaming rates.

    European telecoms commissioner Viviane Reding is set to recommend restrictions on data roaming fees this autumn.

    She has also made clear she intends to impose caps on SMS roaming charges and mobile termination rates, proposals which have drawn widespread criticism from a number of European telcos.

    Citing a recent GSMA report, Powers claimed that the rising uptake of mobile data services was boosted by a 25 per cent fall in roaming rates in the year to April 2008.

    "Actually when you look at it, a small reduction in roaming rates has led to operators making a lot more money off data by volume of usage," he said.

    "Yet all of a sudden there’s uproar when the EU tries to set a cap – mobile providers have taken a head in the sand point of view."

  • VoIP Providers Must Allow Emergency Calls and Give Caller Location


    The UK communications industry regulator, Ofcom, has told internet telephony providers that they must now allow emergency 999 calls over their networks or face the risk of enforcement action.

    Caller location information must also be provided where technically feasible.

    Effective immediately, the ruling for Voice over Internet Protocol (VoIP) providers affects businesses such as BT, Vonage and Skype that offer services that connect VoIP calls to the public telephone network.

    Operators must now provide the ability to make calls to 999, the emergency number used in the UK, and 112, the number most used in other EU countries.

    Ofcom had previously told operators to place stickers on equipment or on-screen labels indicating whether or not emergency calls were possible over a service.

    The rule, known as General Condition 4 of the General Conditions of Entitlement, also provides that the network operator must provide Caller Location Information for calls to the emergency call numbers "to the extent that is technically feasible".

    Ofcom said that ‘technically feasible’ should be taken to mean that location information must be provided where the VoIP service is being used at a predominantly fixed location.

    In May, a child died in Calgary, Canada after an ambulance was dispatched to the wrong address in response to an emergency call placed by his parents using a VoIP phone. The ambulance had been dispatched to an address in Ontario, 2,500 miles away.

    The requirements already apply to fixed line and mobile communications providers but the VoIP industry had resisted their extension.

    In December last year, the Voice on the Net (VON) Coalition Europe was set up as a lobby group to influence the regulation of internet telephony.

    The group, which includes Google, Microsoft and Skype among its founding members, warned against the “premature application” of emergency call rules to VoIP services that are not a replacement for traditional home or business phone services".

    The VON Coalition said the move "could actually harm public safety, stifle innovations critical to people with disabilities, stall competition, and limit access to innovative and evolving communication options where there is no expectation of placing a 112 call".

  • Skype Questions Carriers Commitment to "Open" Networks


    Christopher Libertelli, Skype’s senior director of government and regulatory affairs for North America, has written a strongly-worded letter complaining that the major US wireless carriers are all talk when it comes to "open" networks.

    Writing to the Federal Ccommunications Commission (FCC) chairman, Kevin Martin, he said that if the Commission wanted to live up to its stated goal of making open networks more accessible, it would affirm that this policy covered wireless networks.

    Libertelli said that last week at the CTIA Wireless IT and Entertainment conference in San Francisco, the major US carriers paid lip service to the idea of open networks, but strongly cautioned that too much choice would lead to chaos and damage the viability of their business model.

    "The attitude of the wireless carriers was perhaps best summed up in Sprint Nextel Corp,” he wrote.

    He quoted Sprint CEO Dan Hesse’s recent comment: ‘The big Internet can be daunting… There can be too much choice.’

    Libertelli continued: “This stands in stark contrast to the Commission’s wise policies designed to promote as much consumer choice as possible."

    He said Skype was mindful of the challenges wireless carriers faced in moving to an open network. But he also said it was not enough to simply talk about open networks.

    "Consumer choice, competition and free markets, not carriers acting to block competition, should win the day in wireless–now, not later," he said.

    "If the Commission believed that the transition to more open networks was going to proceed quickly, statements out of CTIA’s convention suggest just the opposite.”

  • Coming Year Important for New Wireless HDTV Products

    Wireless High Definition Special: Over the coming weeks hdtv.biz-news.com will be interviewing representatives from the competing wireless high definition TV systems to assess their current state of readiness and future viability.

    To kick things off, Steve Wilson, principal analyst at ABI Research, which recently produced a report Wireless Video Cable Replacement Market and Technologies, gives his opinion on wireless HDTV developments.

    The end-of-year shopping season, followed by the annual CES trade show in January, will give the next indications of the likely short-term prospects for wireless high-definition television systems in the consumer space.

    Holiday sales of existing products and new product announcements at CES will help paint a picture as to which of several competing systems – if any – is likely to lead the charge towards wide consumer acceptance of wireless HDTV.

    There are three contending technologies, loosely characterized as: 5 GHz, 60 GHz, and ultra-wideband (UWB).

    Small numbers of 5 GHz and UWB devices are currently shipping; demo products of 60 GHz systems are expected early next year.

    “Over the next two to three years, we’re going to see one or two of these wireless HDTV approaches emerge as the primary ones,” said Wilson.

    Two industry groups have emerged to promote 5 GHZ and 60 GHz solutions.

    Israeli company Amimon, around whose technology the 5 GHz platforms are based, took an initiative in July, forming the WHDI Special Interest Group, which has been joined by Hitachi, Motorola, Sharp, Samsung and Sony.
    Hedging their bets, the latter two vendors are also members of the competing industry body, WirelessHD, which is intended to promote the 60 Hz approach designed by SiBEAM, Inc.

    Other members of WirelessHD include Intel, LG Electronics, Matsushita Electric, NEC and Toshiba.

    Samsung is said to believe that WHDI should be seen as a stopgap technology until WirelessHD becomes “the ultimate solution in the long run”.

    But until then, Wilson believes “the WHDI group has the early momentum”.

    He continued: “Announcements at CES of systems using the 60 GHz band will give some indication of whether consumer products will actually make it to market in 2009.

    “The coming year will be a very important period for the introduction of all types of new wireless high-definition TV products.”

  • Creativity the Key to Secure Data Backup

    Guus Leeuw jr, president & CEO of ITPassion Ltd, urges creativity in the way data is stored.

    Any piece of electronic information needs to be stored somewhere and somehow. This should guarantee access to that piece of information over the years.

    You want that information backed up, in case a disaster strikes, so that you can restore and access it again. For some information, a need exists to keep it for a long period of time, three or seven years.

    Let’s focus on backup and restore for a moment. Often, a system or its data is backed up for disaster recovery purposes.

    Tapes are then eventually sent off-site for safe storage. Such tapes must be re-introduced to a restore environment. What happens with the tape while it is in secure storage is often unknown to the Enterprise.

    A tape that is sent for off-site storage contains some form of catalogue to identify the tape and its contents.
    This catalogue, in extreme cases, must hold enough information to retrieve the stored data, even if one had to re-install a new backup environment due to disaster.

    Backup solutions conforming to the NDMP standard could utilise a pre-described recipe to store the data on the tape, in form of well-quantified storage records. Anybody with a conforming reader application could then retrieve the data off the tape and try to inspect it.

    This is a potential security risk, especially in light of recent events of lost data and the concern that that caused with the general public. It would be good if the backups were duly encrypted so that even a good hacker cannot crack the contents of the tape, which is supposedly important, considering that a lot of Government Agencies deal with private data.

    Equally important is the fraud that we hear about so often in the news lately: Thrown-away computers that get shipped to some far-away location, where the hard disks are inspected for private data such as credit card and other “useful” information. It would be good if a PC had a little program that wipes all data securely off the disk, before people turn it off one last time.

    Governments have done what it takes to support this kind of security: Air Force System Security Instructions 5020, CESG, German VSITR, just to name a few. Tools are not hard to find, however they are generally not for free, and in my opinion, Governments can do more to publish the availability of this type of product.

    Talking of storage, let’s focus on the part of the storage infrastructure that is mostly “forgotten”, but very critical: the fibre optical network between the server equipment and the actual storage equipment.

    With the current trend to reduce carbon footprint and hence save the planet, there is another aspect of virtualisation that is actually more critical to business than the reduction of carbon footprint alone. That aspect is cost savings. Did you know that you can slash your annual IT cost by at least 40 per cent when opting for virtualised server environments alone: You need less hardware, which is the biggest cost, and overall you would spend less on power and cooling.

    As these virtualised environments support more and more guest environments, simply because the underlying physical layer gets more powerful, a faster and better access to the back-end storage systems is required.

    Speeds of up to 8Gbps are not unheard of in the industry for your storage network. Even storage devices start supporting 8Gbps connection speeds. Do you need it? Not always. If you’re supporting several I/O-intensive guest servers, you might be surprised how much more throughput you can achieve over 8Gbps bandwidth versus over 4Gpbs bandwidth.

    Implementing Microsoft Exchange environments on virtualised hardware becomes very possible. Especially if you can achieve end-to-end, virtual server to storage, guaranteed data paths as if your virtual environment were a physical environment.

    Hosting for multiple Government Agencies starts to wander into the realm of the possible as well. If all Agencies in a County were to put their IT together, great things can happen to the overall cost of running IT at the Government.

    Sharing knowledge and space wherever possible would seem a good strategy to follow up on, especially now that the public is intense on reducing Government Expenditure, increasing the success of Government IT Projects, and, last but least, enforcing the reduction of carbon footprint, which is also supported by the Government itself.

    Overall a good many ways exist to increase the capabilities of storage, backup and restore, and archiving. It is time that the IT industry becomes creative in this area.

  • Tips for email management and archiving

    With only 20 per cent of companies demonstrating good control on email management, Dave Hunt, CEO of C2C, comments on the state of email management and archiving and notes what resellers can do to position themselves as protectors of companies’ most used and valuable communication method.

    Although around 30 per cent of organisations have some form of archiving in place, most consider that this would not constitute adequate control.

    A recent survey by C2C found that 65 per cent of respondents had set mailbox capacity limits meaning in effect, that end users were responsible for managing their own mailboxes.

    Just how bad does it get?

    In practice, this self regulation probably results in significant lost productivity and constitutes a poor strategy for managing and discovering data.

    We consider the top five questions being by resellers interested in recommending email management:

    1. Is Email control a management or archive issue?

    It is a management issue and archiving is part of the solution. Resellers should identify a solution that identifies unnecessary emails, handles attachments and provides automated quota management which should be part of a strategic ‘cradle to grave’ management of email. It isn’t a case of archiving email merely to reduce the live storage footprint, but part of a well thought-out strategy, designed hand-in-hand with the customer that aids productivity and time management and that can be implemented by an IT department simply and economically.

    2. What is the biggest problem for email management – storage costs, ‘loss’ of information or compliance issues?

    All of these are problems. Some will cost your customers on a daily basis; others could result in huge fines in liability. Failure to preserve email properly could have many consequences including brand damage, high third-party costs to review or search for data, court sanctions, or even instructions to a jury that it may view a defendant’s failure to produce data as evidence of culpability.

    3. What guidelines should be in place for mailbox quotas – and how can these be made more business friendly?

    Most specialists in email management agree that mailbox quotas are a bad idea. The only use would be a quota for automatic archiving, whereby, on reaching a specific mailbox threshold, email is archived automatically (and invisibly to the user) until a lower threshold is reached. Our C2C survey also found that those who self-manage email to stay within quotas frequently delete messages, delete attachments, and/or create a PST file. The over-reliance on PST files as a means to offload email creates several challenges when companies must meet legal requirements, since PST files do not have a uniform location and cannot be searched centrally for content with traditional technologies. Resellers can explain that reliance on PST files is poor practice.

    4. Once retention schedules and compliance have been met, does the email need to be destroyed – and if so, how should resellers’ recommend companies go about this?

    In some instances it is necessary to delete emails once the retention period has passed, in others it is only an option. Deletion also depends on the industry type, for instance, does it have to be guaranteed destruction, such as to US DoD standards, or is a simple removal of the email sufficient?

    5. What would be your top tips be for email management?

    Resellers that wish to add true value should consider the whole picture of email data management, from the instant an email is sent to the time it is finally destroyed.

  • Ten Criteria For Enterprise Business Continuity Software

    Jerome Wendt, president and lead analyst of DCIG Inc, an independent storage analyst and consulting firm, outlines 10 criteria for selecting the right enterprise business continuity software

    The pressures to implement business continuity software that can span the enterprise and recover application servers grow with each passing day.

    Disasters come in every form and shape from regional disasters (earthquakes, floods, lightning strikes) to terrorist attacks to brown-outs to someone accidently unplugging the wrong server.

    Adding to the complexity, the number of application servers and virtual machines are on the rise and IT headcounts are flat or shrinking.

    Despite these real-world situations, companies often still buy business continuity software that is based on centralized or stand-alone computing models that everyone started abandoning over a decade ago.

    Distributed computing is now almost universally used for hosting mission critical applications in all companies.

    However business continuity software that can easily recover and restore data in distributed environments is still based on 10 year old models.

    This puts businesses in a situation when they end up purchasing business continuity software that can only recover a subset of their application data.

    Organizations now need a new set of criteria that accounts for the complexities of distributed systems environments.

    Today’s business continuity software must be truly enterprise and distributed in its design.

    Here are 10 features that companies now need to identify when selecting business continuity software so it meets the needs of their enterprise distributed environment:

    • Heterogeneous server and storage support.
    • Accounts for differences in performance.
    • Manages replication over WAN links.
    • Multiple ways to replicate data.
    • Application integration.
    • Provides multiple recovery points.
    • Introduces little or no overhead on the host server.
    • Replicates data at different points in the network (host, network or storage system).
    • Centrally managed.
    • Scales to manage replication for tens, hundreds or even thousands of servers.

    The requirements for providing higher, faster and easier means of enterprise business continuity have escalated dramatically in the last decade while the criteria for selecting the software remains rooted in yesterday’s premises and assumptions.

    Today’s corporations not only need to re-evaluate what software they are using to perform these tasks but even what criteria on which they should base these decisions.

    The 10 criteria listed here should provide you with a solid starting point for picking backup continuity software that meets the requirements of today’s enterprise distributed environments while still providing companies the central control and enterprise wise recoverability that they need to recover their business.

    To read the full criteria please go to DCIG Inc.

  • Blu-ray is an unstoppable train

    Europeans told to learn from US retailers in order to convert consumers to Blu-ray and drive it into the mainstream

    Why upgrade to Blu-ray when the old DVD player still manages to churn out a pretty good picture?

    That appears to be a question many people have been asking themselves, especially when prices for Blu-ray players and discs remain high.

    Not for much longer, however, according to various speakers at the Blu-ray Disc Association’s (BDA) press shindig at the IFA electronics trade show in Berlin.

    They were keen to dispel any concerns that the format will never quite make it into the mainstream – though it was conceded that more work is necessary before Blu-ray finally puts DVD to the sword.

    Jim Bottoms, managing director of Futuresource Consulting, told the IFA conference that DVD’s market penetration had reached a point in the 1990s when it could be described as an “unstoppable train”.

    He said that was now the case for Blu-ray in the US and within 6-12 months it would also be true for Europe.
    “At this stage it’s too early to make that call for Europe but we are only six months away from it,” he said.

    “In the US, that call can be made now. It will be pretty much impossible to stop Blu-ray becoming a mass market product in the US.”

    He added: “We are moving forward to a situation where Blu-ray really is growing with its own momentum to become a train that is unstoppable.”

    Work remains to be done in Europe

    Things aren’t hurtling along quite so forcefully in Europe, though, where BD sales will reach 12 million discs this year, according to Bottom.

    This only accounts for around 2 per cent of total video sales, although he expects the share to climb to 5-6 per cent next year – and keep rising swiftly.

    However, by 2012 DVD will still lead in the UK, 56 per cent to 44 per cent. BD will do better in Germany – it’ll take 46 per cent of the market – but less well in Spain and Italy – 43 per cent and 39 per cent, respectively.

    To encourage the market along, Bottoms said Europe had to learn from the US, particularly from retailers there who have got behind Blu-ray by promoting it in stores and demonstrating the format’s superior quality.

    He said there was evidence that some consumers had been “turned off” HD based on only viewing broadcast HD programmes.

    They hadn’t found the quality sufficiently superior to judge it worthwhile paying to upgrade from their existing DVD players.

    Demonstrating Blu-ray at point of sales areas had been shown to be very effective in persuading people about the format’s quality.

    Initiatives such as improved retail support would ultimately help close what Bottom described as the HD content gap in Europe.

    He said this situation had arisen because currently around a third of households had HD screens but only 2 per cent could get high def content.

    This was compared to the US, where 50-60 per cent of households had HD screens and around a third could access high def content.

    Frank Simonis, chairman of the Blu-ray Disc Association’s European Promotions committee, not surprisingly agreed that Blu-ray had reached the point of going mass market.

    He said the European market would start to accelerate in the autumn, adding: “You will see a lot of good things this fall. European consumers are hungry for high def.”

    Simonis defended the lag in the release of European movies compared to the US and the higher price of European Blu-ray discs – a huge sore point with many consumers.

    He said Europe, despite being a similar sized market to the US, had 15 different languages and individual markets in each country – making it a very different proposition to the US.

    “We have to work on an individual country basis for each launch plan,” he said. “So it’s one year behind the US. It’s not something we like but something that’s due to the nature of the European continent.

    “So we are not doing that badly – in fact, if you put Europe on the same timeline as the US, Europe is faster.”

    How would you describe the Blu-ray Express – hurtling unstoppably or trundling along? Please let us know your comments.