Tag: infrastructure

  • West & Central African Com: MTN Nigeria Targets Customer Satisfaction to Expand Market Reach


    VIDEO INTERVIEW: Ahmad Farroukh, CEO of MTN Nigeria, was interviewed at the recent West & Central African Com conference held in Abuja, Nigeria.
    He talks about plans for increasing market share by targeting customer satisfaction. Farroukh also discusses infrastructure sharing and MTN’s Community Phone Service.

  • Demand for VoIP Solutions Likely to Rise with Spread of Satellite Broadband Technology


    VoIP solutions are likely to benefit from more people around the world seeking to access the internet using satellite technology, according to research from Global Industry Analysts.

    The technology has been tipped as a possible way to provide broadband services in more remote, rural communities where it will be much more difficult to deploy conventional broadband infrastructure.

    The analysts said this could mean that web users have better use of VoIP solutions and are able to access high-speed downloads many miles from the nearest telephone exchange.

    The study said: "The satellite broadband’s capability to extend unique services such as rural telephony, e-distance learning and telemedicine services is enticing the prospective market participants in a major way."

    It added that the value of the satellite broadband market could reach nearly GBP £4 billion within six years if the interest and subsequent take-up continues to grow.

  • INSIGHT: External IT's Joseph Stedler on the Advantages of Storage Virtualization in Private Clouds


    DataCore Software has announced that hosted IT-as-a-service company External IT has standardized on its SANsymphony storage virtualization software to serve as their storage area network (SAN).

    With VMware virtual servers, Citrix XenApp and DataCore storage virtualization, it allows External IT to deliver a complete virtualization infrastructure.

    Joseph Stedler, senior engineer and Dallas data center manager, External IT, said this is in the form of private computing "clouds", tailored individually to a specific client’s needs.

    He said he had worked with traditional SANs for eight years and has had firsthand experience with every major hardware SAN – including EMC, HP and NetApp.

    "There are various, major drawbacks to hardware SANs. One is the fact that there is a single point of failure at the disk level," he said.

    "This is particularly the case when doing, for example, firmware upgrades – on the controllers, on the disks, on the shelves – whereby you have to take the SAN down to perform that task.

    "The second most irksome characteristic of hardware SANs is their cost. These EMC SANs, these HP EVAs are inherently expensive, particularly during upgrade time."

    Stedler said there are capabilities that DataCore brings to the table that he "absolutely loves".

    "The concept of having two SANs as your one SAN environment is just elegantly simple," he said. "You have an ‘A’ side and a ‘B’ side."

    Stedler said the beauty of this is that if you need to do hardware maintenance or firmware upgrades, an administrator can actually take down half of the SAN and still have the other half serving production traffic – completely uninterrupted.

    "The second, major benefit of DataCore for External IT has to do with performance," he said.

    "With DataCore, you will experience enormous performance gains. The performance that DataCore delivers is nothing short of awesome."

    Other benefits that make up the "DataCore Difference" for External IT include Seamless Maintenance, Disaster Recovery (through asynchronous replication) and the Flexibility to create your own SANs.

  • China 3G Build Props Up Global Mobile Gear Market


    Huawei Technologies doubled its market share in the mobile network infrastructure market in the first quarter of 2009.

    The Chinese company’s success comes as domestic mobile operators prepare to spend over USD $20 billion this year on rolling out the initial phases of China’s 3G deployments.

    This has led to a record number of 3G base station shipments.

    However, while Huawei now occupies third position in the market, there appears to be little sign of cheer asides from the activity in China.

    A quarterly market report from Dell’Oro said the global mobile infrastructure market contracted by nine per cent in January-March compared to the same period a year ago.

    It said the GSM market experienced its largest year-over-year decline and without China’s 3G tenders in WCDMA and CDMA base station deployments, the market would have fallen further.

    Scott Siegler, senior analyst at Dell’Oro Group, said China Unicom’s WCDMA deployment is shaping up to be the single largest 3G deployment in history.

    He said it was the primary contributor to the most ever – 100 thousand – Node B shipments in the quarter.

    "With the CDMA market declining elsewhere around the world, China Telecom’s spending resulted in the most CDMA base station shipments in over four years," he said.

    "As the two GSM operators, China Mobile and China Telecom focused their spending on the rapid deployment of their 3G networks, spending on their GSM networks significantly declined.

    "We expect this spending to accelerate in the second half of the year."

    During the quarter, Huawei experienced the greatest rate of growth, almost doubling its share of the total infrastructure market to 15 cent compared to the same quarter last year.

    Meanwhile, market leader Ericsson increased its share slightly to 33 per cent of the market in January-March, while Nokia Siemens Networks dropped to 21 per cent from 24 per cent a year ago, according to Dell’Oro.

    Alcatel-Lucent saw its share fall to 14 per cent from 16 per cent.

    Even less fortunate was North American’s largest maker of telecommunications gear Nortel, which saw its market share halving to 4 per cent from a year ago.

    The company filed for Chapter 11 bankruptcy protection in US federal court and for creditor protection in Canada’s Ontario Superior Court of Justice in January.

    With the market expected to remain tight and extremely competitive, other’s could well be going down the same path.

  • Virtualisation – Back-Up And Recovery Strategies

    With the wave of virtualisation sweeping across the business IT infrastructure, Mark Galpin, product marketing manager of Quantum, encourages IT managers to embrace the advantages of virtualisation after fully considering the impact on the back-up and recovery infrastructure.

    There can be no doubt that virtualisation is the technology trend of the moment.

    Google the term and more than 30 million links offering expertise in the area will appear in milliseconds – and this is not just more technology hype.

    The virtualisation trend is having an impact on the business IT landscape.

    Drivers for virtualisation range from hardware, power and space savings through to increased manageability and data protection.
    Analyst group Forrester reports that 23 per cent of European firms are today using server virtualisation, and an additional 12 per cent are piloting the process as a means of reducing costs.

    IDC also predicts that the total number of virtualised servers shipped will rise to 15 per cent in 2010, compared to 5 per cent in 2005.

    And with the recent flotation of virtualisation leader VMware at a market value of GBP£9 billion, many investors as well as IT experts are betting their business on this trend becoming accepted everyday best practice.

    Virtualisation brings benefits

    Virtualisation has brought us new ways of doing things from managing desktop operating systems to consolidating servers.
    What’s also interesting is that virtualisation has become a conceptual issue – a way to deconstruct fixed and relatively inflexible architectures and reassemble them into dynamic, flexible and scalable infrastructures.

    Today’s powerful x86 computer hardware was originally designed to run only a single operating system and a single application, but virtualisation breaks that bond, making it possible to run multiple operating systems and multiple applications on the same computer at the same time, increasing the utilisation and flexibility of hardware.

    In essence, virtualisation lets you transform hardware into software to create a fully functional virtual machine that can run its own operating system and applications just like a “real” computer.

    Multiple virtual machines share hardware resources without interfering with each other so that you can safely run several operating systems and applications at the same time on a single computer.

    The VMware approach to virtualisation inserts a thin layer of software directly on the computer hardware or on a host operating system. This software layer creates virtual machines and contains a virtual machine monitor or “hypervisor” that allocates hardware resources dynamically and transparently so that multiple operating systems can run concurrently on a single physical computer without even knowing it.

    However, virtualising a single physical computer is just the beginning. A robust virtualisation platform can scale across hundreds of interconnected physical computers and storage devices to form an entire virtual infrastructure.

    By decoupling the entire software environment from its underlying hardware infrastructure, virtualisation enables the aggregation of multiple servers, storage infrastructure and networks into shared pools of resources that can be delivered dynamically, securely and reliably to applications as needed. This pioneering approach enables organisations to build a computing infrastructure with high levels of utilisation, availability, automation and flexibility using building blocks of inexpensive industry-standard servers.

    Benefits can come with initial increased complexity

    One of the great strengths of virtualisation is its apparent simplicity and its ability to simplify and increase flexibility within the IT infrastructure. However, as time passes there are some important lessons emerging from early adopters’ experience which are important to consider.

    IT managers looking to unleash virtualisation technology in their production networks should anticipate a major overhaul to their management strategies as well. That’s because as virtualisation adds flexibility and mobility to server resources, it also increases the complexity of the environment in which the technology lives. Virtualisation requires new thinking and new ways of being managed, particularly in the back-up and recovery areas of storage in a virtualised environment.

    Virtual servers have different management needs and have capabilities that many traditional tools cannot cope with. They can disappear by being suspended or be deleted entirely, and they can move around and assume new physical addresses.

    As a result, some existing infrastructures need to become more compatible with virtual machines in areas such as back-up and recovery.

    Many of the virtualisation deployments to date have been implemented on application or file servers where unstructured data is the key information. In these environments, VMware tools for back-up and recovery work well. Copies of the virtual machine images can be taken once a week, moved out to a proxy server and then saved onto tape in a traditional manner.

    Real returns available through virtualising structured data

    But the real returns on investment for business from virtualisation will come in its ability to virtualise the structured data of its key applications such as Oracle, SQL or Exchange. Many of these areas have been avoided to date because of the complexity of protecting these critical business applications in a virtualised environment.

    The standard VMware replication tools take a snapshot image in time and do not provide a consistent state for recovery and rebuild of structured data.

    The answer for critical applications where recovery times need to be seconds rather than hours is to build expensive highly available configurations. This solves system or site loss risks but protection is still required against data corruption, accidentally deleted data and virus attack.

    Less critical systems also need to be protected and data sets retained for compliance and regulatory purposes. In most data centres, traditional backup and recovery will be performing these functions today using familiar software tools that integrate with the database and tape or disk targets for the data.

    So, the obvious solution is to continue to back-up everything as before but in a virtualised environment the increased load on the network infrastructure would become unbearable very quickly with machines grinding to a halt and applications groaning.

    Tape systems with their high bandwidths and intolerance of small data streams are also unsuitable as targets as more flexibility is needed to schedule back-ups to multiple devices.

    The answer is disk-based back-up appliances

    With structured data, the answer is to use new disk-based back-up appliances to protect data. Using a Quantum DXi solution, for example, businesses can combine enterprise disk back-up features with data de-duplication and replication technology to provide data centre protection and anchor a comprehensive data protection strategy for virtualised environments.

    DXi solutions bring a number of additional benefits. In as much as they are useful when storing structured data, they are also effective in storing virtual machine disk format (VMDK) images and unstructured data, meaning users can benefit from a single point of data management. A benefit of storing VMDK images on de-duplicated disk is that all VMDK image are very much alike and so achieve an exceptionally high de-duplication ratio. This means much larger volumes of data can be stored on limited disk space.

    The DXi range leverages Quantum’s patented data de-duplication technology to dramatically increase the role that disk can play in the protection of critical data. With the DXi solutions, users can retain 10 to 50 times more back-up data on fast recovery disk than with conventional arrays.

    With remote replication of back-up data also providing automated disaster recovery protection, DXi users can transmit back-up data from a single or multiple remote sites equipped with any other DXi-Series model to a central, secure location to reduce or eliminate media handling. DXi–Series replication is asynchronous, automated, and it operates as a background process.

    Whether businesses are looking to increase return on investment from their virtualisation implementations or planning a virtualised environment, the lessons are clear. To make the most of this latest technological innovation, IT managers must plan their recovery and back-up strategies to cater for the virtual new world.