Tag: hardware-and-technology

  • Intel PCs to wake up for VoIP phone calls

    A wake-up call for the PC: Intel-powered computers to snap out of sleep when you phone them

    Intel is unveiling new technology that will let computers wake up from their power-saving sleep state when they receive a phone call over the Internet.

    Current computers have to be fully “on” to receive a call, making them impractical and energy-wasters as replacements for the telephone.

    The new Intel component will let computers automatically return to a normal, full-powered state when a call comes in. The computer can activate its microphone and loudspeaker to alert the user, then connect the call.

    Trevor Healy, chief executive of Jajah, which will be the first Internet telephone company to utilize the feature, said: “This certainly helps the PC become a much better center of communications in the home.”

    Joe Van De Water, director of consumer product marketing for Intel , said the first Intel motherboards with the Remote Wake capability will be shipping in the next month.

    These components, which are at the heart of every computer, will most likely be used by smaller computer manufacturers. Bigger names like Dell Inc. and Hewlett-Packard Co. use their own motherboard solutions, but Intel is working to supply them with the technology as well.

    The four initial Remote Wake motherboards will be for desktop computers and will need an Internet connection via Ethernet cable, as Wi-Fi doesn’t work in sleep mode.

    Van De Water said the computer will know to wake up only for calls from services to which the user has subscribed, so computer-waking prank calls should be impossible.

  • Ten Criteria For Enterprise Business Continuity Software

    Jerome Wendt, president and lead analyst of DCIG Inc, an independent storage analyst and consulting firm, outlines 10 criteria for selecting the right enterprise business continuity software

    The pressures to implement business continuity software that can span the enterprise and recover application servers grow with each passing day.

    Disasters come in every form and shape from regional disasters (earthquakes, floods, lightning strikes) to terrorist attacks to brown-outs to someone accidently unplugging the wrong server.

    Adding to the complexity, the number of application servers and virtual machines are on the rise and IT headcounts are flat or shrinking.

    Despite these real-world situations, companies often still buy business continuity software that is based on centralized or stand-alone computing models that everyone started abandoning over a decade ago.

    Distributed computing is now almost universally used for hosting mission critical applications in all companies.

    However business continuity software that can easily recover and restore data in distributed environments is still based on 10 year old models.

    This puts businesses in a situation when they end up purchasing business continuity software that can only recover a subset of their application data.

    Organizations now need a new set of criteria that accounts for the complexities of distributed systems environments.

    Today’s business continuity software must be truly enterprise and distributed in its design.

    Here are 10 features that companies now need to identify when selecting business continuity software so it meets the needs of their enterprise distributed environment:

    • Heterogeneous server and storage support.
    • Accounts for differences in performance.
    • Manages replication over WAN links.
    • Multiple ways to replicate data.
    • Application integration.
    • Provides multiple recovery points.
    • Introduces little or no overhead on the host server.
    • Replicates data at different points in the network (host, network or storage system).
    • Centrally managed.
    • Scales to manage replication for tens, hundreds or even thousands of servers.

    The requirements for providing higher, faster and easier means of enterprise business continuity have escalated dramatically in the last decade while the criteria for selecting the software remains rooted in yesterday’s premises and assumptions.

    Today’s corporations not only need to re-evaluate what software they are using to perform these tasks but even what criteria on which they should base these decisions.

    The 10 criteria listed here should provide you with a solid starting point for picking backup continuity software that meets the requirements of today’s enterprise distributed environments while still providing companies the central control and enterprise wise recoverability that they need to recover their business.

    To read the full criteria please go to DCIG Inc.

  • New High Speed Camera Memory Stick

    Sony model ideal upgrade for high performance digital cameras and HD camcorders

    As files get bigger, so the pressure for flash memory grows.

    The latest offering from Sony Recording Media & Energy is one solution for users needing high capacity and high speed data transfer.
    The Memory Stick PRO-HG Duo HX comes with 4GB or 8GB capacity and a read speed of 20MB/second (15MB/second write).

    This makes it more than capable of coping even with the strain of HD video.

    When used with the supplied USB adaptor for maximum speed, it can shorten data transfer time by one-third compared to Sony’s Memory Stick PRO Duo (Mark 2).

    The provision of a USB adaptor as a standard accessory also makes it very simple to transfer data onto a PC or notebook.

    Also useful is the free, downloadable Memory Stick Data Rescue Service which can quickly recover deleted photographs and files.

    The Memory Stick PRO-HG Duo HX uses an 8-bit parallel interface to achieve this level of performance and comes with a 10 year warranty.

    It will be available from October 2008.

  • No Black Hole for CERN Data

    The largest scientific instrument on the planet will produce roughly 15 Petabytes (15 million Gigabytes) of data annually when it begins operations

    System crashes and the ensuing data loss may be most IT managers’ idea of the end of the world.

    Yet spare a thought for the folk running the LHC Computing Grid (LCG) designed by CERN to handle the massive amounts of data produced by the Large Hadron Collider (LHC).

    Many people believe the USD $4bn energy particle acclerator, which crisscrosses the border between France and Switzerland, is a Doomsday Machine that is going to create micro black holes and strangelets when switched on tomorrow.

    While that is, hopefully, pure fantasy what is more of a nightmare is how to deal with the colossal amounts of data that the 27km-long LHC is going to produce.

    The project is expected to generate 27 TB of raw data per day, plus 10 TB of "event summary data", which represents the output of calculations done by the CPU farm at the CERN data center.

    The LHC is CERN’s new flagship research facility, which is expected to provide new insights into the mysteries of the universe.

    It will produce beams seven times more energetic than any previous machine, and around 30 times more intense when it reaches design performance, probably by 2010.

    Once stable circulating beams have been established, they will be brought into collision, and the final step will be to commission the LHC’s acceleration system to boost the energy to 5 TeV, taking particle physics research to a new frontier.

    CERN director general, Robert Aymar, said: “The LHC will enable us to study in detail what nature is doing all around us.
    “The LHC is safe, and any suggestion that it might present a risk is pure fiction.”

    Originally standing for Conseil Européen pour la Recherche Nucléaire (European Council for Nuclear Research), CERN was where the World Wide Web began as a project called ENQUIRE, initiated by Sir Tim Berners-Lee and Robert Cailliau in 1989.

    Berners-Lee and Cailliau were jointly honored by the ACM in 1995 for their contributions to the development of the World Wide Web.

    Appropriately, sharing data around the world is the goal of the LCG project.

    Since it is the world’s largest physics laboratory, CERN’s main site at Meyrin has a large computer center containing very powerful data processing facilities primarily for experimental data analysis.

    Its mission has been to build and maintain a data storage and analysis infrastructure for the entire high energy physics community that will use the LHC.

    And because of the need to make the data available to researchers around the world to access and analyse, it is a major wide area networking hub.

    The data from the LHC experiments will be distributed according to a four-tiered model. A primary backup will be recorded on tape at CERN, the “Tier-0” center of LCG.

    After initial processing, this data will be distributed to a series of Tier-1 centers, large computer centers with sufficient storage capacity and with round-the-clock support for the Grid.

    The Tier-1 centers will make data available to Tier-2 centers, each consisting of one or several collaborating computing facilities, which can store sufficient data and provide adequate computing power for specific analysis tasks.

    Individual scientists will access these facilities through Tier-3 computing resources, which can consist of local clusters in a University Department or even individual PCs, and which may be allocated to LCG on a regular basis.

    A live webcast of the event will be broadcast tomorrow. What are your thoughts on LHC – will it reveal the secrets of the universe or a gaping black hole?

  • Multi-Service Business Gateway Market Growing


    The appeal of the multi-service business gateway (MSBG) in the US market continues to increase, reports In-Stat.

    A multiservice business gateway is a device that combines network voice and data communications services into a single box.

    It integrates critical functions into a single platform that supports routing, VPN, firewall, security, IDS/IPS, service-aware QoS, voice, and application processing.

    According to the high-tech market research firm that going forward the MSBG will be the means by which new IP communication technology and applications are adopted by small and medium-sized businesses and branch offices.

    The multi-service business gateway (MSBG) combines the functionality of a router, Ethernet switch, security firewall, VoIP gateway, and other appliances into a single, integrated device.

    Keith Nissen, In-Stat analyst, said: “The installation of new data communication equipment or replacement of data devices are the most common reasons for purchasing an MSBG.
    “However, the migration to VoIP technology is increasingly driving MSBG sales.”

    Recent research by In-Stat found the following:
    * 50+ per cent of all US businesses with greater than 20 employees have installed MSBG devices, according to an In-Stat survey of US IT and business managers employed by corporations that operate small, medium or branch offices.
    * 60 per cent of businesses favor MSBGs with integrated Wi-Fi technology.
    * Nearly 66 per cent of businesses prefer office-in-a-box devices.

  • iPhone rivals beef up camera offerings


    The launch by Samsung in the UK this week of what it claims is Europe’s first 8 megapixel camera phone is being seen as an attempt to highlight shortcomings in the iPhone.

    A number of other handset vendors are preparing to launch similar high-end camera phones in time for the Christmas period.

    Sony Ericsson is expected to launch the 8.1 megapixel C905 in the fall, while Nokia and LG are reported to be planning similar moves.
    The fact Apple’s 3G iPhone only packs a 2 megapixel camera is regarded as one of its key weakness.

    Samsung’s i8510 will be available in the UK through Carphone Warehouse and free on a £35 a month contract with Orange UK.
    The smartphone is being positioned as a genuine alternative to digital cameras.

    Mark Mitchinson, vice president for Samsung, said the cell phone industry was playing catch-up, selling only 4 and 5 megapixel camera phones.

    “But the 8 megapixel is a new milestone, I think the vast majority of consumers will see it as a credible alternative,” he said.
    “For the first time ever you will not need to carry a camera as well as a phone on your holidays.”

    The i8510 is based on Symbian’s Series 60 platform and includes HSPA connectivity, Wi-Fi, GPS and FM radio functionality.

  • HD test success spells bandwidth boost

    BBC succeeds with world’s first reception of HD pictures over DTT using DVB-T2

    Test transmissions in the UK have successfully received high def pictures compliant to the DVB-T2 standard using a real-time demodulator.

    The BBC, which performed the tests, says this is the first time anywhere in the world that a live end-to-end DVB-T2 chain has been demonstrated.

    DVB-T2 is a new version of the DVB-T standard currently used for digital terrestrial television in the UK.

    It offers an increase in efficiency over DVB-T, which means more bandwidth will be available on the multiplex when it is reconfigured in tandem with digital switchover to permit the carriage of high definition services.

    That, combined with a switch from MPEG-2 to MPEG-4, will allow for increased HD content.

    The UK’s analogue transmission ends in 2012 but some parts of the country will get the benefit of DVB-T2 earlier, with a few places going live next year.

    The current estimate is that in 2009 there will be three HD channels available in the UK, one of which goes to the BBC and the other two going to ITV, Channel 4 or Five.

    The BBC started DVB-T2 test transmissions from the Guildford transmitter in June.

    Justin Mitchell, leader of the BBC’s DVB-T2 modem development team, said: “Following the approval of DVB-T2 in June and the launch of test transmissions from Guildford transmitter the next day, we are delighted that on Kingswood Warren’s 60th anniversary our team has been able to deliver a working demonstration of a DVB-T2 modulator and demodulator.”

    The modulation and demodulation devices will be made available for licensing.

    There will be a demonstration of the DVB-T2 modulator and demodulator on the DVB stand 1.D81 at IBC in Amsterdam.

  • OLED is coming – but at a price


    As a next-generation display technology, the first OLED (organic light emitting diode) screens were never going to come cheap.

    For the introduction of the first OLED to the European market, Sony is said to be putting a €3,500 (US$5,000) price tag on its XEL-1 when it becomes available before Christmas.

    The astronomical cost, reported by OLED-Display.net, dwarfs the US$1850 paid in Japan and even makes the US$2,100 price stateside seem reasonable.

    When the XEL-1 launched in Japan it was unveiled as a kind of prototype for what could be. Sony was said to be making a loss on each set.

    While the XEL-1 has received a positive reception from consumers in Japan, expansion into other markets is sure to be slower at such elevated prices.

    Competition from Sony’s rival Samsung

    OLED TVs, which could potentially replace LCD and plasma TVs, are predicted to sell close to 3 million units in 2012.

    Samsung, which released the world’s largest OLED television at the IFA trade show in Berlin, has committed itself to
    commercial production of mid to large screens by 2010.

    “Samsung will begin commercial production of mid- and large-sized OLED televisions around 2010,” according to a statement from Samsung.

    At IFA, Samsung displayed two OLED screens – a 14.1-inch model and a 31-inch model.

    Sony had the XEL-1 and a 27-inch prototype, which was introduced at the CES in Las Vegas earlier this year.

  • New Blaupunkt GPS Device Supports VoIP


    Blaupunkt’s new Travel Pilot 700 GPS navigation devices can be used to make VoIP calls.

    The GPS device is powered by ARM9 500MHz processor, and a 266MHz DSP processor. There is also 8GB of internal memory.

    For VoIP, the Travel Pilot 700 also supports WiFi 802.11b/g.

    Other than ordinary GPS functions, it comes with a video camera on the back for capturing live footage of the road ahead.

    But the Travel Pilot 700 also offers multimedia functionalities, with an integrated DVB-T digital TV tuner, integrated media player supporting DivX, H.264, MPEG-2, QuickTime, WMV, XviD video and AAC, MP2, MP3, OGG and WMA audio files.

    It also has Bluetooth for handsfree.

  • Apple sued over iPhone's 3G issues


    Tech-Ex reports in his blog that Alabama resident Jessica Alena Smith has filed a complaint against Apple.

    He says that although the lawsuit hasn’t been granted class action status yet, he believes it will, eventually.

    According to Tech-Ex, Jonathan Kudulis, an attorney with Birmingham, Alabama-based Trimmier Law Firm, representing Smith, said:
    “Apple sold these devices on the promise that they were twice as fast as the pre-existing phones and that they would function suitably, or properly, on the 3G network. But, thus far, Apple and the phone have failed to deliver on this promise.”

    The blogger explains his own experience of what he describes as an almost complete “3G outage”.

    “I work at a company that works on mobile phone software, and any of our other 3G phones work just fine, with full bars of coverage, at work, while the iPhone has one bar at best,” he said.
    “Additionally, while some try to pin the problem on AT&T, complaints from other carriers in different countries indicate it’s not a network issue.”

    Tech-Ex says he doesn’t believe Smith is looking for a rich payday. Instead, he suggests she is trying to get Apple to fix the issue, if necessary by recalling and repairing existing phones.

    He concludes by saying that after calling AT&T, they refunded him an entire month on his data plan – which seems to have satisfied him.

    What have your experiences been with your iPhone? Please let us know.