Information highway

Imagine a world where the internet is so fast that personal data storage units are obsolete.

Tags: CERN the European Organization for Nuclear ResearchCloud computingSwitzerlandUnited Kingdom
  • E-Mail
Information highway British professor Peter Higgs in CERN's Large Hadron Collider, the world's largest particle accelerator, in Geneva. (Getty Images)
By  Administrator Published  June 10, 2008

Imagine a world where the internet is so fast that personal data storage units are obsolete, and HD films can be downloaded in seconds. With the switching on of the CERN particle accelerator this year, technology has taken one step closer to that reality.

At a depth of some 50-175 metres underground, running across the borders of Switzerland and France, lies CERN's (European Organisation for Nuclear Research) Large Hadron Collider (LHC), the biggest high-energy particle accelerator in the world.

Science institutes and governments around the world have pumped up to US$10 billion into the project, which is tasked with confirming the existence of the elusive Higgs boson, and which could go some way to validating scientific theories about the laws of the universe.

The problem for Ian Bird, project leader at the LHC Computing Grid Project (LCG), is that the amount of data produced at this mammoth installation is astronomical - about 15 petabytes a year (or 15 million gigabytes).

To add to this problem, CERN needs to distribute this data worldwide, so physicists around the world can access it for analysis. Grid computing was seen as the answer, and now the potential for this technology has raised awareness in something that could revolutionise the telecoms market.

"I think the term ‘grid technology' encompasses a multitude of different things," says Bird. "Some people use it to connect desktop machines to make use of unused cycles on those machines.

"When we at LHC are talking about grid technology, we really mean setting up a big distributed system which allows us to connect hundreds of computer centres around the world together to make it look to our users like a single system."

The data is transferred through a number of tiers. Tier 0 is LHC itself, where the data is produced and the first level of processing is done. Eleven Tier 1 centres are located at major computer centres around the world, and then Tier 2 centres are where physicists can get hold of the data.

Presently, about 55,000 servers are connected to the grid, and another 200,000 are expected to be installed to meet the exponential rise in data traffic.

Top transmission speeds of 10Gbps are possible between CERN and the Tier 1s, thanks to dedicated fibre links, while the Tier 1s and Tier 2s are interconnected using the standard national research networks in the country.

Network connectivity is not something we really worry about to the Tiers 1s. There may be some bottlenecks on the national networks, depending on how they're connected geographically. Quite often it's just a last mile issue."

This type of grid computing, where physical servers are interconnected through fibre, could be the Holy Grail for modern telecommunications services. "The target rates we've demonstrated we can achieve is something like between 1.5-2GBps aggregate data being pumped out of CERN to the Tier 1s.

There's probably an equal amount which is being thrown around between the Tier 1s and the Tier 2s," Bird explains. At this rate, the download of a HD movie would take just a few seconds.

Presently, LCG is a department specifically for LHC Computing, but another EU-funded project, Enabling Grids for e-Science (EGEE), aims to attract other sciences and industries to using grid technology.

Scientists around the world are now making use of the grid in 200-plus applications from nanotechnology to geophysics, yet it's difficult to bring industries into the fold, enabling mass market usage.

"I think you will start to see Amazon and Google providing some of these services now, so-called ‘cloud computing', which is basically grid computing with a different hat on. The mass market will probably come from that direction, rather than what we're doing. I think we're trying to solve a different problem," Bird says.

In cloud computing, enterprises and consumers don't store their data on a personal server. Instead, a network of interconnected servers will host their data, to be accessed over the internet.

This raises huge questions about the future of data storage and consumers daily technology habits. Vendors are already beginning to provide some answers.

Up in the clouds

Cloud computing is a buzzword that's catching on fast. EMC, the leading enterprise storage provider, recently announced the set-up of a new cloud computing business. But it is already making huge strides in the virtualisation market, with its wholly-owned acquisition, VMware.

Not many companies have the means to play around with expensive multitudes of physical servers. So for other business customers, particularly small- to medium-sized enterprises (SMEs), virtualisation could be the answer to streamlining their managing processes and reducing costs.

"Virtualisation is the way you can get a kind of grid computing into everybody's datacentre," says Martin Niemer, product manager at VMware, EMC's wholly-owned virtualisation specialist.

"Grid computing is usually high-performance computing; you have hundreds of users that are all doing the same. Google has tens of thousands running the same application, which is Google search.

"But with virtualisation is really running multiple applications; if you have 10 virtual servers with 100 virtual machines (VMs), probably each and every VM is running a different application.

Virtualisation is a layer between the physical hardware and the operating system, creating a number of virtual instances. This means that each machine can be used to create several virtual servers, all running different applications.

The CAPEX savings generated from this are immediate, and often the primary reason for enterprise customers to adopt virtualisation products.

"The number of virtual machines you can run on a server is what we call the consolidation ratio. Usually, customers see a consolidation ratio of around 1:10 in production environments, meaning where you had 10 servers before, you only have one now. You can imagine this has a high impact on costs. There are 70-80% CAPEX savings," he says.

There are also other benefits including disaster recovery, where VMs can be easily transferred onto another physical server in the event of a power outage, and an elimination of downtime.

Each and every company stands to benefit from virtualisation, and this is why Aptec, a leading software distributor in the Middle East, is keen to add VMware's products to its portfolio.

"Every single SME customer would at some point consider going to virtualisation, because it saves time and money, in terms of optimisation and implementation," says Ilyas Mohammed, senior business manager for software, Aptec.

"We are very excited about it. We have been waiting and looking forward for a virtualisation product, and VMware is the leading virtualisation vendor at the moment.

"The market is widely open; it's got huge potential. According to IDC, the virtualisation market will grow to US$6 billion by 2011," he adds.

According to Niemer, virtualisation is increasing being sold as a service to companies. "Customers traditionally want to run their application inside their own datacentre.

But the smaller customers don't have the option of having a back-up datacentre, and so a service provider could step in and offer a back-up datacentre on demand. This will be like a potential service.

"If you need five VMs, you can rent them on a per day basis, for example. This is a bit like the cloud computing approach, where you don't care on which physical machine your process is running: you just pay for the processing power," he says.

With IDC predicting data traffic will grow 60% per annum for the next 10 years, EMC is also betting substantially on the home storage market. It also anticipates that cloud computing will permeate the technological world of the consumer.

"Today most information is trapped in a device or an application that might be owned by a set of users. Tomorrow the world is going to demand the ability to use and manage information across all these silos," said CEO Joe Tucci at last month's EMC World summit.

"I do believe that there is going to be a fair sized number of homes throughout the world over the next couple of years that will have a terabyte of storage in their house, and that's a market that is worth playing in. I also totally believe that that storage will also be connected to the cloud," Tucci added.

VMware and virtualisation

"Virtualisation is one of the hottest spaces in technology. If you talk about virtualisation a few years ago, people would've scratched their heads. In today's environment, it's one of the top trends," explains Bill Teuber, vice president, EMC.

"VMware is going to grow about 50% this year. It's tracked how fast it has grown versus any other software company in history, and it's in a group of three, with Oracle and Microsoft. Its growth rate continues to accelerate and we're very pleased with the investment and their results," Teuber went on to say.

Martin Niemer, product manager, VMware, adds: "Gartner projects that virtualisation will continue to be the most impactful technology until 2010. Another interesting thing is that a Goldman Sachs survey that asks CEOs every quarter where they are going to spend their money found that VMware has topped that list for several quarters now.

That really shows the importance of virtualisation technology in the marketplace right now.

3197 days ago

Nearly everything in this article is already being done in one way or the other. These guys are just propelling ahead creating new internet backbones

3199 days ago
Jim Davis

That would be one cool World. I imagine it will happen one day.

Add a Comment

Your display name This field is mandatory

Your e-mail address This field is mandatory (Your e-mail address won't be published)

Security code