Researchers building 120 petabyte storage system
The storage array is being built for an unnamed client and will be able to collect data from real-world phenomena
An IBM research lab in Almaden, California is working on a storage repository that offers a colossal 120-petabytes of capacity. 120 petabytes translates to 120 million gigabytes.
The data repository consists of some 200,000 conventional hard drives. The repository is expected to be able to store around one trillion files and will likely provide the necessary space for detailed simulations of real-world phenomena.
The storage array is being developed for an unnamed client that needs a new supercomputer for detailed simulations of real-world phenomena. The new technologies developed to build this repository could enable similar systems for more conventional commercial computing, says Bruce Hillsberg, Director of Storage Research at IBM and leader of the project.
IBM says that it would take 24 billion five megabyte MP3s to fill the drive and that it could also store 60 copies of the biggest backup of the web, which is a collection of 150 billion web pages that make up the Internet Archive's WayBack Machine. The service is a digital library of Internet sites and other cultural artifacts.
Most supercomputer storage arrays presently offer a maximum storage capacity of 15 petabytes, which makes IBM's new system considerably larger. The system is said to have a number of tools and redundancies at its disposal so in the event of drive failure, the supercomputer can continue to work at almost full speed.