What's Packed in Variola's Suitcase?
Interesting. Gives a lower limit to certain storage questions. Guess it's no suprise IBM's SAN product handled things here, it's been field-tested after all. -TD GENEVA -- IBM and CERN, the European Organization for Nuclear Research, today announced that IBM's storage virtualization software has achieved breakthrough performance results in an internal data challenge at CERN. The data challenge was part of a test currently going on at CERN to simulate the computing needs of the Large Hadron Collider (LHC) Computing Grid, the largest scientific computing grid in the world. The LHC is expected to produce massive amounts of data, 15 million gigabytes per year, once it is operational in 2007. The recent results represent a major milestone for CERN, who is testing cutting-edge data management solutions in the context of the CERN openlab, an industrial partnership. Using IBM TotalStorage SAN File System storage virtualization software, the internal tests shattered performance records during a data challenge test by CERN by reading and writing data to disk at rates in excess of 1GB/second for a total I/O of over 1 petabyte (1 million gigabytes) in a 13-day period. This result shows that IBM's pioneering virtualization solution has the ability to manage the anticipated needs of what will be the most data-intensive experiment in the world. First tests of the integration of SAN File System with CERN's storage management system for the LHC experiments have already obtained excellent results. "CERN has a long-standing collaborative relationship with IBM, and we are delighted that IBM is pushing the frontiers of data management in the context of CERN openlab," said Wolfgang von R|den, Information Technology Department Leader at CERN and Head of the CERN openlab. "What we learned from these data challenges will surely influence our technological choices in the coming years, as we continue to deploy the global LHC Computing Grid."
participants (1)
-
Tyler Durden