
Data centric computing systems tackle big data challenges
Published December 08, 2014
NEWS BRIEF--With the world generating more than 2.5 billion gigabytes of data every day, a new approach to supercomputing is on the horizon.
“Data is getting so big,” says VP of IBM Power Systems and Development, Brad McCredie. “Customers are even investing in storage right now more than they are in CPU, and saying, hey, it might be more cost effective to move compute closer to the data than it is to move the data back to the compute.”
The tools used for supercomputing in the past are no longer adequate for today’s data problems. IBM has taken a new “data-centric” approach to supercomputing and recently secured a $325 million contract with the U.S. Department of Energy to develop and deliver the first of these new systems at Lawrence Livermore and Oak Ridge National Laboratories to further advance innovation and discovery in science, engineering, and national security.
McCredie, IBM Fellow and President of the newly founded 77-member Open Power Foundation has been in the industry for 25 years and says today’s systems have to work a lot harder. “Silicon technology isn’t delivering anywhere near the cost performance advantages it has over the three previous decades.”
Data movement and management has created a shift in development away from a focus on faster microprocessors. For IBM, this has materialized into the new “data-centric” approach. IBM OpenPOWER-based systems “brings together an accelerator company, NVIDIA [GPU Technology]; a network company, Mellanox; a system company, IBM; and we innovated with all three of those technologies together to provide a solution that if each of us approached National Labs would nowhere have near meet their needs,” explains McCredie.
The national laboratories offer researchers from academia, government, and industry access to time on their open computers to address grand challenges in science and engineering. According to Big Blue, working with IBM, NVIDIA developed the advanced NVIDIA NVLink interconnect technology, which will enable CPUs and GPUs to exchange data five to 12 times faster than they can today. NVIDIA NVLink will be integrated into IBM POWER CPUs and next-generation NVIDIA GPUs based on the NVIDIA Volta architecture, and allow Sierra and Summit to achieve unprecedented performance levels. With Mellanox, IBM is implementing a state-of-the-art interconnect that incorporates built-in intelligence, to improve data handling.
“Today’s announcement marks a dramatic departure from traditional supercomputing approaches that are no longer viable as data grows at enormous rates. IBM’s Data Centric approach is a new paradigm in computing, marking the future of open computing platforms and capable of addressing the growing rates of data,” says Tom Rosamilia, Senior Vice President, IBM Systems and Technology Group. “The beauty of the systems being developed for Lawrence Livermore and Oak Ridge is that the core technologies are available today to organizations of many sizes across many industries.”
Visit IBM for more information.