storage hall

by Natalie Miller • @natalieatWIS

IBM announces hardware-agnostic storage technology for universal processing

Published May 14, 2014


The challenge with storage today is less about hardware or software and more about data. With this in mind, IBM announced this week a new software-defined storage product portfolio designed to reduce customers’ data storage costs, improve their access to data, and significantly cut time to insight.

Vincent Hsu, IBM Fellow and CTO, Systems Storage, IBM STG, calls it “hardware-agnostic storage software” and explains the “mission of software-defined storage is to eliminate data isolation, the data and the storage island.”

Software-defined storage is a set of capabilities that automatically manage data, both locally and globally, to allow organizations to access and process any type of data on any type of storage device, anywhere in the world.

These new storage innovations are the same as those used in the IBM Watson win on Jeopardy, helping the super computer beat incumbents on the game show by providing faster access to data. According to IBM, the technology is designed to move petabytes of data and billions of files in seconds for clients who face new business challenges in the era of big data and cloud.

Traditional storage systems require users to move data to separate designated systems for transaction processing and analytics. One of the technologies in the portfolio, codenamed “elastic storage,” automatically balances resources to support both types of application workloads, including Hadoop-based analytics, and is capable of reducing storage costs up to 90 percent by automatically moving data onto the most economical storage device.

Elastic storage virtualizes storage, allowing multiple systems and applications to share common pools of storage. Through its support of OpenStack cloud management software, elastic storage also enables users to store, manage, access, and share data across private, public and hybrid clouds.

“Digital information is growing at such a rapid rate and in such dramatic volumes that traditional storage systems used to house and manage it will eventually run out of runway,” says Tom Rosamilia, Senior Vice President, IBM Systems and Technology Group, in a statement.

The new storage software, according to IBM, is ideally suited for the most data-intensive applications, which require high-speed access to massive volumes of information—from seismic data processing, risk management and financial analysis, weather modeling, and scientific research, to determining the next best action in real-time retail situations.

“I think that one of the most important benefits is that you eliminate unnecessary data replication,” Hsu says. “How do people do analytics today? You store data in some kind of primary storage. In order to run analytics on those data, you need to ship those data to the analytics platform. [But] the data becomes so big that moving it to the analytics platform becomes very costly. And because data is so big, it takes a long time to move the data. You won’t get the real-time insight of your data.

“Existing infrastructure is running out of steam because the existing data center infrastructures are fragmented,” Hsu adds. “The way forward is a universal data platform, a data integration platform, one platform to put all your data there and run different processes on it.”

Born in IBM Research Labs, elastic storage has demonstrated it can successfully scan 10 billion files on a single system in 43 minutes, according to IBM. It can exploit server-side Flash for up to a six-time increase in performance than with standard SAS disks and can also virtualize the storage to allow multiple systems and applications to share common pools of storage.

In addition, the software features native encryption and secure erase, which ensures that data is irretrievable to comply with regulations such as HIPAA and Sarbanes-Oxley.

“Now you can get your insight much faster compared to your competition. You can use this data platform to inject the data and you can run your big data operations on the same platform. Your competition will have to inject the data into one primary storage [repository] and replicate the data to the big data platform. You have an advantage over them that is not a matter of seconds; it’s at least a matter of hours, days, because you are going to a one-step process, the others to a two-step process. They put the data in one storage [platform] first and then send it to the big data platform. In your environment, with elastic storage, you just put the data in one place and you can start processing it. The fact that you don’t have to move a large amount of data, the speed to get your insight, the delta is like hours or days.”

Elastic storage software will be available as a cloud service through IBM SoftLayer in the second half of this year.

For more information about IBM Software Defined Storage, visit

Scott Etkin, a managing editor at Data Informed, a sister publication of Insights Magazine, contributed to this report.



No one has commented on this item.