In-Memory Computing, hype or breakthrough?


Posted on: August 14, 2013 by Nicolas Roux

Atos - In Memory Computing hype or breakthroughGartner defines In-Memory Computing (IMC) as a computing style in which the primary data store for applications (the "datastore of records") is the central (or main) memory of the computing environment (on single or multiple networked computers) running these applications.

The basic idea behind IMC is that the computer memory (RAM) can directly store multiterabyte datasets. IMC-enabled applications still use traditional hard-disk drives (HDDs), but these are not the primary focus for the data. Instead, HDDs are leveraged to persistently store in-memory data for recovery purposes, to manage overflow situations and to transport data to other locations.

This design assumption postulates that applications experience high performance and negligible data access latency, even if it needs to scan large volumes of data, such as in analytical or event-processing applications.

Is In-Memory Computing only hype spread out by SAP?

SAP has certainly paved the way to mainstream In-Memory database and analytics solutions, through their large scale commercial push on their HANA platform.

But data stored In-Memory is not persistent (it is lost if the machine shuts down), and persistence of data is key for many business processes. Above that, alternatives to 'spinning' hard drives like new generation of solid state drives (SSDs) might enable to bridge the performance gaps while offering persistant storage.

Business processes are also usually supported by multiple applications, all in need to access shared datasets but only very few able to support in-memory computing.

So the technology might remain a niche vertical, with complex architectures, a lack of standards and security challenges. In other words, a niche for wealthy organizations that can afford it just to run analytics faster.

So why do we believe In-Memory Computing IS the future?

We think In-memory computing will have a long term and disruptive impact, because its business impact goes beyond just enabling business performance, it enables to transform the business. Some entire industries like HFT (High Frequency Trading) would not exist without in-Memory computing technologies.

The demand for processing faster ever-increasing volumes of data is not likely to slow down, and In-Memory Computing answers neatly to that. "Big Data" needs big memory. In-Memory Computing enable many optimizations by boosting the performance: low latency application messaging, shortening batch processing, mixing transactions and analytics on same data set, real-time context event correlation and processing...

But it also enables High performance analytics, interactive data visualization, complex even processing, and this opens to completely new solutions:

  • Applications that could benefit from getting information (not only data) availability in seconds instead of hours/days: retail, ecommerce, e-advertising.
  • Applications that could leverage very intensive analytics: data mining, forecasting, security intelligence, customer relationship management, supply chain planning.
  • Applications that requires complex event processing: predictive monitoring, intelligent metering, fraud and risk management.

The technology is also becoming affordable, with DRAM costs eroding by around 30% per year and ability to use commodity hardware: a single standard 64 bit process can potentially address up to 17 billion of gigabytes of memory.

As a CIO, what should I do tomorrow around In-Memory Computing?

Short term, engage projects to assess how In-Memory can support your existing services and business applications. Processes bottleneck, long batches with heavy database load, performance issues: all these problems might be addressable with In Memory Computing, but as for any new technology, the technical constraints and the TCO need to be carefully analyzed.

Middle term, investigate how In-Memory Computing could bring direct benefits to the business by enabling new high-value services. If you have not yet managed to transform your IT from a support function to a business-strategic function, that's a fantastic opportunity.

In-Memory Computing is seen an important component of Fabric-Based Data Centers, a major topic Atos Scientific Community is working on. Thanks to Tom Coenraad, Alex Caballero-Fernandez, Martin Pfeil, Boris Scharinger and other track members for their contribution to this article.

Share this blog article


About Nicolas Roux

Procurement Director UK&I and member of the Scientific Community
Nicolas has been in the IT industry for 15 years, leading many consulting and integration projects. He held several positions within Atos Managed Services prior joining the Procurement organization. He is also an active member of the Atos scientific community since its creation in 2009.

Follow or contact Nicolas