Madan Sheina, Lead Analyst, Software – Information Management
Interest in in-memory computing platforms is rising. Until now, the primary driver has been the “need for speed,” helping companies to process relatively modest data sets more quickly.
Both traditional and newer in-memory vendors are looking to scale-up these capabilities with innovative platforms that promise to meet the increasing demand for more accelerated, operationally driven analytics against Big Data residing in Hadoop environments.
Their efforts promise to overcome the limitations of traditional analytic architectures and, if successful, to bolster Hadoop’s credentials as a mainstream option for operational enterprise computing needs.
Two magnitudes of in-memory play into Hadoop
Two “magnitudes” of in-memory stand out – processing speed and data scale. In-memory has more or less proved itself as a viable platform for processing data more quickly – at speeds that traditional technologies simply cannot handle. For companies that require realtime analytics, in-memory computing is certainly …