ISE Blog

In-Memory Computing Summit 2016

In-Memory Computing Summit 2016Last week, May 23-24, I presented at the In-Memory Computing Summit 2016 in San Francisco, CA.  This is an industry-wide event tailored to in-memory computing related technologies and solutions. In-memory computing visionaries, decision makers, experts and developers gathered to discuss current uses of in-memory computing as well as the future of this rapidly evolving market.

What is In-Memory Computing?

Put simply, it means storing your data (even your Big Data) in RAM so that it can be accessed, processed, and analyzed thousands of times faster than if it were on disk. This enables systems to achieve the high-throughput and low-latency required by many modern applications in financial, retail, social media, gaming, and other sectors. In addition, several keynotes delivered at the summit made it clear that the data processing speeds enabled by in-memory computing along with simple, cost-effective Cloud based platforms are adding significant momentum to the wave of IoT innovation. It is frequently said that the value of data erodes over time. That is, data is most valuable when you can immediately derive insights and take action in response to it. In this sense, in-memory computing is one way to extract more value from your data. To be clear, this trend doesn't mean that disk is on the fast track to extinction (although some may say so). It just means that you put the data in the right place at the right time. Proper system architecture and design are still important to ensure the speed of RAM and the durability and cost of disk are leveraged properly (after all disk is still cheaper than RAM).

There were many interesting breakout sessions to choose from, describing case studies, specific in-memory technologies, as well general advances in the field. My presentation focused on one aspect of the open source in-memory data fabric Apache Ignite. Apache Ignite is a high-performance, integrated and distributed in-memory platform for computing and transacting on large-scale data sets in real-time, orders of magnitude faster than possible with traditional disk-based or flash technologies. An often under-represented component of Apache Ignite (at least I like to think so) is Streaming and Complex Event Processing (CEP). In my presentation I provided a vendor neutral assessment of Apache Ignite Streaming and CEP with some discussion on how it provides for implementation of unbounded data processing patterns such as those described in The World Beyond Batch: Streaming 101 and The World Beyond Batch: Streaming 102. I followed up that discussion with a live demonstration of a simple IoT use case using Apache Ignite streaming. The application demonstrated how sensors in a manufacturing plant could stream data into Apache Ignite and how client application could aggregate the data and present it on a dashboard. Watch for a follow-up blog post with more details on the presentation as well as links to the slides and video of my presentation.

Take a moment and use the comment section below to tell us how you are using in-memory computing or to pass along any questions you may have on this topic.  We'd love to hear from you!

Matt Coventon, Senior Software Engineer

Matt Coventon, Senior Software Engineer

Matt Coventon is a Senior Software Engineer and the Practice Lead for Big Data Services at Innovative Software Engineering (ISE), a professional services company with a strong interest in the intersection of vehicle telematics, mobile applications, and big data. In his 10 years at ISE his primary focus has been architecting and building web, middleware, and analytics applications that are high performance, fault tolerant, and easy to scale. He enjoys tackling new technologies and understanding how they can be best utilized within the enterprise. He’s a father of four, and in his free time, he’s a songwriter.