Banks are innovating. The purpose of this innovation is to transform bank services into meaningful and frictionless customer experiences. A key element in order to achieve that ambitious goal is by providing well tailored and reactive APIs and provide them as the building blocks for greater and smoother customer journeys and experiences. For these API’s to work, internal processes have to evolve as well from batch processing to real time event processing.
In this talk, after providing a brief introduction of the streaming computing landscape, we describe a RESTful API called “Coral” meant to design and deploy customized and flexible data flows. The user can compose data flow for a number of data streaming goals such as on-the-fly data clustering and classifiers, streaming analytics, per-event predictive analysis , real time recommenders. Once the events are processed, Coral passes the resulting analysis as auctionable events for alerting, messaging or further processing to other systems. Coral is a flexible and generic event processing platform to transform streaming data into actionable events via a RESTful API. Those data flows are defined via the Web API by connecting together basic streaming processing elements named “coral actors”. The Coral framework manages those coral actors on a distributed and scalable architecture.
Streaming and real time data processing and analytics are the key elements to an improved customer experience. In this way, you can get the most targeted processing for your domain (marketing customization, personalized recommenders, fraud detection, real time security alerting, etc.). This streaming “data flow” model implies processing customers’ events as soon as they enter via web APIs. This approach borrows a lot from distributed “data flow” concepts developed for processor architectures back in the 80’s. The “Coral” streaming processing engine is generic and built on top of world class libraries such as Akka and Spark, and fully exposed via a RESTful web API.