Event-driven Architecture is Suddenly Popular, Will Stream Analytics be Next?

by  W. Roy Schulte  23 March 2020 

This article represents the author’s opinion, not necessarily that of Gartner Inc. or any other company.

I am reminded of the wry observation, made by someone I don’t remember but wish I did, that the Internet was an overnight sensation… after 25 years of existence.

Event-driven architecture (EDA) has suddenly become a hot topic among software architects, after almost three decades of sitting on the back burner. There was an initial wave of interest in EDA when message-oriented middleware (MOM), including publish-and-subscribe (pub/sub) messaging systems, emerged in the late 1980s and early 1990s. But most application architecture continued to center on batch processing and synchronous, client/server (request/reply) design patterns, such as service-oriented architecture and its recurring facelifts (microservices and APIs). Sure, some advanced organizations deployed message buses and implemented event-driven applications in the intervening decades, but EDA remained underutilized in most mainstream IT departments until now.

This seems to be changing, driven by business demands for more flexible, easier-to-change, loosely-coupled applications, and enabled by a new generation of high performance, pub/sub messaging infrastructure products, including Kafka, Kinesis, Pulsar, NATS Streaming, Solace, Azure Event Hub and numerous other subsystems. There has been a widespread awakening to the benefits of EDA for increasing the scalability and agility of business systems. We have recently seen some excellent articles that explain the mechanics and benefits of EDA from sources including Amazon, Confluent and Solace.

But there is a second important aspect of event processing, something related to EDA but actually different in many respects. We are referring to the use of event data in stream analytics to provide real-time or near-real-time intelligence. Stream analytics is based on the mathematics of complex-event processing (CEP). CEP is a computing technique in which incoming data about what is happening (event data) is processed as it arrives (data in motion or recently in motion) to generate higher level, more useful, summary information (complex events). Complex events are computed through aggregation (e.g., count of how many times “Coronavirus” has been tweeted in the past 10 minutes) or pattern detection (e.g., is this sequence of financial transactions likely to be fraudulent?).   One complex event may be the result of calculations performed on a few or on millions of base (input) events from one or more event sources.

Early work on CEP began in the late 1990s, so the field is more than 20 years old, although there were precursors (as there always are) and there have been some significant recent advances. In a fast-changing world, awash in event streams such as clickstreams, sensor event streams and news feeds, stream analytics is essential for situation awareness. It helps companies understand what is going on so they can decide what to do.

There is an enormous amount of CEP already being executed in logistics, supply chain, manufacturing, finance, security information and event management (SIEM), and other applications. However, it is mostly hidden within off-the-shelf products and services, such as SaaS, packaged applications or devices. Stream analytics is still mostly out of sight of architects and application developers in IT departments.

This may be changing soon. The data is already available in the tsunami of event streams that is already coursing through the networks of large enterprises. The volume and variety of streams is rapidly increasing further. Some stream analytics applications cannot be bought off-the-shelf, so enterprises must write it themselves. These applications are too specialized to have a canned solution, especially those that require combining data from multiple sources such as heterogeneous SaaS or packaged applications, and/or sensor data from the real world.

Software architects and analytics experts, working together, can extract the information value of their event streams by leveraging the increasingly powerful, and increasingly easy-to-use event stream processing (ESP) platform products. Gartner Inc. is currently tracking 34 such products, so the CEP technology for stream analytics is available now (that’s a story for another time and place).

For those interested in looking into stream analytics and CEP in more depth, multiple sources are available. You can read about the difference between ESP and CEPeight trends in ESP, or when do you need an ESP platform on this web site. Or if you really want to dig down, read a book such as:

  • “The Power of Events: An Introduction to Complex Event Processing in Distributed Enterprise Systems” David Luckham, Addison-Wesley Professional
  • “Event Processing: Designing IT Systems for Agile Companies,” K. Mani Chandy, W. Roy Schulte McGraw-Hill
  • “Event Processing for Business,” David Luckham, John Wiley & Sons

 

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.