Is Complex Event Processing Part of Artificial Intelligence?
David Luckham, W. Roy Schulte
When two technologies such as CEP and AI become everyday business speak, and everybody has them in their commercial offerings, it seems appropriate to discuss how they are related. Read on!
Complex Event Processing (CEP) is a meta-framework of techniques (e.g., event filtering, event pattern matching, causal and timing analysis, hierarchical abstraction of events, construction of complex events, specification of event hierarchies) for processing flows of events in real-time and abstracting humanly understandable and actionable information from those event flows. CEP is generally intended for high-speed, real-time event processing in business or governmental situations where critical decisions must be made quickly at a human management level based upon large flows of low level event data. CEP was developed and tested in the 1990’s.
Artificial Intelligence (AI) on the other hand, has undergone a colossal growth in its possible meanings from its original conception. For example, contrary to some popular misunderstandings, AI does not necessarily mean “any system based on a neural network.” Neural networks are often used as implementations of AI, but many systems that exhibit AI don’t use neural networks.
In a few reports on AI, complex-event processing (CEP) is mentioned as one technique, among many, that is sometimes used in the construction of AI capabilities. While this is true, it is also true that AI can be employed in building CEP applications. Thus there is a symbiotic relationship between the two technologies which we discuss below.
What is Artificial Intelligence? In the beginning, the mid 1950’s, when the term AI was invented, it connoted a machine that behaved in a manner that would be considered to be humanly intelligent. This was the general meaning that John McCarthy, one of the founding fathers of AI, intended. One of the authors of this blog, David Luckham, worked on several projects for John McCarthy as a graduate student at MIT in the late 1950’s, and later on he was invited by McCarthy to join the Stanford AI laboratory. McCarthy defined AI as “the science and engineering of making intelligent machines” which helps, although it begs the question of how to define intelligence.
Earlier, Alan Turing in his famous 1950 paper, “Computing Machinery and Intelligence” (better entitled, “Can Machines Think”) had proposed an answer that became known later on as the Turing Test for intelligence. Turing held that if a person conversing with a machine by means of text input and output could not determine if they were dealing with a human person or the machine, then it would be reasonable to call the machine intelligent. This is compatible with other notions of AI that describe it as the behavior of a system that is surprisingly sophisticated, going beyond what one would expect of a machine (although of course, expectations vary between people and change over time so “surprise” is not a reliable test).
Today AI encompasses a huge variety of subfields, far beyond what McCarthy envisaged, ranging from the general (learning and perception) to the specific, such as playing chess, proving mathematical theorems, writing poetry, driving a car, and diagnosing diseases. AI is relevant in constructing a program to perform any intellectual task; it is truly a universal field. Russell & Norvig in their comprehensive text book on AI take over 1,000 pages to describe the various applications that now fall under the umbrella of AI, and only in the last 100 pages do they get to what McCarthy really meant by the term. They label it HLAI (Human Level AI)!
So it seems appropriate to ask if Complex Event Processing is indeed an aspect of, or part of, Artificial Intelligence as it is currently meant. First of all, McCarthy had no thoughts whatever in 1956 about event processing. The computing world at that time was extremely static. And any considerations about human thought that he may have had certainly did not encompass dynamic flows of events and their aggregation to higher levels of abstract events in real-time. So at first thought, the answer would seem to be no. But times and the meanings of AI have changed!
In many of today’s commercial applications CEP is used to build an AI application, and conversely AI can be used in a CEP application:
- An AI system can often employ CEP software (which may be stream processing frameworks or event stream processing platforms) to process incoming events and feed information to other software components that use AI techniques such neural networks, statistical machine learning (ML, such as regression, random forest, and decision trees), rules engines or control system logic. The CEP stream processing logic preprocesses the input data, chronologically preceding the other intelligent components and enabling them to accomplish tasks that they would otherwise have not been able to do. In this case CEP is used to implement an AI application.
Self driving cars are a good example of CEP as part of an overall AI system. Events streams from LiDAR cameras, engine and transmission sensors and numerous other sensors both inside and outside the car, are preprocessed by event processing logic to calculate complex events (intermediate results) that signify a wide variety of situations. These complex events are input to the higher level robotic AI systems that drive the car, controlling its steering, braking and other subsystems.
- Conversely, CEP systems may employ AI agents at many levels. The processing in modern CEP systems is typically a multi-stage pipeline. One or more stages can consist of agents that implement AI algorithms – for example to determine the reliability of incoming events, or to allocate probabilistic measures of importance to complex events that are aggregations of lower level events.
Examples of CEP systems with AI agents can be found in systems currently in use worldwide to monitor large amounts of incoming events, often using global event processing networks of agents hosted on the Internet. Some examples are early warning systems used to detect various critical situations. These systems have never been described as “CEP systems” but that is precisely what they are! They include early warning systems for weather threats such as tsunamis (e.g., DART), national infrastructure monitoring such as the electrical grid ( so-called Smart Grids) and warning systems for possible emerging disease pandemics (e.g., GPHIN). In all of these event processing systems, early and correct detection of an emerging crisis is vital. The use of AI in these kinds of event processing systems will increase as their performance is assessed over time.
There are an increasing number of examples of “CEP in an AI system” and “AI in a CEP system” Ultimately we must conclude that the difference between these two scenarios is not really important. The two technologies, CEP and AI, have a symbiotic relationship when used together to make intelligent systems that leverage the power of real-time event processing.
Leave a Reply
You must be logged in to post a comment.