The Origins of Complex Event Processing
By David Luckham¹
Complex Event Processing (CEP) was developed on a research project at Stanford University during the 1990’s. The project, called Rapide, aimed to develop an executable language for specifying distributed event-based systems and an accompanying simulator for such specifications.
The Rapide project was a research effort funded by DARPA to develop a system for simulating aspects of battlefield management systems that were supposedly coded in Ada. This research led to a system consisting of a language and simulator called Rapide.
Rapide: The starting point for Rapide was the VHSIC Hardware Description Language (VHDL). This was a government project started in 1981 to standardize technology for modelling the behavior and structure of digital systems at multiple levels of abstraction, ranging from the instruction set level down to that of logic gates, for design entry, documentation, and verification purposes.
Today VHDL is supported by a number of commercial simulators. VHDL led me to become interested in event-based simulation of multi-level system designs. Since DARPA was encouraging cooperative projects between universities and industry, a joint proposal to DARPA was made from Stanford and TRW in the late 1980’s to build an event-based language and simulator for simulating system architectures at multiple levels of design.
Perhaps the most innovative concepts driving this project were (i) developing a capability to track the causal and timing relationships between events produced by the simulator, and (ii) an ability to define correspondences (called maps) between events at different levels in a hierarchical design, and to be able to execute those maps at runtime.
Sets of events together with their causal and timing attributes were called posets (partially ordered set of events). Correspondences between design levels could be defined in Rapide by maps between levels. A map was a Rapide construct that defined the posets of events at one level in a multi-level design that corresponded to a single event at the next higher level in the design.
Using the simulator the lowest level of a system design could be run and the result of that simulation could be mapped up to the next level resulting in a simulation of that level. This process could be repeated up through multiple levels. Thus different levels in a system design written in Rapide could be simulated at the same time.
System Architectures in Rapide: An architecture was a network of computational modules and communication lines between the modules. A module consisted of an interface and a body. A module interface defined its input/output ports (means of communication) with event-pattern constraints specifying how the input-output ports should react to events being sent to them. A module body contained either another architecture or executable code. Thus multiple levels of an architectural design were expressed by nesting architectures within modules. Modules with coded bodies were the lowest level in a multi-level architecture.
Executable Hierarchies were implemented using event-pattern mappings (i.e., Rapide maps) between events at different hierarchy levels. For example, in a chip design one could define how an instruction (an event at the instruction level) was implemented by a poset of events at the lower register transfer level, by a map. There were already a few industry standard hierarchies including the OSI 7 layer communication hierarchy, the TCP/IP 6 layer hierarchy in networking, and similar hierarchies in Hardware design. These served as models for the concept of hierarchy in Rapide. But the Rapide concept went far further by making the hierarchy an executable specification. Using event patterns to map from lower level posets to higher level events was called vertical abstraction.
The Rapide Toolset: An event driven simulator for the Rapide language was built and applied to simulating architectures of various systems. Examples of systems that were simulated included a computer chip design, an air traffic control system, the TIBCO Information Bus (TIB, a messaging system), various business sales websites. The output of the Rapide simulator was a poset that was graphically represented as a DAG to show the causal and timing relationships between events. Various analysis tools were built including a constraint checker, a browser, and animation tools.
Figure 1 shows an abstraction hierarchy for an e-commerce website, a typical example of which would be an Amazon website. Maps define the correspondences between posets of events on the website (the lowest level) and business analysis events (a higher level) that create a customer profile, credit check, buying history, and track supplies of products.
Figure 1. Abstraction Hierarchy for an E-Commerce Website
Maps are again used to specify events at a third level where events abstract website performance and business issues from the second level events.
During the early 1990’s INTEL discussed their chip design flaws with our project at Stanford, and their failure to detect flaws during their design simulations. We demonstrated using Rapide how event abstraction made very small low level design errors become obvious when abstracted to higher levels. For example, a wrong connection at the gate level would be magnified as a failed Add instruction at the instruction level of the design. This work was awarded the best paper award at the 1992 IEEE Design Automation Conference in Anaheim sponsored by the ACM and IEEE.
CEP: In describing an event hierarchy a terminology was needed for an event at a level above the lowest level in the hierarchy. Since a higher level event was composed of many lower level events, the term complex event was coined. For example, in a hardware design hierarchy an Add instruction may involve about 20 register transfer operations, and in turn perhaps several hundred gate level operations. In an e-commerce website, a business event such as a “product sold out” event, may result from several thousand website activity events. We called the hierarchical simulation methodology of using Rapide, Complex Event Processing or CEP.
The CEP Toolset: The Rapide system consisted of the Rapide language, a simulator for executing architecture designs written in Rapide, and an analysis tool set for checking consistency between designs and specifications. Figure 2 below shows the whole design specification and simulation process.
- Requirements were expressed as a Rapide architecture, usually a multi-level architecture. This led to a Rapide model of the requirements.
- The model was compiled and fed to the simulator which was then run.
- The output was a poset (partially ordered set of events).
- The output was fed to various analysis tools. These consisted of a constraint check which check consistency of the output with design constraints (expressed as patterns of events), a graphical tool that represented the output in DAG format, and a set of animation tools that allowed the simulation to be run in slow motion showing how the events were generated and their time order.
At some point it was decided that the analysis toolset could be detached from the Rapide simulator and applied to any system that was generating events. What would be required in order to use the full power of Rapide analysis would be a model for causality between the events in that system. This was the step led to the use of CEP for many kinds of event-based systems outside of simulations.
Figure 2. Rapide Design Specification and Simulation Process
This version of the CEP tools could be used in real-time as the target system was running, or for post mortem analysis. Real time use could handle several hundred events per minute. So we began an effort to apply our analysis tools to other systems. The first choice was the TIBCO Information Bus which allowed various applications to communicate with one another by means of messages which were events. This led to a collaboration with TIBCO whereby a copy of the TIB was given to our Stanford project for experiments.
Commercialization: In early 2000 the Stanford Group began to find the funding to start a company to commercialize CEP. Initially we experienced push back on the terminology “Complex Event Processing”. Some potential funders reacted that software was already too “complex” and that we should be selling “Simple Event Processing”. However we stuck to CEP explaining the fundamental concept of hierarchies of events and higher level events being composed of many lower level events, and therefore being “complex”.
A start-up company called ePatterns was eventually founded. A very promising proof-of-concept application of CEP was proposed by Matson Shipping Lines, a container shipping company with offices in Oakland. Matson had a small research group that were trying to automate the Trip Plans for their container ships. We made a presentation of CEP to them, and they suggested funding ePatterns to build a proof-of-concept implementation of trip plans using CEP.
This was a very simple application for our current implementation of CEP since trip plans usually dealt with less than a hundred events a day. Unfortunately, politics within ePatterns led to the company foundering before this effort could be initiated.
As a consequence of its failure the total outcome of the ePatterns company was the book I wrote as CTO describing CEP and its possible applications called “The Power of Events” published in 2002.
Today “CEP” is in common usage with many companies selling or using CEP products.
- I am indebted to Roy Schulte for help with this article.
Leave a Reply
You must be logged in to post a comment.