Complex Event Processing
The shortest distance between two points may be a straight line, but in business process management it may not be the most effective route. Today's approach to BPM is linear, one step handing off to another. In some cases, a rules engine may be involved to facilitate the movements. But what's ultimately lacking is context, the bigger-picture understanding needed to intelligently evaluate and deal with unique situations.
A new, emerging technological approach called complex event processing (CEP) brings BPM into the real world of business complexity, allowing each step in a process to be informed not simply by the previous step, but by any other step, data, or pattern of behavior deemed relevant to that step.
Traditional BPM technology has three primary functions:
1. Execute a step, providing any data necessary;
2. Receive the result of that step and any data returned; and,
3. Based on that data evaluate where to go next.
However, these functions always happen in this order, and each one has visibility ONLY to the previous step and any data collected along the way. Because of this, BPM is useful only for processes that are predictable, linear, and controlled based on data that is collected or derived during the process. There may be branches and loops, but fundamentally the process can be completely understood in all its permutations and enforced. The process begins and ends, with no need to evaluate outcomes for future iterations of the process.
CEP was developed at Stanford University in the mid-1990s. According to its creator, Professor David Luckham, the goal of CEP is to enable information contained in the events flowing through all of the layers of the enterprise IT infrastructure to be discovered, understood in terms of its impact on high-level management goals and business processes, and acted upon in real time. CEP implementations are built around some key core principles.
Events are technology-neutral occurrences of interest, such as a new purchase, a change of address, or an attempt to break into a network. Events can come from people, devices, applications, networks, or databases.
Events have context, that is, an instance of an event implies timing (when it happened, both in absolute terms and relative to other events), sequence (again relative to other events), and linking relationships to other events (patterns of events, either expected or implied).
Events can also carry information about themselves. For example, a "purchase" event might contain the product purchased and the purchaser.
Events can be evaluated, either based on their context, their data, or additional data that may not accompany the event but is relevant (for example, is the purchaser a gold customer?).
Events form patterns in time and/or sequence that may be interesting. For example, a change of address followed by the reporting of a lost ATM card may indicate an attempt to profit from identity theft.
Events comprise ad hoc processes, a sequence of activities that results in the execution of a process. For example, a revenue recognition process may be composed of the events "contract signing," "purchase order received," and "product shipped." An explicitly defined process automated using events is called an Event Flow.
Events may be abstracted into other events. For example, a change of address event followed by the reporting of a lost ATM card event may be represented by the compound event "Attempted Fraud."
Events can optionally generate responses, or actions. For example, an "Attempted Fraud" event may trigger, in some cases, a "Put Account on Referral" action to make sure downstream account activity is legitimate.
Taken together, these basic principles represent a paradigm shift in the approach to understanding and responding to business activity through IT infrastructure.
The scope of this shift is best illustrated by comparing CEP to traditional process automation approaches:
While BPM provided a necessary first-generation technology, its limitations are clear and must be overcome for process automation to reach its potential. With today's businesses focused on responding to both threats and opportunities as they operate in a hyper-competitive economy, operational efficiency becomes a tremendous advantage.
Complex event processing provides a new generation of capabilities to advance this objective.
About the Author
David Cameron is vice president of marketing and product integration at AptSoft Corporation (www.aptsoft.com) and can be emailed at firstname.lastname@example.org
Forrester Gives a Welcoming Wave to Complex Event Processing
Citing heavy demand, the research firm unveiled its first-ever evaluation of nine vendors in the rapidly maturing space.