Complex event processing

Event processing is a method of tracking and analyzing (processing) streams of information (data) about things that happen (events),[1] and deriving a conclusion from them. Complex event processing, or CEP, consists of a set of concepts and techniques developed in the early 1990s for processing real-time events and extracting information from event streams as they arrive. The goal of complex event processing is to identify meaningful events (such as opportunities or threats)[2] in real-time situations and respond to them as quickly as possible.

These events may be happening across the various layers of an organization as sales leads, orders or customer service calls. Or, they may be news items,[3] text messages, social media posts, stock market feeds, traffic reports, weather reports, or other kinds of data.[1] An event may also be defined as a "change of state," when a measurement exceeds a predefined threshold of time, temperature, or other value.

Analysts have suggested that CEP will give organizations a new way to analyze patterns in real-time and help the business side communicate better with IT and service departments.[4] CEP has since become an enabling technology in many systems that are used to take immediate action in response to incoming streams of events. Applications are now to be found (2018) in many sectors of business including stock market trading systems, mobile devices, internet operations, fraud detection, the transportation industry, and governmental intelligence gathering.

The vast amount of information available about events is sometimes referred to as the event cloud.[1]

Conceptual description

Among thousands of incoming events, a monitoring system may for instance receive the following three from the same source:

  1. church bells ringing.
  2. the appearance of a man in a tuxedo with a woman in a flowing white gown.
  3. rice flying through the air.

From these events the monitoring system may infer a complex event: a wedding. CEP as a technique helps discover complex events by analyzing and correlating other events:[5] the bells, the man and woman in wedding attire and the rice flying through the air.

CEP relies on a number of techniques,[6] including:

Commercial applications of CEP exist in variety of industries and include algorithmic stock-trading,[7] the detection of credit-card fraud, business activity monitoring, and security monitoring.[8]

History

The CEP area has roots in discrete event simulation, the active database area and some programming languages. The activity in the industry was preceded by a wave of research projects in the 1990s. According to[9] the first project that paved the way to a generic CEP language and execution model was the Rapide project in Stanford University, directed by David Luckham. In parallel there have been two other research projects: Infospheres in California Institute of Technology, directed by K. Mani Chandy, and Apama in University of Cambridge directed by John Bates. The commercial products were dependents of the concepts developed in these and some later research projects. Community efforts started in a series of event processing symposiums organized by the Event Processing Technical Society, and later by the ACM DEBS conference series. One of the community efforts was to produce the event processing manifesto.[10]

CEP is used in operational intelligence (OI) products to provide insight into business operations by running query analysis against live feeds and event data. OI collects real-time data and correlates against historical data to provide insight and analysis. Multiple sources of data can be combined to provide a common operating picture that uses current information.

In network management, systems management, application management and service management, people usually refer instead to event correlation. As CEP engines, event correlation engines (event correlators) analyze a mass of events, pinpoint the most significant ones, and trigger actions. However, most of them do not produce new inferred events. Instead, they relate high-level events with low-level events.[11]

Inference engines, e.g., rule-based reasoning engines, typically produce inferred information in artificial intelligence. However, they do not usually produce new information in the form of complex (i.e., inferred) events.

Example

A more systemic example of CEP involves a car, some sensors and various events and reactions. Imagine that a car has several sensors—one that measures tire pressure, one that measures speed, and one that detects if someone sits on a seat or leaves a seat.

In the first situation, the car is moving and the pressure of one of the tires moves from 45 psi to 41 psi over 15 minutes. As the pressure in the tire is decreasing, a series of events containing the tire pressure is generated. In addition, a series of events containing the speed of the car is generated. The car's Event Processor may detect a situation whereby a loss of tire pressure over a relatively long period of time results in the creation of the "lossOfTirePressure" event. This new event may trigger a reaction process to note the pressure loss into the car's maintenance log, and alert the driver via the car's portal that the tire pressure has reduced.

In the second situation, the car is moving and the pressure of one of the tires drops from 45 psi to 20 psi in 5 seconds. A different situation is detected—perhaps because the loss of pressure occurred over a shorter period of time, or perhaps because the difference in values between each event were larger than a predefined limit. The different situation results in a new event "blowOutTire" being generated. This new event triggers a different reaction process to immediately alert the driver and to initiate onboard computer routines to assist the driver in bringing the car to a stop without losing control through skidding.

In addition, events that represent detected situations can also be combined with other events in order to detect more complex situations. For example, in the final situation the car is moving normally and suffers a blown tire which results in the car leaving the road and striking a tree, and the driver is thrown from the car. A series of different situations are rapidly detected. The combination of "blowOutTire", "zeroSpeed" and "driverLeftSeat" within a very short period of time results in a new situation being detected: "occupantThrownAccident". Even though there is no direct measurement that can determine conclusively that the driver was thrown, or that there was an accident, the combination of events allows the situation to be detected and a new event to be created to signify the detected situation. This is the essence of a complex (or composite) event. It is complex because one cannot directly detect the situation; one has to infer or deduce that the situation has occurred from a combination of other events.

Integration with business process management

A natural fit for CEP has been with business process management (BPM).[12] BPM focuses on end-to-end business processes, in order to continuously optimize and align for its operational environment.

However, the optimization of a business does not rely solely upon its individual, end-to-end processes. Seemingly disparate processes can affect each other significantly. Consider this scenario: In the aerospace industry, it is good practice to monitor breakdowns of vehicles to look for trends (determine potential weaknesses in manufacturing processes, material, etc.). Another separate process monitors current operational vehicles' life cycles and decommissions them when appropriate. One use for CEP is to link these separate processes, so that in the case of the initial process (breakdown monitoring) discovering a malfunction based on metal fatigue (a significant event), an action can be created to exploit the second process (life cycle) to issue a recall on vehicles using the same batch of metal discovered as faulty in the initial process.

The integration of CEP and BPM must exist at two levels, both at the business awareness level (users must understand the potential holistic benefits of their individual processes) and also at the technological level (there needs to be a method by which CEP can interact with BPM implementation). For a recent state of the art review on the integration of CEP with BPM, which is frequently labeled as Event-Driven Business Process Management, refer to.[13]

Computation-oriented CEP's role can arguably be seen to overlap with Business Rule technology.

For example, customer service centers are using CEP for click-stream analysis and customer experience management. CEP software can factor real-time information about millions of events (clicks or other interactions) per second into business intelligence and other decision-support applications. These "recommendation applications" help agents provide personalized service based on each customer's experience. The CEP application may collect data about what customers on the phone are currently doing, or how they have recently interacted with the company in other various channels, including in-branch, or on the Web via self-service features, instant messaging and email. The application then analyzes the total customer experience and recommends scripts or next steps that guide the agent on the phone, and hopefully keep the customer happy.[14]

In financial services

The financial services industry was an early adopter of CEP technology, using complex event processing to structure and contextualize available data so that it could inform trading behavior, specifically algorithmic trading, by identifying opportunities or threats that indicate traders (or automatic trading systems) should buy or sell.[15] For example, if a trader wants to track stocks that have five up movements followed by four down movements, CEP technology can track such an event. CEP technology can also track drastic rise and fall in number of trades. Algorithmic trading is already a practice in stock trading. It is estimated that around 60% of Equity trading in the United States is by way of algorithmic trades. CEP is expected to continue to help financial institutions improve their algorithms and be more efficient.

Recent improvements in CEP technologies have made it more affordable, helping smaller firms to create trading algorithms of their own and compete with larger firms.[2] CEP has evolved from an emerging technology to an essential platform of many capital markets. The technology's most consistent growth has been in banking, serving fraud detection, online banking, and multichannel marketing initiatives.[16]

Today, a wide variety of financial applications use CEP, including profit, loss, and risk management systems, order and liquidity analysis, quantitative trading and signal generation systems, and others.

Integration with time series databases

A time series database is a software system that is optimized for the handling of data organized by time. Time series are finite or infinite sequences of data items, where each item has an associated timestamp and the sequence of timestamps is non-decreasing. Elements of a time series are often called ticks. The timestamps are not required to be ascending (merely non-decreasing) because in practice the time resolution of some systems such as financial data sources can be quite low (milliseconds, microseconds or even nanoseconds), so consecutive events may carry equal timestamps.

Time series data provides a historical context to the analysis typically associated with complex event processing. This can apply to any vertical industry such as finance[17] and cooperatively with other technologies such as BPM.

Consider the scenario in finance where there is a need to understand historic price volatility to determine statistical thresholds of future price movements. This is helpful for both trade models and transaction cost analysis.

The ideal case for CEP analysis is to view historical time series and real-time streaming data as a single time continuum. What happened yesterday, last week or last month is simply an extension of what is occurring today and what may occur in the future. An example may involve comparing current market volumes to historic volumes, prices and volatility for trade execution logic. Or the need to act upon live market prices may involve comparisons to benchmarks that include sector and index movements, whose intra-day and historic trends gauge volatility and smooth outliers.

Internet of Things and Smart Cyber-physical systems

Complex event processing is a key enabler in Internet of Things (IoT) settings and Smart Cyber-physical systems (CPS) as well. Processing dense and heterogeneous streams from various sensors and matching patterns against those streams is a typical task in such cases.[18] The majority of these techniques rely on the fact that representing the IoT system's state and its changes is more efficient in the form of a data stream, instead of having a static, materialized model. Reasoning over such stream-based models fundamentally differs from traditional reasoning techniques and typically require the combination of model transformations and CEP.[19]

gollark: What's the blue line at the end of Menger? There should be a road there.
gollark: I mean, strictly speaking I could, but no.
gollark: Can't.
gollark: Really should name that last one.
gollark: No, I mean between Untilted, Xyzzy and Name Wanted.

See also

Vendors and products

  • Apama by Software AG - monitors rapidly moving event streams, detects and analyzes important patterns, and takes action according to rules.[20]
  • Azure Stream Analytics
  • BeepBeep 3 an open source event stream processing library with multiple extensions for various use cases, including temporal logic, finite-state machines, statistics, and more.
  • Drools Fusion
  • EVAM Streaming Analytics
  • Esper Complex event processing for Java and C# (GPLv2).
  • Esri ArcGIS GeoEvent Server
  • Feedzai - Pulse
  • GigaSpaces XAP
  • Informatica RulePoint by Informatica
  • Microsoft StreamInsight Microsoft CEP Engine implementation[21]
  • openPDC — A set of applications for processing streaming time-series data in real-time.
  • Oracle Event Processing - for building applications to filter, correlate, and process events in real time.
  • BRMS - A rules management engine by Red Hat based on Drools
  • SAP ESP - A low-latency, rapid development and deployment platform that allows processing multiple streams of data in real time[22]
  • SAS ESP - A platform that is built for speed to analyse (apply SAS' and third-party analytics, including machine learning algorithms) millions of data records in motion (events) with low-latency response time (milliseconds and sub-milliseconds). Deployable at the edge, on premises and to the Cloud. Flexible platform that is built with openness in mind to make Analytics pervasive everywhere.[23]
  • SQLstream SQLstream's stream processing platform, s-Server, provides a relational stream computing platform for analyzing large volumes of service, sensor and machine and log file data in real-time.
  • TIBCO BusinessEvents & Streambase - CEP platform and High Performance Low Latency Event Stream Processing
  • VIATRA-CEP[24] - A model-driven CEP engine, part of the 3rd generation of the VIATRA model transformation framework
  • WebSphere Business Events
  • Siddhi a Stream Processing and Complex Event Processing java library released under Apache License v2, that listens to events from data streams, detects complex conditions described via a Streaming SQL language, and triggers actions.
  • WSO2 Stream Processor a fully open source distributed and highly available Stream Processing Server released under Apache License v2 by WSO2.
  • Apache Flink Open-source distributed stream processing framework with a CEP API[25] for Java and Scala.
  • Apache Storm Free and open source distributed realtime computation system. Storm processes unbounded streams of data in realtime.

References

  1. Luckham, David C. (2012). Event Processing for Business: Organizing the Real-Time Enterprise. Hoboken, New Jersey: John Wiley & Sons, Inc. p. 3. ISBN 978-0-470-53485-4.
  2. Bates, John, John Bates of Progress explains how complex event processing works and how it can simplify the use of algorithms for finding and capturing trading opportunities, Fix Global Trading, retrieved May 14, 2012
  3. Crosman, Penny (May 18, 2009), Aleri, Ravenpack to Feed News into Trading Algos, Wall Street & Technology
  4. McKay, Lauren (August 13, 2009), Forrester Gives a Welcoming Wave to Complex Event Processing, Destination CRM
  5. D. Luckham, "The Power of Events: An Introduction to Complex Event Processing in Distributed Enterprise Systems", Addison-Wesley, 2002.
  6. O. Etzion and P. Niblett, "Event Processing in Action", Manning Publications, 2010.
  7. Complex Event Processing for Trading, FIXGlobal, June 2011
  8. Details of commercial products and use cases
  9. Leavit, Neal (April 2009), Complex-Event Processing Poised for Growth, Computer, vol. 42, no. 4, pp. 17-20 Washington
  10. Chandy, Mani K.; Etzion, Opher; Ammon, Rainer von (22 December 2017). Chandy, K. Mani; Etzion, Opher; Ammon, Rainer von (eds.). "10201 Executive Summary and Manifesto – Event Processing". Schloss Dagstuhl - Leibniz-Zentrum fuer Informatik, Germany via Dagstuhl Research Online Publication Server.
  11. J.P. Martin-Flatin, G. Jakobson and L. Lewis, "Event Correlation in Integrated Management: Lessons Learned and Outlook", Journal of Network and Systems Management, Vol. 17, No. 4, December 2007.
  12. C. Janiesch, M. Matzner and O. Müller: "A Blueprint for Event-Driven Business Activity Management", Lecture Notes in Computer Science, 2011, Volume 6896/2011, 17-28, doi:10.1007/978-3-642-23059-2_4
  13. J. Krumeich, B. Weis, D. Werth and P. Loos: "Event-Driven Business Process Management: where are we now?: A comprehensive synthesis and analysis of literature", Business Process Management Journal, 2014, Volume 20, 615-633, doi:10.1108/BPMJ-07-2013-0092
  14. Kobielus, James (September 2008), Really Happy in Real Time, Destination CRM
  15. The Rise of Unstructured Data in Trading, Aite Group, October 29, 2008
  16. Complex Event Processing: Beyond Capital Markets, Aite Group, November 16, 2011
  17. "Time Series in Finance". cs.nyu.edu.
  18. "Balogh, Dávid, Ráth, Varró, Vörös: Distributed and Heterogeneous Event-based Monitoring in Smart Cyber-Physical Systems, In 1st Workshop on Monitoring and Testing of Cyber-Physical Systems, Vienna, Austria. 2016".
  19. I. Dávid, I. Ráth, D. Varró: Foundations for Streaming Model Transformations by Complex Event Processing, International Journal on Software and Systems Modeling, pp 1--28, 2016. doi:10.1007/s10270-016-0533-1
  20. Apama Real-Time Analytics Overview Archived 2015-10-25 at the Wayback Machine. Softwareag.com. Retrieved on 2013-09-18.
  21. "Microsoft StreamInsight". technet.microsoft.com.
  22. "SAP ESP - Developers community". Archived from the original on 2015-01-05. Retrieved 2014-07-17.
  23. "SAS Event Stream Processing".
  24. "VIATRA/CEP - Eclipsepedia". wiki.eclipse.org.
  25. "Apache Flink 1.2 Documentation: FlinkCEP - Complex event processing for Flink". ci.apache.org.
This article is issued from Wikipedia. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.