Wednesday, December 23, 2009

On common misconceptions about event processing - the complexity misconception



David Luckham has coined the term "complex event processing", this term has caught as the marketing term behind many of the vendors that provide event processing platforms (comment: IBM, and recently Progress/Apama moved to use the term "business event processing"). While this term succeeded to get traction, it also is a source of on of the common misconceptions, Luckham talked about complex events, and their processing, some people understand it as the complex processing of events, and some just view it as the intersection between event processing and "complex systems". Complex event is defined as "abstraction of one or more other events", which also leads to some interpretations about the nature of abstractions, so this interpretation is easier to understand. However, the misconception is that it is more intuitive to think about "complex event processing" in the second interpretation as "complex" processing of events, and this brings us to the question -- what is complexity? there can be different dimensions of complexity.

  1. The complexity may stem from the fact that we don't know what exactly we are looking for, and generally looking for anomalies in the system (e.g. trying to find security violations), thus some AI techniques have to be applied here.
  2. Another case is that we know what are the patterns or aggregations we are using, but they require complex computing, in term of functional capabilities.
  3. Another case is that the complexity is in some non-functional requirement: such as scalability in several direction (scale-up or scale-down), strict real-time performance constraints, highly distributed system etc..
  4. Another case of complexity is in interoperability, the need to obtain events from many producers, and use events in many consumers, which requires instrumentation/modification of a lot of legacy systems.
  5. Yet another case of complexity may be unreliable event sources, handling false positives and false negatives.

There are probably more complexity cases, however the interesting question is whether the main goal of an event processing system is to solve a "complex" problem.

Om my scientist hat, it is definitely more exciting to solve complex problems, even better, problems of the type that have never seen before. However, from pragmatic point of view, event processing applications are measured on their business value, and there might be a lot of business value of using event processing to systems that have none of these complexity measures, from complexity point of view they can be quite simple, moreover, there may not be an exciting aspect about the implementation as it is similar to other implementations already done, but on the measurement of "business value" it brings a lot of value, thus the value metric is orthogonal to any complexity metric, and indeed many of the applications in which event processing technology is very useful to is quite simple (according to one of the analysts report the "simple" applications are 80-90% of the potential market for the event processing technology). While there is certainly a segment of application for each type of complexity, and more work is required in these direction, the "simple" application will be the bread and butter.

More misconceptions - later.

No comments: