We've discussed briefly what we're doing with SAM and it's relationship to BAM/BI, but it's worth going over some of this in a bit more detail.
For instance, how does SAM related to Business Service Management (BSM)? Well according to the common definitions of BSM, it is aimed at helping users (e.g., administrators, project managers) inform their management software which services, tasks etc. are the most important for the business. BSM then enables them to correlate the performance and availability of those systems with their business goals, identifying when an application, service etc. is not behaving as expected. Through some magic, the BSM identifies the cause(s) of the breach in contract and how to fix it (them).
So is SAM a solution to BSM? Not quite. But it should be easy to see how SAM can be a critical component in the development of BSM.
What about infrastructure monitoring? Well we did cover this in the very first posting about SAM. In essence the answer is the same as for BSM: SAM should be at the core of all infrastructure monitoring, receiving and collating information (data streams) from components, routers, processors, etc. Are these data streams real-time? Well let's ignore the fundamental problems with the limitation of the speed of light and simultanaeity. (Everyone should be made to read Lamport's classic paper on the subject as it relates to computing.) Let's also ignore the differences between hard real-time and soft real-time. The answer is yes AND no: of course most of the information streaming in to the SAM implementation will be coming "as it happens", but we're allowed to provide archival data that may have been taken hours, days or weeks previously. In fact, in order to support the right kind of correlation, archival data, whether provided by the SAM user or taken by SAM itself (since it is based on CEP principles), is critically important. This is also where the Bayesian Inference Network aspect to SAM comes in too. No longer will strict binary triggers be sufficient for the kinds of networks we see today and in the future.
So what constitutes the data of an event? In fact an event message contains multi-dimensional data that we need in order to correlate and visualize it. Of course there has to be a way of distinguishing "when" something (event) happened. That could be explicitly mentioned in the data stream or implied by the local time at which the message is received. Then we need to figure out "what" is being mentioned, i.e., what data is being analyzed. Which brings us to the actual data. Of course one message may contain multiple readings, e.g., the temperature at a sensor as recorded over 5 different intervals. Given all of this information (which is common to most other monitoring techniques), we can start to build up a map of what is going on in the system and trigger on any desired event, even if triggering requires correlating across a multitude of input streams. (In order to make the architecture symmetrical, we'll actually consider time as an input stream to SAM as well.)
But what about visualization? Well we've already seen the start of what we want to do with SAM. The BPM console is just one graphical view on to the data that we are accumulating and correlating. But there will be the equivalent of a BAM console: the SAM console. How you would view this information to obtain the best and most intuitive representation is an ongoing effort in its own right. For example, customizable home pages, displaying the most important graphs, data etc. per user. The notion of mimic diagrams will also be interesting to explore. In all likelihood, because of the inherent flexibility of the SAM infrastructure it'll be impossible to cater for all of the different ways in which the information may be displayed, so there'll need to be a combination of common out-of-the-box views as well as a toolkit of components that allow for the easy construction of other views.
In a later blog posting we'll look at how SAM can be important in the support of cloud computing.
No comments:
Post a Comment