Database security via event stream processing
Published:
Content Copyright © 2006 Bloor. All Rights Reserved.
Also posted on: Accessibility
I wrote recently about the potential for using event stream processing (or complex event processing) for database security. This is precisely what Symantec will be offering later this summer (the exact date has yet to be announced). Moreover, this database security product, which will also provide auditing capability, will be delivered as an appliance.
Let me first deal with the appliance part of this equation. What this means is that Symantec will deliver a combined hardware and software product, with the latter having been pre-installed and pre-configured so that there are no set-up issues: you simply plug it in and go.
Where you plug it into is the network, where it monitors all traffic that goes into and out of the database. For the first few days after installation it will not apparently do much. This is because it is monitoring the behaviour of the various users that access the database and building up patterns of activity that it can recognise. Once it has done so it can report on any anomalies that may occur, ranging from authorised users doing odd things (according to the US Secret Service 78% of all fraud is conducted by authorised users) to SQL injection and Application Server hacks.
This last is particularly important because CRM, ERP and other such applications have global access to the database so hacking the Application Server is an easy way to break into the system.
Another important capability is what Symantec refers to extrusion detection. That is, when anomalous data is leaving the database (as opposed to people getting into it). As an example, the company suggests setting up one or more dummy records in the database. There is no good reason why anyone should access these and, therefore, anyone who does so may be doing so for nefarious purposes.
All alerts and reporting is done in real-time, as is the monitoring of traffic entering and leaving the database, which is where the event streaming aspect of the product is relevant. By contrast, the traditional way to provide this sort of information (which includes conventional auditing: who did what and when) is to use database log files. However, full database logging is normally turned off because it usually kills performance and, in any case, database logs are not easy to read. At best, they are only useful for post-fact analysis.
As with other event streaming products you can define filters—so you can tell the appliance that you are not interested in certain data and, similarly, you can define policy rules to be applied to incoming or outgoing information with respect to what you are interested in.
As far as I can see, and bearing in mind that this will be a first release of a product that will no doubt add capabilities as time goes by, the only drawback to the appliance is that it uses proprietary storage mechanisms. While efficient from a storage and administrative perspective, this has a downside when it comes to analysis of the collected data. There are built-in query and reporting capabilities but if you wanted to apply a business intelligence or data mining tool to the collected data, for more detailed analysis, then you would have to export the data to an external (ODBC compliant) data mart. However, this seems a relatively small price to pay for the peace of mind that this product should bring.