EventScavenger 4.2

I recently had the ‘pleasure’ of having my EventScavenger tool being used to track the security event logs of a couple of domain controllers for a ‘full’ enterprise company. It may be temporary for us to help a specific issue but in the process I had to re-look at how the collector service (the heart of EventScavenger) performs under really high stress. In case you don’t fully appreciate it, with full auditing for successful and failed events these event logs full up within an hour or so – and they have been set to something like 100MB+! Events literally flood into the logs at (sometimes) 10000 plus events in a single minute.

Now, I’ve never had any performance problems with the collectors until now. I’ve had a single collector running on a plain old pc gathering around 50 logs or more running 24×7 for 4-5 years. In all those cases each individual event log never have more than 20MB of data in them and almost never full of data that gets overwritten within an hour or so. The problem with the domain controller event logs are the sheer volume of events written ‘all the time’ plus the collector is running on a separate pc with the sql database on a remote system that is shared with dozen of other systems. The end result is that events retrieved from the logs cannot (1) be read quickly enough and (2) written quickly enough to the database. The first problem is not something I can do anything about now – other than having a collector installed on each of the domain controllers. This is not feasible as it won’t be allowed and they are Windows Core machines anyway.

To address the second issue I did some research into how to submit batches of records to the database since doing it one by one just isn’t fast enough. I know about the new table variable in newer versions of sql server but unfortunately the server that was provided is only sql 2005 which does not support that. Fortunately it does support the xml data type even as a parameter. I found an article ‘Sending Multiple Rows to Database for Modification‘ on codeproject that specifically deals with sql 2005 that describes 2 ways to accomplish this: using a delimited string or the xml data type. The delimited option won’t work for me as it is limited to about 4000 characters and the amount of data I deal with is orders of magnitude more than that. The xml data type allows for up to 2GB in a single shot!

I did quite a few performance tests and there is a real increase in number of records that can be processed using batches – despite the possible overhead of first converting the rows to xml and then sql server converting it back to a table (in memory). Interestingly, increasing the batch size beyond a certain point (1000 as I tested) does not increase the overall throughput. It more or less stay linear. The problem with that is you increase the chance that a single failure caused by one row can make the whole batch fail. So the best is to have a batch size that is big enough to warrant the benefit but not too big so it can cause failures.

Example code

A basic example of the sql xml handling:

Declare @xml  XML
SET @xml = N’
<rs>
<r cn=”machineName” l=”logName” etc />

</rs>’
SELECT
T.Row.value(‘@cn’, ‘VARCHAR(255)’) MachineName,
T.Row.value(‘@l’, ‘varchar(255)’) LogName
FROM   @xml.nodes(‘/rs/r’) AS T(Row)

To generate the xml on client side (simplified code):

System.IO.MemoryStream outstream = new System.IO.MemoryStream();
using (System.Xml.XmlTextWriter wrt = new System.Xml.XmlTextWriter(outstream, Encoding.UTF8))
{

wrt.Formatting = System.Xml.Formatting.Indented;
wrt.Indentation = 1;
wrt.IndentChar = ‘ ‘;
wrt.WriteStartDocument();
wrt.WriteStartElement(“rs”);

foreach (EventLogEntry ele in eventEntries)
{

wrt.WriteStartElement(“r”);
wrt.WriteAttributeString(“cn”, machine);
wrt.WriteAttributeString(“l”, logName);


wrt.WriteEndElement(); //r

}

wrt.WriteEndElement(); //rs
wrt.WriteEndDocument();
wrt.Flush();

outstream.Position = 0;
InsertEventLogEntries(outstream);

}

public InsertEventLogEntries(System.IO.Stream xml)
{

string sql = “InsertEventLogEntries”;
System.Data.SqlTypes.SqlXml sxl = new System.Data.SqlTypes.SqlXml(xml);
SqlParameter[] parms = new SqlParameter[] {
new SqlParameter(“@xml”, SqlDbType.Xml) { Value = sxl}
};
using (SqlCommand cmnd = new SqlCommand(sql, insertConn))
{
cmnd.CommandType = CommandType.StoredProcedure;
cmnd.CommandTimeout = CommandTimeout;
cmnd.Parameters.AddRange(parms);
try
{

cmnd.ExecuteNoQuery();

}
catch { throw; }

}

So with this change it is possible to handle larger sets of inserts into the database. Now only if there was a way to make reading an event log faster…

Leave a Reply

%d bloggers like this: