Tag Archives: Event log

Event Scavenger 5

Seems like life is just getting more hectic these days. I’ve been running version 5 of my Event gathering tool for a while after I ‘upgraded’ the product last year already but never had the time to actually create any way to package it so others can also use it.

Part of the problem was that I decided to use a new Installer technology (thanks to the brilliant minds at MindLessoft formerly known as Microsoft…) – Wix. Unfortunately Wix is not exactly a walk in the park thing to learn and with limited time it was very hard to create anything that is kinda useful.

Anyway, to make a short story long… I decided to just go ahead and publish the latest version with whatever installers I managed to create up to now (and hope for the best).

So, there is a new ‘stable’ release of Event Scavenger on the CodePlex site. Good luck and may the force be with you…

Creating custom BizTalk Event View

This is not exactly ‘new’ technology but someone might find it useful. In recent Windows versions Microsoft enhanced the way the Event log work and function plus the way you can access it. The default Event Viewer inside Windows support creating ‘Views’ with which you can filter events based on your needs. If you know your XML syntax you can create some nice views for yourself. I created one for filtering all BizTalk related events that can simply be imported and then used as is. The XML looks like this:

  <UserQuery /> 
  <Name>BizTalk Server</Name> 
<Query Id="0" Path="Application">
  <Select Path="Application">*[System[Provider[@Name='BizTalk Server']]]</Select> 
  <Select Path="Application">*[System[Provider[@Name='BizTalk Server Deployment']]]</Select> 
  <Select Path="Application">*[System[Provider[@Name='ENTSSO']]]</Select> 
  <Column Name="Level" Type="System.String" Path="Event/System/Level" Visible="">120</Column> 
  <Column Name="Keywords" Type="System.String" Path="Event/System/Keywords">70</Column> 
  <Column Name="Date and Time" Type="System.DateTime" Path="Event/System/TimeCreated/@SystemTime" Visible="">170</Column> 
  <Column Name="Source" Type="System.String" Path="Event/System/Provider/@Name" Visible="">99</Column> 
  <Column Name="Event ID" Type="System.UInt32" Path="Event/System/EventID" Visible="">80</Column> 
  <Column Name="Task Category" Type="System.String" Path="Event/System/Task" Visible="">80</Column> 
  <Column Name="User" Type="System.String" Path="Event/System/Security/@UserID">50</Column> 
  <Column Name="Operational Code" Type="System.String" Path="Event/System/Opcode">110</Column> 
  <Column Name="Log" Type="System.String" Path="Event/System/Channel">80</Column> 
  <Column Name="Computer" Type="System.String" Path="Event/System/Computer">170</Column> 
  <Column Name="Process ID" Type="System.UInt32" Path="Event/System/Execution/@ProcessID">70</Column> 
  <Column Name="Thread ID" Type="System.UInt32" Path="Event/System/Execution/@ThreadID">70</Column> 
  <Column Name="Processor ID" Type="System.UInt32" Path="Event/System/Execution/@ProcessorID">90</Column> 
  <Column Name="Session ID" Type="System.UInt32" Path="Event/System/Execution/@SessionID">70</Column> 
  <Column Name="Kernel Time" Type="System.UInt32" Path="Event/System/Execution/@KernelTime">80</Column> 
  <Column Name="User Time" Type="System.UInt32" Path="Event/System/Execution/@UserTime">70</Column> 
  <Column Name="Processor Time" Type="System.UInt32" Path="Event/System/Execution/@ProcessorTime">100</Column> 
  <Column Name="Correlation Id" Type="System.Guid" Path="Event/System/Correlation/@ActivityID">85</Column> 
  <Column Name="Relative Correlation Id" Type="System.Guid" Path="Event/System/Correlation/@RelatedActivityID">140</Column> 
  <Column Name="Event Source Name" Type="System.String" Path="Event/System/Provider/@EventSourceName">140</Column> 

To use simply copy/past into an XML file and use the ‘Import Custom View…’ context menu option in the standard Event Log Viewer of Windows.

Of course, if you want a proper Event View solution over a whole group of servers use my Event Scavenger tool 😉

Event Scavenger 4.4.1

I created a new release with only Viewer application changes. *.elvw files are now associated in Windows Explorer with the viewer so you can launch them (or from the Windows 7 jump list) and start the application with a predefined set of filters.


Event Scavenger 4.4

Another update to this ‘old workhorse‘ tool of mine. This time just a few UI changes to the viewer app. I changed the interface to be much more compact so the essence of the app – the list of events, is the most prominently visible by default. The detail filters can accessed by just clicking F6 or the button above the event list.

Except for the admin module where only the ‘About’ window was modified nothing else was changed.

As luck would have it, just after creating the 4.4 release I started thinking of another improvement… again with the viewer… Being able to open the .elvw from Windows Explorer with the viewer app. This way you can create short cuts to different views (say on the taskbar) or from the jump list in Windows 7.

EventScavenger updates

I’ve added new Installers for the 2 Windows services in EventScavenger – one for x86 and one for x64. These were just added to the latest recommended download on the site. This include the ‘self-installing’ functionality where you can register the services with Service manager using the “-install” or “-uninstall” parameters.

Event collector

Ok, I’ve built a working service that utilize the event driven Event log functionality built into .Net’s EvenLog class – and not the newer System.Diagnostics.Eventing.Reader classes because I’m having problems getting it to properly return all the data I require.

This new service (a Windows service) runs alongside the existing EventScavenger service. It does not replace it since there are core functionality it lacks (and always will). For example – there is no guarantee that it will capture every event generated in the event log and any entries that get logged while it is not running will be missed – this include past entries as well. So I’m contemplating running both services side-by-side, first for testing but also probably in the future. The only real benefit of the new service is that it keeps the data stored in the database much more ‘up to date’ in case that is what you want. However, EventScavenger was created in the beginning to do the opposite – to be able to look up events in the past and do some reporting on it. Perhaps the combination of both old and new services together can be of use to some.

To poll or not to poll, what am I polling?

This is a question many have asked but the answer is never right for all scenarios. The question is coming up again with my EventScavenger system but the answer is not obvious. There are benefits to both polling and not polling (usually by something that raises events). This is really an architectural question and I’m kind of just blamestorming by myself (blaming myself among other things).

A quick overview of the definitions (in my context of my scenario):

What is polling?

Usually this is an action where something running locally periodically access something remotely and then gather data, do some work (store data) and then go to some sleep or waiting state again until a next scheduled time.

What is the alternative (not polling)?

Usually this involves something at the source pushing the data you need locally to you (or the data storage place where you can access it). You also must have something that ‘subscribe’ to this event to capture it.

What are the benefits of each/each approach?


  • Not active the whole time – i.e. open connections over network, resource handles etc. It spends time ‘waiting’ until next scheduled polling event.
  • Don’t require remote software to be installed (something that subscribes to the events)
  • No ‘missed’ events to worry about. On each poll you gather all relevant data since previous polling anyway.
  • Easier to administrate as it is usually a central component/service.

Not polling (Events):

  • Data gets transferred/stored ‘as it happens’.
  • No big batches of data send over network – each event’s data is sent on its own.
  • No network traffic while there is no data to report on
  • Number of remote locations not (so much) limited to what a single ‘poller’ could handle.

Unfortunately there is no golden ‘middle way’. Its either the one or the other approach (per resource you want to access). The current approach for EventScavenger is plain old polling (each event log is done on a separate thread. This works well up to a point until (1) the number of threads become too many, (2) the number of events per ‘poll’ get so many that the particular thread cannot process things quickly enough. To overcome the these problems you can install an instance (collector as named in the EventScavenger context) on the source machine. That can work if you access to the machine (are allowed to install) but then you might as well ask why not use the event driver approach?

The flip side is with subscribing to events is that you must have something installed ‘locally’ on the resource (eventlog in this case) to access the data. That requires something installed and/or running extra there. Also, if for any reason the subscriber is down/busy or something the events raised would be missed with no way to get them back again afterwards.

I’ve been looking at the EventLog.EntryWritten event and also the whole new System.Diagnostics.Eventing.Reader.EventLogWatcher class. Both only works for the local event logs and I’m having issues with the newer EventRecord class that does not give me the proper full description message (always return a null). Perhaps I have to look at an additional component that runs alongside the existing ‘collectors’ services to gather ‘problem child’ event logs – but on the actual source machine and just for raised events. This could help other people but I’m having a situation where I cannot install something on the sources of the logs that must be imported since they are ‘off-limits’ (Domain controllers and Windows Core installs)

Why can’t life be easy? 😉

EventScavenger 4.2

I recently had the ‘pleasure’ of having my EventScavenger tool being used to track the security event logs of a couple of domain controllers for a ‘full’ enterprise company. It may be temporary for us to help a specific issue but in the process I had to re-look at how the collector service (the heart of EventScavenger) performs under really high stress. In case you don’t fully appreciate it, with full auditing for successful and failed events these event logs full up within an hour or so – and they have been set to something like 100MB+! Events literally flood into the logs at (sometimes) 10000 plus events in a single minute.

Now, I’ve never had any performance problems with the collectors until now. I’ve had a single collector running on a plain old pc gathering around 50 logs or more running 24×7 for 4-5 years. In all those cases each individual event log never have more than 20MB of data in them and almost never full of data that gets overwritten within an hour or so. The problem with the domain controller event logs are the sheer volume of events written ‘all the time’ plus the collector is running on a separate pc with the sql database on a remote system that is shared with dozen of other systems. The end result is that events retrieved from the logs cannot (1) be read quickly enough and (2) written quickly enough to the database. The first problem is not something I can do anything about now – other than having a collector installed on each of the domain controllers. This is not feasible as it won’t be allowed and they are Windows Core machines anyway.

To address the second issue I did some research into how to submit batches of records to the database since doing it one by one just isn’t fast enough. I know about the new table variable in newer versions of sql server but unfortunately the server that was provided is only sql 2005 which does not support that. Fortunately it does support the xml data type even as a parameter. I found an article ‘Sending Multiple Rows to Database for Modification‘ on codeproject that specifically deals with sql 2005 that describes 2 ways to accomplish this: using a delimited string or the xml data type. The delimited option won’t work for me as it is limited to about 4000 characters and the amount of data I deal with is orders of magnitude more than that. The xml data type allows for up to 2GB in a single shot!

I did quite a few performance tests and there is a real increase in number of records that can be processed using batches – despite the possible overhead of first converting the rows to xml and then sql server converting it back to a table (in memory). Interestingly, increasing the batch size beyond a certain point (1000 as I tested) does not increase the overall throughput. It more or less stay linear. The problem with that is you increase the chance that a single failure caused by one row can make the whole batch fail. So the best is to have a batch size that is big enough to warrant the benefit but not too big so it can cause failures.

Example code

A basic example of the sql xml handling:

Declare @xml  XML
SET @xml = N’
<r cn=”machineName” l=”logName” etc />

T.Row.value(‘@cn’, ‘VARCHAR(255)’) MachineName,
T.Row.value(‘@l’, ‘varchar(255)’) LogName
FROM   @xml.nodes(‘/rs/r’) AS T(Row)

To generate the xml on client side (simplified code):

System.IO.MemoryStream outstream = new System.IO.MemoryStream();
using (System.Xml.XmlTextWriter wrt = new System.Xml.XmlTextWriter(outstream, Encoding.UTF8))

wrt.Formatting = System.Xml.Formatting.Indented;
wrt.Indentation = 1;
wrt.IndentChar = ‘ ‘;

foreach (EventLogEntry ele in eventEntries)

wrt.WriteAttributeString(“cn”, machine);
wrt.WriteAttributeString(“l”, logName);

wrt.WriteEndElement(); //r


wrt.WriteEndElement(); //rs

outstream.Position = 0;


public InsertEventLogEntries(System.IO.Stream xml)

string sql = “InsertEventLogEntries”;
System.Data.SqlTypes.SqlXml sxl = new System.Data.SqlTypes.SqlXml(xml);
SqlParameter[] parms = new SqlParameter[] {
new SqlParameter(“@xml”, SqlDbType.Xml) { Value = sxl}
using (SqlCommand cmnd = new SqlCommand(sql, insertConn))
cmnd.CommandType = CommandType.StoredProcedure;
cmnd.CommandTimeout = CommandTimeout;


catch { throw; }


So with this change it is possible to handle larger sets of inserts into the database. Now only if there was a way to make reading an event log faster…