Tag Archives: Projects

QuickMon 3.7

It has been a while since I mentioned any updates of my QuickMon monitoring tool. It has now reached version 3.7 with many new and improved features. There are so many that I can’t remember all of them now but here are some highlights:

  • Context menu for Collectors (and Notifiers) improved
  • View statistics of collector polls
  • Copy and Paste of Collectors or whole branches in the tree view
  • General UI has been simplified (several times)
  • Remote Agents (query resources via another machine)
  • An OleDb query collector has been added (with limitations)
  • Restoration scripts (opposite or corrective scripts)
  • Alert suppression based on # of polls
  • Ability to register the Service (and remote host) plus add firewall exception rule from application itself
  • Some minor fixes to existing collectors


Get it from CodePlex here.

QuickMon 3

QuickMon 3 has been released (3.1 is already out)

Hope you like it and please give feedback. To see some screen shots look at this.

QuickMon 2.13

Just a quick heads-up for any QuickMon users. A new version was added recently with a couple of small but nice improvements.

Some of the changes:

  • UI client now automatically refresh the config of ‘Detail’ windows (if the particular Collector’s config was edited)
  • The WMI collector was basically rewritten to improve it and add a little WMI query builder interface. This help people that are not experts using WMI syntax or know all the classes/properties that are available.


Event Scavenger 5

Seems like life is just getting more hectic these days. I’ve been running version 5 of my Event gathering tool for a while after I ‘upgraded’ the product last year already but never had the time to actually create any way to package it so others can also use it.

Part of the problem was that I decided to use a new Installer technology (thanks to the brilliant minds at MindLessoft formerly known as Microsoft…) – Wix. Unfortunately Wix is not exactly a walk in the park thing to learn and with limited time it was very hard to create anything that is kinda useful.

Anyway, to make a short story long… I decided to just go ahead and publish the latest version with whatever installers I managed to create up to now (and hope for the best).

So, there is a new ‘stable’ release of Event Scavenger on the CodePlex site. Good luck and may the force be with you…

Monitor a Web Service

Ever wanted a way to monitor whether your web service is actually working or cannot monitor a resource directly and need to put a web service in between to access it? Well, I’ve added a SOAP Web service collector to QuickMon so it can monitor/alert¬† on issues/problems related to Web services.

See the documentation about this collector here.

A summary of the documetation:

This collector allows you to poll/query SOAP web services and interpret the results. It supports several simple ‘macros’ to make it easier to test returned values or even DataSets or Arrays.

Creating a Multi-instance self registering service

This is one of those things I’ve been wanting to do for a long time for my Event Scavenger pet project. Basically the requirement is this:

“Being able to have one executable (.exe) file that can be used multiple times for different service instances.”

It turns out it is not too hard to achieve this using C# and .Net 4.0 (might even work with older frameworks but haven’t tried it). It is an extension of my already improved self registering Windows Service extensions which make it possible to simply run the Service exe with a ‘-install’ command line parameter. With the latest additions a ‘named’ instance can also be specified. This means, in the Service Manager of Windows you can see the Service executable plus a command line parameter that makes it unique. It allows you to stop/start/manage it separately as an entity. This is exactly the same as how Microsoft BizTalk Server implements ‘Host Instances’.

Now lets get dirty with some code samples. The first code snippet shows (an example) part of main program.cs that handles the command like parameters.

string collectorName = "Default";		

if (args.Length > 0)
  if (args[0].ToUpper() == "-INSTALL")
              string serviceParameters = "";
    if (args.Length > 1)
      collectorName = args[1];
                  serviceName = "Event Reaper - " + collectorName;
      displayName = "Event Reaper - " + collectorName;
                  serviceParameters = "\"-Collector:" + collectorName + "\"";
    if (args.Length > 2)
      displayName = args[2];
    if (args.Length > 3)
      description = args[3];

  else if (args[0].ToUpper() == "-UNINSTALL")
    if (args.Length > 1)
      collectorName = args[1];
                  serviceName = "Event Reaper - " + collectorName;
  collectorName = HenIT.CommandLineUtils.GetCommand(args, "", "-Collector:");
if (collectorName.Length == 0)
  collectorName = Properties.Settings.Default.CollectorName;

ServiceBase[] servicesToRun;
servicesToRun = new ServiceBase[] 
  new EventReaperService() { CollectorName = collectorName }

All this section does is to check for the “-INSTALL” or “-UNINSTALL” parameters and then call the helper class that does the registration/unregistration. The Display name of the service instance is always prefixed with “Event Reaper – ” to grouped all instances visually together in Service manager.


The ServiceRegister class only has two methods. InstallService is the first one and handles all the bits to gather login details, set properties and then register the service.

public static bool InstallService(string serviceExePath,
        string serviceName,
        string displayName,
        string description,
        string serviceParameters)
    bool success = false;
        string workingPath = System.IO.Path.GetDirectoryName(serviceExePath);
        string logPath = System.IO.Path.Combine(workingPath, "Install.log");
        ServiceStartMode startmode = ServiceStartMode.Automatic;
        ServiceAccount account = ServiceAccount.LocalService;
        string username = "";
        string password = "";
        bool delayedStart = true;

        InstallerForm installerForm = new InstallerForm();
        installerForm.StartType = ServiceStartMode.Automatic;
        installerForm.AccountType = ServiceAccount.User;
        installerForm.TopMost = true;
        if (installerForm.ShowDialog() == System.Windows.Forms.DialogResult.OK)
            startmode = installerForm.StartType;
            account = installerForm.AccountType;
            delayedStart = installerForm.DelayedStart;
            if (installerForm.AccountType == ServiceAccount.User)
                username = installerForm.UserName;
                password = installerForm.Password;

            Hashtable savedState = new Hashtable();
            ProjectInstallerForHelper myProjectInstaller = new ProjectInstallerForHelper(delayedStart);
            InstallContext myInstallContext = new InstallContext(logPath, new string[] { });
            myProjectInstaller.Context = myInstallContext;
            myProjectInstaller.ServiceName = serviceName;
            myProjectInstaller.DisplayName = displayName;
            myProjectInstaller.Description = description;
            myProjectInstaller.StartType = startmode;
            myProjectInstaller.Account = account;
            if (account == ServiceAccount.User)
                myProjectInstaller.ServiceUsername = username;
                myProjectInstaller.ServicePassword = password;
            myProjectInstaller.Context.Parameters["AssemblyPath"] = serviceExePath + " " + serviceParameters;

            success = true;
    catch (Exception ex)
        System.Windows.Forms.MessageBox.Show(ex.Message, "Install service", System.Windows.Forms.MessageBoxButtons.OK, System.Windows.Forms.MessageBoxIcon.Error);
    return success;

The class InstallerForm is simply a plain Windows Form to capture service properties like the username/password etc. I’m not showing its code here now. AssemblyPath is the actual property that Service manager use to launch the executable plus its parameters.

ProjectInstallerForHelper is an utility class that simply inherits from System.Configuration.Install.Installer. It looks like this:

internal class ProjectInstallerForHelper : System.Configuration.Install.Installer
    private ServiceProcessInstaller processInstaller;
    private System.ServiceProcess.ServiceInstaller serviceInstaller;

    public ProjectInstallerForHelper(bool delayedAutoStart = true)
        processInstaller = new ServiceProcessInstaller();
        serviceInstaller = new System.ServiceProcess.ServiceInstaller();
        serviceInstaller.DelayedAutoStart = delayedAutoStart;

        Installers.AddRange(new Installer[] {

    #region Added properties
    public string ServiceName
        get { return serviceInstaller.ServiceName; }
        set { serviceInstaller.ServiceName = value; }
    public string DisplayName
        get { return serviceInstaller.DisplayName; }
        set { serviceInstaller.DisplayName = value; }
    public string Description
        get { return serviceInstaller.Description; }
        set { serviceInstaller.Description = value; }
    public ServiceStartMode StartType
        get { return serviceInstaller.StartType; }
        set { serviceInstaller.StartType = value; }
    public ServiceAccount Account
        get { return processInstaller.Account; }
        set { processInstaller.Account = value; }
    public string ServiceUsername
        get { return processInstaller.Username; }
        set { processInstaller.Username = value; }
    public string ServicePassword
        get { return processInstaller.Password; }
        set { processInstaller.Password = value; }


This is essentially just the reverse of InstallService.

public static bool UnInstallService(string serviceExePath, string serviceName)
    bool success = false;
        ServiceController sc = new ServiceController(serviceName);
        if (sc == null)
            System.Windows.Forms.MessageBox.Show("Service not installed or accessible!", "Stopping service", System.Windows.Forms.MessageBoxButtons.OK, System.Windows.Forms.MessageBoxIcon.Warning);
            return true;
        if (sc.Status == ServiceControllerStatus.Running || sc.Status == ServiceControllerStatus.Paused)
    catch (Exception ex)
        if (!ex.Message.Contains("was not found on computer"))
            System.Windows.Forms.MessageBox.Show(ex.Message, "Stopping service", System.Windows.Forms.MessageBoxButtons.OK, System.Windows.Forms.MessageBoxIcon.Error);
            return true;
        string workingPath = System.IO.Path.GetDirectoryName(serviceExePath);
        string logPath = System.IO.Path.Combine(workingPath, "Install.log");

        ServiceInstaller myServiceInstaller = new ServiceInstaller();
        InstallContext Context = new InstallContext(logPath, null);
        myServiceInstaller.Context = Context;
        myServiceInstaller.ServiceName = serviceName;
        success = true;
    catch (Exception ex)
        System.Windows.Forms.MessageBox.Show(ex.Message, "Uninstall service", System.Windows.Forms.MessageBoxButtons.OK, System.Windows.Forms.MessageBoxIcon.Error);
    return success;


And this is basically the whole thing that allows you to create a Windows Service in C# that can handle multiple instances. There are a few things to keep in mind like the fact that all instances share the same config file unless you make a physical copy of the exe and config file to another directory and register it from there. Using this you can multiple services using different config all running side by side not affecting each other (hopefully hehe)

Event Scavenger 5

A quick heads up about a new version of my Event Scavenger tool/system that is coming. There are a whole lot of changes that build up on the existing functionality as far as the gathering service(s) and Administration are concerned. The Viewer part has (so far) being left unchanged so that a previous version Viewer tool can still access the new version database and Vice Versa.

Why things have changed?

The reasons for the update is simple to cater for some limitations I experienced myself with the tool/system. These include the following:

– Newly added Event Logs required a service restart so the gathering service (Event Scavenger service) could become aware of them.

– The original (Event Scavenger) service kept ‘open’ threads for each machine/log that needed to be ‘polled’. Most of the time each thread was left in a ‘sleeping’ state doing no work at all and just taking up resources.

– Only one instance of the collecting (Event Scavenger) service could run on a single machine – thus limiting it to only one database it can store logs into, one service that could be managed etc.

– Logs polled/imported into another database were set up ‘sort of’ separately from actual (machine)logs already in the current database. This made them (sort of) hard to managed.

– Setting up permissions for users was mostly a manual process.


To solve these issues the following changes have been made:

– The gathering service now spawn threads only for enabled machine/logs each time it ‘runs’ based on a master frequency. This is set to 60 seconds by default (but as always can be changed and now managed from the Admin tool itself). This solves both the 1st and 2nd points mentioned. A side effect is that the performance counters for each Machine/Log now have disappeared (may be a bad thing…). This also means the whole concept of Thread recycling have gone the way of the Dodo…

– The service is now ‘Multi instance’ aware so that multiple copies of the same service (each with its own physical copy of the exe and config file) can run separately ‘servicing’ different Event Logs/databases etc on the same machine. Each instance can thus operate totally independent doing totally its own thing. Multi-machine instances are still possible as before. To set up¬† a new ‘instance’ you simple make a copy of the service exe & config file (plus editing) and then use the “-install ‘<serviceName>’ ‘<displayName>’ ‘<Description>’ ” parameters on the service exe. The result is a separate entry in Service Manager which you can start/stop on its own.

– A Machine/Log entry now has another (sorta) state other than just be enabled/disabled. The 3rd ‘state’ is the option to have it being imported from another EventScavenger database. Using the Admin tool now makes it a lot easier to manage this.

– The Admin tool now has a simplistic view/editor to manage users in the database.

– The Admin tool also now provides a way to ‘manage’ the collectors (services). A simplistic way to stop/start/pause/continue services is built right into the tool.

– To set up the database is now a bit easier. I included a new (command-line) tool that simply runs the various sql scripts for you. Some editing of the scripts might still be required if you don’t like the default settings). You can always still do it manually.


– Something new, the admin tool now has a simplistic way to manage ‘Reaper’ services so you can start/stop/pause services, manage a view of what physical services service what Collector name etc. At the moment the actual registration of a service is still a manual process but the admin tool shows you a help screen with instructions on how to set it up (or even unregister it).

New name

Because the ‘main’ service for collecting Event Logs changed I decided to give it a new name as well – Event Reaper (being a fan of the Mass Effect games must be obvious by now… ūüėČ ).¬† The name ‘Event Scavenger’ is now just a global name for the whole project/tool set. With the new multi instance support this means you can have multiple reapers (on the same box).

Source code

So far I’ve kept the changed source code ‘offline’ as it is now a totally new ‘branch’ of the original source code. The global version number has now being changed to 5 (and in fact – the default database name has now being changed to EventScavenger5) as it is not compatible with the previous version (with the exception of the viewer tool that works on both versions – for now…). Thus, previous version collector services and the Admin tool will not work anymore with the new database version – per design.


I’m still in testing so no actual ‘Release’ has been made yet. I’ll post an update here when here is something that others can start playing with.

Update: A preview (Beta) can be found here. It is (should be) fully functional with all the above mentioned features already working.

HTMLWriter 1.7

Just a quick refresh on my HTMLWriter utility library. Some minor changes/additions:

1. Added support for the NOBR tag

2. Improved the special character functions including ignoring the ‘&’ sign when needed.

Latest version here.


I’v been involved¬†for many years with helping, developing, working with an uncle of mine’s genealogical research – at least the ‘IT’ side of things. Over this time I developed more than one solution to house a database for storing all the relevant data gathered – first an Access database and later stand-alone application using either the old Access database or newer sql express. Lately I’ve been trying to find a way to get away from the clumsiness of having a separate piece of software installed to get a database. It just makes setting up and maintaining an application so much more difficult – since the type of person using these applications are usually very close to ‘computer-illiterate’ or just doesn’t care ‘how it it works’ but rather just ‘that it to works’.

So I’ve been working to create something that does not require a traditional database but is still easy to maintain and troubleshoot if needs be. I choose to use a plain and simple xml file. You might start to¬†think that this is bad idea because of various reasons like loosing all the features built into professional database systems. But you would be wrong for the ‘wrong’ reasons ūüôā . Let me explain. For the purpose of this system the requirements are different from what you would expect when using a ‘traditional’ database. Consider this – for this little ‘system’ the following are applicable:

  • Single user, single machine use only
  • The volume of data is ‘low’. I can’t imagine someone want to put millions of persons’s information in this little application. I have an actual working example with the information of 11000+ people and the performance is really good.
  • No need for fancy transactionality, relational integrity and other jargon users don’t understand anyway (at the database/file level). The app itself (dal/object level) takes care of it.
  • backups are as easy as simply copying a singe file to some other location
  • although xml is well structured and for ‘computer’ use even a normal user can open the file and view it as text to read it. Of course, they can also easily break it by editing it manually…
  • Fewer components to maintain. No security fixes/service packs of sub components of which most parts are not even used at all. That is the ‘price’ you pay for using a technology that has to cater for lots of different possible ways to use it.
So the little application I created is a plain and simple C# Windows application. All it requires is the ‘client-side’ .Net 4 framework (not even the full thing) which is relatively small these days. It is still really a ‘proof-of-concept’ application but is fully functional and usable.
Summary of the features:
  • Enter/view details of persons (like names, surnames, dates and places of birth/christening/death, parents and children, marriages, history, photos and even unlimited separate notes)
  • View a family tree for a person – both¬†predecessors and¬†descendants¬†¬†plus a summary of each person (birth/death/marriages). Up to 7 generations (up and down) are allowed but for practical reasons it is better to specify less.
  • There are several built-in reports (too many to name now). Since they are all html based they are easy to export and share with others.
  • Searching for persons (of course)
  • Comparing details of multiple persons (even a graphical view of living-years so you can see who lived around the same time)
  • Mark selected persons as ‘favourites’ so you can more easily find them again.
  • You can have multiple ‘databases’ (files) – simple got to settings and select another file.
  • the .gfdb file extension is associated with the application meaning you can simply double-click ¬†such a file in Windows Explorer/Desktop and it will be opened with the application.
  • Since we mostly use the ‘de Villiers/Pama System‘ genealogical numbering system in South Africa there is some support to automatically generate these numbers based on a person marked as the progenitor.
It also features bilingual ‘run-time’ support which is something you don’t find often in applications. At the moment it support both English and Afrikaans.
Data format
I did consider GEDCOM and even GEDXML at some time but both have ‘issues’ of their own. GEDCOM is a text flat file format that is not very user friendly and easy to ‘break’. It is very widely used though. GEDXML is not really a standard format at all. They started working on it and abandoned it after a while – not sure why.
‘My’ format is not really a standard on its own – I’m not claiming it to be super or fantastic. It is plain and simple and works for my purposes.

Grab a copy of the installer here and let me know what you think. The zip file contains an example gfdb file containing some (fictional) Star Wars character information to illustrate how the application work.

Update: Version 1.1 available here.

Another Update: Version 1.2 available here

Revisited – XML File as database

I’ve again relooked at the idea of using a plain XML file as a source of a ‘small’ database. The previous attempt resulted in me going down the path of using datasets (typed and non-typed).

This time I started with a set of very simple classes (.Net/C# classes) that when you serialize them to XML it looks very much like a small data store/structure I also happened to use for another import/export genealogical database app I created.

Effectively through the use of ‘Generics’ I am able to create a simple ‘base’ class with which any such small in-memory database can be created/used/persisted to and from disk again. At the ‘lowest’ level I have a class that only does 3 things: Create the data store, load from disk/file and then save to disk/file. It makes use of a small utility class that handles serialization to (xml) files.

public interface ISmallDataStore

void CreateNewDataStore();
void LoadDataStore(string source);
void SaveDataStore(string destination);


public abstract class SDSXMLFileBase<T> : ISmallDataStore where T: class

private string sourceFilePath;
internal T DataContainer;

#region ISDS Members
abstract public void CreateNewDataStore();
abstract internal void SetRelationObjects();
abstract internal void RefreshCacheObjects();

public void LoadDataStore(string source)

sourceFilePath = source;
DataContainer = SerializationUtils.DeserializeXMLFile<T>(source);

public void SaveDataStore(string destination)

sourceFilePath = destination;
SerializationUtils.SerializeXMLToFile<T>(destination, DataContainer);



Two abstract methods are added to facilitate functionality that will be explained later (SetRelationObjects and RefreshCacheObjects). They are optional and but I needed them for specific reasons.

The next ‘layer’ is a class that implements ‘DataContainer’¬†with data structures you want to use as the data store. These data structures are what is going to be saved/serialized. The following is small example of what it looks:

public class SampleSDSImplementation : SDSXMLFileBase<SampleTDataContainer>

public override void CreateNewDataStore()

DataContainer = new SampleTDataContainer();



Through the use of serialization attributes you can define the way the resulted xml would look that is stored. In my example the class SampleTDataContainer is effectively the data store and its fields are decorated with the attributes like XmlElement, XmlAttribute etc.

The following is a small excerpt from a sample class:

[Serializable(), XmlType(“data”)]
public class SampleTDataContainer

public List<Person> Persons = new List<Person>();
public List<Marriage> Marriages = new List<Marriage>();


[Serializable(), XmlType(“p”)]
public class Person

public int Id { get; set; }
public string FirstName { get; set; }
public string Surname { get; set; }


A basic output of the (xml) file will look something like this:

<?xml version=”1.0″ encoding=”utf-16″?>

<p id=”1″ fn=”Anakin” sn=”Skywalker” … />
<p id=”2″ fn=”Padm√©” sn=”Amidala” … />

<m id=”1″ mno=”1″ hid=”1″ wid=”2″ … />


Now that is all good and well for very basic stuff but what about a few more advanced requirements, like auto generating the ‘id’s, having ‘references’ between the objects (the Marriage class will have two references to Person (husband and wife), Person will have two (father and mother) and even reverse referencing (Person having a list of Marriage objects..).

Setting up those references in code is easy but once it has been serialized and then deserialized you get all kind of funnies. For example, An instance of a Person class might have a reference to a marriage and that Marriage instance has a reference to the same Person instance (the first time when you set it up in the code). Next time when the data is loaded from file (deserialized) the Marriage instance reference will not pont to the same instance as the Person instance (and vice versa). Ok this is a generic explanation of the issue but hopefully you get the drift… ¬†Well, this is where one of those 2 methods I mentioned in the original base class comes in – SetRelationObjects. To ensure all referencing objects (at run-time) are actually referencing the correct objects you can do something like this:

internal override void SetRelationObjects()

foreach (Marriage m in DataContainer.Marriages)

if (m.HusbandId > 0 && m.Husband == null)

m.Husband = DataContainer.Persons.FirstOrDefault(h => h.Id == m.HusbandId);

if (m.WifeId > 0 && m.Wife == null)

m.Wife = DataContainer.Persons.FirstOrDefault(w => w.Id == m.WifeId);

foreach (Person p in DataContainer.Persons)

if (p.FatherId > 0 && p.Father == null)

p.Father = DataContainer.Persons.FirstOrDefault(f => f.Id == p.FatherId);

if (p.MotherId > 0 && p.Mother == null)

p.Mother = DataContainer.Persons.FirstOrDefault(f => f.Id == p.MotherId);

//Reassign union objects since deserialization created new/separated instances.
for (int i = 0; i < p.Marriages.Count; i++)

Marriage m = p.Marriages[i];
p.Marriages[i] = DataContainer.Marriages.FirstOrDefault(u => u.Id == m.Id);




The fields for things like (Person) Father and (Person) Mother was not shown in the class listing but you can probably guess what they should look like. These fields are ‘NOT’ serialized per se but rather handled by adding a (int) FatherId and (int) MotherId set of fields that ‘ARE’ serialized. It both makes it easier to read and smaller to store in the xml file. When deserializing only the ‘id’ fields are restored letting the¬†SetRelationObjects method correct the referencing just after load. There may be other ways to do these kind of things but I choose this one as it suits me at the moment.

When designing classes for serialization it helps to know a couple of things about Xml serialization attributes and hidden methods. Lets say you want to have a public field in your class (like the Father/Mother ones) that you do not want exposed by serialization you can use a public bool ShouldSerialize<FieldName>() method inside the class. e.g.

public bool ShouldSerializeFatherId()

return FatherId > 0;

public bool ShouldSerializeMotherId()

return MotherId > 0;


public int FatherId { get; set; }

public int MotherId { get; set; }

private Person father = null;
public Person Father

get { return father; }

if (value != null)

father = value;
FatherId = father.Id;


FatherId = 0;


private Person mother = null;
public Person Mother

get { return mother; }

if (value != null)

mother = value;
MotherId = mother.Id;


MotherId = 0;



Helper methods for interacting with the data

It is possible that you can use the data container as is and directly call methods on the Person objects but that means you could be duplicating a lot of code or functionality each time you use it. To help with this I added a few simple helper methods in the class that implements the base class (SampleSDSImplementation). For example:

public Person AddPerson(string firstName, string surname)

int nextPersonId = 1;
if (DataContainer.Persons.Count > 0)

nextPersonId = (from p in DataContainer.Persons select p.Id).Max() + 1;

Person newPerson = new Person() { FirstName = firstName, Surname = surname };
newPerson.Id = nextPersonId;
return newPerson;


public Person FindSinglePersonById(int id)

return DataContainer.Persons.FirstOrDefault(p => p.Id == id);


As an example of how to use it look at the following:

string dataStoreLocation = System.IO.Path.Combine(System.Environment.GetFolderPath(Environment.SpecialFolder.Desktop), “TestSDS.xml”);
SampleSDSImplementation sdsSample = new SampleSDSImplementation();

Person vader = sdsSample.AddPerson(“Anakin”, “Skywalker”);
vader.IsMale = true;
vader.History = “The baddy of the story”;
vader.NickName = “Darth Vader”;
vader.PlaceOfBirth = “Tatooine”;
vader.PlaceOfDeath = “Death star 2”;

Person padme = sdsSample.AddPerson(“Padm√©”, “Amidala”);
padme.DateOfDeath.Year = 2005;
padme.PlaceOfDeath = “Polis Massa”;
sdsSample.AddMarriage(vader, padme, 1, null, “Naboo”, null, “Mustafar”);

Person luke = sdsSample.AddPerson(“Luke”, “Skywalker”);
luke.IsMale = true;
luke.ChildNo = 1;
luke.NickName = “Master Luke”;
luke.PlaceOfBirth = “Polis Massa”;
luke.Father = vader;
luke.Mother = padme;



This is a simple way to build a little database (don’t fool yourself that you can easily build huge databases with this hehe) with which you can run a simple system that requires small database that can fit into memory.

See example project here: SDSTest