An inordinate amount of time is spent in forensics creating and analyzing log files and creating timelines. I have created a group of scripts that make processing log files a little bit easier. I like to convert log files, both Windows and Linux, to a csv file. It makes it easier to process.

I have created a group of scripts that I use to accomplish this. They can be used individually but I have put them all in a library which I just load when I am processing log files. The library timeline-library.ps1 I load by dot-sourcing it.

ps> . .\timeline-library.ps1

It contains the following routines, some of which I have included here as standalone scripts.



When you load it as a library then you just call the routines without the .ps1

All the scripts output PowerShell objects so you can pipe them together. The convert scripts take the output from the corresponding and convert them to a common format so they can be combined into one large file. The format of this timeline is:

The convert-browserhistory and convert-jumplist take the output from Nirsoft's utilities.

The reason there are two utmp conversion scripts is utmp-parser is a helper script that is used with parse-utmp to parse the utmp/btmp files in 384 byte chunks.

The windows scripts will parse either evt or evtx files and determines the type automatically.

The event log scripts allow them to be filtered by time, eventid, userid, logsource. I usually just convert the entire log file and then filter whil I am analyzing it. If the csv is too large and you need to filter by date later it is easy using PowerShell

ps> import-csv log.csv | where {$_.DateTime -gt '6-6-2008'} | export-csv -notype newlog.csv