"I hope you're not going to base your corporate billing procedures on a "Value Added" scenario. I might owe you a million dollars if so. Great job you guys."
Michael A. Fidelholtz
Controller, Form Tech Concrete Forms
As part of Vestige's on-going commitment to educating our clients, potential end-users and our peers in the industry, Vestige Views blog reflects some of the industry's foremost thought leadership.
At one of our recent Tech Meetings (some background, we have bi-weekly 30-60 minute Tech Meetings at Vestige where we have some training on a topic, it is part of our continuing education program) I presented on LogParser (http://www.microsoft.com/technet/scriptcenter/tools/logparser/default.mspx). It is a free tool from Microsoft and is very handy for parsing event logs, web server logs and traversing file systems to get directory listings.
It is a command line driven application that uses SQL syntax for pulling various fields out of different objects and then outputting the data to another format. CSV is my favorite output format because then I can bring it into my database of choice. Sure, I can write a DTS in SQL Server for a specific input file type, but why reinvent the wheel?
For those of you that are dismayed over the fact that GSI removed the reporting of the record number from their Event Log Parser script, LogParser will pull that information from the event logs. If you get the standard "event log is corrupt" message, I recommend FixEvt (http://murphey.org/fixevt.html).
For those of you that have to parse through gigs and gigs of IIS log files, you can use LogParser to just pull out just those records you are interested in. Maybe you just want the 400s or 500s or maybe you are just looking for records from certain IP addresses. You can also create charts as an output which is great for presentations or to look analytically at say the most viewed webpages on the site in question.
What I really like using LogParser for is traversing file systems. You can recurse or not through a specific folder structure and you can grab data such as filenames, full path, MAC dates, attributes and even some of the internal metadata that is displayed when you right click on a file in Explorer and click on "Properties". You can also calculate MD5 hashes for each file. Neat, huh?
In testing, I found that even when you are calculating the MD5 hash on a file, the last access date is not modified. However, when traversing folders for information, the last access date of a folder is modified. Ah, but that is where someone got smart and provided a "preserveLastAccTime" parameter for LogParser. Set it to "On" and none of that data is altered. There is also another parameter that allows you to capture the MAC dates in UTC time. So here is another good option for collecting files from a computer if you cannot image it (such as in the case of a server that the owner does not want shut down).
There is an install package for the application, but all that does is copy files to a folder on your computer. You can copy that folder structure to a CD/DVD or USB drive and then plug it into the computer you are collecting from and not have to run any installation routine. There is also a DLL that you can access from VB, C++, C# or the programming tool of your choice. Some people have already done that and there are GUI wrappers available on the internet.
The help file is great, just enough information but not too much (like some Microsoft help files are known to do in my opinion). Did I say that this application is free?