Saturday, January 7, 2012

A Shout-Out to a Great Piece of Software


I want to take an opportunity to write about a piece of software I use in every single investigation.  Please note that I do not work for the software company, nor do I receive any compensation from them.

SAW (SMART Acquisition Workshop) is a staple in my forensics lab.  The software is made by Andrew Rosen of ASR Data.  I'll take a moment to mention here that his software (an entire Forensics suite, including SAW) is cheaper than the stuff from Big G or Big A, and he doesn't charge annual "maintenance" fees.

This is how SAW works, as I understand it.  When imaging a device, instead of writing out a cluster of zeroes, it simply makes a note of it.  The resulting file is often significantly smaller than a dd image would be because it is a sparse file.

SAW imaging a 160GB 2.5" SATA HDD.

In the example shown in the picture (sorry for the bad quality), the Source device is mostly empty.  Because SAW creates a sparse file, the acquisition process isn't slowed by having to write zeroes to the Target drive.  Note the current acquisition speed of 82.28 MB / second on a standard USB3 connection and the amount of data "optimized."

The resulting evidence file in this particular case (a 160 GB drive) was just over 6 GB - a whopping 96% reduction in file size.

The use of a sparse file is a clever idea in several ways:

First, the original data is not altered or compressed.  What is missing from the end file is what wasn't there in the first place (e.g. thousands of 4k chunks of \x00).  Andrew likes to say, "What's missing from the file is Nothing.  Nothing is missing."

(That play on words reminds me of a co-worker who called IT and stated his keyboard wasn't working. When the tech arrived, my co-worker stated, "Seems like somebody spilled coffee on it." The tech picked up the keyboard and out poured what seemed like a full mug's worth of coffee.  The tech said: "Be honest, you did this, didn't you?" to which my co-worker replied, "Well, I'm somebody!")

Second, the resulting sparse file can be read (e.g. keyword searched, carved, etc.) by any other program without that program supporting a specific decompression algorithm.  Additionally, the processor isn't bogged down with any decompression.  For the drive shown in the picture, it would mean no waiting for the hard drive to read out 154 gigabytes of zeroes on every job.  I've run tools like IEF against such files with great success.

Third, if you want to mount an actual image, with zeroes and all, you simply instantiate one using a companion piece of software called Smart Mount.  Smart Mount presents the filesystem with a dd image (called 'image.dd') while simultaneously creating a mount point that can be browsed manually.  That's neat and all, but what is significant is how the zeroes are re-inserted.

When the OS reads the instantiated image.dd, or you browse through the mounted filesystem, the "missing" zeroes are read not from the hard drive (remember, they were never included in the forensic image), but instead from /dev/zero.  So the processor, not the hard drive, is slinging those zeroes about.

I wrote this as some food for thought that might help some of you - especially those of you who find your storage arrays filling up with mostly-empty forensic images.  Again, I use this piece of software on every forensics or data recovery job I perform.