My own little unscientific compression test

Post here if your topic does not match the previous sections
Post Reply
Coastie
Posts: 18
Joined: Wed Dec 16, 2009 8:18 pm

My own little unscientific compression test

Post by Coastie » Sun Jun 20, 2010 3:36 am

Dell E1505 laptop
2ghz Core 2
2 GB RAM
80GB Intel G2 SSD

Backing up a fresh Win7 x32 install with updates and MS Security Essentials installed. No cleaning (update files etc) was done, the used space on that NTFS partition is 11.44GB.

Changing the restore partition from NTFS to reiser4 did not have much of an effect on time.

level ____ hh:mm:ss ____ Size/GB
-z3 -j2 ___ 00:08:28 ____ 2.615
-z7 -j2 ___ 00:14:28 ____ 2.403
-z8 -j2 ___ 01:06:15 ____ 2.247

For my system, -z7 seems to be the sweet spot. Great compression level. Next I'll format the primary partition and restore. (just the z3 and z7) I won't be using z8, the additional time to compress is not worth the savings for me.

Coastie
Posts: 18
Joined: Wed Dec 16, 2009 8:18 pm

Re: My own little unscientific compression test

Post by Coastie » Sun Jun 20, 2010 5:51 am

restfs -j2 as follows. Win 7 booted without error in both cases.

LZMA-1 __ -z7 __ 00:07:01
GZIP-6 __ -z3 __00:06:16

Unlike when compressing, neither decompress type taxed the CPU too much. GZIP hovered around 40% on each core. LZMA ranged from 25% to 85% leaning more toward the higher end, so if you have a CPU older than an Core2, GZIP may be a better option. I think LZMA-1 or -z7 is the best option for my hardware/use.

All tests were run on the same SSD with 2 partitions. One for the OS and one for the recovery file. The OS partition was formatted between restores.

I was a little hesitant to use fsarchiver over partimage for a couple of reasons.
1, no decent 3rd party user tutorials online were found (example of great partimage tutorial http://lifehacker.com/292972/partition- ... -rescue-cd)
2, no verify archive command from fsarchiver
3, no GUI

In the end I found using a simple 1 line command was far easier than the partial partimage GUI
The flexibility in the compression and multiple threads saved me around 45 minutes. Last time I did this with partimage it took about an hour to complete, and I had multiple files. Now I can complete the task in under 15 minutes and just have a single file.

The restore process worked great. Thanks, I'll be switching to fsarchiver for my system backup/recovery purposes.

admin
Site Admin
Posts: 550
Joined: Sat Feb 21, 2004 12:12 pm

Re: My own little unscientific compression test

Post by admin » Mon Jun 21, 2010 12:29 pm

Thanks for having shared your experience about fsarchiver. I am happy to see that it works very well, especially on ntfs filesystems which had most bugs. Can you just say which Linux flavor and ntfs-3g version has been used during your tests ?

Previous fsarchiver versions (< 0.6.10) had problems with ntfs filesystems having symbolic links being broken after restfs (Vista & Win7) if a recent ntfs-3g release was used (ntfs-3g >= 2010.3.6), so it's interesting to have feed back about fsarchiver and ntfs filesystems.

Coastie
Posts: 18
Joined: Wed Dec 16, 2009 8:18 pm

Re: My own little unscientific compression test

Post by Coastie » Mon Jun 21, 2010 7:35 pm

I have been very busy on stabilizing SystemRescueCd-1.5.x
Are you also the maintainer of the SRCD? If so I used 1.5.5beta8 for all above tests.
fsarchiver was v0.6.10 included in that, not sure which version of ntfs-3g was inclided in 1.5.5beta8

admin
Site Admin
Posts: 550
Joined: Sat Feb 21, 2004 12:12 pm

Re: My own little unscientific compression test

Post by admin » Mon Jun 21, 2010 7:50 pm

Yes I also maintain SystemRescueCd. So you have been using sys-fs/ntfs3g-2010.5.16 + fsarchiver-0.6.10.
It's good to have positive feed back based on recent versions of fsarchiver and sys-fs/ntfs3g.

Coastie
Posts: 18
Joined: Wed Dec 16, 2009 8:18 pm

Re: My own little unscientific compression test

Post by Coastie » Mon Jun 21, 2010 8:16 pm

Yes, just booted to verify.

ntfs-3g 2010.5.16
fsarchiver 0.6.10

Used the standard SRCD kernel-32 to create the archive and restore

tuipveus
Posts: 44
Joined: Thu May 14, 2009 7:02 pm

Re: My own little unscientific compression test

Post by tuipveus » Sun Sep 12, 2010 3:53 pm

How about specifying -j3 or even -j4 with dualcore? Why fsarchiver was so much faster than partimage, did you save the image to network or usb?

Coastie
Posts: 18
Joined: Wed Dec 16, 2009 8:18 pm

Re: My own little unscientific compression test

Post by Coastie » Tue Sep 28, 2010 3:51 am

as stated above:

All tests were run on the same SSD with 2 partitions. One for the OS and one for the recovery file. The OS partition was formatted between restores.

No hyperthreading on my CPU, so that is the reason for -j2 and not more.
General consensus is to leave 1 core or 1 thread available for other things, so if you have 4 threads available on your CPU, use -j3 to leave 1 thread open.

fsarchiver is faster than partimage because it is multi threaded unlike partimage. Read more here http://www.fsarchiver.org/Fsarchiver_vs_partimage

tuipveus
Posts: 44
Joined: Thu May 14, 2009 7:02 pm

Re: My own little unscientific compression test

Post by tuipveus » Sat Nov 05, 2011 5:15 pm

I also did a small test with Windows 7 64-bit installed and some of my own files (work-computer with jpg's, wav's, installed programs and source-code). Totally about 41,5 GB according to windows explorer.

gzip:
-j8 -z2 -v
22 min 50 s
filesize about 22 GB, 23 583 574 722 bytes

lzo:
-j8 -zz -v
20 min 10 s
filesize about 25 GB, 25 822 147 769 bytes

lzma:
-j16 -z7 -v
49 min 29 s
filesize about 21 GB, 21 851 688 816 bytes.

I don't need more tests, as you can clearly see, data doesnt compress much, no matter what compression algorithm I use and how many cores I use. So I choose gzip for performance and relative good compression.

Test was made with i7 processor with 4 real cores and 4 "virtual". Ssd-disk (ntfs) of my laptop was backed up to my harddisk (also in my laptop, internal).

Something what is really strange is that even with lzo I get reasonable good compression ratio of 100 - 25/41,5 G = 40%. So compressed size is 0,6 * uncompressed size.

Post Reply