dynamic compression ?

Please ask questions here if you are not familiar with fsarchiver
Post Reply
dgerman
Posts: 6
Joined: Thu Jan 14, 2010 10:46 pm

dynamic compression ?

Post by dgerman » Fri Jan 15, 2010 9:30 pm

I thought I read a discussion regarding changing the compression algorithm on a block by block basis ( or n-block basis).
With a parameter for restricting cpu and network bandwidth utilization.
Does anyone know what I'm referring to?
Does anyone know of such a compressor?

admin
Site Admin
Posts: 550
Joined: Sat Feb 21, 2004 12:12 pm

Re: dynamic compression ?

Post by admin » Sat Jan 16, 2010 11:01 am

Hi Dennis,

It's technically possible to use a different compression algorithm and a different compression level for each data block that fsarchiver compresses. In fsarchiver-0.6.5, the management of "out of memory" errors changed. These errors happen with extreme compression levels (especially -z9) that require a huge amount of memory to run. This requirement is multiplied by the number of compression jobs (-j N), so it's very likely to happen when you run "fsarchiver -z9 -j8" for instance.

Now the problem is to determine which compression algorithm and level to use for a particular block. In general blocks are between 256 KB and 1 MB. The data of multiple very small files are merged so that we compress one large block (REGFILEM). The compression is very bad on small blocks.

I have never heard of such a compression technique. I know how to check the usage of the CPU. I had a look at htop sources, and it reads these statistics in a file which is in /proc. The problem is we also have to consider the CPU which is used by the other compression threads.

Suggestions are welcome.

dgerman
Posts: 6
Joined: Thu Jan 14, 2010 10:46 pm

Re: dynamic compression ?

Post by dgerman » Mon Jan 18, 2010 1:53 am

The Dynamic Compression Technology in Broadband Access Networks.
http://www.computer.org/portal/web/csdl ... IE.2009.43

$19

Adaptive-Rate Techniques, Dynamic Network Resource Management.

(ps is that you F.D.? )

admin
Site Admin
Posts: 550
Joined: Sat Feb 21, 2004 12:12 pm

Re: dynamic compression ?

Post by admin » Mon Jan 18, 2010 7:26 am

How does that work in a nutshell ?
Yes that's me :)

tuipveus
Posts: 44
Joined: Thu May 14, 2009 7:02 pm

Re: dynamic compression ?

Post by tuipveus » Sat Feb 20, 2010 9:29 pm

Actually there are some quite good compression programs, which are using dynamic compression. Or perhaps adaptive compression.

Since different compression-algorithms work differently for different kind of data, it is smart to use the best algorithm for your data.

If you compare programs in http://www.maximumcompression.com/data/summary_mf2.php , you will find out that programs which are most efficient are usually choosing the algorithm according to data. Choosing the correct algorithm is algorithm as well!

See also:
http://freearc.org/Benchmarks.aspx

admin
Site Admin
Posts: 550
Joined: Sat Feb 21, 2004 12:12 pm

Re: dynamic compression ?

Post by admin » Wed Feb 24, 2010 2:46 pm

It could be a good idea to do that sort of things for next versions, but it won't be in the short term. I first want to finish the current new features for 0.7.0.

tuipveus
Posts: 44
Joined: Thu May 14, 2009 7:02 pm

Re: dynamic compression ?

Post by tuipveus » Wed Feb 24, 2010 7:36 pm

Yes. I think all users of fsarchive and sysresccd valuate reliability over good compression ratio. It is important to keep goals clear, even good compression ratio is nice feature.

admin
Site Admin
Posts: 550
Joined: Sat Feb 21, 2004 12:12 pm

Re: dynamic compression ?

Post by admin » Thu Feb 25, 2010 12:23 pm

Anyway, I think the lzma / xz compression which is already available (z7, z8, z9) is quite good, and we can always choose a faster compression if we want. So such a feature is not urgent, but may be desirable in the long term.

Post Reply