Web lists-archives.com

Re: .deb format: let's use 0.939, zstd, drop bzip2




Adrian Bunk - 08.05.19, 21:45:
> On Wed, May 08, 2019 at 07:38:26PM +0200, Adam Borowski wrote:
> >...
> >
> > So let's pick compressors to enable.  For compression ratio, xz
> > still wins (at least among popular compressors).  But there's a
> > thing to say about zstd: firefox.deb zstd -19 takes to unpack:
> > * 2.644s .xz, stock dpkg
> > * 2.532s .xz, my tool (libarchive based)
> > * 0.290s .zst, my tool
> > * 0.738s .gz, stock dpkg
> > * 0.729s .gz 0.939, stock dpkg
> > * 0.729s .gz 0.939, stock dpkg
> > File sizes being 60628216 gz, 47959544 zstd, 44506304 xz.
> > 
> > XFCE install total: 723M xz, 773M zstd, 963M gzip.
> > 
> > Thus, even though we'd want to stick with xz for the official
> > archive, speed gains from zstd are so massive that it's tempting to
> > add support for it, at least for non-official uses, possibly also
> > for common Build-Depends.>
> >...
> 
> Is this single-threaded or parallel?
> 
> pbzip2 decompression speed scales nicely with the number of CPUs,
> and in general for anyone interested in massive speed gains the
> way forward would be towards parallel decompression.

Or lbzip2, in quite old tests with my packbench ruby script lbzip scaled 
better than pbzip on an Intel hexacore system. These were published in 
an issue of german Linux User magazine. As I had not multicore laptop 
back then, I was not able to the measurements myself.

Or pxz.

[1] https://martin-steigerwald.de/computer/programme/packbench/
index.html (I did not yet re-upload source repo to Gitlab or so, but 
tarballs are available. Its outdated as well. I did not test whether it 
works with the current Ruby version)

Thanks,
-- 
Martin