Use ‘jpegoptim’ to optimize/re-compress your jpg images

I use optipng and jpegoptim to help me compress/optimize my jpg/png images.

For png images re-compression, please refer to:

https://www.peterdavehello.org/2015/05/use-optipng-to-optimize-re-compress-your-png-images-losslessly/

jpegoptim supports many platform, incluing Solaris, Mac OS X and Windows, of course FreeBSD and Linux, here is its git repository on GitHub:
https://github.com/tjko/jpegoptim

How to install?

FreeBSD:

For Debian/Ubuntu based GNU/Linux:

For CentOS/RHEL/Fedora based GNU/Linux

(on CentOS/RHEL, please enable EPEL repo first)

How to use?

By default, jpegoptim uses ‘lossless‘ mode to re-compress the jpeg images, which means the optimized images will have the same quality just like it was, to make the images smaller, you can use -m<quality>, --max=<quality> to enable lossy optimize by setting its quality, the valid range is 0 – 100, for example, set quality = 90:

Note that the files that already have lower quality setting will be compressed using the lossless optimization method.

You can also use find to help you compress all the jpeg images:

Using this picture on Wikipedia as an exmaple: https://wikipedia.org/wiki/File:JPEG_example_donkey_100.jpg

jpegoptim-example

Before the compression, 36239 bytes (about 36 KB):

JPEG_example_donkey_100
After the compression, 10145 bytes (about 10 KB):

JPEG_example_donkey_80

Can you recognize which one has better/lower quality? :D

The easiest command to compress pdf file

We use pdftk here, pdftk stands for PDF toolkits, is a cross-platform tool for manipulating PDF files.

How to install pdftk?

How to compress PDF via pdftk?

You can also decompress by replacing ‘compress’ keyword with ‘uncompress’

That’s all. Notes that the compression usually works with images only, may not work on the file only contains text and simple shapes.

Use ‘optipng’ to optimize/re-compress your png images losslessly!

OptiPNG is a useful tool to compress png images without loss its quality, it really help reduce the bandwidth, diskspace and the loading/response time for website. I used it to re-compress all the png images on cdnjs and successfully made 206857 images smaller(see cdnjs/cdnjs@e936c87e9044fd2b123).

It’s easy to use, you can install it via apt-get, or download and build it from source :
$ sudo apt-get install optipng

Usage:
$ optipng example.png

The default compress level is 2, the range is 0~7 (may depends on the version you are using), I will always use the highest and slowest level:
$ optipng -o7 example.png

Find all the png images to compress:
$ find path -name "*.png" -exec optipng -o7 {} \;

In fact, optipng can convert BMP, GIF, PNM and TIFF format image to optimized optimized png, and also performs PNG integrity checks and corrections, very nice.

Use multi-threads to compress file(s) when tar-ing something

In unix systems, tar is a widely used tool to package and compress files, tar always spends a lot of time on compression, because the programs it used don’t support multi-thread computing, but tar supports we use specified program to compress file(s), which means we can use the programs support multi-thread programming to compress files with higher speed!

From the manual:

-I, –use-compress-program PROG
filter through PROG (must accept -d)

3 Tools for parallel compression I will use today:

  • gz:   pigz
  • bz2: pbzip2
  • xz:   pxz

(Can be easily installed via apt-get in Debian/Ubuntu based linux)

Originally commands to tar with compression:

  • gz:   tar -czf tarball.tgz files
  • bz2: tar -cjf tarball.tbz files
  • xz:   tar -cJf tarball.txz files

Parallel version:

  • gz:   tar -I pigz -cf tarball.tgz files
  • bz2: tar -I pbzip2 -cf tarball.tbz files
  • xz:   tar -I pxz -cf tarball.txz files

I am going to use linux kernel v3.18.6 as sample and threw the whole directory on ramdisk to compress them and compare the difference!
(PS: CPU is Intel(R) Xeon(R) CPU E3-1220 V2 @ 3.10GHz, 4 cores, 4 threads, 16GB ram)

Result comparison:

tarCompressComparison1

Time spent:
.                                  gzip         bzip2                    xz
Single-thread       17.466s     50.004s       3m54.735s
Multi-thread           4.623s      13.818s       1m10.181s
How faster ?          3.78x          3.62x                3.34x

Because I didn’t specify the parameter, just let them decide the default compress level, so the space they used may be a little bit different, but we still can add parameter(s) like this:

tarCompressComparison2

With parameter -9 to increase the compress level, the result will become 81020940 bytes but not 84479960 bytes, so we save more 3.3 mega bytes! (also spent 40 more secs)

Very useful for me!