Zopfli Optimization: Literally Free Bandwidth

Btw. I’ve tested this image with optipng ( available in Debian repo and OS X Homebrew ) with no aditional command-line options, and it produces .png of size 606,141 which is smaller than pngout version.

2nd for optipng, same result on win32

I can confirm @mattduck and @marqin results on the image in the blog post, optipng has clearly upped their game!

OptiPNG version 0.7.5
Copyright (C) 2001-2014 Cosmin Truta and the Contributing Authors.

This program is open-source software. See LICENSE for more details.

Portions of this software are based in part on the work of:
  Jean-loup Gailly and Mark Adler (zlib)
  Glenn Randers-Pehrson and the PNG Development Group (libpng)
  Miyasaka Masaru (BMP support)
  David Koblas (GIF support)

Using libpng version 1.6.10-optipng and zlib version 1.2.8-optipng

I got 606,273 with defaults, so technically not exactly the same but very close. Still not as good as zopfli at 585,117 even when using optipng -o 7 but an impressive improvement.

I enjoyed this post, some people in the comments above mentioned using ImageOptim. I was happy to hear people at my work enjoy using it to compress images at work, very easy to use to losslessly minify folders of image. The source is on github too, the worker files are interesting to compare https://github.com/pornel/ImageOptim/tree/master/imageoptim/Workers

After reading your article I got zopfli-cryopng here:

http://css-ig.net/tools/public/

After many hours of tweaking, I was able to produce the ultimate optimized small size PNG files…and then I tried WebP - in seconds I had a lossless WebP file smaller than the best PNG I could produce from any of the best PNG optimization tools. So, I think I’ll be using WebP for all of my lossless needs from now on.

For lossy images, JPEG does a pretty good job. To beat JPEG, I’ve been testing BPG and WebP. So far, it appears BPG is capable of producing smaller sizes, but it takes a lot of tweaking to get it optimized to match or beat the quality of JPEG at an equal or smaller size. BPG reminds me of all the tweaking I had to do for PNG, but BPG does indeed produce better quality at smaller sizes compared to JPEG. Compared to WebP, BPG still has a slight edge, but once again, WebP is much easier to use. Hours of tweaking and detailed image comparisons with BPG are unnecessary with WebP. Still, BPG was able to beat WebP in my tests, but I think I liked the simplicity of WebP much more.

The area where BPG has no rival is in animated images. Making a BPG animation is almost as easy as making an animated GIF, but the small amount of extra effort in making a BPG gives AMAZING reductions in file sizes and UNBELIEVABLY HUGE increases in quality, compared to GIF. Of course, that should be no surprise, since GIF is an obsolete dinosaur, and has been for decades. What I’m trying to do now is compare BPG animations with MP4 and WebM video. Producing frame-by-frame optimized video is a bit harder in MP4 and WebM (with or without audio) compared to GIF and BPG (no audio), but my preliminary results thoroughly trounce GIF also. If I stopped here, I could replace low-resolution low-framerate silent animated GIF’s with high resolution high framerate MP4 and WebM at the same file size.

WebM probably beats MP4, but I will need to do more testing to see if WebM beats BPG. I’m guessing the browser support for WebM could make it my first choice, and of course it will pair nicely with WebP. With WebM and lossless WebP already appearing to serve most needs, I’m basically figuring out if BPG beats those, and/or if any of them beat JPEG. At high lossy quality levels, JPEG is still pretty competitive with lossy WebP and BPG. WebP’s edge over JPEG is small, and so far it hasn’t motivated the world to abandon JPEG in favor of WebP. BPG’s edge over JPEG is larger, but once again, it might not be enough at high quality levels to motivate abandoning JPEG. With low lossy quality levels, both WebP and BPG beat JPEG, but if we want better quality, we can still get it with JPEG without needing any new formats.

At the end, the only thing I’m confident of is that WebP beats PNG, and it does it with a lot less effort. I only did a few photographic tests, so maybe there are circumstances where PNG beats WebP, but then I think the convenience of WebP would probably make me say “meh, I don’t care, I’m using WebP anyway, because it’s easier”.

You can find out more about me here:

https://www.mediawiki.org/wiki/User:Badon

1 Like

So I have this for future reference, the command to optimize all png files at the Windows command line:

for %i in (*.png) do zopflipng -y "%i" "%i"
1 Like

Really great view. There is many ascii characters on sqare, also would like to see how other images can stack agains png

One compression, that is even better than Zopfli/Brotli/WebP, is FLIF (free lossless image format, see http://www.flif.info).
E.g. the webcomic image is reduced (lossless) to 320k. And what I found far more sophisticated: You can make the file as small as you want. If you “cut” the file, and just use the “start” of it, it is a progressive lossy encoding.
It also performs on different kind of graphic corpi (e.g. medical, geographical, photo, sketches, …) much more homogeneous regarding compression, opposing to many other algorithms which only shine with one kind of images, and perform not so good on the others.
Only drawback => its not yet arrived in any product, so its not useable. out of the box.

Obviously gif is a poorly suited format for most real world images, but for the avatar example it is perfect! Converting the example avatar to a 4-color gif yeilds a size reduction of 79%. These avatars are mainly solid backgrounds, which compress almost perfectly in gifs.

Hi Jeff, I’m worried by this part:

Yes, it is technically possible to produce strange “lossy” PNG images, but I think that’s counter to the spirit of PNG which is designed for lossless images. If you want lossy images, go with JPG or another lossy format.

You’re getting hung up on categories that aren’t the best categories to get hung up on. To not use lossy optimizations on PNGs is a huge waste of bandwidth, since we know that lossy – but visually lossless – optimizations can drastically reduce the size of PNG images. To not do that because it’s “counter to the spirit of PNG” is to basically light your money on fire (and waste energy, increase page load time, etc.) Image formats have no spirit, just bytes.

PNG is designed for graphics. You wouldn’t want to use JPEG or JPEG-2000 for graphics just to adhere to the spirit of different formats. When you’re dealing with images like your cartoon example, PNG by default is going to waste a lot of space describing fairly simple patterns of shape and color. For that kind of image you can use a good lossy optimizer like PNG Quant and what they do at TinyPNG.com, and reduce the file size by more than 50% with no visual difference.

Here’s the proof: Imgur: The magic of the Internet

Can you tell the difference between yours and this 163 KB version? (It was produced by TinyPNG.com)

Note that these kinds of savings are common in lossy PNG optimization of crude images – by crude I mean cartoons, graphics, and so forth where the math of PNG is extremely inefficient in describing the visual manifestation of the image. webp gets similar results, and the new “near-lossless” mode of webp is pretty good too.

WebP does look good; I just ran a quick test on this image (the book one) PNG vs. WebP Image Formats - Andrew Munsell

30,640 original-cover.png
16,309 original-cover-zofpli.png
8,838 original-cover.webp

I generated a lossless webp using

cwebp -lossless original-cover.png -o original-cover.webp

WebP support is still… largely not there though. Zero support for WebP in Firefox, Edge, or Safari! So errrr good luck with that?

Lossless images are pretty rare on the internet, so the lack of WebP support in many browsers is both unsurprising, and not much of a hindrance for the offline applications that most often need lossless images. That said, if a website needed to use lossless images - like perhaps a pixel-peeping photography site - the bandwidth savings are so huge with WebP, it’s worthwhile to revert to “Best viewed in Google Chrome”, and/or use a WebP JavaScript shim for people in other browsers.

Practically speaking, I think web browser support is much less important than image application support. I personally use XnView, FastStone Image Viewer, PicPick, Pixlr, and a few others, but NONE of them natively support WebP. I can use a plugin to get WebP support in XnView, which I use to manage my camera photos, but the only lossless files I normally create and use are the original RAW files from camera. I haven’t - and won’t - considere replacing the original RAW files with WebP files, so even in my own offline use cases, I haven’t needed WebP.

The one case where I did want to use WebP files was for screenshots for the Coin Compendium (CC) project, but WebP has dimension limitations that prevent it from capturing an entire scrolling webpage. So, I use 256 color PNG, or 70% progressive JPEG, whichever is smaller. We also occasionally use MAFF archives to preserve a page, and sometimes we preserve the original images on the webpage in whatever format they’re in.

I think the #1 problem with WebP adoption is Google’s push to have it replace JPEG or other common formats, when in reality, the only place where it is clearly a top winner by a large margin is in lossless images compared to PNG. Evangelize that asset alone, and it might be possible to win mind-share. As it is, the first thing people do is compare WebP to JPEG, and then dismiss WebP as yet another over-hyped new technology that will eventually be forgotten.

For myself, I like to use the right tool for the job, and WebP is incredibly close to being the right tool for some jobs. No matter how small the niche, a niche is a niche, and excelling in that niche needs to happen FIRST before WebP can gain wider adoption. I would love to use WebP for preservation of large-dimension screenshots, if only they could handle the large dimensions.

2 Likes

One of the techniques I did for my site seemed a bit counter intuitive but yielded satisfactory results especially for more photographic images.

Let’s say my target resolution was 300 pixels wide. Given a source image I would save it as 600 pixels wide keeping the aspect ratio for the height, but save it with 0% quality on Photoshop or 50% quality on mozjpeg or cjpeg. Then let the browser rescale it for me.

The jpeg compression artifacts I found were not visible to the naked eye unless I really squint hard. The bonus effect is you get a nicer picture on 2x retina displays.

This technique significantly reduced the size of my portfolio site which had some large background images to demonstrate parallax.

Ironically it also saves a few more bytes compared to PNG in some of the SVG cases.

I was surprised at how unimpressive zopfli was on those admittedly small, simple, images, so I grabbed one of your images off of Discourse. (It looked similar to the default letters you were testing and was off some account)
https://avatars.discourse.org/v3/letter/j/f6c823/45.png (J)
After 5 single-threaded seconds of 500 iterations on my old i5 here, zopfli spit out.
Input size: 572 (0K)
Result size: 548 (0K). Percentage of original: 95.804%

And for this one:
https://discourse-cdn.codinghorror.com/letter_avatar_proxy/v2/letter/k/d26b3c/32.png (K)
2 seconds and 500 iterations later…
Input size: 691 (0K)
Result size: 674 (0K). Percentage of original: 97.540%

Now, it’s still just a few bytes, since the input images were not large, but, a bit closer to the percentages you mentioned in the article.
BTW, cranking up the number of iterations didn’t gain anything further. Usually for small images 100 or 500 is pretty harmless. If you have a ton of 'em, all the more reason to spread it out over all the cores.