-
Notifications
You must be signed in to change notification settings - Fork 8
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Discussion] Zopfli KrzYmod and PNGWolf/PNGWolfZopfli #21
Comments
I didn't write that part of code, someone else provided it. Basically I prefer using genetic algorithm after optimizing the PNG file with other tools. From what I understand it's mostly about that filter-per-scanline optimization. The problem may already arise from zopflinpng and pngwolf, most likely, having a bit different ZLIB setup for compression during the initial optimization stage. Basically, if you don't use zopfli on EVERY TRY you may still end up with the result that might've been better if the previous or any other genetic algorthm iteration was compressed as a final result. Using zopfli during optimization pass will most likely make your PC work for months if not years on one big/bigger file. I also provided closed source software named krzydefc that can extract IDAT to ZIP file for separate optimization to ZIP file with other software (like KZIP, 7-ZIP, etc.) then packing it back (original file required for header!) into a PNG file. |
By part of the code you mean pngwolf or generic algorithm in zopfli KrzYmod? Sorry, I'm no expert in compression, I barely get what's happening, but I mostly am end user, but I noticed the generic part in KrzYmod :P I was mostly referring to the part from here:
So the gains from pngwolf and zopfli combined that were mentioned on encode.su are because of them using zopfli on different moments or am I missing something? Is it even possible to use zopfli on every try as you speak disregarding the time? Though I'm aware of the bigger files taking so long; that's why I chose the smallest file I had for testing, though it still took a night to run pngwolf on the file (the file size remained the same though). |
Sure (: The gains come from various of things. Few example:
Basically you would need to know the full mathematical equation that can predict what to use, when to use it and what parameters to pass in order to get what you deem to be optimal and satisfying for you. So all in all, as much as automation is great, this fork is mostly about even more try&error'ing until you realize that you should just give up and pick something smaller for now until you have more patience or time to test it out, or use some "defaults" and, again, later re-optimize it with just "re-genetic algorithm-ing" or/and "re-deflate-ing" it in hopes to get something smaller. :P And finally, this is my outlook on things. I may be right or wrong as I don't believe myself to be an expert. Though, I think I have some experimenting having been done on this in the past. So better to review with others too. :) |
Thanks for elaborate reply! I finally managed to get my head around this. So basically, it's really hard if not impossible to pick the right options, as there's too many variables, most of which are exposed here for use? Meaning it's way more beyond the Anyway, back to the core of question as I don't want to keep this issue open needlessly; all of this doesn't have to do anything with piping the file through PNGWolf magically shaving a byte or so every now and then? Is there any reason to use PNGWolf-zopfli in addition to this tool when it comes to purely bytes saved? If so, what part exactly? P. S. You got me curious to take a deeper look at optimizing files further – I'll try the --pass option with the files I already optimized to see if it changes anything; how small a file must be to fit in one block? |
Since discussions aren't available here yet, I'm opening it as an issue – sorry about that!
Anyway, I've been reading discussions on https://encode.su/threads/2176-Zopfli-amp-ZopfliPNG-KrzYmod and saw a discussion about PNGWolf being used in tandem with zopfli. I currently am using rather extreme optimization –
--iterations=1000000 --mui=10000 --all --lossy_transparent --lossy_8bit --alpha_cleaners=bhvapw
and skip--mui
for smallest files.I only want the smallest file size so I don't care about it taking extra time, but even after reading help file and discussion, I don't quite get how pngwolf works – can I use pngwolf on a file that's already been zopfli'ed with your tool and then use pngwolf?
I tried optimizing a really small (13 × 27) file here https://pzwiki.net/wiki/File:Lighting_indoor_01_0.png – I initially optimized it with with
which got me to 474 bytes and then used
from pngwolf-zopfli from https://aur.archlinux.org/packages/pngwolf-zopfli-git/ but it takes a really long time especially since it's not multithreaded, and I'm wondering if there's any point on that.
I noticed message about keeping filters, but I don't quite get it. I also noticed it's for bzip – does it work for .png files too?
The text was updated successfully, but these errors were encountered: