Close


basketball hololive natsuiro_matsuri seifuku sweater thomas_8000 uncompressed_file

Edit | Respond

How does the image differs so much in size between the PNG and JPG :X
Marona762 said:
How does the image differs so much in size between the PNG and JPG :X
This PNG has extremely poor compression. Proper PNG is only 6.5 MB
I think this site would benefit by saving bandwidth and storage space if it would automatically process PNG images to their maximum possible compression
Better ask poster to compress them properly, then upload. Or even better, ask artist to stop uploading uncompressed pngs in hope to get better quality out of lossless compression, which png actually does pretty good, or creating 10K sized images to pack details nobody will notice or care for more than 4 seconds. Pixiv is at fault too, of course.
>I think this site would benefit by saving bandwidth and storage space
sounds like more job for mods, better not.
sounds like more job for mods, better not.
I'm no expert but I think that's can be automated server-side without any manual intervention
>That's can be automated server-side without any manual intervention
Image files usually don't store compression levels, so you have to recompress file to know if it can be compressed further. I can recompress with Graphics Magick 41 tagged images and send somebody to repost them, no problem, but there are some untagged ones.
fatmangoth said:
Image files usually don't store compression levels, so you have to recompress file to know if it can be compressed further.
That's a fair point, but let's say you set up a rule so only pictures whose dimensions x resolution exceed a certain threshold gets re-processed.

For example this picture is 1980x3078 in resolution which means it has 6.094.440 pixels and it's 33.093.090 bytes, that means there are 5,43 bytes per pixel - if I were to re-compress this image in PNG using maximum compression (not sure if it's the actual maximum but w/e it's just an example) it goes down to 6.817.141 bytes that's now 1,11 bytes per pixel -- so as a quick reference formula you could set that any image that exceed 2 bytes per pixels gets re-processed (obviously this isn't a fail-safe method as depending on the noise a PNG can be much bigger than another PNG of the same resolution and compression but it's better than nothing).
No, if something like that worked it would be implemented. Simpler solution would be to discard alpha and recompress all incoming images to png, but that would take even more space.
With the amount of garbage coming and no strict rules to what to store and what not without getting somebody to be butthurt, it is quite pointless.
If you want to clear "uncompressed file" tag, I can send you a zip with properly compressed images this weekend, and you will repost them all, if mods don't mind. But there are more uncompressed pngs and even more image files with alpha channel, like post #918807.
fatmangoth said:
No, if something like that worked it would be implemented. Simpler solution would be to discard alpha and recompress all incoming images to png, but that would take even more space.
If the image doesn't have a PNG version there's no point converting it to PNG, it's so obvious I didn't felt it was needed to say that...

fatmangoth said:
No, if something like that worked it would be implemented.
So your argument is it doesn't work because it's not implemented... okay, so I guess anything proposed doesn't work because nothing would be proposed unless it wasn't yet implemented.
My argument is that you propose do more computation work and probably faulty-prone one. So you're up to reposting or not?
fatmangoth said:
My argument is that you propose do more computation work and probably faulty-prone one. So you're up to reposting or not?
- I am not gonna work for free
- The whole "Reposting" can be fully automated with a proper script
- Saying that this method will require more computation work is an interesting argument, because it completely ignores the savings in terms of bandwidth and storage space.

What do you prefer:

1)Having no computation work but serving a 30 MB file to hundreds if not thousand of users ( = bandwidth)
2)Doing 1(one per picture) computation work and serve a file that's 6 MB to hundreds if not thousand of users ( = bandwidth)

Your argument is like saying that is not worth travelling 100 miles by a car instead by walking because getting into a car and driving it takes effort.
>post #612553
>automated with a proper script
>I am not gonna work for free
smh
fatmangoth said:
>post #612553
>automated with a proper script
>I am not gonna work for free
smh
I downloaded that image and saved it as PNG with 0 compression and now it's 169 MB;

Image if I uploaded it in it's non compressed form, this site would have spent 80MB extra bandwidth for the last 2 years for every user that would have downloaded that image - And that's because this site doesn't have a re-convert script - You're welcome by the way.
smh
Or instead of using legacy formats like png, you could use webp or whatever...
This could certainly be done for all pngs server-side, if wanted, but that would obviously cost processing power.
vendiu said:
Or instead of using legacy formats like png, you could use webp or whatever...
This could certainly be done for all pngs server-side, if wanted, but that would obviously cost processing power.
That's pretty much what I've said in the post that the other user linked:
post #612553

In the last comment of that post (Almost two years ago), I did suggested to switch to WebP since it's both 100% free and supported by ALL modern browsers.