Close
Multithreaded PNG save tool ?
I just made my first serious scan attempt (dina3 image with 1200dpi native scan resolution)

images were fuckhuge
ps was pushed easily beyond 7gb ram usage while trying photomerge
ps even gave an error (beyond 2gb psd file, but still saved it to ~2,8 ;))

well as expected pscs4 with 8gb ram makes fun, HOWEVER when it comes to importing or EXPORTING stuff its just fucking slow, as it doesnt use more than 25% cpu speed, or 1 core...

So are any other programs out there than can process one image (be it bmp, png or jpg) and compress to png using all my 4 cores ?
Most apps are written for 32-bit architectures, so I'll doubt that any software will be using more than 1 core unless somebody moved over to 64-bit?/has 64-bit i.e support for dual-core?

But since most apps that comes on CD/DVDs usually comes with only 1 build (i.e 32bit) I'll doubt that they'll make support for 64bit (meaning that it'll support more than 1 core) unless they can support/add both on the same package. They may have support for backward compatibility, but that's it, It won't use the other core(s) unless I'm wrong here?
what does 32-64 bit have to do with multicore support ?

i ran multithreaded programs on xp32 since ages...
(64-bit has nothing to do with multithreading.)

That's not why PS's PNG is slow. Anyhow, just export to TIFF and use something else to convert to PNG; XnView is about 6x faster, and most other tools should be about the same.
Guess I'm just stupid then : /
MDGeist...something must be wrong.. because i photomerged 12x 600dpi scans with PS and that didn't use more than 1.8 gb of ram (my pc has 2gb of ram). Maybe you have some weird setting on? Beside..1200 is just a waste of space imo..600 is more than enough to filter it with a good result.
will try xnview hope it works for vista64

on another note: lol png got 44x mb big... (while lzw compressed tif was 550ish.... )
syaoran-kun said:
MDGeist...something must be wrong.. because i photomerged 12x 600dpi scans with PS and that didn't use more than 1.8 gb of ram (my pc has 2gb of ram). Maybe you have some weird setting on? Beside..1200 is just a waste of space imo..600 is more than enough to filter it with a good result.
size is important here... were 4 full scanned dina4 pages @ 1200
end image had like 500mpixel ... ;)
ps does leech a lot of ram, if you have enough of it!
bad idea to use pngs for merging too, as each image is blown to like 2x its size in ram

also how can you give ps more than 2 if you only have 2 ? ;p
i have 8, and can give it 8... (well actually no... since ~1,5 are used for system and other stuff... giving it more than 7 can crash my system)

edit:
tried paint.net the "multithreaded" image tool... sucks @ exporting too...
looks as if png for unprocessed 1200dpi scans is a big NOGO

speed/mb wise its just a nightmare with 500+ mpixel images
its way faster to save as bmp and rar it, or do tif+light compression
it better to use bmp/tiff/psd for raws. png is terribly slow not only saving but also loading -_-
I only use png format for making files to upload.

btw, Xnview is a good tool for batch convertion.
resampling, gauss blur, auto/manual levels, normalize, saturation, gamma, remove alpha channel, e.t.c.
and ofcourse png compression ;)
i use irfanview for that

i wonder if one could make png algorithm use 4+ cores though
PNG itself doesn't parallelize. The pngcrush operation might. It's much easier and more effective to just process more than one file at a time. Any serious batch processor should do this.
petopeto said:
PNG itself doesn't parallelize.
Even though PNG naturally doesn't parallelize very well, it probably could be made so in some fashion. One idea would to just do compression on much smaller blocks of the image.

If doing this is a good idea is another story. I would assume since nobody has done so, the speed improvement may not be that great, or maybe there is some other downside from working with small blocks.