7
3
I need to both create and split multipage TIFF images, ranging from 2 to almost 100 pages (A4, 300 dpi, 2500×3500 px). The job is performed periodically by a script on an x64 Linux server. Currently I'm using Imagemagick. The smaller cases do not pose any problems, but the larger ones do.
I need to radically reduce amount of memory used during the operation.
For example, this:
convert *.jpg -compress lzw output.tif
(70 jpeg files) consumes about 4.6 GB of RAM, even though each input is less than 2MB the resulting file is less than 250MB.
The reverse operation:
convert input.tif output-%04d.png
has similar issues.
From what I have read, this happens because Imagemagick first loads and decodes all the input images and only after that it starts encoding them into the output file.
How can I create and split multipage TIFF images without such huge memory footprint? I don't have to necessarily use ImageMagick, any other free tool will be fine.
To put some perspective on it: EACH 2500×3500 pixel image will take up 2500×3500×3 bytes at least as it resides in memory.That is 26250000 bytes per image, 1837500000 bytes total for 70 images. Then you create a DUPLICATE of that in the TIF, total 3675000000. Then you request to save it using lzw compression; some buffers is probably required for that. Maybe add buffers for writing... Handling 70-100 page files isn't easy, especially if the pages are nothing but bitmaps. – Hannu – 2014-08-06T12:59:25.070
@Hannu Not easy for who? The real world says that there's a concept of streaming transforms, and unpacking a huge image stack in memory simultaneously is lame and fugly. – polkovnikov.ph – 2016-03-25T00:13:12.757
The first example
convert
above creates a single PAGED tiff. Depending on how Imagemagick works internally, you MIGHT indeed have a "huge image stack in memory". – Hannu – 2016-03-28T08:52:20.370