1

I have a command line process that is ran by my ruby on rails application to use the ImageMagick 'convert' command to convert a PDF into multiple PNGs. The problem is that when I run the command it takes all the memory on my VPS (512MB) and renders many other functions near useless (web server, ssh etc).

My convert command is this:

convert -density 288 ./document.pdf -resize 25% ./pages/page_%03d.png

I've tried to use ulimit to limit the memory this process can consume to roughly 15% of memory which on my 512MB VPS is roughly 76,800 kbytes. If I run the above command with ulimit as seen below the process initially spikes with no limit taking up to 80% of memory and renders other functions of the server useless due to lag. After a while the process comes down to under 10% (usually 4-8% of memory) but all the other server's functionality is still slow.

Note my numbers about how much memory the process is consuming are coming from top.

bash -c 'ulimit -m 76800; convert -density 288 ./document.pdf -resize 25% ./pages/page_%03d.png'

Does anyone have any ideas on how I can execute this command without it consuming all my memory?

Thanks!

bwizzy
  • 1,265
  • 4
  • 14
  • 17
  • You could experiment with different swap settings. Propably somewhere 100-200 MB might be optimum. I believe ulimit only kills the process and doesn't affect memory usage. – Antti Rytsölä Nov 11 '11 at 15:35

1 Answers1

1

Check out cgroups

http://en.wikipedia.org/wiki/Cgroups

http://www.youtube.com/watch?v=KX5QV4LId_c

ckliborn
  • 2,750
  • 4
  • 24
  • 36