1

We have a rails app that dynamically generates thumbnails on request and saves the result locally to make it faster to serve up the same one again. We use the following url structure:

/thumbnails/99999/large.jpg

Where 99999 is the database record the thumb is linked to. The problem is the number of records with thumbnails is maxing out the directory limit.

Any ideas on how to get around this while maintaining the built in automatic cache retrieval that rails gives you with the public folder?

The end goal is to be able to dynamically generate arbitrary thumbnails on demand and cache the result locally.

chrishomer
  • 297
  • 1
  • 3
  • 8

1 Answers1

2

Use subdirectories, like:

/thumbnails/001/001/large.jpg
/thumbnails/999/999/large.jpg

and so on. Create as many subdirectories as you think are necessary for the unique identifier of your image, e.g.,:

/thumbnails/999/999/999/999large.jpg

You may need to be aware of the inode limits on your filesystem.

cjc
  • 24,533
  • 2
  • 49
  • 69
  • and how do I then tell nginx to map /thumbnails/99999/large.jpg to /thumbnails/099/999/large.jpg ? Or would you suggest we change it on the web request side too? More tedious than I was hoping, but I guess doable. – chrishomer Aug 13 '12 at 18:11
  • Well, certainly changing the request so that it hits the physical file would be best, as it removes a complication. It's possible to do a redirect through some regex, but that will be a lot more trouble than it's worth. – cjc Aug 13 '12 at 18:19