1

I've got an ECS that implements a REST API, which does some fairly heavy computing, but can massively profit from caching at the request level (since it'll receive many identical requests).

I currently have caching implemented on the ECS image itself, but that's not optimal since the cache obviously is neither shared across ECS tasks nor permanent if a task should stop.

So, what's the best way to solve this in AWS? Can I use a Redis cluster or something similar and put it in front of the ECS cluster?

joko
  • 11
  • 1

1 Answers1

1

You don't put Redis in front of ECS but you can make your ECS task consult the Redis cache to see if the required result is there. If yes return it to the caller, if not compute it, return to the caller and also store it to Redis for the next time. That's a pretty standard pattern but requires support from the Application running on your ECS - it needs to know how to talk to Redis. Or to Memcache (simpler and may be enough for simple caching).

Alternatively you can put your REST API behind a CloudFront CDN and enable caching on that level. If done right CloudFront will recognise similar requests and return the cached responses right away without even reaching out to ECS. This can be transparent and doesn't need the App support.

Hope that helps :)

MLu
  • 23,798
  • 5
  • 54
  • 81
  • Thanks for the answer! I'm not sure which steps to take to make my ECS talk to Redis/Memcache -- to be more precise my API is a Flask app. If I understand you correctly, I'd have to make Flask talk to the cache? This being said, I'm imagining a scenario where in theory my ECS task could be down completely, but for previous requests, I could just return the cached response (hence I'd want to put it in front of ECS). Would that be something that I'd use CDN for? – joko Feb 25 '21 at 20:45