I'm working on an image hosting website tailored to a particular niche. The website is made with Django. I'm currently planning to run it on Linode.
So far so good. The problem is: I will need to perform very CPU-intensive tasks on high resolution images. We're talking about scientific grade computation that can take up to 15 minutes on Linode's 4 Xeon CPUs.
I'm not sure if EC2 works like this, but is the following scenario something that rings a bell?
- User uploads an image on the website, which is hosted on Linode
- The application (somehow?) requests that EC2 runs the CPU intensive task.
- EC2 boots a new instance and runs the software with the data provided
- The data is somehow returned to the web application
Obviously I have lots of gaps in the way this thing would work. Can somebody please help me fill them?
EDIT: I forgot to mention that I use celery for the tasks, using RabbitMQ as a message dispatcher. I wonder if it's possible to run create celery tasks on my web server, but then actually run them on EC2 instances created on demand. Ideally, this would also take care of the communication protocols between the parties involved (as I would be pickling webserver's side).