1

I'm not sure if this is the best place to put this question. But for a few days I'm struggling to find a good deploying process to Elastic Beanstalk and I want to know the opinion of people with more experience with that.

First, my plan was to have a generic Docker image, and bind my code volume to that, then do all necessary installations when running the container.

The problem was it takes too long to install everything and EB aborts the deployment with timeout. I've tried to increase the timeout, but it didn't work (no idea why, it doesn't take that long to run the container locally)

So, my next idea was to have the image with everything already pre-installed. but for that I'd to build it before and push to a registry.

Ok, so I decided using ECR wouldn't be a problem for that.

I've created the image and pushed to the registry

Then, I've created a Dockerrun.aws.json, and pointed it to my repository.

After doing that I kept getting the same timeout error as before.

Some hours later I realized that AWS keep building the Dockerfile, even having the Dockerrun in the directory.

I don't understand why they do that, but to fix it I had to move my Dockerfile to be not tracked by git, which is really annoying.

Then ok, it works, but it is a crappy workflow. Every time I update something on my Dockerfile, I have to run everything again, push to ECR. And only after that I can deploy.

And if someone else wants to deploy the code I have to send them the Dockerfile manually, because I can't put it on git, otherwise Elastic Beanstalk try to build it.

I'm pretty sure there is an easier workflow for this. Could someone point in a better direction in this regard?

dfranca
  • 135
  • 8
  • You probably should use Jenkins or CodePipeline to automate the workflow. As far as I remember - beanstalk will build Dockerfile only for single environments, it will ignore it for multi docker environments. But anyway you must somehow build and upload new docker images to ECR – ALex_hha Feb 19 '18 at 13:09

0 Answers0