0

I have two running docker containers:

  1. A Docker container running Nginx that serves Angular app resource files that were copied into the Docker image on docker build
  2. A Docker container that is an api that is current listening on port 5007 internally on the container and 5007 externally on the host machine

I am setting up a staging environment and the following external urls:

  • app.staging.mysite.com
  • account.staging.mysite.com

app.staging.mysite.com hits port 80 and responds with the Angular app resources no problem.

Now I want to get account.staging.mysite.com external API url request to hit my API Docker container listening on port 5007. Requests from account.staging.mysite.com will hit port 80 of the host machine.

Nginx will receive the account.staging.mysite.com port 80 requests and proxy them to Docker container 127.0.0.1:5007 while at the same time still serve my Angular app file by default for all external urls/domains that I do not explicitly express for proxy in my Nginx config.

Instead of serving my Angular app I would like to use Nginx to forward the request to port 5007 so the my Account API can response. So I have altered my Nginx config to the following:

upstream accountstaging{
    server 0.0.0.0:5007
}


server {
    listen 80 default_server;
    listen [::]:80 default_server;
    server_name _;

    # Main
    location / {
        root   /usr/share/nginx/html;
        index  index.html index.htm;
        try_files $uri$args $uri$args/ /index.html;
    }


    error_page   500 502 503 504  /50x.html;
    location = /50x.html {
        root   /usr/share/nginx/html;
    }
}

server {

    listen 80;
    server_name account.staging.mysite.com;

    location / {
        proxy_pass         http://accountstaging;
        proxy_redirect     off;
        proxy_set_header   Host $host;
        proxy_set_header   X-Real-IP $remote_addr;
        proxy_set_header   X-Forwarded-For $proxy_add_x_forwarded_for;
        proxy_set_header   X-Forwarded-Host $server_name;
    }
}

The thing that really surprises me is now there is no default_server hits. In other words, app.staging.mysite.com, which was working and responding with my Angular app resources is no longer working when I added the second server listening on port 80 specifically for account.staging.mysite.com.

So really this question is not that much about Docker containers, this is really about Nginx configuration. Though I am not entirely sure that my Docker containers could be excluded from being part of the problem. So here is my docker-compose.yml for account.staging.mysite.com:

version: '3'

services:
  apistaging:
    build: 
      context: ./
      dockerfile: docker/staging/Dockerfile
    image: tsl.api.account.image
    container_name: tsl.api.account.container
    ports:
      - "5007:5007"
    environment: 
      ASPNETCORE_URLS: http://+:5007

And here is my docker-compose.yml for app.staging.mysite.com:

version: '3'

services:
  frontend:
    build: 
      context: ./
      dockerfile: docker/Dockerfile
    image: tsl.web.frontend.image
    container_name: tsl.web.frontend.container
    ports:
      - "80:80"

Here is my Dockerfile for Nginx that serves my Angular app and is hopefully going to act as a reverse proxy as well, you can see that Nginx default_server servers my Angular app resource files that I copied into this Docker image on docker build:

FROM centos:7
MAINTAINER Brian Ogden

# Not currently being used but may come in handy
ARG ENVIRONMENT
ENV NODE_VERSION 6.11.1

RUN yum -y update && \
    yum clean all && \
    yum -y install http://nginx.org/packages/centos/7/noarch/RPMS/nginx-release-centos-7-0.el7.ngx.noarch.rpm \
    yum -y makecache && \
    yum -y install nginx-1.12.0 wget

# Cleanup some default NGINX configuration files we don’t need
RUN rm /etc/nginx/conf.d/default.conf

#############################################
# NodeJs Install
#############################################

#Download NodeJs package
RUN wget -q -O - https://nodejs.org/dist/v$NODE_VERSION/node-v$NODE_VERSION-linux-x64.tar.gz \
    | tar --strip-components=1 -xzf - -C /usr/local

# https://stackoverflow.com/a/35774741/1258525
# use changes to package.json to force Docker not to use the cache
# when we change our application's nodejs dependencies:
COPY ./package.json /tmp/package.json
RUN cd /tmp && npm install
RUN mkdir /app && cp -a /tmp/node_modules /app/

WORKDIR /app
COPY . /app

RUN npm run build-$ENVIRONMENT

RUN cd /app && cp -a dist/* /usr/share/nginx/html
COPY ./docker/conf/frontend.conf /etc/nginx/conf.d/frontend.conf
COPY ./docker/conf/nginx.conf /etc/nginx/nginx.conf


EXPOSE 80

CMD ["nginx"]

Is this issue possibly because of 80:80 binding for the Angular app frontend docker container?

Brian
  • 311
  • 3
  • 4
  • 14

2 Answers2

1

Are you running Docker and Nginx on the same server? If so, I think you have conflict in the port configuration. At least docker instance for app.staging.mysite.com is using external port 127.0.0.1:80 and the same is with Nginx. Then it may seems that you are connecting to app.staging.mysite.com:80 correctly, but you are not connecting via Nginx reverse proxy, but directly to docker instance. Try to map app.staging.mysite.com to different port then 80 and create upstream server configuration and another related Nginx settings for it, if this is possible for you. If not, I probably do not understand your intent :)

patok
  • 693
  • 1
  • 5
  • 14
  • I think you are right! Will be trying your advice in a couple hours – Brian Feb 02 '18 at 15:22
  • I thought about what you said some more, so there is one Docker container that is Nginx listening on port 80, it servers my Angular app resource files, to app.staging.mysite.com requests, I have added my Dockerfile for Nginx you will see my Angular app files are copied into this Dockerfile – Brian Feb 02 '18 at 17:24
0

I was able to finally figure out what was going on here, I first broke the problem down even further, and you can find my solution here

Brian
  • 311
  • 3
  • 4
  • 14