Software in context


The Ultimate NodeJS Development Setup with Docker

3rd February 2016

I've found myself writing a lot of JavaScript lately, both client- and server-side. And over the past several projects, a common pattern has emerged: I'll have to develop a single page application with a matching back-end web API. This post explains the development and deployment setup I've centered on for these situations. It's by no means complete, but has increased my productivity immensely. Enjoy!

Above, the ultimate unnamed pond near Alta Peak, CA

My initial struggle was simply about how to organize the codebases for a JavaScript project requiring both front- and back-ends. For a while, there was a single codebase (typically an ExpressJS server), with an embedded directory containing the website's source code. For my web editor of choice, WebStorm, this meant having one project embedded within another—and it just felt messy.

I began to realize that JavaScript websites are different enough from back-end applications that they deserve their own project repositories. While both use a package.json file to declare their name, version numbers, and dependencies, each has a specialized ecosystem of tools. And as a general design principle, keeping your front- and back-ends separate allows you to swap them out with minimal friction—e.g. moving an API from ExpressJS to Sails, or jumping to ES6/7, all without disturbing the front codebase.

Secondly, I wanted a simpler and snappier way to develop in the dual-end JavaScript situation. I've used gulp-watch, live-reload, and nodemon to achieve live changes for front- and back-ends before, with great success. But I also wanted a dev environment that was easier to get going, e.g. fewer locally installed dependencies, no long-running tasks in the terminal, and minimal cross-domain complaint workarounds.


Given these considerations, I propose a NodeJS development setup with the following requirements:

  1. Separate codebases for front- and back-ends
  2. Develop in a portable runtime environment
  3. Live changes from both codebases in development
  4. Bonus: Deploy as a single Docker container

The Setup

Note: As mentioned above, this setup assumes a case where a website has a matching web API, or at least a NodeJS server behind it. If all you need is an HTTP server for your static files in the front, see here.

1. Project Structure

I like to keep all of my projects in a big directory called ~/source. Each project lives in a top-level directory in ~/source, and is typically a cloned git repository. To maintain our requirement of separate codebases for front- and back-ends, we'll have a structure like this:


where each sub-directory is a NodeJS project with a package.json file at the top. In my typical setup, frontent-codebase will be an AngularJS-based website with bower dependencies and a Gulpfile, while backend-codebase is an ExpressJS-based HTTP server.

So, we have two codebases, and we'd like to start working on them. Ideally we want the front-end to be served up for perusal with a local web browser, and for the back-end resources to be available locally as well. The straightforward approach is to maybe serve up the website with a local HTTP server like Apache or Nginx, and run node index.js from the terminal for the back-end. This is not ideal for at least two reasons:

  1. The setup depends on locally installed dependencies like NodeJS and an HTTP server. If another developer needs to reproduce the setup, they'd have to recreate this environment manually. This also plainly violates requirement (2).
  2. Potential cross origin complaints form the browser. Out of the box you won't be able to make XHR requests from the front-end to the back-end, unless you support JSONP, wire up your HTTP server and/or hosts file correctly, or have an insecure CORS setting on the server. We'll have to cross this bridge in production anyway, so we'd just be putting it off.

2. Docker Tooling

This is where we use Docker to circumvent the above issues and pass requirements (2) and (3) with flying colors. I assume a basic familiarity with Docker. Our running development environment will look like the following:

Node and Docker Development Diagram

What's happening is that we're running a local Docker container that has all of the back-end dependencies inside of it. Two directories on the host (our two codebases) are mounted into the container. The back-end code is watched by nodemon, where the compiled front-end is served as a set of static files (I'm still using gulp-watch outside of the container to rebuild these). This allows us to make changes directly to our code on the host using our editor of choice, while the container picks up our changes and refreshes everything.

There are a few relevant files here. Since the back-end is the server that will drive the whole application, I put these in backend-codebase:

1. scripts/



docker stop ${CONTAINER_NAME}  
docker rm ${CONTAINER_NAME}

docker run -d -p 80:3000 --name ${CONTAINER_NAME} -e "NODE_ENV=development" -v ${WEBSITE_ASSETS}:/static/ -v `pwd`:/app library/node:5.0 /app/scripts/  

All we're doing here is (1) stopping and removing the previous instance of the development container (yes, it will complain if the container doesn't exist). And (2) running a new container with the two volume mounts noted above. A few things to note here:

  1. WEBSITE_ASSETS is the path to the compiled front-end files, i.e. all your linted and/or uglified website stuff.
  2. This example assumes a backend running on port 3000 that's wired out to port 80.
  3. `pwd` is the current working directory so make sure to run this from backend-codebase with ./scripts/
  4. We're starting from the public NodeJS Docker image, although nothing is keeping us from using our own prebuilt development image.
  5. We're starting the container on a custom entrypoint…

2. scripts/


cd /app  
npm install  
./node_modules/nodemon/bin/nodemon.js --legacy-watch index.js

This script runs when the development container starts. We move into the /app directory (previously mounted) and install npm dependencies. Dependencies have to be installed after the container starts because we're starting from a public base image.

Then instead of running node, we run nodemon, which wraps node and restarts the server when our project files change. We find the nodemon executable to avoid a global install, and the --legacy-watch option is needed per this note. Finally, be sure to install nodemon in the package.json of backend-codebase.

3. index.js

All that's left is to make sure the back-end (in this case Express) knows to serve the mounted front-end volume as static files when we're in development mode:

// ...

if (process.env.NODE_ENV === 'development') {  
    app.use('/', express.static('/static'));
} else {
    app.use('/', express.static('node_modules/my-website-module/assets'));

We'll cover non-development mode in the next section. Now you should be able to hit http://<local_docker_ip_address>:80 and see live changes reflected from both codebases! You can modify the paths that files and API routes are served on as your project needs.

3. Bonus: Deployment

Up to this point we've done enough groundwork with Docker and separating our codebases to achieve a clean implementation of requirement (4): deploy the setup as a single Docker container. The deployment will look like:

Node and Docker Deployment Diagram

The trick is to publish frontend-codebase as an NPM module, and backend-codebase as a Docker image. I won't get into the details of working with a Docker or NPM registry, but assuming you have registries at your disposal, here's a basic sketch of the deployment process:

  1. In frontend-codebase, update the version number in package.json and run npm publish.
  2. In backend-codebase, ensure package.json specifies frontend-codebase as a dependency with the appropriate version.
  3. Build and push the back-end Docker image.
  4. On your server, or as part of your deployment process, pull and run the image.

Relevant files (again in backend-codebase):


  "name": "my-backend-codebase",
  "version": "1.0.0",
  "dependencies": {
    "my-website-module": "^0.1.0"
  "devDependencies": {
    "nodemon": "^1.8.1"

This is a trimmed-down package.json file that illustrates using your own front-end website as a dependency, and nodemon as a development dependency.


FROM node:5.0

ADD . /app  
WORKDIR /app  
RUN npm install  

CMD ["node", "index.js"]  

Here's a simple Dockerfile that works for the purposes of this example. Notice we can run node as opposed to nodemon in production.

All that's left to talk about is the else block from index.js:

} else {
    app.use('/', express.static('node_modules/my-website-module/assets'));

In production mode, we're telling Express to look for static files in our installed dependencies. This assumes that the front-end's published npm module contains a directory called assets containing production-ready files for the website.

Possible Improvements

This setup is a work in progress and there are some rough edges. Here are some ways it could improve:

  1. Use a Dockerfile to build the development container. This would allow more customization of the build environment, and keep us from having to install the dependencies every time the container restarts.
  2. Run Gulp in the dev container? You may have noticed that this is not a completely portable development environment outside of Docker. The front-end build process still runs on the host. I'm thinking of ways to subsume this into the container.
  3. Clean up scripts. I'm sure there are ways these bash scripts could be improved.

Thanks for reading, and feel free to leave feedback!

Caleb Sotelo

Caleb Sotelo

I'm a Software Engineer and Director of OpenX Labs. I try to write about software in a way that helps people truly understand it. Follow me @calebds.

View Comments