Dockerizing My Website: From Development to Production

December 3, 2024 (1mo ago)7 min read

Dockerizing My Website: From Development to Production

Deploying web applications can sometimes feel overwhelming—managing dependencies, configurations, and environments is no small feat. When I started scaling my personal website, these challenges became all too familiar. Manual deployments felt clunky, and debugging environment issues? A total headache.

To streamline my workflow and make my Next.js website deployment both reliable and scalable, I decided to embrace Docker.

In this post, I'll walk you through:

  1. Why I dockerized my website.
  2. How the Docker setup works.
  3. Integrating the new Dockerized setup with GitHub Actions for deployment.

Let's dive in!


Why Docker?

Before adopting Docker, managing my deployment pipeline involved manually configuring environments and dependencies on the server. While this worked, it left room for inefficiencies and potential pitfalls:

  • Consistent Environments: Docker ensures that the app runs the same way on any server, eliminating "it works on my machine" scenarios.
  • Simplified Dependency Management: All dependencies are defined in the Dockerfile, removing the need for manual installations on the server.
  • Streamlined Updates: Making changes or updates is as simple as rebuilding the Docker image and redeploying.
  • Future Scalability: While scaling isn't a concern yet, Docker's portability ensures the app is ready for horizontal scaling when needed.
  • Better Debugging: With Docker logs and isolated containers, diagnosing issues becomes much easier than debugging a system-wide configuration problem.

Docker tackles these challenges head-on by packaging your app and its dependencies into a portable, self-contained container. This guarantees consistency across environments and minimizes surprises during deployments.


The Docker Setup

Here's how I structured the Docker setup for my website. This approach defines the environment and build process, ensuring every deployment runs smoothly—no more surprises!

1. Dockerfile

The Dockerfile is the heart of Dockerized applications. It defines how your app is built, ensuring every deployment behaves identically. Here's how I set up my Dockerfile:

dockerfile
1FROM node:22.9.0-alpine3.20
2
3WORKDIR /app
4
5COPY package.json package-lock.json* ./
6RUN npm install
7
8COPY . .
9RUN npm run build
10
11RUN addgroup -S appgroup && adduser -S <YOUR_NOT_ROOT_USER> -G appgroup \
12    && chown -R <YOUR_NOT_ROOT_USER>:appgroup /app
13USER <YOUR_NOT_ROOT_USER>
14
15EXPOSE 3000
16CMD ["npm", "run", "start"]

Highlights:

  • Base Image: The node:22.9.0-alpine3.20 image offers a lightweight Node.js runtime optimized for containerized environments.
  • Build Process: Copies all application files and runs the production build, ensuring the container includes the latest version of the app.
  • Security Enhancements: A non-root user (<YOUR_NOT_ROOT_USER>) is created and used, reducing the risk of privilege escalation vulnerabilities—a small but vital step for production security.
  • Port Exposure: Exposes port 3000 to receive traffic, aligning with the app's configuration. This keeps the container lean while adhering to the app's expected behavior.

This setup ensures the container is lightweight, secure, and specifically optimized for a production-ready Next.js app.


2. Nginx Configuration

To serve the app in production, I used Nginx as a reverse proxy. Instead of managing Node.js directly on public-facing ports, Nginx handles incoming traffic and routes it securely to the app container running on port 3000.

I detailed how to configure Nginx as a reverse proxy, set up SSL with Let's Encrypt, and secure the server in my previous post about using Nginx. If you're new to Nginx or want a deeper dive into these configurations, I recommend checking it out as we'll skip the details here.


3. .dockerignore

To optimize the Docker build, the .dockerignore file ensures that only essential files are copied into the container:

dockerignore
1**
2!src/*
3!content/*
4!public/*
5!package.json
6!next.config.mjs
7!tailwind.config.ts
8!postcss.config.mjs
9!tsconfig.json
10!components.json

How It Works & Why It Matters:

  • Excluding Unneeded Files: Files like node_modules and build artifacts are excluded, as they are generated during the build process. This reduces the container's size and build time.

    • The ** pattern excludes all files by default (node_modules, .git, etc.).
  • Selective Copying: Only the necessary files are included in the Docker image, reducing the container's size and build time.

    • The ! pattern negates the exclusion for specific files or directories that need to be included (src, content, etc.).

4. Docker Compose

Setting up Docker Compose felt like piecing together a puzzle—defining services and their interactions simplified the deployment process. Everything—from app logic to traffic handling—finally felt cohesive and predictable.

yaml
1services:
2  app:
3    image: <YOUR_DOCKER_IMAGE>
4    build:
5      context: ./
6      dockerfile: docker/Dockerfile
7    
8  web-server:
9    image: nginx:alpine
10    volumes:
11      - ./docker/<NGINX_CONFIG_FILE>:/etc/nginx/conf.d/default.conf:ro
12      - <SSL_CERT_PATH>/fullchain.pem:/etc/nginx/ssl/fullchain.pem:ro
13      - <SSL_KEY_PATH>/privkey.pem:/etc/nginx/ssl/privkey.pem:ro
14    ports:
15      - "80:80"
16      - "443:443"
17    depends_on:
18      - app

Highlights:

  • Multi-Service Setup: The web-server (Nginx) proxies incoming traffic to the app service (Node.js app).
  • Volume Mounts: Maps SSL certificates (fullchain.pem and privkey.pem) and the Nginx configuration file into the web-server container. Replace <NGINX_CONFIG_FILE>, <SSL_CERT_PATH>, and <SSL_KEY_PATH> with your actual paths to keep the configuration adaptable and reusable.
  • Port Mapping: Exposes HTTP (80) and HTTPS (443) ports to external traffic.

The Updated Deployment Workflow

Updating the deployment workflow we talked about in my previous post to incorporate Docker wasn't just about adding commands—it transformed how I manage and deploy updates. By removing manual steps, I've made deployments faster and less error-prone. Here's the updated GitHub Actions workflow:

yaml
1name: Deploy to DigitalOcean
2
3on:
4  push:
5    branches: ["main"]
6    paths:
7      - "src/**"
8      - "content/**"
9      - "public/**"
10  workflow_dispatch:
11
12jobs:
13  build:
14    runs-on: ubuntu-latest
15
16    steps:
17    - name: Deploy to DigitalOcean droplet via SSH
18      uses: appleboy/ssh-action@v1.2.0
19      with:
20        host: ${{ secrets.DO_HOST }}
21        username: ${{ secrets.DO_USERNAME }}
22        key: ${{ secrets.DO_PRIVATE_KEY }}
23        port: ${{ secrets.DO_PORT }}
24        script: |
25          cd <PATH_TO_YOUR_APP>
26          git pull
27          docker compose up --build -d

Key Changes:

  • Removed Manual Build Steps: The workflow no longer needs to install dependencies or build the app manually.
  • Docker Integration: The docker compose up --build -d command builds and starts the Docker containers, automating the deployment process.
  • Simplified Server Maintenance: The containerized setup makes scaling and updates much more efficient.

These changes ensure that every deployment is seamless, consistent, and ready for future enhancements—whether that's scaling services or adding more automation steps.


Benefits of Dockerizing My Website

Switching to Docker has brought significant improvements:

  1. Environment Consistency: No more “it works on my machine” issues.
  2. Simplified Scaling: Adding replicas or services is straightforward.
  3. Improved Security: Isolating the app and running it as a non-root user enhances security.
  4. Faster Deployments: Automating the build and deployment process reduces downtime.

While these benefits are practical, the biggest win for me has been peace of mind. Knowing that deployments are consistent and secure has freed me to focus more on improving the site itself instead of firefighting deployment issues.


What's Next?

With my website fully Dockerized, there's still plenty of room to optimize:

  • CI/CD Enhancements: Integrate automated tests into the deployment pipeline.
  • Blue-Green Deployments: Implement a blue-green deployment strategy for zero-downtime updates.

Dockerizing my website has completely transformed how I handle both development and production workflows. It's more than just simplifying deployments—it's about building confidence and gaining back precious time to focus on creating value. If you've been considering Docker, now's the perfect time to dive in and see the difference for yourself.

I hope this post has inspired you to explore Docker and its benefits. If you have any questions or want to share your Docker journey, feel free to reach out. Happy coding!


You can find me as J1Loop on GitHub or connect with me on LinkedIn.