Setting Up a Local Development Environment with Next.js, Nest.js, Docker, and NGINX for Hot Reloading

While working on my side project, I encountered numerous questions on different forums about the setup for NGINX, Next.js, and Nest.js in a Docker environment.

Getting them to run in a local Docker environment was a common theme in these questions. A clear benefit of this approach is to mirror the production environment as closely possible. This allows us to catch configuration-related issues early in the development stage. Hot reload is also necessary to ensure efficiency in development. As such, the browser will reflect the changes automatically after any code changes.

I managed to get all to work after some research and here is how I’ve structured the approach. The full code example can be found here.

Step 1: Prepare Nest.js Repository

To create the repository for the backend service, run the nest cli.

nest new server

Change the port in main.ts from 3000 to 3001, this is to avoid using the same port as the frontend app. In real projects, this should be stored in an environment variable.

// server/src/main.ts
await app.listen(3001);

Step 2: Prepare Next.js Repository

Start the frontend project with the create-next-app cli and calling it client. I’m using app router with other default configurations.

npx create-next-app@latest client

Since Nest.js already provides a GET / route that returns “Hello World”, let’s hook it up from the frontend using React Server Component and a React Client Component.

In client/app, create a folder client-hello and add a page.tsx with the following code, to fetch and display on the client component. Notice that we are fetching from /api/, this path will be rewritten in the Nginx configuration to become / .

// client/app/client-hello/page.tsx

“use client”;

import { useEffect, useState } from “react”;

export default function ClientHelloPage() {
const [text, setText] = useState(“”);

useEffect(() => {
function helloApi() {
return fetch(“/api/”);
.then((res) => {
return res.text();
.then((data) => setText(data));
}, []);

return <h1 className=”text-3xl font-bold”>{text}</h1>;

On the same level, create a folder serer-server-hello and page.tsx with the following code to set up the React Server Component. Here we are fetching http://nginx/api/ from frontend server to the backend server, proxied through an Nginx server. This path will be rewritten in the Nginx configuration to become / as well, as we will see in below.

// client/app/server-hello/page.tsx

export default async function ServerHelloPage() {
try {
const res = await fetch(“http://nginx/api/”);
const data = await res.text();
if (!res.ok) {
throw new Error(data);
return <h1 className=”text-3xl font-bold”>{data}</h1>;
} catch (error) {
return <h1 className=”text-3xl font-bold”>Error</h1>;

Step 3: Nginx Configuration

In the root level of the project, create an nginx folder and add the following configuration in nginx.conf.

# nginx/nginx.conf

worker_processes 1;

events {
worker_connections 1024;

http {
sendfile on;

upstream client {
server client:3000;

upstream api {
server api:3001;

server {
listen 80;

location /api {
# Strips ‘/api’ from the URI before passing it to the backend
rewrite ^/api/(.*)$ /$1 break;
proxy_pass http://api;
proxy_http_version 1.1;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection ‘upgrade’;
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme;

location / {
proxy_pass http://client;
proxy_http_version 1.1;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection ‘upgrade’;
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme;

Step 4: Set Up Docker for Local Environment

In each of the client and server folders, create a Dockerfile.local for running the containers locally. Create docker-compose.local.yml in the root folder to run the containers.

Client Dockerfile.local

# client/Dockerfile.local

FROM node:21.7-alpine AS base

FROM base AS deps
# Check to understand why libc6-compat might be needed.
RUN apk add –no-cache libc6-compat

# Install dependencies based on the preferred package manager
COPY package*.json ./
RUN npm ci
COPY . .

CMD npm run dev

Server Dockerfile.local

# server/Dockerfile.local

FROM node:21.7-alpine AS base


COPY package*.json ./
RUN npm ci
COPY . .
CMD npm run start:dev

Root Folder docker-compose.local.yml

# docker-compose.local.yml

version: “3”

container_name: api
context: ./server
dockerfile: Dockerfile.local
working_dir: /app
– “3001:3001”
– action: sync
path: ./server
target: /app
– node_modules/
– action: rebuild
path: package.json
container_name: client
context: ./client
dockerfile: Dockerfile.local
restart: always
– action: sync
path: ./client/app
target: /app/app
– node_modules/
– action: rebuild
path: package.json
container_name: nginx
image: nginx:alpine
– api
– client
– ./nginx/nginx.conf:/etc/nginx/nginx.conf
– “3040:80”

Step 5: Building & Running Docker Containers

Run the following docker compose watch command in the root folder specifying docker-compose.local.yml and see the pages loading correctly in in http://localhost:3040/server-hello and http://localhost:3040/client-hello.

docker-compose -f docker-compose.local.yml watch

Then, visit http://localhost:3040/server-hello and http://localhost:3040/client-hello and ensure they are loading without errors.

You can connect your API Platform such as Postman or Insomnia and send requests to your backend server via http://localhost:3001:

Make changes to the code in client and server and you can see the changes reflected without restarting the containers. Now we have everything running with hot reload.

Useful Insights

Preventing Unnecessary Builds

Referring to either of the Dockerfile.local files, the line COPY . . happens after RUN npm ci . If the reverse is true, Docker will rebuild this and all subsequent steps whenever there is a code change. This will lead to unnecessary installation of packages and other rebuilds.

With the above Dockerfile.local files, make a change to the code and notice the step of RUN npm ci was read from cache.

=> CACHED [5/6] RUN npm install 0.0s
=> [6/6] COPY . . 1.1s

Now, edit the Dockerfile.local and move the COPY . . step before RUN npm ci . Run docker-compose -f docker-compose.local.yml watch and make a change to the code. You will notice the step RUN npm ci is always run, which isn’t something you want after every code change.

=> [5/6] COPY . . 1.2s
=> [6/6] RUN npm install 0.8s

Therefore, with the correct configuration, the only time npm ci will run is when the package.json file is changed.

How Did We Set Up Hot Reload?

Referring to the docker-compose.local.yml, we use file-watch for the client and api services to automatically update and preview these services as we edit and save our code.

We define this development configuration with develop and set the action to sync, which makes sure any changes made locally will automatically match with the corresponding files within the service container. Then we run docker compose using docker compose watch instead of the usual docker compose up.


In this guide, we successfully configured a local development environment using Next.js, Nest.js, Docker, and Nginx, complete with hot reloading. I welcome your feedback on this setup to help refine and improve it further.

Code can be found in this repository.

Setting Up a Local Development Environment with Next.js, was originally published in Level Up Coding on Medium, where people are continuing the conversation by highlighting and responding to this story.

​ Level Up Coding – Medium

about Infinite Loop Digital

We support businesses by identifying requirements and helping clients integrate AI seamlessly into their operations.

Gartner Digital Workplace Summit Generative Al

GenAI sessions:

  • 4 Use Cases for Generative AI and ChatGPT in the Digital Workplace
  • How the Power of Generative AI Will Transform Knowledge Management
  • The Perils and Promises of Microsoft 365 Copilot
  • How to Be the Generative AI Champion Your CIO and Organization Need
  • How to Shift Organizational Culture Today to Embrace Generative AI Tomorrow
  • Mitigate the Risks of Generative AI by Enhancing Your Information Governance
  • Cultivate Essential Skills for Collaborating With Artificial Intelligence
  • Ask the Expert: Microsoft 365 Copilot
  • Generative AI Across Digital Workplace Markets
10 – 11 June 2024

London, U.K.