Docker for the .NET Developer: From "It Works on My Machine" to Production Confidence
How containers and Docker Compose became my essential tools for building, testing, and running robust .NET microservices.

Hey there, dev! We’ve all been there. That moment of relief when you finish a feature, everything compiles, the tests pass, and you proudly declare: "It works on my machine!". Hours later, chaos ensues: the application crashes in the QA environment, or worse, a teammate can't even get the project to run.
For years, we've battled environment drift. Different .NET SDK versions, local environment variables that are never versioned, that one-of-a-kind SQL Server instance running on a machine that behaves "uniquely." This lack of consistency isn't just annoying; it's expensive, creates bugs, and kills productivity.
It was against this backdrop that Docker stopped being "that DevOps tool" and became a cornerstone of my .NET development workflow. And its loyal companion, Docker Compose, became the conductor for our local microservices orchestra. Let's unpack how this duo transformed the way we build software.

1. Why Bother? Environment as Code
The first question many .NET developers ask is, "But Visual Studio already handles everything for me. Why do I need Docker?" The answer lies in a shift in mindset: treating your development environment as code.
A Dockerfile is the exact, versionable recipe for building the environment where your application will run. There's no more, "you forgot to install dependency X," or, "which SDK version are we using?" It's all right there, in code.
When combined with Docker Compose, the power multiplies. You can define not just your API but its entire ecosystem of dependencies in a single file:
The database: Need SQL Server or Postgres? Spin up a container for it.
The cache: Using Redis? That’s just another service in your
docker-compose.yml.Other microservices: Does your app depend on another team's service? Add it to the compose file.
With a single command—docker-compose up—anyone on the team can recreate the complete, identical development environment. The era of the "10-page setup guide" is over.
2. Anatomy of a Modern Dockerfile for .NET 8
The real elegance of using Docker with .NET lies in multi-stage builds. Instead of a monolithic file, we create a production line that optimizes for security, size, and performance.

Let's break down the stages, inspired by a real-world Azure Functions microservice I built:
Stage 1: The Builder (build) We start with a base image that has the full .NET SDK (mcr.microsoft.com/dotnet/sdk:8.0). It's large and packed with tools, perfect for compiling our code. The trick here is to copy the .csproj files first and run dotnet restore. Thanks to Docker's layer caching, dependencies are only downloaded again if the project files change.
# Build Stage - Uses the full SDK
FROM mcr.microsoft.com/dotnet/sdk:8.0 AS build
WORKDIR /source
# Copy and restore dependencies first to leverage caching
COPY *.sln .
COPY src/*/*.csproj ./src/
RUN dotnet restore
# Copy the rest of the source code
COPY . .
Stage 2: The Validator (test) One of my favorite practices is embedding unit tests right into the image build process. Before publishing anything, we create a stage that simply runs dotnet test.
# Test Stage - Ensures quality
FROM build AS test
WORKDIR /source
RUN dotnet test
If a test fails, the image build fails. This provides fast feedback and a fantastic quality gate.
Stage 3: The Publisher (publish) Here, we use the build output to generate optimized release artifacts.
# Publish Stage - Creates the final artifacts
FROM build AS publish
WORKDIR /source/src/MyWebApp.Functions
RUN dotnet publish -c Release -o /app/publish
Stage 4: The Final Image (runtime) This is the crown jewel. We discard all previous stages and start fresh with a lean, secure runtime image (e.g., mcr.microsoft.com/azure-functions/dotnet-isolated:4-dotnet-isolated8.0). We copy only the published artifacts from the publish stage.
# Final Stage - Lean and secure runtime image
FROM mcr.microsoft.com/azure-functions/dotnet-isolated:4-dotnet-isolated8.0
WORKDIR /home/site/wwwroot
COPY --from=publish /app/publish .
The result? A small image with a minimal attack surface (no SDK, no source code), ready for production.
3. Orchestrating the Local Symphony with Docker Compose
The Dockerfile gives us the image. The docker-compose.yml file gets it to play along with the rest of the band. Here’s a practical (and simplified) example of how to orchestrate a .NET API with a SQL Server database:
version: '3.8'
services:
# Our API microservice
my-api:
container_name: my-clean-api
build:
context: .
dockerfile: Dockerfile
ports:
- "8080:80" # Maps container port 80 to port 8080 on the host
environment:
- ASPNETCORE_ENVIRONMENT=Development
- ConnectionStrings__SqlConnectionString=Server=db;Database=MyDb;User Id=sa;Password=${DB_PASSWORD};TrustServerCertificate=True
depends_on:
- db # Ensures the database starts first
# Our database dependency
db:
container_name: local-sql-server
image: mcr.microsoft.com/mssql/server:2022-latest
environment:
- ACCEPT_EULA=Y
- SA_PASSWORD=${DB_PASSWORD}
ports:
- "1433:1433" # Exposes the SQL Server port to the host machine
A Quick Analysis:
build: Compose will use ourDockerfileto build the API image.environment: We inject the connection string, pointing to thedbservice. The password comes from an environment variable, keeping secrets out of the code.depends_on: The API will only attempt to start after the database container is up, preventing startup connection errors.A single network: By default, Compose creates a virtual network for these services, allowing them to communicate using their service names (like
db).
The complexity of setting up communication between the API and the database is reduced to a few lines of YAML. That’s productivity.
Conclusion: More Than a Tool, a Mindset
Adopting Docker in my .NET workflow was a game-changer. The conversation shifted from, "What version of the SDK do you have?" to, "Did you run docker-compose up?"
The benefits are clear and impactful:
Radical Consistency: The same environment from dev to test to production.
Isolation: Multiple projects with different dependencies can run side-by-side without conflict.
Automation and CI/CD: The
Dockerfilebecomes the single contract for the build and deploy pipeline.Confidence: The artifact you test locally is the exact same immutable artifact that goes to production.
At the end of the day, Docker isn’t about containers. It’s about predictability. It's about focusing our time on solving complex business problems, with the peace of mind that comes from knowing the foundation—the environment—just works. Every single time.
Clean code is not about beauty — it’s about predictability.
📣 Let's Keep the Conversation Going!
What's been your experience using Docker with .NET? Have you ever been caught in the "it works on my machine" trap? Share your stories and tips in the comments below!





