GitHub - ckreiling/mcp-server-docker: MCP server for Docker (original) (raw)

🐋 Docker MCP server

An MCP server for managing Docker with natural language!

đŸĒŠ What can it do?

❓ Who is this for?

Demo

A quick demo showing a WordPress deployment using natural language:

mcp-docker-wp-demo.mp4

đŸŽī¸ Quickstart

Install

Claude Desktop

On MacOS: ~/Library/Application\ Support/Claude/claude_desktop_config.json

On Windows: %APPDATA%/Claude/claude_desktop_config.json

Install from PyPi with uv

If you don't have uv installed, follow the installation instructions for your system:link

Then add the following to your MCP servers file:

"mcpServers": {
  "mcp-server-docker": {
    "command": "uvx",
    "args": [
      "mcp-server-docker"
    ]
  }
}

Install with Docker

Purely for convenience, the server can run in a Docker container.

After cloning this repository, build the Docker image:

docker build -t mcp-server-docker .

And then add the following to your MCP servers file:

"mcpServers": {
  "mcp-server-docker": {
    "command": "docker",
    "args": [
      "run",
      "-i",
      "--rm",
      "-v",
      "/var/run/docker.sock:/var/run/docker.sock",
      "mcp-server-docker:latest"
    ]
  }
}

Note that we mount the Docker socket as a volume; this ensures the MCP server can connect to and control the local Docker daemon.

📝 Prompts

đŸŽģ docker_compose

Use natural language to compose containers. See above for a demo.

Provide a Project Name, and a description of desired containers, and let the LLM do the rest.

This prompt instructs the LLM to enter a plan+apply loop. Your interaction with the LLM will involve the following steps:

  1. You give the LLM instructions for which containers to bring up
  2. The LLM calculates a concise natural language plan and presents it to you
  3. You either:
    • Apply the plan
    • Provide the LLM feedback, and the LLM recalculates the plan

Examples

Resuming a Project

When starting a new chat with this prompt, the LLM will receive the status of any containers, volumes, and networks created with the given project name.

This is mainly useful for cleaning up, in-case you lose a chat that was responsible for many containers.

📔 Resources

The server implements a couple resources for every container:

🔨 Tools

Containers

Images

Networks

Volumes

🚧 Disclaimers

Sensitive Data

DO NOT CONFIGURE CONTAINERS WITH SENSITIVE DATA. This includes API keys, database passwords, etc.

Any sensitive data exchanged with the LLM is inherently compromised, unless the LLM is running on your local machine.

If you are interested in securely passing secrets to containers, file an issue on this repository with your use-case.

Reviewing Created Containers

Be careful to review the containers that the LLM creates. Docker is not a secure sandbox, and therefore the MCP server can potentially impact the host machine through Docker.

For safety reasons, this MCP server doesn't support sensitive Docker options like --privileged or --cap-add/--cap-drop. If these features are of interest to you, file an issue on this repository with your use-case.

đŸ› ī¸ Configuration

This server uses the Python Docker SDK's from_env method. For configuration details, seethe documentation.

đŸ’ģ Development

Prefer using Devbox to configure your development environment.

See the devbox.json for helpful development commands.

After setting up devbox you can configure your Claude MCP config to use it:

  "docker": {
    "command": "/path/to/repo/.devbox/nix/profile/default/bin/uv",
    "args": [
      "--directory",
      "/path/to/repo/",
      "run",
      "mcp-server-docker"
    ]
  },