GitHub - inercia/MCPShell: Use shell scripts as MCP tools (original) (raw)

MCPShell

banner

The MCPShell is a tool that allows LLMs to safely execute command-line toolsthrough the Model Context Protocol (MCP). It provides a secure bridge between LLMs and operating system commands.

Features

Quick Start

Imagine you want Cursor (or some other MCP client) help you with your space problems in your hard disk.

  1. Create a configuration file /my/example.yaml defining your tools:
    mcp:
    description: |
    Tool for analyzing disk usage to help identify what's consuming space.
    run:
    shell: bash
    tools:
    • name: "disk_usage"
      description: "Check disk usage for a directory"
      params:
      directory:
      type: string
      description: "Directory to analyze"
      required: true
      max_depth:
      type: number
      description: "Maximum depth to analyze (1-3)"
      default: 2
      constraints:
      • "directory.startsWith('/')" # Must be absolute path
      • "!directory.contains('..')" # Prevent directory traversal
      • "max_depth >= 1 && max_depth <= 3" # Limit recursion depth
      • "directory.matches('^[\w\s./\-_]+$')" # Only allow safe path characters, prevent command injection
        run:
        command: |
        du -h --max-depth={{ .max_depth }} {{ .directory }} | sort -hr | head -20
        output:
        prefix: |
        Disk Usage Analysis (Top 20 largest directories):

Take a look at the examples directory for more sophisticated and useful examples. Maybe you prefer to let the LLM know about your Kubernetes cluster withkubectl? Or let it run some AWS CLI commands? 2. Configure the MCP server in Cursor (or in any other LLM client with support for MCP)
For example, for Cursor, create .cursor/mcp.json:
{
// you need the "go" command available
"mcpServers": {
"mcp-cli-examples": {
"command": "go",
"args": [
"run", "github.com/inercia/MCPShell@v0.1.8",
"mcp", "--tools", "/my/example.yaml",
"--logfile", "/some/path/mcpshell/example.log"
]
}
}
}
You can also use relative paths and omit the .yaml extension:
{
"mcpServers": {
"mcp-cli-examples": {
"command": "go",
"args": [
"run", "github.com/inercia/MCPShell@v0.1.8",
"mcp", "--tools", "example",
"--logfile", "/some/path/mcpshell/example.log"
]
}
}
}
This will look for example.yaml in the tools directory (~/.mcpshell/tools/ by default).
See more details on how to configure Cursor orVisual Studio Code. Other LLMs with support for MCPs should be configured in a similar way. 3. Make sure your MCP client is refreshed (Cursor should recognize it automatically the firt time, but any change in the config file will require a refresh). 4. Ask your LLM some questions it should be able to answer with the new tool. For example:"I'm running out of space in my hard disk. Could you help me finding the problem?".

Usage and Configuration

Take a look at all the command in this document.

Configuration files use a YAML format defined here. See the this directory for some examples.

For deploying MCPShell in containers and Kubernetes, see the Container Deployment Guide.

Agent Mode

MCPShell can also be run in agent mode, providing direct connectivity between Large Language Models (LLMs) and your command-line tools without requiring a separate MCP client. In this mode, MCPShell connects to an OpenAI-compatible API (including local LLMs like Ollama), makes your tools available to the model, executes requested tool operations, and manages the conversation flow. This enables the creation of specialized AI assistants that can autonomously perform system tasks using the tools you define in your configuration. The agent mode supports both interactive conversations and one-shot executions, and allows you to define system and user prompts directly in your configuration files.

For detailed information on using agent mode, see the Agent Mode documentation.

Security Considerations

So you will probably thing_"this AI has helped me finding all those big files. What if I create another tool for removing files?"_.Don't do that!.

Please read the Security Considerations document before using this software.

Contributing

Contributions are welcome! Take a look at the development guide. Please open an issue or submit a pull request on GitHub.

License

This project is licensed under the MIT License - see the LICENSE file for details.