GitHub - stippi/code-assistant: An LLM-powered, autonomous coding assistant. Also offers an MCP mode. (original) (raw)

Code Assistant

CI

A CLI tool built in Rust for assisting with code-related tasks.

Features

Installation

Ensure you have Rust installed on your system. Then:

Clone the repository

git clone https://github.com/stippi/code-assistant

Navigate to the project directory

cd code-assistant

Build the project

cargo build --release

The binary will be available in target/release/code-assistant

Configuration in Claude Desktop

The code-assistant implements the Model Context Protocol by Anthropic. This means it can be added as a plugin to MCP client applications such as Claude Desktop.

Configure Your Projects

Create a file ~/.config/code-assistant/projects.json. This file adds available projects in MCP server mode (list_projects and file operation tools). It has the following structure:

{ "code-assistant": { "path": "/Users//workspace/code-assistant" }, "asteroids": { "path": "/Users//workspace/asteroids" }, "zed": { "path": "Users//workspace/zed" } }

Notes:

Configure MCP Servers

A Finder window opens highlighting the file claude_desktop_config.json. Open that file in your favorite text editor.

An example configuration is given below:

Usage

Code Assistant can run in two modes:

Agent Mode (Default)

code-assistant --task [OPTIONS]

Available options:

Environment variables:

Examples:

Analyze code in current directory using Anthropic's Claude

code-assistant --task "Explain the purpose of this codebase"

Use OpenAI to analyze a specific directory with verbose logging

code-assistant -p open-ai --path ./my-project -t "List all API endpoints" -v

Use Google's Vertex AI with a specific model

code-assistant -p vertex --model gemini-1.5-flash -t "Analyze code complexity"

Use Ollama with a specific model (model is required for Ollama)

code-assistant -p ollama -m codellama --task "Find all TODO comments in the codebase"

Use AI Core provider

code-assistant -p ai-core --task "Document the public API"

Use with working memory agent mode instead of message history mode

code-assistant --task "Find performance bottlenecks" --agent-mode working_memory

Continue a previously interrupted task

code-assistant --continue-task

Start with GUI interface

code-assistant --ui

Record a session for later playback

code-assistant --task "Optimize database queries" --record ./recordings/db-optimization.json

Play back a recorded session with fast-forward (no timing delays)

code-assistant --playback ./recordings/db-optimization.json --fast-playback

Server Mode

Runs as a Model Context Protocol server:

code-assistant server [OPTIONS]

Available options:

Roadmap

This section is not really a roadmap, as the items are in no particular order. Below are some topics that are likely the next focus.

Contributing

Contributions are welcome! Please feel free to submit a Pull Request.