MCP Ollama

A Model Context Protocol (MCP) server for integrating Ollama with Claude Desktop or other MCP clients.

Requirements

  • Python 3.10 or higher
  • Ollama installed and running (https://ollama.com/download)
  • At least one model pulled with Ollama (e.g., ollama pull llama2)

Configure Claude Desktop

Add to your Claude Desktop configuration (~/Library/Application Support/Claude/claude_desktop_config.json on macOS, %APPDATA%\Claude\claude_desktop_config.json on Windows):

{
  "mcpServers": {
    "ollama": {
      "command": "uvx",
      "args": [
        "mcp-ollama"
      ]
    }
  }
}

Development

Install in development mode:

git clone https://github.com/yourusername/mcp-ollama.git
cd mcp-ollama
uv sync

Test with MCP Inspector:

mcp dev src/mcp_ollama/server.py

Features

The server provides four main tools:

  • list_models - List all downloaded Ollama models
  • show_model - Get detailed information about a specific model
  • ask_model - Ask a question to a specified model

License

MIT

Related in Development - Secure MCP Servers

ServerSummaryActions
Mong MCP ServerA Model Context Protocol (MCP) server implementation that provides a moby-like random name generator...View
MCP ClientNo documentation available.View
MCP HAR ServerA Model Context Protocol (MCP) server that parses HAR (HTTP Archive) files and displays requests in...View
Fluent (ServiceNow SDK)A stdio MCP Server for Fluent (ServiceNow SDK), a TypeScript-based declarative domain-specific langu...View
Multichain MCP ServerView
MCP Expr LangView