Ollama is a AI & ML MCP server that lets Claude Code, Cursor, Windsurf and any MCP-compatible AI agent run local LLMs via Ollama server. Install in 1 minute with mcpizy install ollama.
AI & ML
Run local LLMs via Ollama server
mcpizy install ollamanpx -y ollama-mcpOpen-source LLM observability. Collaborative prompt editing, tracing, and evaluation.
Predict anything with AI forecasting. Time series predictions, anomaly detection, and trend analysis.
Production-ready RAG for searching and retrieving document data. Semantic search over your files.
Models, datasets, and spaces on HF
If Ollama doesn't fit your stack, these AI & ML MCP servers solve similar problems.
The Ollama MCP server is an AI & ML Model Context Protocol server that lets Claude Code, Cursor, Windsurf, VS Code with Copilot, and other MCP-compatible AI agents run local LLMs via Ollama server. It exposes Ollama's capabilities as tools the AI can call directly from your editor or CLI.
The fastest way is the MCPizy CLI: run `mcpizy install ollama` and MCPizy will add the server to your `.claude.json` automatically. You can also install it manually by adding an entry under `mcpServers` in `.claude.json` with the command `npx -y ollama-mcp` and restarting Claude Code.
Yes. The Ollama MCP server is free and open source. You may still need a Ollama account or API key to connect the server to the underlying service, but the MCP layer itself has no MCPizy subscription cost.
Yes. Any MCP-compatible client works — including Claude Code, Claude Desktop, Cursor (via `.cursor/mcp.json`), Windsurf, VS Code with Copilot Chat, and custom agents built on the MCP SDK. The same install command targets all of them; only the config file path differs.
Once installed, your AI agent can run local LLMs via Ollama server directly inside your conversation. Typical use cases include asking Claude Code or Cursor to run Ollama operations, inspect results, chain Ollama with other MCP servers (see our Workflow Recipes), and automate repetitive ai & ml tasks without leaving your editor.