Langfuse is a AI & ML MCP server that lets Claude Code, Cursor, Windsurf and any MCP-compatible AI agent open-source LLM observability. Install in 1 minute with mcpizy install langfuse.
AI & ML
Open-source LLM observability. Collaborative prompt editing, tracing, and evaluation.
mcpizy install langfusenpx -y @langfuse/mcp-serverAI-powered codebase context. Understand any GitHub repo instantly with semantic code search.
Predict anything with AI forecasting. Time series predictions, anomaly detection, and trend analysis.
Production-ready RAG for searching and retrieving document data. Semantic search over your files.
Models, datasets, and spaces on HF
If Langfuse doesn't fit your stack, these AI & ML MCP servers solve similar problems.
The Langfuse MCP server is an AI & ML Model Context Protocol server that lets Claude Code, Cursor, Windsurf, VS Code with Copilot, and other MCP-compatible AI agents open-source LLM observability. It exposes Langfuse's capabilities as tools the AI can call directly from your editor or CLI.
The fastest way is the MCPizy CLI: run `mcpizy install langfuse` and MCPizy will add the server to your `.claude.json` automatically. You can also install it manually by adding an entry under `mcpServers` in `.claude.json` with the command `npx -y @langfuse/mcp-server` and restarting Claude Code.
Yes. The Langfuse MCP server is free and open source (see the GitHub repository linked on this page). You may still need a Langfuse account or API key to connect the server to the underlying service, but the MCP layer itself has no MCPizy subscription cost.
Yes. Any MCP-compatible client works — including Claude Code, Claude Desktop, Cursor (via `.cursor/mcp.json`), Windsurf, VS Code with Copilot Chat, and custom agents built on the MCP SDK. The same install command targets all of them; only the config file path differs.
Once installed, your AI agent can open-source LLM observability directly inside your conversation. Typical use cases include asking Claude Code or Cursor to run Langfuse operations, inspect results, chain Langfuse with other MCP servers (see our Workflow Recipes), and automate repetitive ai & ml tasks without leaving your editor.