Firecrawl is a Web Scraping MCP server that lets Claude Code, Cursor, Windsurf and any MCP-compatible AI agent extract clean data from any website. Install in 1 minute with mcpizy install firecrawl.
Web Scraping
Extract clean data from any website. Turn entire websites into LLM-ready markdown or structured data.
Official homepageConnect your Firecrawl account once — MCPizy stores the credentials encrypted and uses them whenever you run a recipe in managed mode.
mcpizy install firecrawlnpx -y firecrawl-mcpfirecrawl_scrapeScrape a single URL and return markdown + metadata
Inputs
urlstringrequiredformatsstring[]optionalfirecrawl_crawlCrawl a website recursively up to a depth
Inputs
urlstringrequiredlimitnumberoptionalfirecrawl_mapMap all URLs on a website
Inputs
urlstringrequiredfirecrawl_searchSearch the web and scrape top results
Inputs
querystringrequiredlimitnumberoptionalfirecrawl_extractExtract structured data using a schema
Inputs
urlsstring[]requiredschemaobjectrequiredWorks identically across clients. Only the config file path differs.
~/.claude.json{
"mcpServers": {
"firecrawl": {
"command": "npx",
"args": [
"-y",
"firecrawl-mcp"
],
"env": {
"FIRECRAWL_API_KEY": "fc-..."
}
}
}
}.cursor/mcp.json{
"mcpServers": {
"firecrawl": {
"command": "npx",
"args": [
"-y",
"firecrawl-mcp"
],
"env": {
"FIRECRAWL_API_KEY": "fc-..."
}
}
}
}~/.codeium/windsurf/mcp_config.json{
"mcpServers": {
"firecrawl": {
"command": "npx",
"args": [
"-y",
"firecrawl-mcp"
],
"env": {
"FIRECRAWL_API_KEY": "fc-..."
}
}
}
}Create an API key in the Firecrawl dashboard
FIRECRAWL_API_KEYPaste any of these prompts into Claude Code, Cursor or another MCP-compatible client.
“Scrape https://news.ycombinator.com and give me the top 10 headlines”
Uses: firecrawl_scrape
“Crawl anthropic.com and build a sitemap of docs pages”
Uses: firecrawl_crawl, firecrawl_map
“Extract product name, price, and rating from these 20 Amazon pages”
Uses: firecrawl_extract
“Search the web for 'MCP server best practices 2025' and summarise”
Uses: firecrawl_search
Scrape websites with Oxylabs Web API. Dynamic rendering, SERP data, and e-commerce extraction.
Real-time web data extraction with HTML, markdown, and screenshot support.
Easy web data access and simplified information retrieval with proxy support.
Real-time Google SERP results integration for LLM applications.
If Firecrawl doesn't fit your stack, these Web Scraping MCP servers solve similar problems.
Use 3,000+ pre-built cloud tools to extract data from websites, social media, and e-commerce platforms.
Extract and interact with the web at scale. Automated access with proxy networks and SERP API.
Render high-quality website screenshots. Full page captures, element selection, and custom viewports.
The Firecrawl MCP server is an Web Scraping Model Context Protocol server that lets Claude Code, Cursor, Windsurf, VS Code with Copilot, and other MCP-compatible AI agents extract clean data from any website. It exposes Firecrawl's capabilities as tools the AI can call directly from your editor or CLI.
The fastest way is the MCPizy CLI: run `mcpizy install firecrawl` and MCPizy will add the server to your `.claude.json` automatically. You can also install it manually by adding an entry under `mcpServers` in `.claude.json` with the command `npx -y firecrawl-mcp` and restarting Claude Code.
Yes. The Firecrawl MCP server is free and open source (see the GitHub repository linked on this page). You may still need a Firecrawl account or API key to connect the server to the underlying service, but the MCP layer itself has no MCPizy subscription cost.
Yes. Any MCP-compatible client works — including Claude Code, Claude Desktop, Cursor (via `.cursor/mcp.json`), Windsurf, VS Code with Copilot Chat, and custom agents built on the MCP SDK. The same install command targets all of them; only the config file path differs.
Once installed, your AI agent can extract clean data from any website directly inside your conversation. Typical use cases include asking Claude Code or Cursor to run Firecrawl operations, inspect results, chain Firecrawl with other MCP servers (see our Workflow Recipes), and automate repetitive web scraping tasks without leaving your editor.