Leverage Firecrawl’s advanced web scraping tools to search the web, extract content from URLs, crawl entire websites, and pull structured data using LLM-powered extraction.
How to Use MCP Nodes
What is Firecrawl MCP?
The Firecrawl MCP creates a customized node that understands Firecrawl’s web data API and responds to natural language prompts. Describe what you need, and the node executes Firecrawl tools to search, scrape, crawl, map, or extract data from websites. Results come back as structured data that you can pass to the next step in your workflow.What Can It Do for You?
- Search the web with optional scraping, supporting multiple sources, time filtering, geo-targeting, and category filtering
- Scrape single URLs and extract content in various formats including markdown, HTML, links, screenshots, and images
- Map entire websites to discover all URLs with sitemap control
- Crawl websites and extract content from multiple pages with natural language prompts, path filtering, and depth control
- Extract structured data from URLs using LLM with JSON Schema for precise data extraction
Available Tools
| Tool | What It Does | Example Use |
|---|---|---|
| Search | Search the web with optional scraping. Supports multiple sources, time filtering, geo-targeting, and category filtering | ”Search the web for search term and return structured data with title, url, and snippet for the top result count results” |
| Scrape | Scrape a single URL and extract content in various formats (markdown, summary, HTML, links, screenshots, images, branding) | “Scrape url and return structured data with title, markdown content, and all links found on the page” |
| Map | Get all URLs from a website with sitemap control | ”Map the website url and return structured data with all discovered URLs” |
| Crawl | Crawl a website and extract content from multiple pages. Supports natural language prompts, path filtering, and depth control | ”Crawl url with depth depth and return structured data with page titles, URLs, and markdown content” |
| Get Crawl Status | Get the status and results of a crawl job using the crawl ID | ”Using crawl ID crawl_id, get the crawl status and return structured data with status, completed pages, and results” |
| Batch Scrape | Scrape multiple URLs at once with shared scrape options | ”Batch scrape urls and return structured data with title, url, and markdown content for each page” |
| Get Batch Scrape Status | Get the status and results of a batch scrape job using the batch ID | ”Using batch ID batch_id, get the batch scrape status and return structured data with status and results” |
| Extract | Extract structured data from URLs using LLM with JSON Schema | ”Extract fields from url and return structured data matching the schema” |
| Get Extract Status | Get the status and results of an extract job using the extract ID | ”Using extract ID extract_id, get the extract status and return structured data with status and extracted data” |
Credit Costs
| Tool | Credits Per Use |
|---|---|
| Search | 85 per item |
| Scrape | 80 credits |
| Map | 10 credits |
| Crawl | 80 per item |
| Get Crawl Status | 80 per item |
| Batch Scrape | 80 per item |
| Get Batch Scrape Status | 80 per item |
| Extract | 300 credits |
| Get Extract Status | 3 credits |
How to Use
1
Create Your Firecrawl MCP Node
Go to your node library, search for Firecrawl, and click “Create a node with AI”
2
Add Your Prompt
Drag the Firecrawl MCP node to your canvas and add your prompt in the text box.
3
Test Your Node
Run the node to see the results. If it works as expected, you’re all set. If you need adjustments, check the troubleshooting tips below.
4
Save and Reuse
Once your Firecrawl MCP node is working, save it to your library. You can now use this customized node in any workflow.
Example Prompts
Here are some prompts that work well with Firecrawl MCP: Web Search with Scraping:Troubleshooting
If your Firecrawl MCP node isn’t working as expected, try these best practices:Keep Prompts Simple and Specific
- Good: “Scrape
urland return title and markdown content” - Less Efficient: “Scrape
url, summarize the content, extract all company information, and format it as a report”
Match What Firecrawl Can Do
- Good: “Crawl
urlwith depth 2 and return page titles and URLs” - Less Efficient: “Crawl
urland send the results to a Google Sheet”
Break Complex Tasks Into Steps
Trying to do everything in one prompt can sometimes lead to timeouts or unexpected results:1
Step 1: Map the Website
Map the website
url and return structured data with all discovered URLs2
Step 2: Scrape Product Pages
For each product URL, scrape and return structured data with title, price, and description
3
Step 3: Generate Report
Using the scraped data, create a summary report with Ask AI and return structured data with key findings
In your workflow, connect these nodes sequentially. The output from Step 1 becomes the input for Step 2, and Step 2’s data feeds into Step 3.
Focus on Data Retrieval
Firecrawl MCP is great at getting information from websites. For analysis or content creation, connect it to AI nodes in your workflow or agent. Example:- Good prompt: “Scrape
urland return structured data with title, author, published date, and main content” - Less Efficient: “Scrape
urland write a blog post summary about the content”
Troubleshooting Node Creation
Empty Outputs
Empty Outputs
If you’re seeing empty outputs in the node creation window (or if you’ve already created the node, hover over it and click “Edit”), use the chat interface to prompt the AI to add debug logs and verify the API response. Specifically mention that you received empty outputs.
Incorrect Results
Incorrect Results
In the node creation window (or if you’ve already created the node, hover over it and click “Edit”), use the chat interface to describe in detail what you expected versus what you received.
Iterate Using the Chat
Iterate Using the Chat
MCP node creation often requires a few tweaks. Use the chat interface in the node creation window to refine filters, output fields, or pagination. The AI will adjust the node based on your feedback.
Need More Help?
- Watch What are MCP Nodes video tutorial
- Check out MCP Best Practices in Gumloop University
- Join the Gumloop Community for support
- View the Firecrawl MCP setup guide for Claude and Cursor
- Contact support at [email protected]
