# Scrapfly Documentation

## Table of Contents

### Dashboard

- [Intro](https://scrapfly.io/docs)
- [Project](https://scrapfly.io/docs/project)
- [Account](https://scrapfly.io/docs/account)
- [Workspace & Team](https://scrapfly.io/docs/workspace-and-team)
- [Billing](https://scrapfly.io/docs/billing)

### Products

#### MCP Server

- [Getting Started](https://scrapfly.io/docs/mcp/getting-started)
- [Tools & API Spec](https://scrapfly.io/docs/mcp/tools)
- [Authentication](https://scrapfly.io/docs/mcp/authentication)
- [Examples & Use Cases](https://scrapfly.io/docs/mcp/examples)
- [FAQ](https://scrapfly.io/docs/mcp/faq)
##### Integrations

- [Overview](https://scrapfly.io/docs/mcp/integrations)
- [Claude Desktop](https://scrapfly.io/docs/mcp/integrations/claude-desktop)
- [Claude Code](https://scrapfly.io/docs/mcp/integrations/claude-code)
- [ChatGPT](https://scrapfly.io/docs/mcp/integrations/chatgpt)
- [Cursor](https://scrapfly.io/docs/mcp/integrations/cursor)
- [Cline](https://scrapfly.io/docs/mcp/integrations/cline)
- [Windsurf](https://scrapfly.io/docs/mcp/integrations/windsurf)
- [Zed](https://scrapfly.io/docs/mcp/integrations/zed)
- [Roo Code](https://scrapfly.io/docs/mcp/integrations/roo-code)
- [VS Code](https://scrapfly.io/docs/mcp/integrations/vscode)
- [LangChain](https://scrapfly.io/docs/mcp/integrations/langchain)
- [LlamaIndex](https://scrapfly.io/docs/mcp/integrations/llamaindex)
- [CrewAI](https://scrapfly.io/docs/mcp/integrations/crewai)
- [OpenAI](https://scrapfly.io/docs/mcp/integrations/openai)
- [n8n](https://scrapfly.io/docs/mcp/integrations/n8n)
- [Make](https://scrapfly.io/docs/mcp/integrations/make)
- [Zapier](https://scrapfly.io/docs/mcp/integrations/zapier)
- [Vapi AI](https://scrapfly.io/docs/mcp/integrations/vapi)
- [Agent Builder](https://scrapfly.io/docs/mcp/integrations/agent-builder)
- [Custom Client](https://scrapfly.io/docs/mcp/integrations/custom-client)


#### Web Scraping API

- [Getting Started](https://scrapfly.io/docs/scrape-api/getting-started)
- [API Specification](https://scrapfly.io/docs/scrape-api/specification)
- [Monitoring](https://scrapfly.io/docs/monitoring)
- [Customize Request](https://scrapfly.io/docs/scrape-api/custom)
- [Debug](https://scrapfly.io/docs/scrape-api/debug)
- [Anti Scraping Protection](https://scrapfly.io/docs/scrape-api/anti-scraping-protection)
- [Proxy](https://scrapfly.io/docs/scrape-api/proxy)
- [Proxy Mode](https://scrapfly.io/docs/scrape-api/proxy-mode)
- [Proxy Mode - Screaming Frog](https://scrapfly.io/docs/scrape-api/proxy-mode/screaming-frog)
- [Proxy Mode - Apify](https://scrapfly.io/docs/scrape-api/proxy-mode/apify)
- [(Auto) Data Extraction](https://scrapfly.io/docs/scrape-api/extraction)
- [Javascript Rendering](https://scrapfly.io/docs/scrape-api/javascript-rendering)
- [Javascript Scenario](https://scrapfly.io/docs/scrape-api/javascript-scenario)
- [SSL](https://scrapfly.io/docs/scrape-api/ssl)
- [DNS](https://scrapfly.io/docs/scrape-api/dns)
- [Cache](https://scrapfly.io/docs/scrape-api/cache)
- [Session](https://scrapfly.io/docs/scrape-api/session)
- [Webhook](https://scrapfly.io/docs/scrape-api/webhook)
- [Screenshot](https://scrapfly.io/docs/scrape-api/screenshot)
- [Errors](https://scrapfly.io/docs/scrape-api/errors)
- [Timeout](https://scrapfly.io/docs/scrape-api/understand-timeout)
- [Throttling](https://scrapfly.io/docs/throttling)
- [Troubleshoot](https://scrapfly.io/docs/scrape-api/troubleshoot)
- [Billing](https://scrapfly.io/docs/scrape-api/billing)
- [FAQ](https://scrapfly.io/docs/scrape-api/faq)

#### Crawler API

- [Getting Started](https://scrapfly.io/docs/crawler-api/getting-started)
- [API Specification](https://scrapfly.io/docs/crawler-api/specification)
- [Retrieving Results](https://scrapfly.io/docs/crawler-api/results)
- [WARC Format](https://scrapfly.io/docs/crawler-api/warc-format)
- [Data Extraction](https://scrapfly.io/docs/crawler-api/extraction-rules)
- [Webhook](https://scrapfly.io/docs/crawler-api/webhook)
- [Billing](https://scrapfly.io/docs/crawler-api/billing)
- [Errors](https://scrapfly.io/docs/crawler-api/errors)
- [Troubleshoot](https://scrapfly.io/docs/crawler-api/troubleshoot)
- [FAQ](https://scrapfly.io/docs/crawler-api/faq)

#### Screenshot API

- [Getting Started](https://scrapfly.io/docs/screenshot-api/getting-started)
- [API Specification](https://scrapfly.io/docs/screenshot-api/specification)
- [Accessibility Testing](https://scrapfly.io/docs/screenshot-api/accessibility)
- [Webhook](https://scrapfly.io/docs/screenshot-api/webhook)
- [Billing](https://scrapfly.io/docs/screenshot-api/billing)
- [Errors](https://scrapfly.io/docs/screenshot-api/errors)

#### Extraction API

- [Getting Started](https://scrapfly.io/docs/extraction-api/getting-started)
- [API Specification](https://scrapfly.io/docs/extraction-api/specification)
- [Rules Template](https://scrapfly.io/docs/extraction-api/rules-and-template)
- [LLM Extraction](https://scrapfly.io/docs/extraction-api/llm-prompt)
- [AI Auto Extraction](https://scrapfly.io/docs/extraction-api/automatic-ai)
- [Webhook](https://scrapfly.io/docs/extraction-api/webhook)
- [Billing](https://scrapfly.io/docs/extraction-api/billing)
- [Errors](https://scrapfly.io/docs/extraction-api/errors)
- [FAQ](https://scrapfly.io/docs/extraction-api/faq)

#### Proxy Saver

- [Getting Started](https://scrapfly.io/docs/proxy-saver/getting-started)
- [Fingerprints](https://scrapfly.io/docs/proxy-saver/fingerprints)
- [Optimizations](https://scrapfly.io/docs/proxy-saver/optimizations)
- [SSL Certificates](https://scrapfly.io/docs/proxy-saver/certificates)
- [Protocols](https://scrapfly.io/docs/proxy-saver/protocols)
- [Pacfile](https://scrapfly.io/docs/proxy-saver/pacfile)
- [Secure Credentials](https://scrapfly.io/docs/proxy-saver/security)
- [Billing](https://scrapfly.io/docs/proxy-saver/billing)

#### Cloud Browser API

- [Getting Started](https://scrapfly.io/docs/cloud-browser-api/getting-started)
- [Proxy & Geo-Targeting](https://scrapfly.io/docs/cloud-browser-api/proxy)
- [Unblock API](https://scrapfly.io/docs/cloud-browser-api/unblock)
- [File Downloads](https://scrapfly.io/docs/cloud-browser-api/file-downloads)
- [Session Resume](https://scrapfly.io/docs/cloud-browser-api/session-resume)
- [Human-in-the-Loop](https://scrapfly.io/docs/cloud-browser-api/human-in-the-loop)
- [Debug Mode](https://scrapfly.io/docs/cloud-browser-api/debug-mode)
- [Bring Your Own Proxy](https://scrapfly.io/docs/cloud-browser-api/bring-your-own-proxy)
- [Browser Extensions](https://scrapfly.io/docs/cloud-browser-api/extensions)
##### Integrations

- [Puppeteer](https://scrapfly.io/docs/cloud-browser-api/puppeteer)
- [Playwright](https://scrapfly.io/docs/cloud-browser-api/playwright)
- [Selenium](https://scrapfly.io/docs/cloud-browser-api/selenium)
- [Vercel Agent Browser](https://scrapfly.io/docs/cloud-browser-api/agent-browser)
- [Browser Use](https://scrapfly.io/docs/cloud-browser-api/browser-use)
- [Stagehand](https://scrapfly.io/docs/cloud-browser-api/stagehand)
- [Vibium](https://scrapfly.io/docs/cloud-browser-api/vibium)

- [Billing](https://scrapfly.io/docs/cloud-browser-api/billing)
- [Errors](https://scrapfly.io/docs/cloud-browser-api/errors)


### Tools

- [Antibot Detector](https://scrapfly.io/docs/tools/antibot-detector)

### SDK

- [Golang](https://scrapfly.io/docs/sdk/golang)
- [Python](https://scrapfly.io/docs/sdk/python)
- [TypeScript](https://scrapfly.io/docs/sdk/typescript)
- [Scrapy](https://scrapfly.io/docs/sdk/scrapy)

### Integrations

- [Getting Started](https://scrapfly.io/docs/integration/getting-started)
- [LangChain](https://scrapfly.io/docs/integration/langchain)
- [LlamaIndex](https://scrapfly.io/docs/integration/llamaindex)
- [CrewAI](https://scrapfly.io/docs/integration/crewai)
- [Zapier](https://scrapfly.io/docs/integration/zapier)
- [Make](https://scrapfly.io/docs/integration/make)
- [n8n](https://scrapfly.io/docs/integration/n8n)

### Academy

- [Overview](https://scrapfly.io/academy)
- [Web Scraping Overview](https://scrapfly.io/academy/scraping-overview)
- [Tools](https://scrapfly.io/academy/tools-overview)
- [Reverse Engineering](https://scrapfly.io/academy/reverse-engineering)
- [Static Scraping](https://scrapfly.io/academy/static-scraping)
- [HTML Parsing](https://scrapfly.io/academy/html-parsing)
- [Dynamic Scraping](https://scrapfly.io/academy/dynamic-scraping)
- [Hidden API Scraping](https://scrapfly.io/academy/hidden-api-scraping)
- [Headless Browsers](https://scrapfly.io/academy/headless-browsers)
- [Hidden Web Data](https://scrapfly.io/academy/hidden-web-data)
- [JSON Parsing](https://scrapfly.io/academy/json-parsing)
- [Data Processing](https://scrapfly.io/academy/data-processing)
- [Scaling](https://scrapfly.io/academy/scaling)
- [Walkthrough Summary](https://scrapfly.io/academy/walkthrough-summary)
- [Scraper Blocking](https://scrapfly.io/academy/scraper-blocking)
- [Proxies](https://scrapfly.io/academy/proxies)

---

# n8n

 [  View as markdown ](https://scrapfly.io/?view=markdown)   Copy for LLM    Copy for LLM  [     Open in ChatGPT ](https://chatgpt.com/?hints=search&prompt=Read%20from%20https%3A%2F%2Fscrapfly.io%2Fdocs%2Fmcp%2Fintegrations%2Fn8n%20so%20I%20can%20ask%20questions%20about%20it.) [     Open in Claude ](https://claude.ai/new?q=Read%20from%20https%3A%2F%2Fscrapfly.io%2Fdocs%2Fmcp%2Fintegrations%2Fn8n%20so%20I%20can%20ask%20questions%20about%20it.) [     Open in Perplexity ](https://www.perplexity.ai/search/new?q=Read%20from%20https%3A%2F%2Fscrapfly.io%2Fdocs%2Fmcp%2Fintegrations%2Fn8n%20so%20I%20can%20ask%20questions%20about%20it.) 

 

 

 Powerful workflow automation platform. Integrate Scrapfly web scraping into your n8n workflows for automated data collection, monitoring, and processing pipelines.

 

 

 Workflow Automation Cloud Self-Hosted [  Official Website ](https://n8n.io/) 

 

 1. [MCP Documentation](https://scrapfly.io/docs/mcp)
2. [Integrations](https://scrapfly.io/docs/mcp/integrations)
3. n8n
 
 ## Prerequisites

Before getting started, make sure you have the following:

- [n8n](https://n8n.io/) instance (cloud or self-hosted)
- Your Scrapfly API key ([create account](https://scrapfly.io/register))
 
## Setup Instructions

Integrate Scrapfly into your n8n workflows using HTTP Request nodes. This enables powerful automated web scraping pipelines.

1. **Create New n8n Workflow** Start a new workflow in n8n:
    
    
    1. Log in to your n8n instance
    2. Click "New Workflow"
    3. Add a trigger node (Schedule, Webhook, Manual, etc.)
    4. From the search bar, search for the node **MCP Client** and add it to your workflow
     
      Tip: Common Trigger Nodes
    - **Schedule Trigger:** Run scraping on a schedule (hourly, daily, etc.)
    - **Webhook:** Trigger scraping via HTTP request
    - **Manual Trigger:** Run scraping on demand
    - **Email Trigger:** Scrape URLs from incoming emails
2. **Configure Your MCP Client Node**
    1. **Server Transport:** **HTTP Streamablet**
    2. **MCP Endpoint URL:** `https://mcp.scrapfly.io/mcp`
    3. **Authentication:** **Bearer Auth**
    4. **Credential for Bearer Auth:** Click the edit button and add Your Scrapfly API key
3. **Save MCP Client Configuration** After entering the configuration, you should find the supported Scrapfly MCP tools on the **Tool** dropdown menu.
4. **Use Scrapfly MCP Client with an AI Agent Node**To get the best of MCP Clients on n8n, you should combine your MCP tool with an AI Agent node:
    
    
    1. From the workflow, search for the node **AI Agent** and add it to your workflow
    2. Link the Scrapfly MCP Client as a tool for the AI agent node
    3. Link a chat model to the AI agent node to allow agentic use of the Scrapfly MCP tool
5. **Process Scrapfly Response** Extract and process the scraped content using n8n's data transformation nodes:
    
    **Extract Content with Set Node:**
    
    
    1. Add a "Set" node after the HTTP Request
    2. Configure it to extract the content:
     
    
    - **Field Name:** `content`
    - **Value:** `{{ $json.result.content }}`
     
    **Alternative: Use Code Node for Complex Processing:**
    
     ```
    // Extract and process scraped content
    const scrapedData = $input.all();
    
    return scrapedData.map(item => ({
      json: {
        url: item.json.result.url,
        content: item.json.result.content,
        title: item.json.result.content.split("\n")[0], // Extract first line as title
        scrapedAt: new Date().toISOString()
      }
    }));
    ```
    
     
    
       
    
     
    
      **Pro Tip:** Use the Code node to clean, parse, or transform scraped data before saving it!
6. **Add Output Action** Send the scraped data to your desired destination:
    
    **Popular Output Nodes:**
    
    
    - **Google Sheets:** Save data to a spreadsheet
    - **Airtable:** Store in a database
    - **Slack/Discord:** Send alerts or notifications
    - **Email:** Email reports
    - **Webhook:** Send to your custom API
    - **File:** Save as JSON, CSV, or other formats
     
      Example: Save to Google SheetsAdd a Google Sheets node with these settings:
    
    
    - **Operation:** Append Row
    - **Spreadsheet:** Select your sheet
    - **Columns:** Map `url`, `content`, `scrapedAt`
7. **Test and Activate Workflow** Test your workflow and activate it for production use:
    
    
    1. Click "Execute Workflow" to test manually
    2. Review the output from each node
    3. Once working correctly, toggle the workflow to "Active"
    4. Monitor executions in the "Executions" tab
     
      Tip: Error HandlingAdd error handling to your workflow:
    
    
    - Enable "Continue On Fail" on HTTP Request node
    - Add an IF node to check for errors: `{{ $json.error }}`
    - Route errors to a notification or logging node

## Example Workflow Templates

###### Daily News Aggregator

    

Schedule daily at 9am: scrape top stories from multiple news sites and email digest

 

    

###### Price Monitor

    

Monitor competitor pricing hourly and send Slack alert on price changes

 

    

###### Competitive Intelligence

    

Scrape competitor sites on webhook trigger, parse data and save to Airtable

 

    

###### Content Aggregation

    

Schedule: scrape blog RSS feeds, extract articles, summarize with AI, post to CMS

 

    



## Troubleshooting

#####    HTTP Request Node Fails   

 

**Problem:** HTTP Request to Scrapfly returns errors

**Solution:**

- Verify API key is correct in query parameters
- Check URL parameter is properly formatted (http:// or https://)
- Review error message in node execution data
- Test API call in browser or Postman first
 
 

 

 

#####    Response Data Not Accessible   

 

**Problem:** Cannot access scraped content from response

**Solution:**

- Use correct JSON path: `{{ $json.result.content }}`
- Check "Response Format" is set to "JSON"
- Inspect node output data to see response structure
- Add a Set node to extract specific fields
 
 

 

 

#####    Workflow Execution Timeout   

 

**Problem:** Workflow times out during scraping

**Solution:**

- Increase timeout in HTTP Request node settings
- Disable JavaScript rendering if not needed (faster scraping)
- Split large jobs into smaller batches
- Check n8n instance timeout settings
 
 

 

 

#####    Rate Limiting Issues   

 

**Problem:** Hitting Scrapfly API rate limits

**Solution:**

- Add "Wait" nodes between scraping operations
- Use Scrapfly cache parameter for repeat requests
- Reduce scraping frequency in schedule trigger
- Upgrade Scrapfly plan for higher limits
 
 

 

 

#####    Credentials Not Working   

 

**Problem:** n8n credentials not applying API key correctly

**Solution:**

- Use query parameter method instead of header auth
- Verify credential is selected in HTTP Request node
- Check credential type matches authentication method
- Test with hardcoded API key first to isolate issue
 
 

 

 

#####    Workflow Not Activating   

 

**Problem:** Cannot activate workflow or schedule trigger does not fire

**Solution:**

- Ensure all required fields are filled in trigger node
- Check n8n instance has active executions enabled
- Review execution logs for activation errors
- Verify schedule trigger timezone settings
 
 

 

 



## Alternative Automation Platforms

While n8n offers powerful open-source automation, you might also want to explore these no-code automation alternatives for Scrapfly integration:

#####   [Make](https://scrapfly.io/docs/mcp/integrations/make) 

Visual automation platform with advanced workflow capabilities. Best for complex multi-step scenarios without self-hosting.

- Fully-managed cloud platform
- Visual drag-and-drop scenario builder
- Enterprise support options
 
 

 

 

#####   [Zapier](https://scrapfly.io/docs/mcp/integrations/zapier) 

Simpler trigger-action automation connecting 5000+ apps. Best for straightforward workflows with minimal setup.

- Largest app ecosystem (5000+ integrations)
- Easier learning curve for beginners
- Instant setup, no hosting needed
 
 

 

 



## Next Steps

- [Explore available MCP tools](https://scrapfly.io/docs/mcp/tools) and their capabilities
- [See real-world examples](https://scrapfly.io/docs/mcp/examples) of what you can build
- [Learn about authentication methods](https://scrapfly.io/docs/mcp/authentication) in detail
- [Read the FAQ](https://scrapfly.io/docs/mcp/faq) for common questions
 
 [  Back to All Integrations ](https://scrapfly.io/docs/mcp/integrations)