Make

Make logo

Visual automation platform with advanced workflow capabilities. Build powerful data collection scenarios with Scrapfly's web scraping integrated into Make's drag-and-drop scenario builder.

Workflow Automation Cloud Enterprise Official Website

Prerequisites

Before getting started, make sure you have the following:

Setup Instructions

Integrate Scrapfly into your Make scenarios using HTTP Request modules. This enables powerful automated web scraping workflows with Make's visual scenario builder.

  1. Create New Make Scenario

    Start a new scenario in Make (Note: In Make terminology, a Scenario is equivalent to a workflow):

    1. Log in to your Make account
    2. Click "Create a new scenario"
    3. Add a trigger module (Schedule, Webhook, Google Sheets, Email, etc.)
    Tip: Common Trigger Modules
    • Schedule: Run scraping on a schedule (every 15 minutes, hourly, daily, etc.)
    • Webhooks: Trigger scraping via HTTP request
    • Google Sheets: Scrape URLs when new rows are added
    • Email: Scrape URLs from incoming emails
    • Watch RSS Feed: Scrape new articles automatically
  2. Add HTTP Request Module for Scrapfly

    Configure an HTTP module to call the Scrapfly API (Note: In Make, Modules are equivalent to nodes in other platforms):

    1. Click the "+" button to add a new module after your trigger
    2. Search for and select "HTTP" → "Make a request"
    3. Configure the module with these settings:

    HTTP Request Configuration:

    • URL: https://api.scrapfly.io/scrape
    • Method: GET
    • Query String:
      • key: __API_KEY__
      • url: {{1.url}} (from previous module or hardcoded URL)
      • format: markdown
    • Parse response: Yes
    Sign up for free to get your API key.
    Tip: Using Make Connections

    Instead of hardcoding the API key, create a reusable connection:

    • In the HTTP module, click "Add" next to Connection
    • Choose "Custom" connection type
    • Add a custom header: X-API-Key with your Scrapfly API key
    • Save and select this connection in future HTTP modules
  3. Process Scrapfly Response

    Extract and process the scraped content using Make's data transformation modules:

    Option 1: Direct Field Mapping (Simple Use Case):

    1. Add your output module (Google Sheets, Airtable, etc.)
    2. Map fields directly from the HTTP module output:
    • Content: {{2.result.content}}
    • URL: {{2.result.url}}
    • Scraped At: {{now}}

    Option 2: Use Iterator for Complex Data (Advanced):

    If you need to process arrays or split data into multiple items:

    1. Add an "Iterator" module after the HTTP module
    2. Configure it to iterate over data arrays
    3. Process each item individually in subsequent modules

    Option 3: Use Tools → Text Parser (Extract Patterns):

    For extracting specific patterns (emails, prices, phone numbers) from scraped content:

    1. Add a "Tools" → "Text parser" → "Match pattern" module after the HTTP module
    2. Configure pattern extraction:
    Common Pattern Examples:

    Extract Email Addresses:

    • Text to parse: {{2.result.content}}
    • Pattern: [A-Za-z0-9._%+-]+@[A-Za-z0-9.-]+\.[A-Z|a-z]{2,}
    • Output: Array of all email addresses found

    Extract Prices (with currency symbols):

    • Text to parse: {{2.result.content}}
    • Pattern: [$€£¥]\s?\d+(?:[.,]\d{2})?
    • Output: Array of prices (e.g., $99.99, €45, £120.00)

    Extract Phone Numbers (US format):

    • Text to parse: {{2.result.content}}
    • Pattern: \(?\d{3}\)?[-.\s]?\d{3}[-.\s]?\d{4}
    • Output: Array of phone numbers

    Extract URLs from Content:

    • Text to parse: {{2.result.content}}
    • Pattern: https?://[^\s]+
    • Output: Array of all URLs found
    Example Workflow: Price Extraction from Multiple Products

    To extract prices from a scraped e-commerce page and process each one:

    1. HTTP Request → Scrape product listing page with Scrapfly
    2. Text Parser → Extract all prices using pattern: \$\d+\.\d{2}
    3. Iterator → Loop through each extracted price
    4. Google Sheets → Append each price to tracking spreadsheet

    Result: Automated price monitoring with individual price entries in your spreadsheet!

    Pro Tip: Use Make's built-in functions like split(), replace(), or parseJSON() to clean and transform scraped data before saving!
  4. Add Output Action Module

    Send the scraped data to your desired destination using Make's extensive app library:

    Popular Output Modules:

    • Google Sheets: Append rows to a spreadsheet
    • Airtable: Create or update records in a base
    • Slack: Send alerts or data summaries to channels
    • Email: Send formatted reports or notifications
    • Webhook: Send to your custom API endpoint
    • Data Store: Save to Make's built-in database
    • Google Drive: Save as CSV, JSON, or other files
    Example: Save to Google Sheets

    Add a Google Sheets module with these settings:

    • Action: Add a Row
    • Spreadsheet: Select or create your spreadsheet
    • Sheet: Select the sheet name
    • Column Mapping:
      • URL: {{2.result.url}}
      • Content: {{2.result.content}}
      • Timestamp: {{now}}
  5. Test and Activate Scenario

    Test your scenario and activate it for production use:

    1. Click "Run once" to test the scenario manually
    2. Review the output from each module (bubbles show data flow)
    3. Check for errors in any module (red indicators)
    4. Once working correctly, toggle the scenario to "ON"
    5. Monitor executions in the scenario history tab
    Tip: Error Handling with Router

    Add error handling to your scenario using Make's Router module:

    • Add a "Router" module after the HTTP request
    • Create two paths:
      • Success path: Filter: {{2.success}} = true
      • Error path: Filter: {{2.success}} = false
    • On the error path, add notification or logging modules
    • Enable "Continue the execution of the route even if this module returns no results" on critical modules

Example Scenario Templates

Price Monitoring Dashboard
Schedule hourly: scrape competitor product prices, compare with stored values, send Slack alert on price drops, update Google Sheets
Competitor Content Tracking
Schedule daily: scrape competitor blog RSS feeds, extract articles, summarize with OpenAI, post to Airtable with tags
Real Estate Listing Aggregator
Schedule every 30 minutes: scrape property listing sites, parse details (price, beds, location), deduplicate, save to Data Store, email digest
Job Board Scraper
Webhook trigger: receive job search query, scrape multiple job sites, extract job details, filter by criteria, send to email with formatting

Practical Example: Complete E-commerce Price Tracker

Here's a step-by-step example of building a complete price monitoring scenario in Make that demonstrates multiple features:

Use Case: Track Competitor Prices and Alert on Changes

Monitor competitor product prices hourly, extract pricing data, store in Google Sheets, and send Slack alerts when prices drop.

Module Type: Schedule → Every 1 hour

Configuration:

  • Interval: 1 hour
  • Start: Immediately after activation

Module Type: HTTP → Make a request

Configuration:

  • URL: https://api.scrapfly.io/scrape
  • Method: GET
  • Query String:
    • key: __API_KEY__
    • url: https://web-scraping.dev/product/1 (competitor product page)
    • format: markdown
    • render_js: true (if page uses JavaScript)
  • Parse response: Yes

Module Type: Tools → Text parser → Match pattern

Configuration:

  • Text to parse: {{2.result.content}}
  • Pattern: \$(\d+\.\d{2}) (extracts price like $49.99)
  • Global match: No (just first price)

Output: Price value without dollar sign (e.g., "49.99")

Module Type: Google Sheets → Search Rows

Configuration:

  • Spreadsheet: Price Tracking Sheet
  • Sheet: Products
  • Filter: Product URL equals {{2.result.url}}
  • Limit: 1 (most recent entry)

Output: Previous price from spreadsheet (or empty if first check)

Module Type: Router

Routes:

  • Route 1 - Price Decreased: Filter: {{3.1}} < {{4.Price}}
    • Leads to Slack notification (price drop alert)
  • Route 2 - Price Changed (Any): Filter: {{3.1}} != {{4.Price}}
    • Leads to Google Sheets update
  • Route 3 - No Change: Fallback route (no filter)
    • No action needed

Module Type: Slack → Create a Message

Configuration:

  • Channel: #price-alerts
  • Text:
    🔔 Price Drop Alert!
    Product: {{2.result.url}}
    Old Price: ${{4.Price}}
    New Price: ${{3.1}}
    Savings: ${{4.Price - 3.1}}
    
    Check it out: {{2.result.url}}

Module Type: Google Sheets → Add a Row

Configuration:

  • Spreadsheet: Price Tracking Sheet
  • Sheet: Products
  • Column Mapping:
    • URL: {{2.result.url}}
    • Price: {{3.1}}
    • Previous Price: {{4.Price}}
    • Timestamp: {{now}}
    • Change: {{3.1 - 4.Price}}
Result: You now have an automated price tracker that runs every hour, extracts prices, compares them with historical data, sends Slack alerts on price drops, and maintains a complete price history in Google Sheets!

Troubleshooting

Problem: HTTP Request to Scrapfly returns connection errors or timeouts

Solution:

  • Verify API key is correct in query parameters (no spaces or extra characters)
  • Check URL parameter is properly formatted (must start with http:// or https://)
  • Review error message in module execution data (click on the module bubble)
  • Test API call in browser or Postman first to isolate issue
  • Check Make's status page for service outages

Problem: Make cannot parse Scrapfly response or shows "invalid JSON"

Solution:

  • Ensure "Parse response" is enabled in HTTP module settings
  • Check response format in module output (click bubble to inspect raw data)
  • Verify Scrapfly API returned valid JSON (not HTML error page)
  • Add a "JSON" → "Parse JSON" module if automatic parsing fails
  • Use correct JSON path: {{2.result.content}} for scraped content

Problem: Scenario stops executing due to Make's operation limits

Solution:

  • Check your Make plan limits (operations per month/execution)
  • Reduce scraping frequency in schedule trigger
  • Split large scraping jobs into multiple scenarios
  • Use Scrapfly's cache parameter to avoid re-scraping same URLs
  • Upgrade Make plan for higher operation limits
  • Monitor usage in Make's dashboard to track operation consumption

Problem: Cannot map fields from Scrapfly response to output modules

Solution:

  • Click on module bubbles to inspect exact data structure
  • Use Make's mapping panel to explore available fields
  • Add a "Set Variable" or "Text Parser" module to extract specific data
  • Use functions like get() to access nested JSON fields
  • Test with hardcoded values first, then replace with dynamic mappings

Problem: Hitting Scrapfly API rate limits or Make's operation quota

Solution:

  • Add "Sleep" or "Repeater" modules with delays between scraping operations
  • Use Scrapfly's cache parameter for repeat requests (reduces API calls)
  • Reduce scraping frequency in schedule trigger settings
  • Implement batching: collect URLs, then process in smaller groups
  • Upgrade Scrapfly plan for higher API rate limits
  • Monitor Make's operation usage in your account dashboard

Problem: Webhook trigger does not execute scenario when called

Solution:

  • Verify webhook URL is correct (copy from Make's webhook module)
  • Check scenario is activated (ON toggle in top-right)
  • Test webhook with a simple curl command or Postman first
  • Review webhook execution history in Make's scenario runs
  • Ensure webhook payload matches expected structure in trigger settings
  • Check firewall or network restrictions blocking webhook delivery

Alternative Automation Platforms

While Make offers powerful visual workflow automation, you might also want to explore these no-code automation alternatives for Scrapfly integration:

Zapier Zapier

Simpler trigger-action automation connecting 5000+ apps. Best for straightforward linear workflows with task-based pricing.

  • Largest app ecosystem (5000+ integrations)
  • Easier learning curve for beginners
  • Task-based pricing model
n8n n8n

Open-source workflow automation with self-hosting option. Perfect for developers who need full control and customization.

  • Self-hosted or cloud deployment
  • Open-source with active community
  • Advanced workflow capabilities

Next Steps

Summary