πŸš€ We are hiring! See open positions

Oxylabs to Scrapfly Migration Guide

Complete parameter mapping and code examples for migrating from Oxylabs Web Scraper API to Scrapfly. Most teams complete migration in under 2 hours with zero downtime.

Key Differences in API Architecture

Oxylabs uses a source-based approach with different scrapers for different targets. Scrapfly uses a unified API where the same parameters work across all targets.

Oxylabs Approach

  • Different source for each target type
  • google_search, amazon_product, universal
  • Each source has unique parameters
  • Per-result pricing model

Scrapfly Approach

  • Single unified API for all targets
  • Just provide url parameter
  • Consistent parameters across all requests
  • Credit-based pricing (pay for complexity)

Complete Parameter Mapping

Oxylabs and Scrapfly use different parameter naming conventions. This table shows exact mappings for all common features.

Oxylabs Parameter Scrapfly Parameter Notes
username:password (Basic Auth) key API authentication (Scrapfly uses query param or header)
url url Target URL to scrape (same)
source Not needed Scrapfly automatically detects target type from URL
render=html render_js=true Enable JavaScript rendering with headless browser
geo_location country 2-letter ISO country code (e.g., "us", "gb")
locale lang Accept-Language header for locale/language preference
parse=true extraction_model Use Scrapfly's Extraction API for structured data
user_agent_type os Operating system: linux, macos, windows, android, ios
context[key]=value headers[key]=value Custom HTTP headers
callback_url webhook_name Async webhook notifications (configure in dashboard)
browser_instructions js_scenario Browser automation: clicks, form fills, scrolls
Web Unblocker (separate product) asp=true Anti-bot bypass (included in Scrapfly API)
N/A cache Enable response caching (Scrapfly exclusive)
N/A cache_ttl Cache time-to-live in seconds (Scrapfly exclusive)
N/A auto_scroll Auto-scroll to load lazy content (Scrapfly exclusive)
N/A proxy_saver Bandwidth optimization (Scrapfly exclusive)
N/A tags Custom tags for request tracking (Scrapfly exclusive)
N/A wait_for_selector Wait for CSS selector to appear (Scrapfly exclusive)

Oxylabs Source to Scrapfly Mapping

Oxylabs requires different source values for different targets. In Scrapfly, just use the URL. The API handles everything automatically.

Oxylabs Source Scrapfly Approach Optional Extraction Model
universal Just use url parameter Any extraction model
google_search Use Google URL directly search_engine_results
amazon_product Use Amazon product URL product
amazon_search Use Amazon search URL product_listing
walmart_product Use Walmart product URL product
google_shopping Use Google Shopping URL product_listing
ebay_product Use eBay product URL product

Scrapfly's unified API eliminates the need to remember different source names for each target.

Migration Code Examples

Side-by-side code examples showing how to migrate from Oxylabs to Scrapfly. Select your language below.

Oxylabs
import requests
from pprint import pprint

username = "USERNAME"
password = "PASSWORD"

payload = {
    "source": "universal",
    "url": "https://example.com",
    "geo_location": "United States",
    "render": "html"
}

response = requests.post(
    "https://realtime.oxylabs.io/v1/queries",
    auth=(username, password),
    json=payload
)

pprint(response.json())
Scrapfly
from scrapfly import ScrapflyClient, ScrapeConfig

client = ScrapflyClient(key="YOUR_SCRAPFLY_API_KEY")

result = client.scrape(ScrapeConfig(
    url="https://example.com",
    country="us",
    render_js=True,
    asp=True  # Anti-bot bypass
))

print(result.content)
Oxylabs
const axios = require('axios');

const username = 'USERNAME';
const password = 'PASSWORD';

const payload = {
    source: 'universal',
    url: 'https://example.com',
    geo_location: 'United States',
    render: 'html'
};

const response = await axios.post(
    'https://realtime.oxylabs.io/v1/queries',
    payload,
    {
        auth: { username, password }
    }
);

console.log(response.data);
Scrapfly
const { ScrapflyClient } = require('scrapfly-sdk');

const client = new ScrapflyClient({
    key: 'YOUR_SCRAPFLY_API_KEY'
});

const result = await client.scrape({
    url: 'https://example.com',
    country: 'us',
    render_js: true,
    asp: true  // Anti-bot bypass
});

console.log(result.result.content);
Oxylabs
curl 'https://realtime.oxylabs.io/v1/queries' \
--user 'USERNAME:PASSWORD' \
-H 'Content-Type: application/json' \
-d '{
    "source": "universal",
    "url": "https://example.com",
    "geo_location": "United States",
    "render": "html"
}'
Scrapfly
curl "https://api.scrapfly.io/scrape\
?key=YOUR_SCRAPFLY_API_KEY\
&url=https%3A%2F%2Fexample.com\
&country=us\
&render_js=true\
&asp=true"

Browser Instructions Migration

Oxylabs' browser_instructions map to Scrapfly's js_scenario. Both support clicks, form fills, and waits.

Oxylabs
{
    "source": "universal",
    "url": "https://example.com",
    "render": "html",
    "browser_instructions": [
        {"type": "click", "selector": "#button"},
        {"type": "input", "selector": "#search", "value": "test"},
        {"type": "wait", "wait_time_s": 2}
    ]
}
Scrapfly
ScrapeConfig(
    url="https://example.com",
    render_js=True,
    js_scenario=[
        {"click": {"selector": "#button"}},
        {"fill": {"selector": "#search", "value": "test"}},
        {"wait": 2000}
    ]
)

See full JS Scenario documentation

πŸ€– AI Migration Assistant

Use Claude or ChatGPT to automatically convert your Oxylabs code to Scrapfly. Copy this prompt and paste it along with your existing code.

Copy This Prompt

I'm migrating from Oxylabs Web Scraper API to Scrapfly. Here's my current code using Oxylabs' API.
Please convert it to use Scrapfly's Python SDK (or JavaScript SDK if my code is in JavaScript).

Key differences and parameter mappings:
- Remove "source" parameter (Scrapfly auto-detects from URL)
- username:password auth β†’ key parameter
- render="html" β†’ render_js=True
- geo_location="United States" β†’ country="us" (use 2-letter ISO codes)
- locale β†’ lang (Accept-Language header)
- parse=True β†’ extraction_model="..." (see Extraction API docs)
- browser_instructions β†’ js_scenario (similar structure)
- Web Unblocker product β†’ asp=True (built into Scrapfly API)
- callback_url β†’ webhook_name (configure webhook in dashboard first)
- user_agent_type β†’ os parameter

Scrapfly-exclusive features to consider:
- cache=True for response caching
- auto_scroll=True for lazy-loaded content
- wait_for_selector for element-based waits
- proxy_saver for bandwidth optimization

Scrapfly SDK Docs (markdown for LLM): https://scrapfly.io/docs/sdk/python?view=markdown
Scrapfly API Docs (markdown for LLM): https://scrapfly.io/docs/scrape-api/getting-started?view=markdown

My current Oxylabs code:
[PASTE YOUR CODE HERE]
How to Use:
  1. Copy the prompt above
  2. Open Claude or ChatGPT
  3. Paste the prompt and replace [PASTE YOUR CODE HERE] with your Oxylabs code
  4. Review the generated Scrapfly code and test it with your free 1,000 credits

Developer Tools: Use our cURL to Python converter and selector tester to speed up development.

Frequently Asked Questions

What replaces Oxylabs' Web Unblocker in Scrapfly?

Scrapfly's ASP (Anti-Scraping Protection) is built into the main API:

# Oxylabs: separate Web Unblocker product
# Scrapfly: just add asp=True
result = client.scrape(ScrapeConfig(
    url="https://protected-site.com",
    asp=True  # Enables anti-bot bypass
))

Learn more about ASP

How do I handle geo_location in Scrapfly?

Oxylabs uses full location names, Scrapfly uses 2-letter ISO country codes:

# Oxylabs
geo_location = "United States"
geo_location = "California,United States"

# Scrapfly
country = "us"  # 2-letter ISO code

How do I migrate Oxylabs' parse=True to Scrapfly?

Scrapfly's Extraction API is more flexible:

# Use pre-built extraction models
extraction_model="product"
extraction_model="article"
extraction_model="job_listing"

# Or use AI prompts for custom extraction
extraction_prompt="Extract the product title, price, and reviews"

How do I test my migration?

  1. Sign up for free: Get 1,000 API credits with no credit card required
  2. Run parallel testing: Keep Oxylabs running while testing Scrapfly
  3. Compare results: Verify that Scrapfly returns the same data
  4. Gradual migration: Switch traffic gradually (e.g., 10% β†’ 50% β†’ 100%)

Start Your Migration Today

Test Scrapfly on your targets with 1,000 free API credits. No credit card required.

  • 1,000 free API credits
  • Full API access
  • Migration support
  • Same-day response from our team
Start For Free

Need help with migration? Contact our team