Bypass Akamai
97% success on Akamai-protected targets. One API parameter.
Sensor payload signed. _abck minted. Page cleared.
- Real browser fingerprints. TLS, HTTP/2, Canvas, and WebGL all match real Chrome on every request.
- Adaptive challenge solving. JavaScript execution, CAPTCHAs, proof-of-work — handled without solver keys.
- Session intelligence. Unblocked browser sessions reused across requests so Akamai sees a single coherent visitor.
One API parameter. Add asp=True. See the ASP docs.
Every Akamai Signal, Matched
Bit-for-bit fingerprint parity with real Chrome. Not monkey-patching, not surface-level — engineered at the protocol and C++ layers.
_abck Cookie Signed. Sensor Data Accepted.
Akamai Bot Manager issues the _abck cookie after the sensor_data payload verifies. The payload carries device telemetry, browser properties, and hardware fingerprints; a single mismatch against the observed request fingerprint invalidates the cookie instantly and blocks every follow-up request. Scrapfly generates authentic sensor_data under a real browser and keeps _abck valid across the session.
TLS Fingerprint — Byte-Perfect Chrome
Akamai inspects TLS fingerprint at the edge. A JA4 hash that doesn't match a known browser profile triggers the bmak script path immediately. Curlium emits the exact Chrome ClientHello — cipher order, GREASE, ALPN, signature algorithms, HTTP/2 SETTINGS frame.
Why DIY Bypasses Fail
Open-source Akamai bypass stacks break on every bmak rotation.
| curl-impersonate | no sensor_data |
| akamai-bypass (OSS) | broken per bmak rev |
| Selenium + stealth | fingerprint leaks |
| Playwright | Canvas + Audio mismatches |
| Scrapfly | 97%, tracked daily |
_abck Refresh
Akamai rotates the cookie on a tight cadence. Scrapfly caches and rotates in lockstep, so a stale cookie never triggers a block.
Fortune 500
~30% of Fortune 500 sites use some form of Akamai protection. Common in retail, airlines, and finance.
- Retail
- Airlines
- Finance
Akamai Detection Stack — Every Layer Matched
Akamai chains TLS → bmak sensor payload → _abck cookie → device fingerprint → behavioral telemetry. Break one layer and the cookie invalidates.
bmak Sensor Collection
Akamai's bmak.js fingerprints the device and mints a signed sensor payload. Scrapium runs the collector under real Chrome semantics; the payload verifies server-side.
- Device telemetry matched
- Browser properties native
- Hardware signals coherent
- Timing signals realistic
One Parameter. Akamai Cleared.
Add asp=True. Scrapfly detects Akamai and routes through the correct engine — Curlium for HTTP-layer targets, Scrapium for JS-heavy ones.
Set asp=True and Scrapfly handles Akamai automatically. Picks Curlium or Scrapium per target.
from scrapfly import ScrapeConfig, ScrapflyClient, ScrapeApiResponse
client = ScrapflyClient(key="API KEY")
api_response: ScrapeApiResponse = client.scrape(
ScrapeConfig(
url='https://httpbin.dev/html',
# bypass anti-scraping protection
asp=True
)
)
print(api_response.result)
import {
ScrapflyClient, ScrapeConfig
} from 'jsr:@scrapfly/scrapfly-sdk';
const client = new ScrapflyClient({ key: "API KEY" });
let api_result = await client.scrape(
new ScrapeConfig({
url: 'https://httpbin.dev/html',
// bypass anti-scraping protection
asp: true,
})
);
console.log(api_result.result);
package main
import (
"fmt"
"github.com/scrapfly/go-scrapfly"
)
func main() {
client, _ := scrapfly.New("API KEY")
result, _ := client.Scrape(&scrapfly.ScrapeConfig{
URL: "https://httpbin.dev/html",
// bypass anti-scraping protection
ASP: true,
})
fmt.Println(result.Result.Content)
}
use scrapfly_sdk::{Client, ScrapeConfig};
#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
let client = Client::builder().api_key("API KEY").build()?;
let cfg = ScrapeConfig::builder("https://httpbin.dev/html")
// bypass anti-scraping protection
.asp(true)
.build()?;
let result = client.scrape(&cfg).await?;
println!("{}", result.result.content);
Ok(())
}
http https://api.scrapfly.io/scrape \
key==$SCRAPFLY_KEY \
url==https://httpbin.dev/html \
asp==true
{
"result": {
"status_code": 200,
"success": true,
"url": "https://httpbin.dev/html",
"content": "<!DOCTYPE html><html>...</html>",
"content_format": "raw",
"content_type": "text/html; charset=utf-8",
"response_headers": {
"content-type": "text/html; charset=utf-8",
"server": "cloudflare"
},
"cookies": [],
"duration": 1842,
"log_url": "https://scrapfly.io/dashboard/monitoring/log/01J...",
"asp_cost": 30
},
"context": {
"asp": true,
"proxy": {
"country": "us",
"type": "datacenter"
}
}
}
Frequently Asked Questions
Can I test on my specific Akamai targets?
Yes. The free plan includes 1,000 API credits with no credit card required. Enable asp=True and test your exact targets before committing. Scrapfly achieves 97% success on Akamai-protected sites; failed requests are not charged.
How much does ASP cost?
ASP starts at 30+ credits per request, scaling with target complexity. You pay for what a specific target needs, not a flat premium. See pricing.
What is the _abck cookie?
Akamai's _abck cookie carries the bot score derived from sensor data: device telemetry, browser properties, and hardware fingerprints. A mismatch between _abck and the observed request fingerprint marks the visitor as a bot. Scrapfly regenerates the sensor data under a real browser and keeps the cookie valid across requests.
What industries use Akamai?
Akamai is common in retail, airlines, and finance. About 30% of Fortune 500 sites use some form of Akamai protection. DIY guide: how to bypass Akamai.
Bypass every other major anti-bot vendor too.
ASP handles every major anti-bot stack with the same flag. Switch targets, keep the parameter.