Bypass Incapsula
96% success on Incapsula-protected targets. One API parameter.
reese84 signed. incap_ses minted. Chain coherent.
- Real browser fingerprints. TLS, HTTP/2, Canvas, and WebGL all match real Chrome on every request.
- Adaptive challenge solving. JavaScript execution, CAPTCHAs, proof-of-work — handled without solver keys.
- Session intelligence. Unblocked browser sessions reused across requests so Incapsula sees a single coherent visitor.
One API parameter. Add asp=True. See the ASP docs.
Every Incapsula Signal, Matched
Bit-for-bit fingerprint parity with real Chrome. Not monkey-patching, not surface-level — engineered at the protocol and C++ layers.
reese84 Token Signed. incap_ses Cookie Valid.
Imperva's reese84 challenge fingerprints the browser and mints a signed token. The token feeds the incap_ses_* cookie family along with visid_incap_* tracking and the nlbi_* load-balancer cookie. Without the matching token, every subsequent request is blocked. Scrapium executes the challenge under real Chrome semantics; Scrapfly caches the unblocked session.
Cookie Chain — All Four Aligned
Imperva inspects the full cookie chain on every request. A single missing or malformed cookie triggers re-challenge. Scrapfly keeps all four coherent across sessions.
Why DIY Bypasses Fail
reese84 rotates aggressively; static replay doesn't survive a single deploy.
| curl-impersonate | no JS challenge |
| incapsula-bypass (OSS) | broken per reese84 rev |
| undetected-chromedriver | cookie chain breaks |
| Playwright + stealth | fingerprint mismatches |
| Scrapfly | 96%, tracked daily |
TLS Match
Curlium emits Chrome-exact TLS so the cookie chain is even allowed to be issued.
Enterprise + Banking
Imperva is heavy in enterprise, banking, government — high-value targets with deep budgets.
- Enterprise
- Banking
- Government
Imperva Detection Stack — Every Layer Matched
Imperva chains TLS → reese84 challenge → cookie chain → JS validation → device fingerprint. All five must align.
JavaScript Challenges
Imperva ships _Incapsula_Resource, the _Incapsula object, utmvc validation, and an interstitial. All cleared inline.
- _Incapsula_Resource handler
- _Incapsula object reads
- utmvc validation
- Interstitial cleared
One Parameter. Incapsula Cleared.
Add asp=True. Scrapfly detects Incapsula and routes through the correct engine — Curlium for HTTP-layer targets, Scrapium for JS-heavy ones.
Set asp=True and Scrapfly handles Incapsula automatically. Picks Curlium or Scrapium per target.
from scrapfly import ScrapeConfig, ScrapflyClient, ScrapeApiResponse
client = ScrapflyClient(key="API KEY")
api_response: ScrapeApiResponse = client.scrape(
ScrapeConfig(
url='https://httpbin.dev/html',
# bypass anti-scraping protection
asp=True
)
)
print(api_response.result)
import {
ScrapflyClient, ScrapeConfig
} from 'jsr:@scrapfly/scrapfly-sdk';
const client = new ScrapflyClient({ key: "API KEY" });
let api_result = await client.scrape(
new ScrapeConfig({
url: 'https://httpbin.dev/html',
// bypass anti-scraping protection
asp: true,
})
);
console.log(api_result.result);
package main
import (
"fmt"
"github.com/scrapfly/go-scrapfly"
)
func main() {
client, _ := scrapfly.New("API KEY")
result, _ := client.Scrape(&scrapfly.ScrapeConfig{
URL: "https://httpbin.dev/html",
// bypass anti-scraping protection
ASP: true,
})
fmt.Println(result.Result.Content)
}
use scrapfly_sdk::{Client, ScrapeConfig};
#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
let client = Client::builder().api_key("API KEY").build()?;
let cfg = ScrapeConfig::builder("https://httpbin.dev/html")
// bypass anti-scraping protection
.asp(true)
.build()?;
let result = client.scrape(&cfg).await?;
println!("{}", result.result.content);
Ok(())
}
http https://api.scrapfly.io/scrape \
key==$SCRAPFLY_KEY \
url==https://httpbin.dev/html \
asp==true
{
"result": {
"status_code": 200,
"success": true,
"url": "https://httpbin.dev/html",
"content": "<!DOCTYPE html><html>...</html>",
"content_format": "raw",
"content_type": "text/html; charset=utf-8",
"response_headers": {
"content-type": "text/html; charset=utf-8",
"server": "cloudflare"
},
"cookies": [],
"duration": 1842,
"log_url": "https://scrapfly.io/dashboard/monitoring/log/01J...",
"asp_cost": 30
},
"context": {
"asp": true,
"proxy": {
"country": "us",
"type": "datacenter"
}
}
}
Frequently Asked Questions
Can I test on my specific Incapsula targets?
Yes. The free plan includes 1,000 API credits with no credit card required. Enable asp=True and test your exact targets before committing. Scrapfly achieves 96% success on Incapsula-protected sites; failed requests are not charged.
How much does ASP cost?
ASP starts at 30+ credits per request, scaling with target complexity. You pay for what a specific target needs, not a flat premium. See pricing.
What is the reese84 challenge?
reese84 is Imperva's JavaScript challenge that fingerprints the browser and mints a signed token. The token goes into the incap_ses_ cookie family. Without the right token, every subsequent request is blocked. Scrapium runs the challenge under real Chrome semantics; Scrapfly caches the unblocked session. DIY guide: bypassing Incapsula.
Bypass every other major anti-bot vendor too.
ASP handles every major anti-bot stack with the same flag. Switch targets, keep the parameter.