🚀 We are hiring! See open positions

Post-Quantum TLS: Why Scraping Tools Are Now Exposed

by Hisham Apr 08, 2026 16 min read
Post-Quantum TLS: Why Scraping Tools Are Now Exposed Post-Quantum TLS: Why Scraping Tools Are Now Exposed

When your scraping tool opens an HTTPS connection, the TLS handshake it sends contains a new How TLS Fingerprint is Used to Block Web Scrapers? signal most scrapers don't send: the post-quantum key share. By early 2026, 57.4% of all browser-initiated connections include an X25519MLKEM768 key share that adds 1,088 bytes to the ClientHello message.

CDNs cross-reference key share presence against your User-Agent header. A request claiming Chrome 131 but lacking a PQ key share is a red flag that fires before the first byte of HTTP traffic arrives.

This isn't a future risk. Chrome enabled post-quantum key exchange by default in April 2024. Firefox followed in November 2024. Apple's ecosystem joined in October 2025. Akamai made PQ the default for all origin connections in January 2026.

Two named CVEs exposed Go-based scraping libraries that were undetected for over two years. In this guide, we'll cover how post-quantum TLS creates new detection vectors, the uTLS vulnerability chain affecting Go scrapers, how How to Bypass Cloudflare When Web Scraping in 2026 and How to Bypass Akamai when Web Scraping in 2026 enforce PQ detection today, and what scraping teams should do before the window closes.

Key Takeaways

  • Post-quantum TLS is already a live bot-detection signal, with Chrome, Firefox, and Apple sending PQ key shares by default across 57.4% of browser traffic.
  • A missing X25519MLKEM768 key share is now a direct fingerprint mismatch for traffic claiming a modern browser, and CDNs can detect that before any HTTP request is processed.
  • Go-based scraping tools are especially exposed because pinned pre-PQ uTLS fingerprints and the 2026 uTLS CVEs make many Chrome impersonation profiles trivially distinguishable from real browsers.
  • Scrapfly is the safest solution for scraping teams that need browser-grade TLS fingerprinting without maintaining their own PQ-aware TLS stack, because it handles post-quantum key exchange, ClientHello parity, and fingerprint rotation at the infrastructure level.
  • Teams managing their own stack should upgrade uTLS to 1.8.2+, stop pinning pre-PQ Chrome fingerprints, and validate packet structure and key share behavior immediately.
Get web scraping tips in your inboxTrusted by 100K+ developers and 30K+ enterprises. Unsubscribe anytime.

The Post-Quantum TLS Transition Is Already Here

The shift to post-quantum TLS started in early 2024 and moved faster than most scraping tool maintainers anticipated.

Post-quantum key exchange in TLS works through hybrid key agreements. The X25519MLKEM768 algorithm pairs the classic X25519 elliptic curve with ML-KEM-768 (formerly Kyber-768). The hybrid approach keeps connections secure against both classical and quantum-era attacks without breaking compatibility with servers that don't yet support PQ. From a bot detection perspective, the cryptography matters less than the signal it creates in the ClientHello.

Browser Adoption: Chrome, Firefox, and the Safari Gap

Google enabled X25519MLKEM768 by default in Chrome 124 in April 2024. Mozilla followed with Firefox 132 in November 2024. Apple shipped PQ support for iOS and macOS in October 2025.

According to F5 Labs, 57.4% of all browser-based transactions are now post-quantum ready. Chrome drives most of that number: Chrome accounts for 59% of browser connections, and 93% of those Chrome connections are PQ-capable. Firefox covers 2.4% of connections, with 85% of Firefox sessions PQ-ready.

The major holdout was Safari. Apple devices account for 38% of all connections, and Safari had no PQ cipher support on any iOS device until October 2025. The Safari support gap pushed the overall browser average down to 57.4%. But it doesn't help scrapers: CDNs flag PQ absence when a request claims to be Chrome or Firefox, regardless of Safari's status.

Browser PQ Default Since Share of Connections PQ-Ready Rate
Chrome 131+ April 2024 59% 93%
Firefox 132+ November 2024 2.4% 85%
Safari October 2025 38% Rolling out
All browsers 57.4%

CDN Rollout: Cloudflare and Akamai Go Default

Cloudflare rolled out post-quantum connections to origins across all plan tiers through 2024. By October 2025, over 50% of human-initiated traffic reaching Cloudflare used PQ key exchange. Cloudflare now treats PQ presence as part of the baseline for classifying human traffic.

Akamai set a firm deadline. Post-quantum key exchange became the default for all client-to-Akamai connections on January 31, 2026, with full network rollout completing in March 2026. Any scraping traffic reaching Akamai without a PQ key share now operates outside the baseline Akamai defines as normal browser behavior.

CDN enforcement has now caught up with browser adoption. The next section covers the three mechanisms that make PQ key exchange an effective detection signal.

How Post-Quantum Creates New Bot Detection Vectors

Post-quantum key exchange doesn't only change what's in the TLS handshake. It changes the shape, size, and structure of the ClientHello in ways that create new fingerprinting signals. Three mechanisms matter most for bot detection.

The Binary Signal: PQ Key Share Presence

The X25519MLKEM768 key share is 1,124 bytes. The classic X25519 key share it replaces is 36 bytes. Both Chrome and Firefox send this PQ key share by default in every TLS 1.3 connection.

Any scraping tool that doesn't include the PQ key share is making a claim no real Chrome 131+ or Firefox 132+ browser would make. CDNs detect this by cross-referencing the User-Agent header against the key shares present in the ClientHello. A request claiming Chrome 131 without the PQ key share doesn't match any known-good Chrome 131 ClientHello in their database.

The CDN key-share check needs no machine learning and no behavioral analysis. It runs at the TLS handshake stage, before HTTP traffic starts. The detection is also retroactive: a scraper can update its User-Agent string to Chrome 131+ without updating its TLS stack, and the key share size catches that immediately.

ClientHello Fragmentation: The Multi-Packet Problem

A pre-PQ ClientHello is 300-500 bytes and fits in a single TCP packet. A post-PQ ClientHello exceeds 1,400 bytes. That's above the typical TCP maximum segment size (MSS) of approximately 1,460 bytes. The ClientHello now splits across two or more TCP packets.

Pre-PQ vs Post-PQ ClientHello size comparison showing single packet vs multi-packet fragmentation

Post-PQ ClientHello fragmentation is new behavior for TLS 1.3. Prior to post-quantum key exchange, the ClientHello nearly always fit in a single packet. Scraping libraries that use a single write() call to send the ClientHello produce different TCP segmentation patterns compared to how real browsers handle fragmentation. That difference in segmentation becomes a secondary fingerprint.

Middleboxes that don't handle fragmented ClientHellos may also drop connections, creating a detectable failure rate in scraping traffic that doesn't match real browser connection success rates.

JA4 Fingerprint Expansion

JA4 fingerprinting captures more handshake fields than the older JA3 standard. Post-quantum key exchange adds new data to JA4: the hybrid group offered, the key share size, and the extension ordering that accompanies PQ support.

A February 2026 arXiv paper on bot detection via TLS fingerprints found that a CatBoost classifier using JA4 features achieved an AUC of 0.998 and classification accuracy of 0.9863. A separate March 2026 arXiv paper showed that TLS traffic using classical versus post-quantum key exchange can be distinguished with 98% accuracy from handshake data alone.

Both results confirm what CDN infrastructure teams already act on: PQ presence or absence is one of the strongest classification signals in modern TLS traffic. JA4 with PQ features outperforms JA3 at peak adoption. You can verify your own JA3 and JA4 fingerprints using Scrapfly's JA3 fingerprint tool.

These detection mechanisms apply across all HTTP client stacks, but Go-based scrapers face an additional exposure from two named CVEs in the uTLS library.

The uTLS Vulnerability Chain: Go Scrapers Exposed

The Go scraping ecosystem relies heavily on Use Curl Impersonate to scrape as Chrome or Firefox, a fork of Go's crypto/tls that allows custom ClientHello fingerprints. Two named CVEs in uTLS showed that the strategy of pinning to a stable fingerprint breaks down as browsers evolve.

Why Pinned Fingerprints Exist (and Why PQ Breaks Them)

Scraping libraries pin to specific browser fingerprints (for example, HelloChrome_120) because stability matters more than freshness. A known-good fingerprint that passes detection is safer than auto-updating to an untested profile. If the fingerprint changes, something might break.

This strategy worked when browser TLS evolved gradually. A new cipher suite here, an extension reorder there. Changes were small enough that pinned fingerprints stayed plausible for months.

Post-quantum key exchange broke that assumption. Chrome's ClientHello grew from approximately 300 bytes to over 1,400 bytes in a single browser update. A tool pinning to HelloChrome_120 without PQ support now diverges from the real Chrome 120 ClientHello on the first connection. The stability advantage became a detection liability.

CVE-2026-26995: The Missing Padding Extension

The affected package is github.com/refraction-networking/utls, versions 1.6.0 through 1.8.1.

Real Chrome adds a padding extension to the ClientHello when the message is shorter than 512 bytes. Chrome does this to prevent certain network equipment from rejecting unusually short packets. The HelloChrome_120 non-PQ fingerprint in uTLS didn't replicate this padding.

The result: any ClientHello using Chrome 120 parameters but with a total length under 512 bytes is not a real Chrome 120 ClientHello. A deep packet inspection system can detect this with a single packet length measurement. For domain-fronted requests, the per-connection detection probability is approximately 25%. For direct IP connections, it jumps to approximately 50%.

The CVE program later rejected the CVE on 2026-02-20, classifying it as an "external dependency vulnerability." The technical exposure is real regardless of the classification status. Fix: upgrade uTLS to version 1.8.2 or later.

CVE-2026-27017: ECH/GREASE Mismatch

This vulnerability covers versions 1.6.0 through 1.8.0, and it affects HelloChrome_120, HelloChrome_120_PQ, HelloChrome_131, and HelloChrome_133.

Chrome selects cipher suites for the outer ClientHello and for the Encrypted Client Hello (ECH) inner message consistently. When Chrome prefers AES for the outer cipher suite, it uses AES for ECH too. The affected uTLS fingerprints hardcode AES preference for the outer cipher suite but select the ECH cipher suite randomly between AES and ChaCha20.

That random selection gives a 50% chance of choosing ChaCha20 for ECH while using AES for the outer suite. That combination doesn't exist in real Chrome. Any server or middlebox checking this consistency will detect the mismatch on roughly half of all connections.

The two CVEs compound: the padding issue runs at a 25-50% detection probability, and the ECH mismatch adds an independent 50% probability. Tools running both vulnerabilities approach near-certain detection over a session. Fix: upgrade uTLS to version 1.8.1 or later (1.8.2+ covers both CVEs).

Ecosystem Impact: Colly, go-rod, and Beyond

uTLS is the foundation for TLS fingerprint spoofing across the Go scraping ecosystem. Web crawlers built on Colly, browser automation tools using go-rod, and custom Go HTTP clients all depend on uTLS for fingerprint management.

The CVE disclosures revealed that tools pinning to HelloChrome_120 or related pre-PQ fingerprints had been detectable for over two years. Even after upgrading to 1.8.2+, tools face the broader problem: a manually maintained fingerprint approach doesn't adapt when browsers update their TLS behavior. Post-quantum key exchange is the most visible example of this fragility, but it won't be the last.

CDN Enforcement: Who's Checking What

Three major CDN providers have publicly documented their approach to post-quantum TLS. Their enforcement postures differ in aggressiveness, but the direction is the same.

Cloudflare protects more than 20% of websites and handles over a third of human web traffic with PQ encryption as of late 2025. Cloudflare uses both JA3 and JA4 fingerprinting as part of its bot management pipeline, comparing incoming ClientHello fingerprints against a database of known-good browser patterns. PQ presence is part of that comparison. Bot traffic that spoofs Chrome fingerprints without the matching PQ key share creates a fingerprint with no known-good match in Cloudflare's database.

Akamai disclosed a patent on TLS fingerprinting for bot detection and published research showing 92-98% bot classification accuracy through cross-layer analysis. Akamai correlates TLS fingerprints with HTTP behavior and round-trip time. Post-quantum TLS became the default for all Akamai connections on January 31, 2026. Any scraping traffic reaching Akamai without a PQ key share now operates outside the baseline Akamai defines as normal browser behavior.

Akamai also detects "cipher stunting" (a technique where scrapers intentionally downgrade TLS cipher suite selection to avoid fingerprinting). PQ key exchange makes cipher stunting more detectable because the downgrade now affects packet size, not just cipher suite selection.

F5 published the 57.4% PQ browser readiness figure and monitors PQ adoption across enterprise TLS infrastructure. F5's role in the detection stack is primarily monitoring and reporting, but the data F5 publishes sets the industry baseline that other CDNs use to calibrate their detection thresholds.

CDN PQ Status Fingerprinting Method Detection Capability
Cloudflare >50% of human traffic protected JA3/JA4 against known-good DB PQ presence as classification signal
Akamai Default since January 2026 Patented cross-layer TLS analysis 92-98% accuracy
F5 Infrastructure monitoring TLS traffic analysis PQ readiness baseline

Understanding the enforcement posture helps, but the timeline shows how quickly the window for adapting has narrowed.

Timeline: When PQ Becomes Table Stakes

The PQ adoption curve isn't linear. Browser adoption has been building since 2024, but CDN enforcement is compressing into a short window in 2026.

The Compressed Timeline Thesis

Here's how the timeline looks from a scraping team's perspective:

April 2024: Chrome enables X25519MLKEM768 by default. Any request claiming Chrome 124+ without PQ starts diverging from real browser traffic. Most scraping tools don't notice because CDN enforcement hasn't caught up.

November 2024: Firefox follows. Both major desktop browsers now send PQ key shares by default. The baseline for a "real browser" TLS handshake has shifted.

October 2025: Apple ships PQ support for iOS and macOS. The last major platform holdout joins default PQ deployment.

January 2026: Akamai makes PQ the default for all connections. CDN enforcement catches up with browser adoption. A scraping tool without PQ support is now operating outside the browser baseline on Akamai-protected sites.

Late 2026: As PQ absence becomes rarer in real browser traffic, its value as a bot signal increases. PQ absence moves from "useful signal" to "near-certain indicator" for traffic claiming modern browsers.

The post-quantum transition is a step function, not a gradient. Unlike cipher suite evolution (where fingerprints diverged gradually over months), PQ added 1,088 bytes to the ClientHello in a single browser update. There's no gradual middle ground.

Spoofing library maintenance hasn't kept pace with browser changes. Most of the Go scraping ecosystem still uses manually maintained fingerprint profiles. PQ key generation, hybrid algorithm formatting, and TCP segmentation handling all add complexity to maintaining an accurate fingerprint. Encrypted Client Hello (ECH) adds the next layer of complexity: ECH encrypts the Server Name Indication (SNI) field, adding further fingerprinting signals that CDNs already track.

Given where enforcement stands today, here are the steps that address the most immediate exposure.

What Scraping Teams Should Do Now

The window to adapt is open but closing. These steps apply to scraping teams managing their own TLS stack.

Audit your TLS library for PQ support. Check whether your HTTP client sends an X25519MLKEM768 key share. Verify this using Scrapfly's JA3 fingerprint tool or by inspecting your ClientHello in Wireshark. If you see only X25519 in the key_shares extension, your stack doesn't have PQ support.

Upgrade uTLS if you're building with Go. Versions 1.6.0 through 1.8.1 carry CVE-2026-26995. Versions 1.6.0 through 1.8.0 carry CVE-2026-27017. Upgrade to 1.8.2 or later. Then verify the HelloChrome fingerprint you're using includes the PQ key share.

Stop pinning pre-PQ Chrome fingerprints. HelloChrome_120 without PQ support is now a detectable fingerprint mismatch. Switch to HelloChrome_131 or a profile that includes X25519MLKEM768. Don't update your User-Agent string without updating the TLS stack to match.

Test your ClientHello against expected packet structure. httpcloak provides a way to verify that your ClientHello matches the expected size and extension order for your claimed fingerprint. Pay specific attention to key share size and TCP packet count.

Monitor HelloRetryRequest rates. A HelloRetryRequest (HRR) occurs when the server doesn't support the key group the client offered. A higher HRR rate in your scraping traffic compared to real browser baselines creates a detectable behavioral signal. Proper PQ support reduces HRR frequency to match browser behavior.

PQ support is a moving target. Maintaining fingerprint accuracy is ongoing engineering work, not a one-time fix.

For teams that don't want to run a parallel TLS engineering program, Scrapfly handles fingerprint management at the infrastructure level.

Handle PQ Fingerprinting with Scrapfly

Diagram showing a request being processed by Scrapfly's anti-scraping protection middleware before reaching the target website

Scrapfly manages TLS fingerprinting, post-quantum key exchange, and fingerprint rotation at the infrastructure level. When you send a request with anti-scraping protection enabled, Scrapfly handles the ClientHello using browser-grade TLS stacks. You don't need to track CVEs, maintain Chrome fingerprint profiles, or rebuild your TLS stack as browsers update.

For teams that don't want to run a parallel TLS engineering program alongside their data pipeline, Scrapfly handles the arms race automatically.

FAQ

Does post-quantum TLS affect web scraping?

Yes. PQ key exchange creates a new fingerprinting signal that CDNs use to classify traffic. A scraper without PQ support signals its identity before any HTTP request is processed.

What is X25519MLKEM768?

X25519MLKEM768 is a hybrid key exchange algorithm combining X25519 elliptic curve cryptography with ML-KEM-768 (formerly Kyber). Chrome and Firefox send this key share by default in every TLS 1.3 connection as of 2024-2025.

Is uTLS still safe for web scraping?

Versions before 1.8.2 have CVE-2026-26995, and versions before 1.8.1 have CVE-2026-27017. Upgrading to 1.8.2+ fixes both CVEs, but tools using uTLS still face the broader problem of maintaining accurate fingerprints as browsers update their TLS behavior.

How do CDNs detect scraping tools via TLS?

CDNs compare the ClientHello fingerprint against a database of known-good browser patterns. They check cipher suites, extensions, and key shares, including PQ key share presence. Cross-referencing the User-Agent against the TLS fingerprint catches most mismatches without machine learning.

When will post-quantum TLS be required?

PQ is already the default in Chrome, Firefox, and Apple's ecosystem. Akamai made it the default for all connections in January 2026. By late 2026, PQ absence will be a near-certain bot indicator for traffic claiming modern browsers.

Summary

Post-quantum TLS is already a detection vector, not a future concern. By early 2026, more than half of all browser traffic includes a PQ key share. CDNs cross-reference that presence against User-Agent headers. A scraping tool claiming Chrome 131 without PQ key exchange fails a binary check that runs before HTTP traffic starts.

Two named CVEs in uTLS showed that pinning to a stable fingerprint becomes a liability when browsers make step-function changes to their TLS stacks. Go-based scrapers using HelloChrome_120 or related pre-PQ profiles were exposed for over two years. Upgrading uTLS to 1.8.2+ fixes the immediate CVEs, but the broader challenge is keeping fingerprints accurate as browsers keep updating.

Akamai's January 2026 enforcement deadline marks the inflection point where CDN detection caught up with browser adoption. For scraping teams managing their own TLS stack, the time to audit and upgrade is now.

Scale Your Web Scraping
Anti-bot bypass, browser rendering, and rotating proxies — all in one API. Start with 1,000 free credits.
No credit card required 1,000 free API credits Anti-bot bypass included
Not ready? Get our newsletter instead.