WebGL Fingerprint Test
Analyze your WebGL and WebGL2 capabilities. Test GPU rendering, shader precision, extensions, and 3D performance. See how your graphics card creates a unique browser fingerprint.
WebGL Fingerprint Hash
This hash uniquely identifies your GPU configuration (click to copy)
GPU Information
GPU Vendor: The manufacturer of your graphics card (NVIDIA, AMD, Intel, Apple, ARM, etc.)
GPU Renderer: The exact model and driver version (e.g., "NVIDIA GeForce RTX 3080", "Apple M1 Pro")
WebGL Version: The WebGL specification version your browser supports
This combination is extremely identifying. Even among users with the same GPU model, driver versions create variations. Browsers expose this information through the
WEBGL_debug_renderer_info extension.
WebGL Extensions
Common Extensions:
•
WEBGL_debug_renderer_info - Exposes GPU vendor/renderer (fingerprinting risk)•
OES_texture_float - Floating-point textures for advanced rendering•
WEBGL_compressed_texture_* - Hardware texture compression formats•
EXT_texture_filter_anisotropic - High-quality texture filtering•
OES_standard_derivatives - Shader derivatives for advanced effectsThe exact set of extensions varies by GPU, driver, and browser, making it a strong fingerprint vector.
Shader Precision Formats
Loading shader precision data...
Shader Types:
• Vertex Shader: Processes each vertex (3D point) in your scene
• Fragment Shader: Processes each pixel to determine its final color
Precision Levels:
• HIGH - Maximum precision (typically 32-bit float or 16-bit int)
• MEDIUM - Balanced precision and performance (typically 16-bit float or 10-bit int)
• LOW - Minimum precision for performance (typically 10-bit float or 8-bit int)
Data Types:
• FLOAT - Decimal numbers (positions, colors, normals)
• INT - Whole numbers (indices, counters)
Precision Format Values:
•
rangeMin/rangeMax - The minimum/maximum representable value (log2 scale)•
precision - Number of bits of precision availableDifferent GPUs support different precision capabilities, making this a strong fingerprinting vector. Mobile GPUs often have different precision support than desktop GPUs.
WebGL vs WebGL2
WebGL 1.0
WebGL 2.0
New Features:
• 3D textures and texture arrays
• Multiple render targets (MRT)
• Transform feedback
• Sampler objects
• Uniform buffer objects
• Integer textures and attributes
• Improved GLSL (OpenGL Shading Language) version 300
Browser Support: Chrome 56+, Firefox 51+, Edge 79+, Safari 15+
WebGL2 support varies by browser and GPU, adding another dimension to fingerprinting.
WebGL2 Extensions
3D Rendering Demo
Privacy Implications
- GPU Information - Vendor and renderer are highly unique
- Rendering Differences - Even identical code produces different pixels across GPUs
- Extension Support - Varies by GPU, driver, and browser combination
- Shader Capabilities - Precision formats differ between hardware
- Performance Characteristics - Rendering speed and frame rates are distinctive
2. Use Tor Browser: Standardizes WebGL parameters to reduce uniqueness
3. Firefox Privacy Settings: Enable
privacy.resistFingerprinting and webgl.disable-extensions4. Browser Extensions: Canvas Defender, Trace, or similar can spoof WebGL parameters
5. Virtual Machines: Use VMs with common GPU configurations for sensitive browsing
6. Anti-detect Browsers: For web scraping, use Scrapfly with automated fingerprint management
How WebGL Fingerprinting Works
WebGL fingerprinting uses your graphics card (GPU) to create a unique identifier by testing rendering capabilities and extracting hardware-specific information.
Common Techniques:
- GPU Vendor & Renderer Detection: The
WEBGL_debug_renderer_infoextension exposes exact GPU model and driver version - Rendering Comparison: Drawing identical shapes produces different pixel-level results across GPUs due to floating-point precision variations
- Extension Enumeration: Testing for 50+ WebGL extensions creates a unique capability matrix
- Parameter Probing: Querying hundreds of WebGL parameters (texture sizes, viewport limits, aliasing support, etc.)
- Shader Precision Testing: Different GPUs support different precision levels for floating-point and integer operations
- Performance Benchmarking: Measuring rendering speed and frame rates reveals GPU performance characteristics
Why It's Highly Effective:
- Hardware Diversity: Thousands of GPU models with different capabilities
- Driver Variations: Even identical GPUs produce different results with different driver versions
- Pixel-Perfect Differences: Anti-aliasing and sub-pixel rendering vary by hardware
- Difficult to Spoof: Software-level spoofing is challenging due to hardware constraints
- Cross-browser Consistent: GPU fingerprints remain similar across different browsers on the same machine
Real-World Impact:
Studies show WebGL fingerprinting can uniquely identify 99.24% of desktop users and 94.5% of mobile users. Combined with other fingerprinting techniques (Canvas, Audio, Fonts), it creates a near-perfect tracking system that works without cookies.
For web scrapers and automation, matching target users' GPU characteristics is critical to avoid detection by anti-bot systems.