Filtering & Detection

CloakRadar uses multiple detection layers to identify and filter unwanted traffic. This guide explains each detection method and how to configure them for optimal results.

Detection Overview

CloakRadar analyzes every visitor using multiple detection methods:

Multi-Layer Detection

Each visitor is analyzed across 6+ detection layers simultaneously.

Real-Time Analysis

All detection happens in under 5ms, invisible to visitors.

Detection Layers

Layer What It Checks Accuracy
IP Analysis Datacenter, VPN, proxy, Tor, geolocation 95%+
User Agent Bot signatures, browser validity 90%+
TCP/IP Fingerprint OS stack characteristics 98%+
TLS Fingerprint JA3/JA4 signatures, SSL client type 99%+
JS Fingerprint Canvas, WebGL, browser environment 99%+
Behavioral Mouse, scroll, keyboard patterns 95%+

Bot Detection

Identifies automated visitors including search engine bots, scrapers, and automation tools.

What Gets Detected

Category Examples Detection Method
Search Bots Googlebot, Bingbot, Yandex, Baidu User agent, IP ranges
Headless Browsers Puppeteer, Playwright, Selenium WebDriver, JS checks
Automation Tools PhantomJS, SlimerJS, HtmlUnit Environment fingerprint
Scrapers HTTrack, Wget, Curl User agent, headers
Spy Tools AdSpy, BigSpy, AdPlexity IP ranges, patterns
Ad Verification DoubleVerify, IAS, Moat IP ranges, UA patterns

Bot Detection Levels

Level Description Recommended For
Off No bot detection Testing only
Low Only obvious bots (search crawlers) Minimal filtering
Medium Standard detection (recommended) Most campaigns
High Aggressive detection, may catch edge cases High-risk campaigns
Paranoid Maximum filtering, strictest rules Very sensitive campaigns
Warning

"Paranoid" mode may filter some legitimate users with unusual browser configurations. Monitor your traffic quality when using this setting.

Fingerprinting

Fingerprinting creates unique identifiers for each visitor based on their device and browser characteristics.

TCP/IP Fingerprinting

Analyzes the TCP/IP stack to identify the true operating system:

Why This Matters

If a visitor claims to use Windows but their TCP/IP stack shows Linux, they're likely spoofing their user agent (common for bots).

TLS/JA3 Fingerprinting

Analyzes the TLS handshake to identify the client type:

Component What It Tells Us
Cipher Suites Which encryption methods the client supports
Extensions TLS extensions requested
Curves Elliptic curves supported
Point Formats EC point format support

The JA3 hash is a 32-character MD5 fingerprint unique to each client application. This allows us to identify:

JavaScript Fingerprinting

Collects 1500+ browser attributes including:

Category Attributes Collected
Canvas Unique rendering patterns from drawing operations
WebGL GPU vendor, renderer, supported extensions
Audio AudioContext fingerprint
Fonts Installed fonts list
Screen Resolution, color depth, pixel ratio
Hardware CPU cores, memory, touch support
Browser Plugins, languages, timezone

Behavioral Analysis

Analyzes how visitors interact with the page to detect non-human behavior.

Mouse Analysis

Scroll Analysis

Keyboard Analysis

How to Enable

Behavioral analysis requires our JavaScript snippet on your landing page. Add it to collect behavioral data for enhanced detection.

Geographic Filtering

Filter traffic based on location.

Country Filter

Allow or block specific countries:

Mode: Whitelist
Countries: United States, Canada, United Kingdom

Result: Only visitors from US, CA, UK see your offer

Region/City Filter

For more granular control:

Best Practice

Match your geo filters to your ad targeting. If you're only running ads in the US, only allow US traffic.

Device Filtering

Device Type

Operating System

Browser

IP & Network Filtering

VPN Detection

Detects 25+ VPN providers including:

Proxy Detection

Detects various proxy types:

Datacenter Detection

Blocks 30+ datacenter/hosting provider ASNs:

Tor Detection

Blocks Tor exit nodes using regularly updated lists.

WebRTC Leak Detection

Exposes real IPs hidden behind VPNs using WebRTC technology.

Custom Rules

Referrer Filter

Allow or block based on referring URL:

Mode: Whitelist
Referrers: facebook.com, fb.com

Result: Only traffic from Facebook is allowed

User Agent Filter

Block specific user agent patterns using regex:

Pattern: .*curl.*|.*wget.*|.*python.*
Action: Block

Result: Blocks common scraping tools

URL Parameter Filter

Filter based on URL parameters:

Parameter: source
Value: facebook
Action: Allow only

Result: Only clicks with ?source=facebook are allowed

ISP Filter

Block or allow specific ISPs/organizations.

Detection Levels

Quick configuration presets for common scenarios:

Preset Best For Filters Enabled
Light High-quality traffic sources Bot detection (Low), Geo filter
Standard Most campaigns Bot (Medium), VPN, Datacenter, Geo
Strict Sensitive offers All filters, Bot (High), Fingerprinting
Maximum Highest protection needed All filters (Paranoid), Behavioral
Recommendation

Start with Standard settings and adjust based on your traffic quality reports. If you see too many bots getting through, increase detection. If legitimate users are being blocked, decrease it.


Next: Learn how to read your data in Analytics & Reports.