Filtering & Detection
CloakRadar uses multiple detection layers to identify and filter unwanted traffic. This guide explains each detection method and how to configure them for optimal results.
Detection Overview
CloakRadar analyzes every visitor using multiple detection methods:
Multi-Layer Detection
Each visitor is analyzed across 6+ detection layers simultaneously.
Real-Time Analysis
All detection happens in under 5ms, invisible to visitors.
Detection Layers
| Layer | What It Checks | Accuracy |
|---|---|---|
| IP Analysis | Datacenter, VPN, proxy, Tor, geolocation | 95%+ |
| User Agent | Bot signatures, browser validity | 90%+ |
| TCP/IP Fingerprint | OS stack characteristics | 98%+ |
| TLS Fingerprint | JA3/JA4 signatures, SSL client type | 99%+ |
| JS Fingerprint | Canvas, WebGL, browser environment | 99%+ |
| Behavioral | Mouse, scroll, keyboard patterns | 95%+ |
Bot Detection
Identifies automated visitors including search engine bots, scrapers, and automation tools.
What Gets Detected
| Category | Examples | Detection Method |
|---|---|---|
| Search Bots | Googlebot, Bingbot, Yandex, Baidu | User agent, IP ranges |
| Headless Browsers | Puppeteer, Playwright, Selenium | WebDriver, JS checks |
| Automation Tools | PhantomJS, SlimerJS, HtmlUnit | Environment fingerprint |
| Scrapers | HTTrack, Wget, Curl | User agent, headers |
| Spy Tools | AdSpy, BigSpy, AdPlexity | IP ranges, patterns |
| Ad Verification | DoubleVerify, IAS, Moat | IP ranges, UA patterns |
Bot Detection Levels
| Level | Description | Recommended For |
|---|---|---|
| Off | No bot detection | Testing only |
| Low | Only obvious bots (search crawlers) | Minimal filtering |
| Medium | Standard detection (recommended) | Most campaigns |
| High | Aggressive detection, may catch edge cases | High-risk campaigns |
| Paranoid | Maximum filtering, strictest rules | Very sensitive campaigns |
"Paranoid" mode may filter some legitimate users with unusual browser configurations. Monitor your traffic quality when using this setting.
Fingerprinting
Fingerprinting creates unique identifiers for each visitor based on their device and browser characteristics.
TCP/IP Fingerprinting
Analyzes the TCP/IP stack to identify the true operating system:
- TTL (Time to Live) - Different OSes use different default values
- Window Size - TCP window size patterns
- TCP Options - Options order and values
- MTU - Maximum transmission unit
If a visitor claims to use Windows but their TCP/IP stack shows Linux, they're likely spoofing their user agent (common for bots).
TLS/JA3 Fingerprinting
Analyzes the TLS handshake to identify the client type:
| Component | What It Tells Us |
|---|---|
| Cipher Suites | Which encryption methods the client supports |
| Extensions | TLS extensions requested |
| Curves | Elliptic curves supported |
| Point Formats | EC point format support |
The JA3 hash is a 32-character MD5 fingerprint unique to each client application. This allows us to identify:
- Headless browsers (Puppeteer, Playwright)
- Automation frameworks (Selenium)
- Programming language HTTP clients (Python requests, Node.js axios)
- Real browsers (Chrome, Firefox, Safari)
JavaScript Fingerprinting
Collects 1500+ browser attributes including:
| Category | Attributes Collected |
|---|---|
| Canvas | Unique rendering patterns from drawing operations |
| WebGL | GPU vendor, renderer, supported extensions |
| Audio | AudioContext fingerprint |
| Fonts | Installed fonts list |
| Screen | Resolution, color depth, pixel ratio |
| Hardware | CPU cores, memory, touch support |
| Browser | Plugins, languages, timezone |
Behavioral Analysis
Analyzes how visitors interact with the page to detect non-human behavior.
Mouse Analysis
- Movement Entropy - Real users have random, curved movements; bots move in straight lines
- Speed Variance - Humans vary their speed; bots are consistent
- Click Patterns - Natural click timing vs robotic patterns
Scroll Analysis
- Scroll Speed - Natural vs programmatic scrolling
- Direction Changes - Humans change direction; bots scroll continuously
- Scroll Depth - How far down the page visitors scroll
Keyboard Analysis
- Keystroke Timing - Intervals between key presses
- Typing Speed - Variance in typing speed
- Error Patterns - Humans make mistakes; bots don't
Behavioral analysis requires our JavaScript snippet on your landing page. Add it to collect behavioral data for enhanced detection.
Geographic Filtering
Filter traffic based on location.
Country Filter
Allow or block specific countries:
Mode: Whitelist
Countries: United States, Canada, United Kingdom
Result: Only visitors from US, CA, UK see your offer
Region/City Filter
For more granular control:
- Filter by state/province
- Filter by city
- Combine with country for precise targeting
Match your geo filters to your ad targeting. If you're only running ads in the US, only allow US traffic.
Device Filtering
Device Type
- Mobile - Smartphones
- Tablet - iPads, Android tablets
- Desktop - Windows, Mac, Linux computers
Operating System
- Windows (all versions)
- macOS
- iOS
- Android
- Linux
- Chrome OS
Browser
- Chrome
- Firefox
- Safari
- Edge
- Opera
- Samsung Browser
IP & Network Filtering
VPN Detection
Detects 25+ VPN providers including:
- NordVPN, ExpressVPN, Surfshark
- CyberGhost, PIA, IPVanish
- Windscribe, ProtonVPN, and more
Proxy Detection
Detects various proxy types:
- HTTP/HTTPS proxies
- SOCKS proxies
- Anonymous proxies
- Web-based proxies
Datacenter Detection
Blocks 30+ datacenter/hosting provider ASNs:
- AWS, Google Cloud, Azure
- DigitalOcean, Linode, Vultr
- OVH, Hetzner, Cloudflare
Tor Detection
Blocks Tor exit nodes using regularly updated lists.
WebRTC Leak Detection
Exposes real IPs hidden behind VPNs using WebRTC technology.
Custom Rules
Referrer Filter
Allow or block based on referring URL:
Mode: Whitelist
Referrers: facebook.com, fb.com
Result: Only traffic from Facebook is allowed
User Agent Filter
Block specific user agent patterns using regex:
Pattern: .*curl.*|.*wget.*|.*python.*
Action: Block
Result: Blocks common scraping tools
URL Parameter Filter
Filter based on URL parameters:
Parameter: source
Value: facebook
Action: Allow only
Result: Only clicks with ?source=facebook are allowed
ISP Filter
Block or allow specific ISPs/organizations.
Detection Levels
Quick configuration presets for common scenarios:
| Preset | Best For | Filters Enabled |
|---|---|---|
| Light | High-quality traffic sources | Bot detection (Low), Geo filter |
| Standard | Most campaigns | Bot (Medium), VPN, Datacenter, Geo |
| Strict | Sensitive offers | All filters, Bot (High), Fingerprinting |
| Maximum | Highest protection needed | All filters (Paranoid), Behavioral |
Start with Standard settings and adjust based on your traffic quality reports. If you see too many bots getting through, increase detection. If legitimate users are being blocked, decrease it.
Next: Learn how to read your data in Analytics & Reports.