Tor traffic is a small slice of the abuse you'll see — typically 0.3% to 1.5% of total visits — but it punches above its weight on fraud-per-visit. Here's how Tor exit detection actually works, why static blocklists are useless, and how to act on the signal without false-positiving people who use Tor for legitimate privacy reasons.
Tor's exit-node list is published openly. check.torproject.org/torbulkexitlist returns the current set, refreshed continuously. So in principle, detection is a one-line lookup against a pre-fetched file. In practice, almost every implementation that does this is wrong.
Why static Tor lists are useless
The Tor network has roughly 1,500–2,000 exit relays at any given moment. Half of them rotate IPs within 24 hours. Several hundred only stay online for a few hours per day. The list at check.torproject.org is a snapshot — accurate at the moment you fetch it, stale within hours.
Most engineering teams set up a daily cron to fetch the list. By the time it has run, ~12% of the IPs in the file are no longer exit nodes (they've gone offline, rotated, or rolled to a different role inside Tor), and a similar number of new exits have come online and aren't in the file. So a daily-cached list is wrong on around a quarter of the relevant traffic.
Hourly is better. Real-time is better still. The Tor consensus is published every hour at https://collector.torproject.org/recent/relay-descriptors/consensuses/, but the parser is non-trivial and most people get the descriptor format wrong on first attempt.
Beyond the list: catching Tor traffic the lists miss
A surprising amount of Tor-using traffic comes from bridges, not standard exit nodes. Bridges are unlisted relays specifically designed to evade IP-based blocking — they're how Tor users in censored countries get through national firewalls. By design they don't appear on any public list.
Detecting bridge-routed Tor traffic requires moving past IP lookup into protocol-layer signals:
1. TLS / JA3 fingerprint
The Tor Browser is a Firefox fork with deliberate fingerprint normalisation. Its TLS handshake signature is one of a handful of distinctive JA3/JA4 hashes. If you see an inbound JA3 that matches the Tor Browser specifically, the traffic is from a Tor user — regardless of whether the IP is on any list.
2. Browser fingerprint flatness
Tor Browser intentionally rounds and lies about screen resolution, timezone, language, and plugin list. Most users land on the same handful of canonical fingerprints. A "1000×1000 viewport at UTC with English-only Accept-Language and no installed fonts beyond the system defaults" is one of them. Real Firefox users are not this homogeneous.
3. HTTP-header ordering
Tor Browser sends headers in a normalised order that differs from upstream Firefox. The header set is also stripped — no DNT permutations, no Client Hints, no Sec-CH-UA. A request with that exact header signature, even from a residential IP, is almost always Tor.
4. Behavioural mismatch
Real visitors have a session story: page-1 dwell, scroll, navigation, conversion or abandonment. Tor Browser users dropping in for fraud have an unusually short, unusually purposeful path — straight to login or signup, no exploration. By itself this is weak; combined with a Tor-shaped fingerprint it's enough.
What policy makes sense?
This is the part most teams get wrong. Tor is used legitimately by journalists, activists, security researchers, abuse-survivors looking for help, and people in countries with surveillance that makes plain-text browsing unsafe. Blanket-blocking Tor at the network layer punishes them. It also visibly signals to attackers that you're a soft target who responded to a single signal.
Better policy patterns:
- Allow read-only access from Tor. Browsing your marketing site, blog, and docs from Tor should work. There's no fraud risk.
- Step up auth, don't block, on signup and login. A Tor visitor signing up should clear an extra friction layer (email + phone, or device-bound passkey) rather than be rejected outright. This filters spam without locking out legitimate privacy users.
- Block Tor on payment, withdrawal, KYC. Tor for cash-out is almost always fraud. Tor for browsing is usually fine.
- Use Tor as one signal of N, never on its own. Tor + new account + datacenter card BIN + tampered browser fingerprint is a clear block. Tor alone, on a 12-month-old account in good standing, is not.
How Sentinel handles Tor
Sentinel's /v1/evaluate response includes:
{
"isSuspicious": true,
"details": {
"isTor": true,
"torExitNode": "9aa1bef...",
"torConfidence": 0.97,
"vpn": false,
"proxied": false,
"ipScore": 0.91,
"browserTampering": 0.62
}
}
The exit-node identifier is the public relay nickname/fingerprint, useful for logs and audit trails. torConfidence reflects how strongly the signal is backed up by IP-list match plus protocol-layer fingerprint. A confidence above 0.9 means both the IP and the JA3/header signature match — it's not just an IP-list hit. Below 0.5, you have a soft positive that should not gate a critical action on its own.
One-line integration
Same as any other Sentinel call — there's no separate Tor endpoint:
const verdict = await fetch('https://sntlhq.com/v1/evaluate', {
method: 'POST',
headers: {
'Authorization': 'Bearer sk_live_YOUR_KEY',
'Content-Type': 'application/json'
},
body: JSON.stringify({ token: req.body.sentinelToken })
}).then(r => r.json());
const d = verdict.details;
if (d.isTor && d.torConfidence > 0.85 && isHighRiskAction(req.path)) {
return res.status(403).json({ error: 'tor_blocked_for_this_action' });
}
The full IP/Tor/proxy/bot signal returns in under 40ms — well below the latency floor where users notice.
What about VPN traffic?
VPN traffic is a separate signal — see our VPN detection API guide. The detection mechanics are similar in shape (IP enrichment + protocol fingerprint) but the populations are very different. VPN users are a much larger, more heterogeneous group, and the policy decisions are different. Don't conflate the two.
Tor is also distinct from residential proxy networks, which are the harder-to-detect class — Tor advertises itself, residential proxies hide.
Try it
Free API key at sntlhq.com/signup. The Tor signal is included in the standard response — no extra endpoint, no extra cost. 1,000 requests/hour on the free tier, no card. Node SDK on npm at @sentinelsup/sdk; REST docs for any other language.