Live · monitoring AI bot traffic right now
  • GPTBot
  • ClaudeBot
  • Perplexity
  • Google-Extended
  • OAI-SearchBot
GPTBot · ClaudeBot · Perplexity · Google-Extended

Know which AI crawlers are training on your content.

Verified against official IP ranges — the fakes don't make it through.

Every GPTBot, ClaudeBot, OAI-SearchBot and Perplexity hit — page by page. Impostors spoofing those UAs from random IPs get flagged for blocking.

Real-time. Not tomorrow morning's report.

See what your site is doing right now — not yesterday.

It's like Google Analytics for log files.

Real-time logs from Cloudflare, Vercel or CloudFront — tied to your sitemap, GSC and robots.txt so you know exactly what Googlebot, ChatGPT and every bot is doing to your site.

Real Googlebot vs. UA-spoofing scraper.

Stop trusting the User-Agent header.

Every bot verified against its provider's published IP ranges.

Real Googlebot, Bingbot, GPTBot — verified. Scrapers borrowing their UAs from datacentre IPs — flagged for blocking. 200+ bots covered.

Attack probes spotted before your origin starts to smoke.

Catch the attack before the 5xx alert fires.

Hacking probes, scanner UAs, exposed secrets — in minutes.

Sqlmap, nuclei, /.env probes, SQLi patterns — surfaced with named source IPs, so you block at the WAF before a probe becomes a breach.

Search Console takes 48 hours. We take five minutes.

Spot a deindexing risk the same hour.

Live alerts when Googlebot starts hitting errors.

404s after a deploy. 5xx on key product pages. Crawl frequency dropping on a section. Surfaced in real time — before the rankings notice.

No tracking pixel. No JS snippet. No new agent.

Use the logs your CDN already produces.

Cloudflare. Vercel. CloudFront. One stream, every signal.

Skip JS-snippet bloat and analytics-vendor sampling. We read your CDN logs in real time — bots, SEO, security, performance — from one source.

Alerts that name names — IPs, paths, crawlers.

Alerts that tell you what to do.

Every alert ships with an AI triage: source, paths, action.

When 5xxs spike or scrapers hit, your inbox names the IP, the UA, the URLs, and what to block. Not "something happened — check your logs".

Botify-class data, without the Botify price tag.

Enterprise log analysis, from £19/mo.

Same crawl-budget data, sitemap correlation, SEO alerts.

All the crawler analytics legacy tools sell for $75k/yr. Multi-site, multi-seat, no per-consultant pricing. Free trial, no card needed.

Half of Googlebot's hits go to pages you don't want indexed.

See where Googlebot wastes your crawl budget.

Real content vs. /wp-admin vs. attack probes — per section.

When 60% of Googlebot's hits land on admin paths, redirect loops and 404 ghosts, real content starves. We show the split so you can fix robots.txt.

Every sitemap URL, every crawler hit, side by side.

Find the URLs Googlebot has never seen.

Live sitemap coverage — not Search Console's "discovered last month".

We cross-reference your sitemap.xml with live crawler activity. Sitemap URLs never crawled — flagged. Orphaned crawled URLs — flagged. The gap is the work.

We'll detect your CDN and tell you how to connect.

LogLens dashboard showing real-time traffic, bot, and AI-crawler analytics with site-health metrics and a 24-hour traffic chart
The dashboard. Live traffic, bot share, AI crawler share, and site health — at a glance.

What LogLens does

Four jobs every customer hires LogLens for. All four run from the same real-time log stream — no extra setup per use case.

Detect deindexing risk

Spot Googlebot 404s, redirect storms and crawl-rate drops the hour they happen — not 48 hours later in Search Console.

5-minute alerts vs. ~24h batch

Verify AI bots

GPTBot, ClaudeBot, Perplexity, OAI-SearchBot — every claim checked against the official IP ranges. Impostors flagged for blocking.

200+ bots, IP-verified

Catch attack probes early

Sqlmap, nuclei, /.env probes, exposed-secret patterns — surfaced with named source IPs before a probe becomes a breach.

Caught in the first 5-min window

Audit crawl-budget waste

See where Googlebot spends its time. /wp-admin hits, redirect loops, 404 ghosts — broken down by section so you can fix robots.txt with intent.

Per-URL-category breakdown

Who is LogLens for?

Six user types we built LogLens around. Each has a specific problem existing tools don't solve in time.

The technical SEO

In-house SEO lead at a growing content or SaaS site

"I deployed sitemap changes yesterday. No idea if Googlebot has seen them. Search Console won't tell me for 48 hours."

Real scenario: Friday 4pm URL-structure deploy. By Saturday 8am LogLens alerts: Googlebot hit 1,247 of your old URLs and got 404s because the redirect rules missed a pattern. You fix it before Monday traffic. Search Console wouldn't have shown you until Tuesday — and by then Google has already started deindexing.

Live logs · sitemap cross-analysis · alerts the moment Googlebot hits a broken URL

The SEO agency

Technical-SEO agency managing 20–200 client sites

"Screaming Frog seats × every client × every consultant doesn't scale. Botify starts at $75,000/yr."

Real scenario: You manage 50 client sites. One LogLens account, every consultant on the team has access, every client gets a shared read-only dashboard link. When something breaks on any client's site, the team sees it in real time — not when the consultant remembers to re-run a desktop log analysis next month.

Multi-tenant · unlimited team seats · shared client dashboards · one invoice

The AI-GEO specialist

New 2026 role — "SEO, but for AI answers"

"I need to know if GPTBot, ClaudeBot and OAI-SearchBot are actually reading my content. Are they? Which pages? Is my crawl-to-refer ratio good or bad?"

Real scenario: GPTBot fetched 847 of your product pages last week, verified against OpenAI's official IP ranges. Your /pricing page is being crawled 3× more than competitors'. Your ClaudeBot activity dropped 40% last Wednesday — LogLens shows the timing coincides with a robots.txt update. You revert the change the same day.

AI crawler verification · per-bot breakdown · crawl-to-refer tracking · robots.txt change correlation

The developer / DevOps lead

Engineering at a high-traffic site or web app

"Attack traffic hammered /wp-login.php yesterday and took the site down. Our infra alerts saw 'high CPU', not 'attack in progress'. SEO tooling missed it because the traffic was tagged 'human'."

Real scenario from a LogLens customer: A single datacentre IP sent 114,838 POSTs to /blog//wp-login.php in six hours, rotating 20+ fake browser UAs to evade detection. Traditional monitoring saw only "high CPU load". LogLens alerted within the first hour with the attacker IP named, the attack path identified, and a "scraper rotating UAs" label. Rate-limited at the CDN before the on-call engineer had their first coffee.

Attack-path classifier · absolute-rate floor · per-IP + UA analysis · real-time alerts

The e-commerce manager

SEO / ops at a Shopify Plus, BigCommerce, or Vercel Commerce store

"My platform doesn't give me server logs. Google Search Console is always 2–3 days behind. I'm flying blind on what crawlers are doing."

Real scenario: Black Friday prep pages launched Tuesday. By Friday morning LogLens shows Googlebot hadn't touched them — a leftover Disallow: /collections/holiday-* in robots.txt from last year's campaign. You fix it with 10 days to traffic week. With batch-based SEO tools, you'd have found out in Search Console after the sale.

Hydrogen / Vercel Log Drain receiver · Cloudflare integration · sitemap + robots.txt diff monitoring

The security-conscious SEO

SEO and web-ops lead at a site routinely crawled by scrapers and bot impersonators

"Everyone claims to be Googlebot. Half of them aren't. My analytics tool can't tell the difference. Meanwhile the real Googlebot is getting rate-limited by my WAF because it can't distinguish either."

Real scenario: LogLens verifies every claimed bot against its official published IP ranges. Fake Googlebots get flagged bot_verified: false, status: unverified. You set alerts on unverified-bot surges. Your WAF whitelists real Googlebot IPs (from LogLens's verified list) so legitimate crawl isn't throttled.

Bot IP verification · official range matching · impersonation detection · verified-bot allowlist

Why live data changes everything

Most SEO log tools take yesterday's logs, load them overnight, and give you a report next morning. That means every regression has 12–48 hours to hurt you before anyone notices. LogLens sees events as they happen, classifies them in-flight, and alerts within five minutes.

Batch log tools (most competitors)

"Upload yesterday's logs. Report by tomorrow morning."

Logs exported daily from your CDN, uploaded to the analyzer, processed overnight, reported on in the morning. By then the damage has been done.

  • Monday 14:00 Deploy accidentally breaks product URL structure
  • Monday 14:00–23:59 Googlebot hits 15,000 404s. No one knows.
  • Tuesday 02:00 Logs exported for the day
  • Tuesday 09:00 Analyzer report generated
  • Tuesday 10:00 Someone reads the report
  • Tuesday 11:00 Fix deployed
  • Total blast window: 21 hours
LogLens — real-time

"Stream from CDN. Alert in 5 minutes."

Cloudflare, Vercel, or CloudFront logs stream into LogLens as they happen. Every request is classified, crawler-verified, and anomaly-scored in-flight. Alerts fire the moment something looks off.

  • Monday 14:00 Same deploy breaks URLs
  • Monday 14:03 First Googlebot 404s land in LogLens
  • Monday 14:08 Error-spike alert fires with specific URLs
  • Monday 14:15 Fix deployed before the crawler returns
  • Total blast window: 15 minutes

This is not hypothetical. A LogLens customer — a UK comparison site — had a brute-force attack against /blog//wp-login.php from a single datacentre IP pushing 114,838 hits. LogLens caught it within one 5-minute window. A batch tool would have reported it the next morning, long after the attacker had already taken the site down.

LogLens Alerts page showing the baseline-intelligence heatmap that detects anomalies across requests, errors, bots, crawlers and latency
Alerts page. Every metric continuously baselined against itself — anomalies surface in minutes, not the next morning.

The only SEO log tool that ties everything together

Logs tell you what actually happened. Sitemaps tell you what should have happened. Search Console tells you what Google saw. robots.txt tells you what you allowed. Most tools have one of these. LogLens has all four, tied together in real time.

Logs

What actually happened on the origin. Every request, every bot, every status code.

Sitemap

What you want crawled. Fetched and diffed daily, changes detected automatically.

Search Console

What Google actually shows in search. Pulled via the official API.

robots.txt

What you allowed. Change-tracked so you can correlate drops to policy edits.

Questions only LogLens can answer

  • Which URLs in my sitemap has Googlebot never crawled in the last 30 days?
  • Which URLs does Google show in search but Googlebot hasn't crawled in 90 days? (Likely deindexing soon.)
  • Did last Tuesday's robots.txt change cause Googlebot's drop in visits? Show me the exact hour it stopped.
  • What's my crawl-to-refer ratio for each AI crawler? Is ChatGPT citing my content proportional to how much it crawls?
  • Which URLs are burning crawl budget with 404s or redirects when they should be 200s?
  • Is the Googlebot I'm seeing actually Googlebot — or a scraper spoofing its user-agent?
LogLens SEO Overview showing crawler requests, pages crawled, daily crawl rate and crawl-health metrics with a 7-day crawler activity chart
SEO Overview. Crawler requests, pages crawled, crawl rate, and crawl-budget waste — sourced from real Googlebot hits, not estimated from rank.

Crawl budget monitoring

Crawl budget monitoring

Stop wasting Googlebot's time on pages you don't care about

Every site has a finite amount of time Googlebot is willing to spend. If 40% of that goes to 404s, redirects, and attack probes, only 60% lands on the pages you actually want indexed.

LogLens shows you — in real time — where Googlebot is spending its budget. Which URLs waste it. Which sections are over- or under-served. Whether that cheap redirect rule you added last month now accounts for 18% of all crawler hits.

Nothing else on the market shows you this. Screaming Frog shows you what you uploaded. Search Console shows you aggregates. Only LogLens shows you the behaviour, live, broken down by intent.

Googlebot · last 7 days · by URL category
/products/*    ████████████ 42% good
/content/*     ██████      21% good
404 / redirect ███████     24% wasted
/wp-admin/*    ███          8% attack
/search                    5% low-value
32% of Googlebot's time on your site is not going to indexable content.
Recovering even half of that would move ~140 URLs/day from unindexed to indexed.

Alerts that name names

Alerts that name names

"Your site might be down" is not an alert

Every alerting tool can tell you something is wrong. Most of them can't tell you what is wrong or who is doing it. LogLens alerts name the specific IP, the specific crawler, the specific paths — so you know exactly what to do when one lands in your inbox.

Every alert runs an enrichment query against your log history and checks reachability. We won't tell you your site is down if a 200 HEAD probe says it isn't. We won't tell you about a redirect storm without naming the IP causing it.

Warning · Redirect Storm
yoursite.com · 70% redirects in last 15 min
76% came from a single IP45.86.xxx.xxx (Singapore datacentre) sent 1,508 requests using 20 different user-agents. That's a scraper rotating UAs to evade detection, not a site problem. The redirects are your apex→www canonicalisation working correctly. Consider rate-limiting this IP at your CDN.
Critical · Crawler Disappearance
yoursite.com · Search engine crawl dropped 96%
A major search engine has stopped crawling: Googlebot (234 → 8). Check recent robots.txt changes, deployments that may have broken pages, elevated 4xx/5xx in crawler-accessed paths, or Search Console crawl errors.
Info · Crawler Frequency Change
yoursite.com · AI crawler activity up 4.8×
GPTBot is driving this — 998 hits (81% of crawler traffic). Top crawled pages: /blog (245), /products/xyz (189), /pricing (87). An AI crawler stepping up ingestion — check robots.txt if you don't want your content used for training.
What we monitor for you

16 anomaly detectors, running on a 5-minute cron

Each one watches a different signal. Every alert names the IPs, paths, and crawlers responsible — and runs a reachability probe before sending so we don't tell you your site is down when it isn't.

SEO & crawlers

Critical
Crawler disappearance

Googlebot, Bingbot, or another major search bot has stopped crawling. Names which one + the rate change. Catches deindexing risks before Search Console does.

Info
Crawler frequency change

A search or AI crawler's activity changed by 3×+. Names the bot, top crawled pages, and whether it's verified against official IP ranges.

Warning
Crawler error spike

Search engines are hitting errors on your pages. Lists affected bots, status codes, and the specific URLs failing.

Warning
404 spike

Surge in 404 responses. Lists the top broken URLs, which are being hit by Googlebot/Bingbot, and how many unique IPs each.

Errors & performance

Critical
Server error spike (5xx)

Origin or backend failing. Lists the top failing endpoints with hit counts so engineers know what to deploy-fix.

Warning
Client error spike (4xx)

Surge in 4xx responses. Status-code breakdown shows whether it's 404, 401, 403, or 429. Suppresses noise from WAF blocks unless verified search bots are affected.

Critical
Traffic blackout

Traffic has dropped to ~zero. Runs a live HTTP probe of your homepage before alerting — so you only get woken up when the site really is down.

Warning
Latency degradation

Average response time has spiked. Lists the slowest URLs with average + max latency, weighted by traffic, so you know what to fix first.

Traffic anomalies

Warning
Massive traffic spike

Volume far above baseline. Lists the most-hit URLs — viral content concentrates on a few pages, attacks target endpoints like /wp-login.php.

Warning
Human traffic drop

Real-user traffic dropped while bot traffic stayed stable. Suggests routing, SEO, or UX issues affecting people but not crawlers.

Warning
Redirect storm

3xx responses surged. Names the top source IP + UA-rotation pattern (so you can tell scrapers apart from real config issues), plus the most-redirected URLs.

Info
Traffic pattern anomaly

Catch-all for unusual traffic that doesn't match a specific issue. Tightly throttled to avoid noise on small sites.

Bot abuse & security

Critical
Bot impersonation surge

Bots claiming to be Googlebot/Bingbot from unverified IP ranges. Lists the targeted URLs — usually a scraper trying to bypass robots.txt rules.

Info
Bot ratio shift

Mix of bot vs human traffic has changed materially. Often signals new crawler activity, scraping, or a CDN/WAF rule change.

Critical
Automated hacking probe

Scanner activity targeting known-vulnerable paths (/.env, /wp-config.php, /.git/config), known scanner UAs (sqlmap, nuclei, nikto), and SQLi / path-traversal / XSS / Log4Shell signatures. Promoted to critical if any probe gets a 200/30x — meaning the resource may exist on your origin.

Critical
Exposed secret in URL

Daily scan for tokens, JWTs, AWS access keys, Stripe live keys, and similar high-precision patterns appearing in your URLs. Once a secret lands in a URL it lives in CDN logs, browser history, and referer headers — assume it’s compromised and rotate.

Info
AI crawler activity

Sudden drop or surge in GPTBot, ClaudeBot, Perplexity-Bot, Google-Extended. Useful for tracking AI training exposure and the impact of robots.txt changes.

Cadence: every 5 minutes on Starter and above, every 30 minutes on Basic, every hour on Free. All alerts include enrichment context — the affected IPs, URLs, crawlers — so you know exactly what to do.

How many requests do you have?

Our pricing is based on requests, not visitors — because logs see everything: every page, every image, every JS file, every bot. Drop in your monthly visitors and we'll give you a conservative estimate.

2
15
25%
Estimated requests / month
2M
50,000 × 2 pages × 15 reqs × 1.33 (25% bots) = ~2M
Recommended plan
Basic $19/mo
2M requests/month · 120 days retention · email + Slack alerts.
Start 14-day free trial

Why visitors ≠ requests. A single visitor loading one page generates dozens of requests — the HTML, plus every CSS file, image, JavaScript bundle, font, and any AJAX/API calls the page fires. On a typical content site, one page view is around 25 requests; on a heavier ecommerce or app surface, it can be 50+.

And then there's bot traffic. On most public-facing sites, 30–60% of all requests come from search crawlers, AI bots, monitoring tools, scrapers, and feed readers. Logs see every one of them — that's the point.

Don't overthink the estimate. Once you sign up, the free 14-day trial runs against your real traffic. After about a week we'll have enough data to recommend the exact plan that matches your actual volume — and you can switch (up or down) any time. We also opt-in overage protection so you never get a surprise bill if a viral spike or a scraper hits.

Simple, transparent pricing

Start free, upgrade as you grow. All paid plans include a 14-day free trial. Save 20% with annual billing.

Free

$0/month

Try LogLens on one site — no card needed

  • 1M requests/monthHow many do I need?
  • 1 website
  • 30 days data retention
  • Real-time analytics
  • Bot & AI crawler detection
  • 5 AI insights/month
Get started free

Basic

$19/month

or $190/year — save 20%

  • 2M requests/monthHow many do I need?
  • 1 website (+$10/mo per extra, up to 3)
  • 120 days data retention
  • Sitemap & GSC integration
  • Email + Slack alerts
  • CSV export
  • Unlimited users
Start free trial

Growth

$149/month

or $1,490/year — save 20%

  • 20M requests/monthHow many do I need?
  • 15 websites (+$10/mo per extra, up to 30)
  • 1 year data retention
  • Unlimited users
  • Everything in Starter
  • Priority support
Start free trial

Scale

$499/month

or $4,990/year — save 20%

  • 200M requests/monthHow many do I need?
  • 50 websites + unlimited extras at $10/mo each
  • 2 years data retention
  • Unlimited users
  • Everything in Growth
  • Dedicated support
Start free trial

Enterprise

Custom

For 200M+ requests, SSO, SLA, DPA

  • Custom request volume
  • Unlimited everything
  • Custom retention
  • SSO / SAML
  • DPA & custom MSA
  • Uptime SLA
  • Dedicated CSM & Slack

All paid plans include overage protection (you opt in, set a cap, we bill in $10 increments). Overage rates scale with tier: Basic $6/M, Starter $4/M, Growth $2/M, Scale $1/M.

Common questions

The fastest answers — and an AI you can ask anything else.

How is LogLens different from Google Analytics?

Google Analytics only tracks visitors who execute JavaScript — humans with cookies enabled. LogLens reads server / CDN access logs, so it sees every request: bots, crawlers, AI agents, mobile-app HTTP traffic, scrapers, anything that can't or won't run JS.

They're complementary, not competing. Use GA for human user behaviour and conversion. Use LogLens for SEO, bot management, security, and ops.

Which CDNs and platforms does LogLens work with?

Real-time: Cloudflare (worker), AWS CloudFront (Kinesis Firehose), Vercel (Log Drain).

Historical / file upload: Apache or Nginx access logs, CloudFront standard or real-time logs, and Screaming Frog Log File Analyser Events CSV exports. Auto-detected on upload.

Not on the list? Use the compatibility checker at the top of the page — it'll detect your stack and tell you what to do.

How fast is "real-time"?

First request to dashboard: typically 5–10 seconds end-to-end. Alerts fire within minutes of a threshold being crossed.

Can I import historical log files?

Yes — drag-and-drop Apache, Nginx, CloudFront, or Screaming Frog Events CSV exports. Auto-detected, deduplicated against any real-time data, merges cleanly into your dashboard. Imports don't count against your monthly request quota.

Where is my data stored, and is it GDPR-friendly?

AWS eu-west-2 (London). Encrypted at rest and in transit. We don't sell data, we don't aggregate across customers, we don't run ads. DPA available on request.

How does the free trial work?

14-day full-access trial. Connect your site, import historical logs, see exactly what real-time analytics looks like for your specific traffic. After 14 days the trial expires; data stays for 7 more days, then is deleted unless you claim the account. No credit card required to start.

Can I have multiple sites or team members?

Yes. Plan determines max sites and seats — Free is 1/1, Starter is 3/3, Growth is 10/10, Scale is unlimited. Extra sites can be added with a $10/site/month addon on plans with finite caps. Team members get role-based access (owner, admin, member, viewer).

What happens if I exceed my monthly request limit?

Basic and above can opt-in to overage billing — pay-as-you-go for requests beyond your plan's quota with a cap you set. Without opt-in, real-time ingestion pauses (alerting continues) until the next billing cycle. Imports never count against the quota.

Do I need to install anything on my server?

In most cases, no. Cloudflare = a Worker (no server changes). CloudFront = a Kinesis Firehose stream pointed at our endpoint. Vercel = a Log Drain. Apache or Nginx logs = drag-and-drop upload.

Can LogLens detect AI crawlers like GPTBot and ClaudeBot?

Yes — and it verifies them against the bots' published IP ranges, so you can tell legitimate GPTBot/ClaudeBot/PerplexityBot traffic from scrapers using their User-Agent string. Per-bot crawl frequency, top URLs, and verification status live on the Bots & Crawlers page.

Don't see your question?

Ask LogLens AI directly — same brain as the chat bubble, inline below.

Ready to see what's actually happening on your site?

Free tier includes real-time ingestion, full SEO integration, and unlimited alerts. No credit card. Five-minute setup.