Free GEO audit — see if AI search can find you.
Check your site against nine AI crawlers (ChatGPT × 3 user agents, Claude, Perplexity, Gemini, Bing Copilot, Apple, Common Crawl). llms.txt validation, JSON-LD schema, content depth, and structural readability — under 10 seconds, no email required.
What the free audit covers
AI bot access check
See which of the nine AI crawlers — GPTBot, OAI-SearchBot, ChatGPT-User, ClaudeBot, PerplexityBot, Google-Extended, Bingbot, Applebot-Extended, CCBot — are blocked in your robots.txt.
llms.txt presence + validation
Detect whether you have an llms.txt file. We fetch and validate it from your domain root and surface the byte count + preview.
JSON-LD schema detection
Crawl your URL for application/ld+json blocks and list every @type we detect. Missing LocalBusiness or FAQPage? You will see it.
Meta + content depth
Title tag, meta description, H1 presence, and word count — the surface signals AI engines use to extract citable passages.
Frequently asked questions
Is this really free?
Yes — no sign-up, no credit card, no email gate. Enter your URL and get the audit. We score five dimensions: citability, structural readability, multi-modal, authority signals, and technical accessibility. The full GEO audit on the paid platform adds per-page schema generation, passage-level rewrites, llms.txt download, and historical tracking.
How does it actually work?
We fetch your robots.txt, llms.txt, and the page itself server-side (not from your browser). Then we parse robots.txt for AI-bot user-agent rules, detect JSON-LD blocks, extract heading hierarchy / meta tags / lists / tables / multi-modal elements / author signals, classify paragraphs by word length, and score the result 0-100. Whole process takes under 10 seconds.
Why are AI engines important if Google still dominates?
Google AI Overviews appear above the regular organic results for an increasing share of queries, and ChatGPT search, Perplexity, and Claude are now used by millions of buyers daily for "recommendation" queries (e.g. "best mortgage broker in Melbourne"). Brands cited by AI engines win those recommendations before any organic click happens.
What is llms.txt and do I need one?
llms.txt is a proposed standard (similar to robots.txt) at the root of your domain that gives AI engines a markdown summary of your key pages. Not enforced by every major engine yet, but Perplexity reads it and other engines are following. Our paid platform generates one for you with all your live URLs.
What gets scored and how?
Citability counts 25% (optimal 134-167 word passages + "X is..." definition openers). Structural readability counts 20% (single H1, ≥2 H2, question-style headings, lists/tables, FAQ). Multi-modal content counts 15% (images, video, iframe — pages with multi-modal see 156% higher AI selection rates). Authority signals count 20% (author byline, published date, modified date, og:image). Technical accessibility counts 20% (9 AI bots in robots.txt, llms.txt presence, JSON-LD schema, meta tags, content depth).
Which AI bots do you check?
Nine AI crawlers: GPTBot (ChatGPT training), OAI-SearchBot (ChatGPT Search), ChatGPT-User (ChatGPT browsing), ClaudeBot (Anthropic), PerplexityBot, Google-Extended (Gemini / AI Overviews), Bingbot (Copilot), Applebot-Extended (Apple Intelligence), and CCBot (Common Crawl). For each we check robots.txt for root-path access. Partial-path rules (e.g. "Disallow: /private") are not evaluated — only "Disallow: /" and "Allow: /".
Want the full GEO audit?
Per-page schema generation, passage-level citability rewrites, llms.txt download, and score history are part of the paid platform. 30-day free trial.