How to improve your GeoScored audit results
Your GeoScored report shows what to fix. This guide explains how. Start with the checks marked "fail" (red). Those have the biggest impact on your Generative Engine Optimization (GEO) score.
Quick wins (under 30 minutes)
These five changes take minutes, not days. Any of them can move your score on the next scan.
1. Unblock AI crawlers in robots.txt
Open your website's robots.txt file (usually at yoursite.com/robots.txt). Look for lines that say Disallow for GPTBot, ClaudeBot, or Google-Extended. Remove those lines. If you don't have a robots.txt file, you're fine. AI crawlers are allowed by default.
Not technical? Ask your developer: "Can you check our robots.txt and make sure we're not blocking GPTBot, ClaudeBot, or Google-Extended?"
2. Add a /llms.txt file
This is a plain text file at the root of your site that tells AI models which pages to prioritize. Think of it as a table of contents for AI crawlers. List your most important pages with a short description of each. The format is simple: page title, URL, one-line summary.
Read the full spec at llmstxt.org.
3. Fix your heading hierarchy
Every page should have exactly one H1 (the page title). Below that, use H2 for main sections and H3 for subsections. Never skip levels. Going from an H2 straight to an H4 confuses AI crawlers because they use headings to understand how your content is organized.
Not technical? Ask your developer: "Can you check our pages for heading level skips? We need one H1, then sequential H2s and H3s."
4. Add JSON-LD Organization schema to your homepage
JSON-LD (JavaScript Object Notation for Linked Data) is a small block of code you add to your homepage. It tells search engines and AI your company name, website URL, logo, and social profiles in a structured format they can read directly. You can test your schema with Google's Rich Results tool.
Not technical? Ask your developer: "Can you add JSON-LD Organization schema to our homepage with our name, URL, logo, and social links?"
5. Update your "last modified" date if content is current
AI engines prefer fresh content. If your page is up to date but the last modified date says 2022, you're leaving points on the table. Update the dateModified meta tag to today's date. If your CMS (Content Management System) has an "update" button, clicking it usually refreshes the date automatically.
Not technical? Ask your developer: "Can you update the dateModified meta tag and Last-Modified header on our key pages?"
Technical AI-Readiness fixes
These checks measure whether AI crawlers can reach and read your pages. If they can't get in, nothing else matters.
AI Crawler Access
This check looks at your robots.txt file and HTTP response headers. Both can block AI crawlers. The most common problem is a blanket Disallow: / rule for AI bots, sometimes added by security plugins or CDN (Content Delivery Network) settings without you knowing.
If you're technical: Open your robots.txt and remove any Disallow rules for GPTBot, ClaudeBot, Google-Extended, PerplexityBot, and Bytespider. Also check your HTTP response headers for X-Robots-Tag: noindex directives that target these bots.
If you're not technical: Ask your developer to check both robots.txt and HTTP headers. Also ask them to check your CDN or WAF (Web Application Firewall) settings, because Cloudflare and similar services sometimes block AI bots by default.
Schema Markup
Schema markup is structured data that tells AI exactly what your page contains. Without it, AI has to guess. The most important types are Organization (on your homepage), Article (on blog posts), FAQ (on help pages), and Breadcrumb (on any page with navigation hierarchy).
If you're technical: Add JSON-LD script blocks in your page's <head>. Include required properties: name, url, and description for Organization. For Article, add headline, datePublished, dateModified, and author. Validate with Google's Rich Results Test.
If you're not technical: Many CMS platforms have schema plugins. WordPress users can try Yoast SEO or Rank Math. Ask your developer: "Can you add JSON-LD structured data to our pages?"
LLMs.txt
The llms.txt file is newer than robots.txt, but adoption is growing fast. It gives AI models a prioritized list of your most important content. Think of it as a site map designed specifically for AI, not search engine crawlers.
If you're technical: Create a plain text file at /llms.txt on your domain. List each important page with its URL and a one-line description. Follow the format at llmstxt.org.
If you're not technical: Ask your developer: "Can you create an llms.txt file at the root of our site? Here's a list of our 10 most important pages."
Content Freshness
AI engines use freshness signals to decide how much to trust your content. A page last modified in 2021 gets cited less than one updated this month, even if the information hasn't changed. Freshness signals include the Last-Modified HTTP header, the dateModified meta tag, and dates visible on the page itself.
If you're technical: Set the Last-Modified header in your server config. Add a <meta name="dateModified"> tag. If you use Article schema, include the dateModified property.
If you're not technical: Republish or update your key pages in your CMS. Most platforms automatically update the modified date when you save. Ask your developer to confirm the Last-Modified header is being sent correctly.
JavaScript Rendering Gap
Some websites load all their content using JavaScript. That means the initial HTML (Hypertext Markup Language) is mostly empty. A browser fills it in, but many AI crawlers do not run JavaScript. If your important content only exists after JavaScript runs, AI crawlers may never see it.
If you're technical: Switch to server-side rendering (SSR) or static site generation (SSG) for your key pages. If you're using React, Next.js, or Nuxt, enable SSR. If that's not possible, use a pre-rendering service like Prerender.io to serve static HTML to crawlers.
If you're not technical: Ask your developer: "Is our main page content in the initial HTML, or does it load with JavaScript? If it's JavaScript-only, can we switch to server-side rendering for our most important pages?"
Structural Extractability fixes
These checks measure whether AI can pull useful, quotable passages from your content. Getting crawled is step one. Getting cited is step two.
Heading Hierarchy
AI models use headings like a table of contents. When the hierarchy is clean (H1, then H2, then H3), AI can break your page into sections and find the right passage to cite. When headings skip levels or when a page has three H1 tags, AI gets confused about what the page is actually about.
What to do: Use exactly one H1 per page. It should be your page title. Use H2 for main sections. Use H3 for subsections within an H2. Never skip a level. Make each heading descriptive. "Our Pricing" is better than "Section 3."
Not technical? Open your CMS editor and check the heading dropdown for each section. The page title should be Heading 1. Main sections should be Heading 2. Subsections should be Heading 3. If you see Heading 4 directly under Heading 2, add a Heading 3 between them.
Answer First
AI models build answers from the opening sentences of each section. If your first paragraph is setup or backstory, the AI skips it or quotes something less useful. Lead every section with the main point. Explain why after.
What to do: For each section on your page, read the first sentence. Does it answer the question the heading asks? If not, rewrite it so the first 1-2 sentences contain the key takeaway. Move context and history below the answer.
Think of it like a news article. The headline says what happened. The first sentence gives the important facts. The rest of the article fills in the details.
Fact Density
AI engines prefer content with specific, verifiable facts. Vague statements like "we have years of experience" give AI nothing to cite. Specific statements like "founded in 2019, serving 2,400 customers across 14 countries" give AI concrete material to quote.
What to do: Go through each paragraph and count the verifiable facts: numbers, dates, percentages, named entities, locations, specifications. Aim for at least 2-3 facts per paragraph. Replace vague language with specifics wherever possible.
Example: Instead of "Our product is fast," write "Average response time is 47ms based on 1.2 million requests in January 2026."
Passage Self-Containment
AI often extracts a single paragraph from your page and drops it into an answer. If that paragraph starts with "As mentioned above" or "This approach," the AI citation makes no sense to the reader. Every paragraph should stand on its own.
What to do: Read each paragraph in isolation. Does it make sense without the paragraph before it? If not, rewrite the opening sentence to restate the subject. Replace pronouns like "it," "this," and "they" with the actual noun when starting a new paragraph.
Test: Copy any single paragraph and paste it into a blank document. If a reader can understand it without context, it passes. If they'd ask "what does 'this' refer to?", rewrite it.
Markdown Fidelity
AI models often convert your HTML to markdown before processing it. Complex CSS layouts, deeply nested divs, and content hidden behind interactive elements can get stripped away in that conversion. Tables, lists, headings, and simple paragraphs survive well. Fancy layouts do not.
What to do: Use semantic HTML. Present data in real <table> elements, not styled divs. Use <ul> and <ol> for lists. Avoid putting important text inside images, accordions, or tabs that require interaction to reveal.
Not technical? Ask your developer: "Is our important content in plain HTML elements like paragraphs, tables, and lists? Or is it in complex layouts that might not convert to markdown cleanly?"
Entity Strength fixes
These checks measure whether AI knows who you are. Technical access and good content structure get you crawled and cited. Entity strength determines whether AI describes your brand correctly.
Knowledge Graph
AI models pull brand facts from knowledge graphs, primarily Wikidata. If your company doesn't have a Wikidata entry, or if the entry is missing key details, AI may describe you incorrectly or not mention you at all.
What to do: Search for your brand on Wikidata. If you don't have an entry, create one. If you do, fill in as many properties as possible: official name, website URL, founding date, industry, headquarters location, social media profiles, and a short description. Wikidata is free and open to anyone.
Beyond Wikidata: Make sure your Google Business Profile (if applicable), Crunchbase profile, LinkedIn company page, and Wikipedia article (if notable enough) all contain consistent information. AI models cross-reference multiple sources. When the facts match across sources, AI trusts them more.
AI Brand Check
This check asks real AI models "What is [your brand]?" and compares their answers to your actual about page. If AI describes you inaccurately, people asking AI about your industry are getting wrong information about your company.
What to do: Start with your about page. Write a clear, factual description of what your company does, who it serves, and what makes it different. Use specific language. Add Organization schema (JSON-LD) that matches your about page text exactly. AI will pick up both the human-readable and machine-readable versions.
Build third-party signals: AI models learn about your brand from mentions on other websites. Get listed in industry directories, write guest posts, respond to journalist queries (services like HARO), and maintain active social media profiles. Each consistent mention reinforces what AI knows about you.
Be patient. AI Brand Check results don't change overnight. AI models re-train on new data periodically. After you improve your about page, schema, and third-party mentions, it may take weeks or months before AI models update their descriptions of your brand. Re-scan monthly to track progress.
After you make changes
Re-scan to verify your fixes
Run a new GeoScored scan after making changes. Your report will show updated scores for each check. Compare the before and after results to confirm the fixes worked. Some improvements show up immediately (like unblocking crawlers or adding schema). Others, like AI Brand Check improvements, take longer because AI models need time to re-crawl and re-index your content.
Compare before and after
Your dashboard shows all your past scans. Look at the score trend over time. If a check went from "fail" to "pass," you're moving in the right direction. If it didn't change, double-check that your changes are live on the public site (not just in a staging environment).
Set a re-scan schedule
Your GEO score can change even if you don't touch your site. AI models update their training data and behavior continuously. A site that scores 72 today could score 58 next month without any changes on your end. CMS updates, plugin changes, CDN configuration changes, and AI crawler behavior shifts all affect your score.
| Scenario | Frequency | Why |
|---|---|---|
| After making changes | Immediately | Verify your fixes actually landed |
| Active optimization | Monthly | Track progress and catch regressions early |
| Competitive monitoring | Monthly | Competitor scores shift as AI models update |
| Maintenance mode | Quarterly | Minimum for catching unintended regressions |
If you're scanning monthly, a GeoScored subscription plan is more cost-effective than individual scans. Subscribers get full reports on every scan within their monthly allotment.
Why GEO scores change over time
Your GEO score can shift even if you haven't changed anything on your site. Several factors outside your control affect how AI search engines evaluate your content:
- AI model updates. ChatGPT, Gemini, and Perplexity update their models and training data monthly. Each update can change which sites get cited, how citations are selected, and what content patterns are preferred.
- Schema interpretation changes. AI engines are still developing how they interpret Schema.org markup. Structured data that boosts your visibility today may be weighted differently after the next model update.
- Competitor content changes. Your competitors publish new content, improve their structured data, and optimize for AI visibility. Their changes affect the relative strength of your scores.
- AI Brand Check volatility. The AI Brand Check directly queries live AI models. Their responses can shift within weeks as models retrain and update their knowledge. This is typically the most volatile check in your report.
- Infrastructure and platform changes. CMS updates, plugin changes, CDN configuration, security certificate renewals, and hosting changes can all affect how AI crawlers access and interpret your content.
Traditional SEO (Search Engine Optimization) rankings shift gradually over weeks or months. GEO (Generative Engine Optimization) visibility can shift meaningfully after a single AI model update. This is why we recommend monthly re-scans for anyone actively optimizing, and quarterly scans as a minimum baseline.
Ready to check your progress?
Run a new GeoScored scan