Something strange is happening to web traffic. Visitors are arriving, reading your content, and leaving—but they're not clicking anything because they're not human.
AI agents now account for a significant and growing share of website traffic. Cloudflare's 2025 data shows GPTBot (OpenAI's crawler) grew from 4.7% to 11.7% of verified bot traffic in just one year. ClaudeBot, Meta's crawlers, and dozens of others are right behind it.
These agents aren't just indexing your site for future searches. They're reading it right now, on behalf of real people asking real questions. And if your site doesn't work for them, you're invisible to a growing segment of your potential audience.
This is where Agent Experience Optimization comes in.
The shift already happened
Here's the reality: 69% of Google searches now end without a click, according to Similarweb's July 2025 report. Google's AI Overviews answer the question right there in the search results. Perplexity synthesizes information from multiple sources. ChatGPT browses the web and summarizes what it finds.
Users still get answers. They just don't visit your site to get them.
For queries where AI Overviews appear, organic click-through rates dropped 61%, according to Seer Interactive's September 2025 study. That's not a rounding error. That's a fundamental change in how information flows on the web.
The web is becoming intermediated. Instead of users visiting sites directly, they increasingly interact through AI systems that fetch, synthesize, and act on their behalf. Your content still matters—maybe more than ever—but the way it reaches people has changed.
What is AXO?
Agent Experience Optimization (AXO) is the practice of making websites work for AI agents. Not instead of humans. Alongside them.
Think of it as the machine-readable layer of your site. Just as you design navigation for human visitors and optimize page speed for their experience, AXO ensures AI agents can effectively access, understand, and use your content.
This matters because AI agents have different needs than humans:
They can't see your design. Visual hierarchy, images, and interactive elements mean nothing to an agent parsing your HTML.
They can't execute JavaScript. OpenAI's crawlers don't render client-side content. If your product information loads dynamically, AI agents see an empty page.
They need explicit context. Humans infer meaning from layout and design patterns. Agents need structured data that explicitly states what things are and how they relate.
They're looking for specific answers. They're not browsing. They arrive with a question and need to find the answer quickly and unambiguously.
The four pillars of AXO
AXO isn't a single technique. It's a set of practices that work together to make your site agent-friendly.
1. Structured data
This is the foundation. Schema.org markup tells AI agents exactly what your content represents—not through inference, but through explicit declaration.
Without structured data, an AI agent has to guess that the text "£299" next to "iPhone 15 Case" is probably a price. With proper schema markup, you're telling it directly: this is a Product, this is its price, this is its availability.
JSON-LD is the format to use. Google recommends it, AI systems parse it easily, and it keeps your markup separate from your content.
The key schemas to implement:
- Organization for your business identity
- Product for anything you sell
- Article/BlogPosting for content with clear authorship and dates
- FAQPage for question-and-answer content
- Speakable for content suitable for voice assistants
Consistency matters here. If you're "ABC Corp" on one page and "ABC Corporation" on another, you're creating confusion in how agents build their understanding of your business.
2. Content accessibility
Everything important needs to exist as text. Not text in images. Not text rendered by JavaScript. Actual HTML text that's present when the page loads.
This is straightforward to audit: view your page source and search for your key information. If your product specs, pricing, or critical content isn't there, AI agents can't see it either.
This doesn't mean abandoning rich media. It means ensuring the essential information has a text equivalent. Alt text for images. Transcripts for videos. Server-rendered content for critical pages.
3. Clear information architecture
AI agents navigate programmatically. They follow links, respect robots.txt directives, and try to build a map of your site's content.
Help them by making your structure predictable:
- Logical URL patterns
- Comprehensive internal linking
- Updated sitemaps
- Clear hierarchy in your navigation
The goal is a site that an agent can traverse systematically without hitting dead ends or circular loops.
4. Server-rendered content
This is where most sites fail. AI crawlers don't execute JavaScript. They fetch your HTML and read what's there. If your content loads via client-side JavaScript, AI agents see a blank page or a loading spinner.
Check any page on your site: view source and search for your key content. If it's not in the initial HTML, you have a problem.
The fix depends on your stack:
- Static sites are already fine—content is in the HTML
- Server-side rendering (SSR) delivers complete HTML on first request
- Static site generation (SSG) pre-builds pages with full content
- Client-side apps need hybrid approaches or pre-rendering for critical pages
This isn't just about AI agents. It's also faster for users, better for accessibility, and more resilient. The JavaScript dependency that breaks for AI agents is often the same one that breaks for users on slow connections or older devices.
Why bother?
Fair question. If AI agents are going to summarize your content without sending visitors, why help them?
Three reasons:
Attribution and authority. When AI systems cite sources, they favor sites they can parse reliably. Being cited in AI-generated answers maintains your brand visibility even in a zero-click world.
Higher-quality traffic. The visitors who do click through from AI systems are often further along in their decision-making. According to industry data, AI-referred traffic converts at significantly higher rates than traditional organic search.
Future-proofing. AI agents are becoming more capable, not less. The systems being built today will handle bookings, purchases, and complex transactions. Sites that work well for agents will be included in these workflows. Sites that don't will be skipped.
Where to start
You don't need to rebuild your site. Start with these steps:
Audit your structured data. Use Google's Rich Results Test or Schema.org's validator to see what machines currently understand about your pages.
Check your source. View your most important pages without JavaScript. Is the critical information present? If not, that's your first priority.
Test with actual agents. Ask ChatGPT or Perplexity about your products or services. The gaps in their answers reveal gaps in your site's machine-readability.
Consider an llms.txt file. There's an emerging proposal for a markdown file at your site root that helps AI agents understand your site's purpose and structure. It's not yet a standard, but some sites are experimenting with it.
What comes next
We're at the beginning of a significant shift in how the web works. The sites that adapt will maintain visibility and relevance. The ones that don't will gradually fade from AI-mediated discovery—technically online but functionally invisible.
AXO isn't about gaming algorithms or chasing trends. It's about ensuring your content remains accessible as the ways people find and consume information continue to evolve.
The agents are already here. The question is whether your site is ready for them.

