BLOG
Articles about AI agent readiness, web standards, and preparing for autonomous agents.
Content Negotiation for AI Agents: Why Sentry Serves Markdown Over HTML
Sentry co-founder David Cramer shows how content negotiation — a 25-year-old HTTP standard — saves AI agents 80% of tokens. We break down the implementation: Accept headers, markdown delivery, authenticated page redirects, and what this means for every website preparing for agent traffic.
Vercel's agent-browser: Why a CLI Beats MCP for Browser Automation
Vercel's agent-browser hit 22,000 GitHub stars in two months. It's a CLI, not an MCP server, and the data shows why: 94% fewer tokens, 3.5x faster execution, 100% success rate. We break down how it works, why it uses the accessibility tree, and what the 'less is more' finding means for your website.
Cloudflare /crawl Endpoint: One API Call to Crawl Any Website
Cloudflare launched a /crawl endpoint that crawls entire websites with one API call — returning HTML, Markdown, or AI-extracted JSON. We break down what this means for AI agent readiness: why your robots.txt, sitemap, semantic HTML, and server-side rendering now matter more than ever.
Playwright: From Test Runner to AI Agent Interface
Playwright overtook Cypress, then Microsoft shipped Playwright MCP — turning the same tool into the standard browser runtime for AI agents. We break down why the data-testid vs getByRole debate now determines whether agents can use your site, the testing-accessibility-agent flywheel, and what this means for frontend teams.
AI Crawlers Ignore llms.txt — But AI Agents Don't
Dries Buytaert's data shows zero AI crawlers use llms.txt. But he measured the wrong thing. Crawlers scrape for training data — agents complete tasks. We break down why the crawler vs agent distinction matters, which coding agents already use llms.txt and content negotiation, and what you should implement today.
Anthropic's AI Exposure Index: What Real-World Usage Data Means for Your Website
Anthropic's new 'observed exposure' metric reveals a 61-point gap between theoretical AI capability and actual usage. We break down the data — from 75% task coverage for programmers to 14% hiring slowdowns for young workers — and explain why this adoption gap is a countdown for website AI agent readiness.
Build for Agents: Why CLIs Are the New Distribution Channel
Andrej Karpathy argues CLIs are exciting because they're 'legacy' technology — battle-tested, standardized, and universally parseable by AI agents. We break down why CLIs, MCP servers, and machine-readable docs are becoming the primary distribution channel for software, and what product teams need to build right now.
How AI Agents See Your Website: The Accessibility Tree Explained
AI agents don't see your website the way humans do. They navigate via the accessibility tree — a browser-generated structure originally built for screen readers. We explain how it works, which AI frameworks use it, and why accessible websites outperform in the age of AI agents.
The State of AEO: Key Insights from Vercel's 2026 Report
Vercel's State of AEO report lays out the shift from the attention economy to the answer economy. We break down the key statistics, platform landscape, content architecture recommendations, and what it all means for your website's AI visibility.
What Is agents.json? Advertising AI Agent Capabilities on Your Website
agents.json is the emerging complement to robots.txt - a machine-readable file that tells AI agents what your website can do. We cover the Wildcard specification, compare it to A2A, MCP, and OpenAPI, and show you how to implement it step by step.
EXPLORE MORE
Most websites score under 45. Find out where you stand.