A new tool scans websites for AI agent compatibility across emerging standards like MCP, robots.txt, and agentic commerce protocols.
A web-based scanner (linked to Cloudflare's agent ecosystem) now lets anyone input a URL to check how ready their site is for AI agent interactions. It evaluates multiple emerging standards including robots.txt AI bot rules, Markdown content negotiation, MCP (Model Context Protocol), OAuth, Agent Skills, and agentic commerce signals. The tool provides a readiness score and prioritized recommendations. It appears tied to Cloudflare's Agents platform and documentation.
MCP, Agent Skills, and agentic commerce headers are becoming the new SEO — sites that don't expose them will be bypassed by AI agents in favor of ones that do. This scanner gives you a concrete checklist: robots.txt directives for AI bots, Markdown negotiation support, OAuth flows for agent auth, and structured discovery metadata. These aren't hypothetical — Cloudflare is already routing agents through sites that comply.
Run your production domain through the scanner this week, then fix the top two failing checks — robots.txt AI directives and discovery headers — both of which can be deployed in under an hour with no backend changes.
Run: curl -A 'GPTBot' https://yourdomain.com/robots.txt
Tags
Sources
Also today
Signals by role
Also today
Tools mentioned