I was experimenting with AI crawlers (like GPTBot and ClaudeBot) recently and realized a massive gap: most modern SPAs are practically invisible to them. If you are running a React or Next.js app without proper SSR, the AI often just sees an empty body tag.
Standard SEO isn't enough anymore. Bots are now looking for specific machine-readable files like /llms.txt, structured JSON-LD, and clear robots.txt permissions tailored for AI. I tested my own company's site and it scored horribly.
So I spent the last few days building AIO Checker to automate this audit. It scans a URL and provides:
A 0-100 AI Visibility score based on 7 technical factors.
A breakdown of missing elements (SSR, llms.txt, etc.).
An exportable .md audit file. The idea is that you can drop this markdown file directly into Cursor, Windsurf, or Claude, and prompt it to "fix the codebase based on this audit."
The initial scan is free, and the full markdown export is a one-time $4.99 fee.I would love your feedback. Let me know if it breaks on your site's architecture, if you disagree with the scoring weights, or what you think about the emerging /llms.txt standard. Roast the UI/UX if you must.