1 pointby jferdizzle2 hours ago1 comment
  • jferdizzle2 hours ago
    I built Inlay (https://inlay.dev) to solve a simple problem: AI agents can’t reliably discover most websites.

    When Claude/ChatGPT answers questions about a business, it often relies on stale or incomplete info. Inlay is my attempt at a “crawl + expose for AI” layer.

    It does 3 things:

    1) Audits your site across 11 AI-readiness criteria (structured data, llms.txt, semantic HTML, crawler policy, meta quality, content accessibility, API discoverability, security/trust signals, freshness, image accessibility, MCP readiness)

    2) Hosts AI-facing infrastructure for you (MCP endpoint + generated llms.txt / llms-full.txt + sitemap/feed from crawled pages)

    3) Auto-adds missing discoverability signals (via one script tag: JSON-LD, canonical/meta improvements, MCP discovery tags)

    The part I’m most interested in feedback on is scoring methodology: each analyzer scores 0–100 and rolls into a weighted overall score.

    Free audit: https://inlay.dev/audit (no signup)