2 pointsby inem3 hours ago1 comment
  • inem3 hours ago
    ChatGPT renders every message in the DOM at once. At 2,000 messages that's ~500K nodes. The tab freezes or crashes.

    The AI handles long conversations fine. It's purely a frontend problem — React re-rendering an ever-growing tree.

    The fix is simple: intercept the fetch response for /backend-api/conversation/{id}, truncate the mapping to the last N messages for rendering, keep full context for the model. 30KB, no dependencies, no external requests.

    I built this because my most useful conversations were becoming unusable. Chrome Web Store submission is pending review, so distributing via Gumroad for now.

    Happy to answer questions about the implementation.