3 pointsby _boffin_a day ago1 comment
  • _boffin_a day ago
    Quick clarification: the `/hn` page is a no-login, no-API-key interactive demo (autoplay is just the default walkthrough; you can pause and click around).

    OpenAI BYOK is only for the full app when you want real model calls.

    More detail on what’s different under the hood:

      - Branches are anchored to a source message + selected span (not freeform threads).
      
      - Collector items are references back to those spans, so "Compose" can build a prompt from explicit citations rather than chat history drift.
    
      - "Context compiler" shows the exact prompt stack + token budget, and lets you exclude/pin items to control what survives truncation.
    
    Feedback I'd love: does Branch + Collector + Compose feel faster than "open a second chat window + copy/paste", or does it feel like extra steps?