20 pointsby taspeotis10 hours ago4 comments
  • gnabgib10 hours ago
    Earlier (29 points, 4 comments) https://news.ycombinator.com/item?id=47367129
  • tyleo9 hours ago
    I mentioned this at work but context still rots at the same rate. 90k tokens consumed has just as bad results in 100k context window or 1M.

    Personally, I’m on a 6M+ like codebase and had no problems with the old window.

  • atonse9 hours ago
    CC seems to have gotten pretty good with auto compacting and continuing where it left off. Are there any good use cases for this?

    I guess it would be to avoid tool use?

    • satring9 hours ago
      [flagged]
      • atonse8 hours ago
        But interestingly every now and then I look at the compaction result and it now says if you need to reference the previous conversation you can open <file>. So technically that context is connected.

        I’ve noticed MCPs get unstable after compaction. but even that’s been less so lately.

  • shablulman10 hours ago
    [flagged]