6 pointsby MetaWhirledPeas7 hours ago8 comments
  • MehdiBelkacem2 hours ago
    Token anxiety is real. What worked for me: prompt caching on fixed system prompts cut my Anthropic bill by ~60% overnight. Most devs don't realize cache writes are 25x cheaper than input tokens on Claude.

    Local models for classification/routing + frontier only for generation is the other move — but the latency tradeoff is real if you're in a user-facing flow.

  • CM306 hours ago
    For a lot of companies, probably shut down or drastically limit their AI usage due to rising costs. A small or medium sized business dependent on ever growing AI expenses is in a real bad position, and could well go under.

    I heard a few companies ended up going back to hiring actual employees for work that was previous done by LLMs, so there's a chance we could see some more of that too. Might also see a few try to make it work with outdated or local ones too.

  • ipaddr4 hours ago
    Less people will use the frontline models and those who do will pay more. Progress will slow. OpenAI will sell your chat data. You will get an AI tax. Companies will use less of it.

    Hopefully new ways to deliver similiar quality will be discovered.

    Stock market will pop.

    Prices will go up for people inside the moat

  • scorpioxy6 hours ago
    What always happens. A market correction followed by going back to a reasonable state, until the next bubble of course.

    In my opinion, LLMs are useful for many things but not anything and everything and definitely not in the way the boosters are claiming. This is not a popular opinion when you are inside the bubble or have something to gain by it. So when there there's a downturn, things will hopefully stabilize with LLMs being another tool that can be used to automate certain things. It feels crazy saying this these days and have been told I'm out of touch if I think this way and who knows, maybe that's true.

  • atleastoptimal5 hours ago
    Prices are going down. Just look at open source models, you can run the equivalent to a SOTA model 8 months ago on your laptop.
  • krapp6 hours ago
    What do you think will happen? How does supply and demand work? Practically every business and government in existence is existentially dependent on AI, speculation on it is the only thing keeping the world from global financial collapse. It's "too big to fail" at a scale that dwarfs the financial crisis of 2008.

    You'll pay the fucking danegeld is what you'll do, and keep paying it, because you reorganized your entire existence around and mortgaged your future on a closed proprietary third party service's business model that is now a single point of failure for our entire technological civilization, making its market value practically infinite.

    That's a collective "you" there, by the way, not "you" personally.

    • scorpioxy6 hours ago
      Isn't it strange? You'd think there were some lessons learned from the 2008 crisis but apparently not. It is not that long ago to be forgotten already.
      • andrei_says_6 hours ago
        The lesson is that if you’re too big to fail no laws apply to you and there unlimited money to be made.

        It has been learned very well.

        The brazen violation of intellectual property was a precondition of making this technology useful. Taking the risk of breaking the law at this unprecedented scale was an informed decision made based on this very lesson.

    • atleastoptimal5 hours ago
      AI model prices are getting cheaper over time, per amount of capability.
  • rebekkamikkoa7 hours ago
    [dead]
  • KeynitionAuto7 hours ago
    [dead]