4 pointsby 1010083 hours ago4 comments
  • andsoitis3 hours ago
    You can run smaller models locally that are pretty good at code gen.

    Give it a try.

    To get started: https://simonwillison.net/2024/Nov/12/qwen25-coder/

    or https://simonwillison.net/2024/Dec/9/llama-33-70b/

  • Eridrus2 hours ago
    Your assumption is wrong.

    The LLM companies are profitable on the current gen models. Inference is profitable, rather than subsidized.

    They are raising the biggest chunk of capital to buy data center compute that will come online ~2 years from now and be an order of magnitude larger.

    The bear case for the labs is that they're Cisco, not Pets.com.

  • NKosmatos3 hours ago
    AI is here to stay but not in the way big corporations dream of it. People will continue using AI but when the AI bubble pops, sooner than later, things will stabilize and adapt to real usage with a different business model.

    As you correctly state, the cost of AI as a Service (AIaaS) will increase for end users, but this isn't necessarily a bad thing. It will allow the "real" users to continue having access to it and sieve out the ones who are just playing around. Prices for RAM, GPUs, SSDs will normalize a lot and more people will move towards local models.

    Similarly to what happened with the dot-com bubble (I saw it happening), it doesn't mean that everything will disappear, but that it will change/adapt. All of us AI realists are currently being treated like technophobes when we say things like that ;-)

  • theandrewbailey3 hours ago
    When the dot com bubble popped, the internet and websites did not go away. (I'm posting this to a website ending in .com) When the mortgage crisis happened, mortgages certainly didn't go away.

    When the AI/LLM bubble pops, LLMs will still exist and be used. They just won't be hyped and pushed everywhere.