Give it a try.
To get started: https://simonwillison.net/2024/Nov/12/qwen25-coder/
The LLM companies are profitable on the current gen models. Inference is profitable, rather than subsidized.
They are raising the biggest chunk of capital to buy data center compute that will come online ~2 years from now and be an order of magnitude larger.
The bear case for the labs is that they're Cisco, not Pets.com.
As you correctly state, the cost of AI as a Service (AIaaS) will increase for end users, but this isn't necessarily a bad thing. It will allow the "real" users to continue having access to it and sieve out the ones who are just playing around. Prices for RAM, GPUs, SSDs will normalize a lot and more people will move towards local models.
Similarly to what happened with the dot-com bubble (I saw it happening), it doesn't mean that everything will disappear, but that it will change/adapt. All of us AI realists are currently being treated like technophobes when we say things like that ;-)
When the AI/LLM bubble pops, LLMs will still exist and be used. They just won't be hyped and pushed everywhere.