When it comes to local AI, I’m of the option that this is where things should go in the long-term. However, I want to see the market mature more to understand what that will look like. I don’t want to buy today in hopes of something down the road.
So pretty much all my buy signals are telling me to kick the can down the road.
This site was posted a couple weeks back. You can select the M5 Max from the list and see how it would run various local models. That may help make the decision.
[0] https://www.reddit.com/r/LocalLLaMA/comments/1rzkw4x/m5_max_...
I didn't go for a Max chip because I value the better battery life on the Pro more than I value the additional GPU cores.
Personally, I think until the LLMs start to plateau, it will always be more valuable to run a frontier LLM vs just a very capable local LLM. I have no idea when that will happen, so I simply decided to not overbuy the hardware now.