If a couple more iterations of this, say gemma6 is as good as current opus and runs completely locally on a Mac, I won’t really bother with the cloud models.
That’s a problem.
For the others anyway.
Glad it wasnt just me - i was impressed with the quality of Gemma4 - it just couldnt write the changes to file 9/10 times when using it with opencode
There was an update to tool calling 3 days ago. I haven't tested it myself but hope it helps.
The world has moved on, that code-golf time is now spent on ad algorithms or whatever.
Escaping the constraint delivered a different future than anticipated.
Plus having Gemma on my device for general chat ensures I will always have a privacy respecting offline oracle which fulfils all of the non-programming tasks I could ever want. We are already at the point where the moat for these hyper scalers has basically dissolved for the general public's use case.
If I was OpenAI or Anthropic I would be shitting my pants right now and trying every unethical dark pattern in the book to lock in my customers. And they are trying hard. It won't work. And I won't shed a single tear for them.
It simply.. doesn't. The SotA models are enormous now, and there's no free lunch on compression/quantization here.
Opus 4.6 capabilities are not coming to your (even 64-128gb) laptop or phone in the popular architecture that current LLMs use.
Now, that doesn't mean that a much narrower-scoped model with very impressive results can't be delivered. But that narrower model won't have the same breadth of knowledge, and TBD if it's possible to get the quality/outcomes seen with these models without that broad "world" knowledge.
It also doesn't preclude a new architecture or other breakthrough. I'm simply stating it doesn't happen with the current way of building these.
edit: forgot to mention the notion of ASIC-style models on a chip. I haven't been following this closely, but last I saw the power requirements are too steep for a mobile device.
You needed supercomputer to win in chess until you didn't.
Currently local models performance in natural language is much better than any algorithm running on a super computer cluster just few years ago.
Contradicting that trend takes more than "It simply.. doesn't."
There's plenty of room for RAM sizes to double along with bus speed. It idled for a long time as a result of limited need for more.
for llm providers, i always believe the key is to focus on high value problems such as coding or knowledge work, becaues of the high marginal cost of having new customers - the token burnt. and low marginal revenue if the problem is not valuable enough. in this sense no llm providers can scale like previous social media platforms without taking huge losses. and no meaning user stickiness can be built unless you have users' data. and there is no meaningful business model unless people are willing to pay a high price for the problem you solve, in the same way as paying for a saas.
i am really not optimistic about the llm providers other than anthropic. it seems that the rest are just burning money, and for what? there is no clear path for monetization.
and when the local llm is powerful enough, they will soon be obsolete for the cost, and the unsustainable business model. in the end of the day, i do agree that it is the consumer hardware provider that can win this game.
They did do the smart thing of not throwing too much capital behind it. Once the hype crumbles, they will be able to do something amazing with this tech. That will be a few years off but probably worth the wait.
Firefox is also marketing how easy it is to disable AI.
Decently accessible automation and discovery, without having to go figure out a bunch of stuff
The user does not give two shits if the new laptop "has AI". This is how Apple has been killing it lately, they market the macbooks being powerful, cheap, with long batteries, and a premium feel. Things the user cares about. Most of the stuff marketers are just blanket labeling "AI" will eventually be shuffled to the background and rebranded with a more specific term to highlight the feature being delivered rather than the fact it's AI".
Apple seems to follow the values that Steve laid out. Tim isn’t a visionary but he seems to follow the principles associated with being disciplined with cash quite well. They haven’t done any stupid acquisitions either. Quite the contrast with OAI.
But this approach may not work in other areas: e.g. building electric batteries, wireless modems, electric cars, solar cell technology, quantum computing etc.
Essentially Apple got lucky with AI but it needs to keep investing in cutting edge technology in the various broad areas it operates in and not let others get too far ahead !
Obviously that was built upon years of iPhone experience, but it shows they can lag behind, buy from other vendors, and still win when it becomes worth it to them.
They could change the architecture again tonight, and start releasing new machines with it. The users will adopt because there is literally no other choice.
Every machine they release will be fastest and most capable on the platform, because there is no other option
We will see if they ever release a new VisionOS device, but it's not the first time they did that; see also the Apple Watch.
- Apple Watch
- AirTag
Those are a few that come to mind. All do multi-billions in revenue per year.
And since iPhones form the largest single company's device network in the rich countries, that is a pretty big advantage.
My parents use Android to ask “What are the 5 biggest towers in Chicago” or “Remove the people on my picture” while apparently iPhone is only capable of doing “Hey Siri start the Chronometer / There is no contact named Chronometer in your phone”.
My iPhone is lagging a ridiculous 10 years behind. It’s just that I don’t trust Google with my credit card.
The only reason to care about it being OS integrated is to interact with functions of the OS, which siri does fine.
When they made the iPhone, iPod, and Apple Watch they had no specific hardware advantage over competitors. Especially with early iPhone and iPod: no moat at all, make a better product with better marketing and you’ll beat Apple.
Now? Good luck getting any kind of reasonably priced laptop or phone that can run local AI as well as the iPhone/MacBook. It doesn’t matter that Apple Intelligence sucks right now, what matters is that every request made to Gemini is losing money and possibly always will.
This is especially true in 2026 where Windows laptops are climbing in price while MacBooks stay the same.
It's not. People make this claim with zero evidence.
But Google made around $20B profit on Google search in 2025 Q4, and that includes AI search.
In hindsight it’s obvious why they pulled it off - nobody else could do it. They all had pieces missing.
1) Apple is not a data company.
2) Apple hasn't found a compelling, intuitive, and most of all, consistent, user experience for AI yet.
Regarding point 2: I haven't seen anyone share a hands down improved UX for a user driven product outside of something that is a variation of a chat bot. Even the main AI players can't advertise anything more than, "have AI plan your vacation".
When I open up JIRA or Slack I am always greeted with multiple new dialogues pointing at some new AI bullshit, in comparison. We hates it precious
However, I have even less patience for companies forcing paid-for third-party ads down my throat on a paid product. Slack at least doesn't sell my eyeballs. Facebook, Twitter, Google's ads are worse to me than new feature dialogues.
Which brings me to Apple. I pay for a $1k+ device, and yet the app store's first result is always a sponsored bit of spam, adware, or sometimes even malware (like the fake ledger wallet on iOS, that was a sponsored result for a crypto stealer). On my other devices, I can at least choose to not use ad-ridden BS (like on android you can use F-Droid and AuroraStore, on Linux my package manager has no ads), but on iOS it's harder to avoid.
Apple hasn't sunk to Google levels in terms of ads, but they've crossed a line.
I'm actually pretty disappointed in the lack of discovery available in the App Store, but I rarely go there. I'm fine with advertising being there. I wish it was better but I'm not offended that there is paid promotion in a store.
>"to fix this, please install our app"
>search BankName
>comes up with other banks, BankNames US app (not the country you are in)
>revolut etc (cant use in the country you are in)
>ten minutes later
even worse when its your telecomm telling you to install their Official App so you can pay your bills or they will cut your cellular service, and you cant find it
I have a separate Dutch Apple ID I can switch to, but each time I log out I risk accidentally deleting all my data.
I get an app recommendation from a friend, I go to the App Store and search for it. I have to be super careful about which link I'm actually clicking on and which app I'm installing, because the App Store is riddled with spam and malware.
I wouldn't mind, except that Apple charge 30% of everything with the justification that they are keeping the ecosystem free of spam and malware...
I just don't find it hard to find the app I want, when I want something specific, and install, and then _get the hell out of that shithole_.
For me, the second tile is an ad for Upside, some cashback app
Honestly the last time I remember using the App Store was years ago and I can't recall if they had ads or not. Imo it's distasteful and I wish they didn't have them. Still leagues better than the fucking ads in the start menu which caused me to give up on gaming and Windows forever.
If I search for my bank, I get another bank. If I search for "Wordle", I get a bunch of ad-supported spamware (both the ad and non-ad results) before the real NYT Games app.
The app store has ads in search results. This is the primary way that my technologically inept relatives end up with the wrong app installed btw, is by searching and clicking the first result, and getting complete trash adware.
Apple should be ashamed of selling out their users.
Consumers want iPhones and (if Apple are right) some form of AR glasses in the next decade. That’s their focus. There’s a huge amount of machine learning and inference that’s required to get those to work. But it’s under the hood and computed locally. Hence their chips. I don’t see what Apple have to gain by building a competitor to what OpenAI has to offer.
Imagine a future where Nvidia sells the exact same product at completely different prices, cheap for those using local models, and expensive for those deploying proprietary models in data centers.
[WSJ] sources expect.. first units in H1 2026, with GTC as the most likely unveiling stage.. NPU reportedly exceeds both Intel and AMD’s current neural processing units.. If the integrated GPU delivers RTX 5070-class performance in a thin laptop form factor, it would eliminate the need for a separate GPU die, fundamentally changing how gaming laptops are designed.I find this intriguing.. Does anyone here have enough insight to speculate more?
Doing this you will make all kind of fun predictions.
In other news, people keep buying iPhones, and Apple just had its best quarter ever in China. AAPL is up 24% from last year.
that's the other part of the story that matters, not apple intelligence. this writeup tries to touch on that, apple is uniquely positioned to do really well in this arena if/when local llm's becoming commodities that can do really impressive stuff. we're getting there a lot faster than we thought, someone had a trillion parameter qwen3,5 model going on his 128gb macbook and now people are thinking of more creative ways to swap out whats in memory as needed.
Here's to another 10 years of scuffed Metal Compute Shaders, I guess.
But... what's the argument that the bulk of "AI value" in the coming decade is going to be... Siri Queries?! That seems ridiculous on its face.
You don't code with Siri, you don't coordinate automated workforces with Siri, you don't use Siri to replace your customer service department, you don't use Siri to build your documentation collation system. You don't implement your auto-kill weaponry system in Siri. And Siri isn't going to be the face of SkyNet and the death of human society.
Siri is what you use to get your iPhone to do random stuff. And it's great. But ... the world is a whole lot bigger than that.
Maximizing the available options is in fact a "strategy", and often a winning one when it comes to technology. I would love to be reminded of a list of tech innovators who were first and still the best.
Anyway, hasn't this always been Apple's strategy?
This was really unsurprising [0].
Well.. no. The Stargate expansion was cancelled the orginally planned 1.2MW (!) datacenter is going ahead:
> The main site is located in Abilene, Texas, where an initial expansion phase with a capacity of 1.2 GW is being built on a campus spanning over 1,000 acres (approximately 400 hectares). Construction costs for this phase amount to around $15 billion. While two buildings have already been completed and put into operation, work is underway on further construction phases, the so-called Longhorn and Hamby sections. Satellite data confirms active construction activity, and completion of the last planned building is projected to take until 2029.
> The Stargate story, however, is also a story of fading ambitions. In March 2026, Bloomberg reported that Oracle and OpenAI had abandoned their original expansion plans for the Abilene campus. Instead of expanding to 2 GW, they would stick with the planned 1.2 GW for this location. OpenAI stated that it preferred to build the additional capacity at other locations. Microsoft then took over the planning of two additional AI factory buildings in the immediate vicinity of the OpenAI campus, which the data center provider Crusoe will build for Microsoft. This effectively creates two adjacent AI megacampus locations in Abilene, sharing an industrial infrastructure. The original partnership dynamics between OpenAI and SoftBank proved problematic: media reports described disagreements over site selection and energy sources as points of contention.
https://xpert.digital/en/digitale-ruestungsspirale/
> Micron’s stock crashed. [the link included an image of dropping to $320]
Micron’s stock is back to $420 today
> One analysis found a max-plan subscriber consuming $27,000 worth of compute with their 200$ Max subscription.
Actually, no. They'd miscalculated and consumed $2700 worth of tokens.
The same place that checked that claim also points out:
> In fact, Anthropic’s own data suggests the average Claude Code developer uses about $6 per day in API-equivalent compute.
https://www.financialexpress.com/life/technology-why-is-clau...
I like Apple's chips, but why do we put up with crappy analysis like this?
Rather, I feel that Apple has forgotten its roots. The Mac was “the computer for the rest of us,” and there were usability guidelines backed by research. What made the Mac stand out against Windows during a time when Windows had 95%+ marketshare was the Mac’s ease of use. The Mac really stood out in the 2000s, with Panther and Tiger being compelling alternatives to Windows XP.
I think Apple is less perfectionistic about its software than it was 15-20 years ago. I don’t know what caused this change, but I have a few hunches:
0. There’s no Steve Jobs.
1. When the competition is Windows and Android, and where there’s no other commercial competitors, there’s a temptation to just be marginally better than Windows/Android than to be the absolute best. Windows’ shooting itself in the foot doesn’t help matters.
2. The amazing performance and energy efficiency of Apple Silicon is carrying the Mac.
3. Many of the people who shaped the culture of Apple’s software from the 1980s to the 2000s are retired or have even passed away. Additionally, there are not a lot of young software developers who have heard of people like Larry Tesler, Bill Atkinson, Bruce Tognazzini, Don Norman, and other people who shaped Apple’s UI/UX principles.
4. Speaking of Bruce Tognazzini and Don Norman, I am reminded of this 2015 article (https://www.fastcompany.com/3053406/how-apple-is-giving-desi...) where they criticized Apple’s design as being focused on form over function. It’s only gotten worse since 2015. The saving grace for Apple is that the rest of the industry has gone even further in reducing usability.
I think what it will take for Apple to readopt its perfectionism is if competition forced it to.