570 pointsby stygiansonic8 hours ago56 comments
  • Fiveplus7 hours ago
    The writing was on the wall the moment Apple stopped trying to buy their way into the server-side training game like what three years ago?

    Apple has the best edge inference silicon in the world (neural engine), but they have effectively zero presence in a training datacenter. They simply do not have the TPU pods or the H100 clusters to train a frontier model like Gemini 2.5 or 3.0 from scratch without burning 10 years of cash flow.

    To me, this deal is about the bill of materials for intelligence. Apple admitted that the cost of training SOTA models is a capex heavy-lift they don't want to own. Seems like they are pivoting to becoming the premium "last mile" delivery network for someone else's intelligence. Am I missing the elephant in the room?

    It's a smart move. Let Google burn the gigawatts training the trillion parameter model. Apple will just optimize the quantization and run the distilled version on the private cloud compute nodes. I'm oversimplifying but this effectively turns the iPhone into a dumb terminal for Google's brain, wrapped in Apple's privacy theater.

    • CharlesW6 hours ago
      > I'm oversimplifying but this effectively turns the iPhone into a dumb terminal for Google's brain, wrapped in Apple's privacy theater.

      Setting aside the obligatory HN dig at the end, LLMs are now commodities and the least important component of the intelligence system Apple is building. The hidden-in-plain-sight thing Apple is doing is exposing all app data as context and all app capabilities as skills. (See App Intents, Core Spotlight, Siri Shortcuts, etc.)

      Anyone with an understanding of Apple's rabid aversion to being bound by a single supplier understands that they've tested this integration with all foundation models, that they can swap Google out for another vendor at any time, and that they have a long-term plan to eliminate this dependency as well.

      > Apple admitted that the cost of training SOTA models is a capex heavy-lift they don't want to own.

      I'd be interested in a citation for this (Apple introduced two multilingual, multimodal foundation language models in 2025), but in any case anything you hear from Apple publicly is what they want you to think for the next few quarters, vs. an indicator of what their actual 5-, 10-, and 20-year plans are.

      • dktp5 hours ago
        My guess is that this is bigger lock-in than it might seem on paper.

        Google and Apple together will posttrain Gemini to Apple's specification. Google has the know-how as well as infra and will happily do this (for free ish) to continue the mutually beneficial relationship - as well as lock out competitors that asked for more money (Anthropic)

        Once this goes live, provided Siri improves meaningfully, it is quite an expensive experiment to then switch to a different provider.

        For any single user, the switching costs to a different LLM are next to nothing. But at Apple's scale they need to be extremely careful and confident that the switch is an actual improvement

        • TheOtherHobbes3 hours ago
          It's a very low baseline with Siri, so almost anything would be an improvement.
          • anamexisan hour ago
            The point is that once Siri is switched to a Gemini-based model, the baseline presumably won't be low anymore.
          • eastbound3 hours ago
            Ollama! Why didn’t they just run Ollama and a public model! They’ve kept the last 10 years with a Siri who doesn’t know any contact named Chronometer only to require the best in class LLM?
            • chankstein382 hours ago
              The other day I was trying to navigate to a Costco in my car. So I opened google maps on Android Auto on the screen in my car and pressed the search box. My car won't allow me to type even while parked... so I have to speak to the Google Voice Assistant.

              I was in the map search, so I just said "Costco" and it said "I can't help with that right now, please try again later" or something of the sort. I tried a couple more times until I changed up to saying "Navigate me to Costco" where it finally did the search in the textbox and found it for me.

              Obviously this isn't the same thing as Gemini but the experience with Android Auto becomes more and more garbage as time passes and I'm concerned that now we're going to have 2 google product voice assistants.

              Also, tbh, Gemini was great a month ago but since then it's become total garbage. Maybe it passes benchmarks or whatever but interacting with it is awful. It takes more time to interact with than to just do stuff yourself at this point.

              I tried Google Maps AI last night and, wow. The experience was about as garbage as you can imagine.

              • woahan hour ago
                Siri on my Apple Home will default to turning off all the lights in the kitchen if it misunderstands anything. Much hilarity ensues
      • hadlock5 hours ago
        > what their actual 5-, 10-, and 20-year plans are

        Seems like they are waiting for the "slope of enlightenment" on the gartner hype curve to flatten out. Given you can just lease or buy a SOTA model from leading vendors there's no advantage to training your own right now. My guess is that the LLM/AI landscape will look entirely different by 2030 and any 5 year plan won't be in the same zip code, let alone playing field. Leasing an LLM from Google with a support contract seems like a pretty smart short term play as things continue to evolve over the next 2-3 years.

        • IgorPartola3 hours ago
          This is the key. The real issue is that you don’t need superhuman intelligence in a phone AI assistant. You don’t need it most of the time in fact. Current SOTA models do a decent job of approximating college grad level human intelligence let’s say 85% of the time which is helpful and cool but clearly could be better. But the pace at which the models are getting smart is accelerating AND they are getting more energy efficient and memory efficient. So if something like DeepSeek is roughly 2 years behind SOTA models from Google and others who have SOTA models then in 2030 you can expect 2028 level performance out open models. There will come a time when a model capable of college grad level intelligence 99.999% of the time will be able to run on a $300 device. If you are Apple you do not need to lead the charge on a SOTA model, you can just wait until one is available for much cheaper. Your product is the devices and services consumers buy. If you are OpenAI you have no other products. You must become THE AI to have in an industry that will in the next few years become dominated by open models that are good enough or to close up shop or come up with another product that has more of a moat.
          • ipaddr2 hours ago
            "pace at which the models are getting smart is accelerating". The pace is decelerating.
      • VirusNewbie44 minutes ago

                  LLMs are now commodities and the least important component of the intelligence system Apple is building
        
        
        If that was even remotely true, Apple, Meta, and Amazon would have SoTA foundational models.
      • bigyabai5 hours ago
        That's not an "obligatory HN dig" though, you're in-media-res watching X escape removal from the App Store and Play Store. Concepts like privacy, legality and high-quality software are all theater. We have no altruists defending these principles for us at Apple or Google.

        Apple won't switch Google out as a provider for the same reason Google is your default search provider. They don't give a shit about how many advertisements you're shown. You are actually detached from 2026 software trends if you think Apple is going to give users significant backend choices. They're perfectly fine selling your attention to the highest bidder.

        • theshrike7943 minutes ago
          There are second-order effects of Google or Apple removing Twitter from their stores.

          Guess who's the bestie of Twitter's owner? Any clues? Could that be a vindictive old man with unlimited power and no checks and balances to temper his tantrums?

          Of course they both WANT Twitter the fuck out of the store, but there are very very powerful people addicted to the app and what they can do with it.

        • kennywinker3 hours ago
          Caveat: as long as it doesn’t feel like you’re being sold out.

          Which is why privacy theatre was an excellent way to put it

        • yunohn3 hours ago
          Apple’s various privileged device-level ads and instant-stop-on-cancel trials and special rules for notifications for their paid additional services like Fitness+, Music, Arcade, iCloud+, etc are all proof that they do not care about the user anymore.
    • concinds6 hours ago
      An Apple-developed LLM would likely be worse than SOTA, even if they dumped billions on compute. They'll never attract as much talent as the others, especially given how poorly their AI org was run (reportedly). The weird secrecy will be a turnoff. The culture is worse and more bureaucratic. The past decade has shown that Apple is unwilling to fix these things. So I'm glad Apple was forced to overcome their Not-Invented-Here syndrome/handicap in this case.
      • microtherion2 hours ago
        Reportedly, Meta is paying top AI talent up to $300M for a 4 year contract. As much as I'm in favor of paying engineers well, I don't think salaries like this (unless they are across the board for the company, which they are of course not) are healthy for the company long term (cf. Anthony Levandowski, who got money thrown after him by Google, only to rip them off).

        So I'm glad Apple is not trying to get too much into a bidding war. As for how well orgs are run, Meta has its issues as well (cf the fiasco with its eponymous product), while Google steadily seems to erode its core products.

      • blitzar6 hours ago
        Apple might have gotten very lucky here ... the money might be in finding uses, and selling physical products rather than burning piles of cash training models that are SOTA for 5 minutes before being yet another model in a crowded field.

        My money is still on Apple and Google to be the winners from LLMs.

        • Melatonic5 hours ago
          Apple has also never been big on the server side equation of both software and hardware - don't they already outsource most of their cloud stack to Google via GCP ?

          I can see them eventually training their own models (especially smaller and more targeted / niche ones) but at their scale they can probably negotiate a pretty damn good deal renting Google TPUs and expertise.

          • ghaff3 hours ago
            Xserve was always kind of a loss. Wrote a piece about it a number of years back. It became pretty much a commodity business--which isn't Apple.
            • no_wizard3 hours ago
              I always wondered what they were hoping for with their server products back when they had them. Consumers and end users benefit greatly from the vertical integration that Apple is good at. This doesn't translate with servers. Commodity hardware + linux is not only cheaper, its often easier, and was definitely less proprietary.

              Its also a race to the bottom type scenario. Apple would have never been able to keep up with server release schedules.

              Was an interesting but ultimately odd moment of history for servers.

            • pstuartan hour ago
              With Thunderbolt 5 and M5 Ultras, Apple could be building lower cost clusters that could possibly scale enough while keeping a lower power budget. Obviously that can't compete with NVIDIA racks, but for mobile consumer inference maybe that would be enough?
        • lamontcg5 hours ago
          And when the cost of training LLMs starts to come down to under $1B/yr, Apple can jump on board, having saved >$100B in not trying to chase after everyone else to try to get there first.
    • maxloh6 hours ago
      Is the training cost really that high, though?

      The Allen Institute (a non-profit) just released the Molmo 2 and Olmo 3 models. They trained these from scratch using public datasets, and they are performance-competitive with Gemini in several benchmarks [0] [1].

      AMD was also able to successfully train an older version of OLMo on their hardware using the published code, data, and recipe [2].

      If a non-profit and a chip vendor (training for marketing purposes) can do this, it clearly doesn't require "burning 10 years of cash flow" or a Google-scale TPU farm.

      [0]: https://allenai.org/blog/molmo2

      [1]: https://allenai.org/blog/olmo3

      [2]: https://huggingface.co/amd/AMD-OLMo

      • PunchyHamsteran hour ago
        my prediction is that they might switch once AI craze will simmer down to some more reasonable level
      • turtlesdown116 hours ago
        No, of course the training costs aren't that high. Apple's ten years of future free cash flow is greater than a trillion dollars (they are above $100b per year). Obviously, the training costs are a trivial amount compared to that figure.
        • ufmacean hour ago
          What I'm wondering - their future cash flow may be massive compared to any conceivable rational task, but the market for servers and datacenters seems to be pretty saturated right now. Maybe, for all their available capital, they just can't get sufficient compute and storage on a reasonable schedule.
        • bombcar4 hours ago
          I have no idea what AI involves, but "training" sounds like a one-and-done - but how is the result "stored"? If you have trained up a Gemini, can you "clone" it and if so, what is needed?

          I was under the impression that all these GPUs and such were needed to run the AI, not only ingest the data.

          • DougBTX2 hours ago
            > but how is the result "stored"

            Like this: https://huggingface.co/docs/safetensors/index

          • esafak4 hours ago
            Yes, serving requires infra, too. But you can use infra optimized for serving; nvidia GPUs are not the only game in town.
          • tefkah4 hours ago
            Theoretically it would be much less expensive to just continue to run the existing models, but ofc none of the current leaders are going to stop training new ones any time soon.
            • bombcaran hour ago
              So are we on a hockey stick right now where a new model is so much better than the previous that you have to keep training?

              Because almost every example of previous cases of things like this eventually leveled out.

        • amelius3 hours ago
          Hiring the right people should also be trivial with that amount of cash.
      • lostmsu3 hours ago
        No, I doesn't beat Gemini in any benchmarks. It beats Gemma, which isn't a SoTA even among open models of that size. That would be Nemotron 3 or GPT-OSS 20B.
    • robotresearcheran hour ago
      For some context with numbers, in mid-2024 Apple publicly described 3B parameter foundation models. Gemini 3 Pro is about 1T today.

      https://machinelearning.apple.com/research/apple-intelligenc...

    • drob5187 hours ago
      Yea, I think it’s smart, too. There are multiple companies who have spent a fortune on training and are going to be increasingly interested in (desperate to?) see a return from it. Apple can choose the best of the bunch, pay less than they would have to to build it themselves, and swap to a new one if someone produces another breakthrough.
      • Fiveplus7 hours ago
        100%. It feels like Apple is perfectly happy letting the AI labs fight a race to the bottom on pricing while they keep the high-margin user relationship.

        I'm curious if this officially turns the foundation model providers into the new "dumb pipes" of the tech stack?

        • drob5187 hours ago
          It’ll be interesting to see how it plays out. The question is, what’s the moat? If all they have is scaling to drive better model performance, then the winner is just whoever has the lowest cost of capital.
          • ivell6 hours ago
            Google seems to thrive on commodity products. Search, EMail, etc.

            It is their strength to take commodity products and scale it well.

          • raw_anon_11117 hours ago
            This isn’t a mystery - it’s Google
            • drob5185 hours ago
              Yea, I think that’s probably right, unless something unexpected changes the game.
        • whywhywhywhy4 hours ago
          As if they really have a choice though. Competing would be a billion dollar Apple Maps scenario.
    • LeoPanthera2 hours ago
      Google says: "Apple Intelligence will continue to run on Apple devices and Private Cloud Compute, while maintaining Apple's industry-leading privacy standards."

      So what does it take? How many actual commitments to privacy does Apple have to make before the HN crowd stops crowing about "theater"?

    • Sevii3 hours ago
      Apple's goal is likely to run all inference locally. But models aren't good enough yet and there isn't enough RAM in an iPhone. They just need Gemini to buy time until those problems are resolved.
      • O5vYtytb8 minutes ago
        Well DRAM prices aren't going down soon so I see this as quite the push away from local inference.
      • kennywinker3 hours ago
        That was their goal, but in the past couple years they seem to have given up on client-side-only ai. Once they let that go, it became next to impossible to claw back to client only… because as client side ai gets better so does server side, and people’s expectations scale up with server side. And everybody who this was a dealbreaker for left the room already.
    • hmokiguess5 hours ago
      I always think about this, can someone with more knowledge than me help me understand the fragility of these operations?

      It sounds like the value of these very time-consuming, resource-intensive, and large scale operations is entirely self-contained in the weights produced at the end, right?

      Given that we have a lot of other players enabling this in other ways, like Open Sourcing weights (West vs East AI race), and even leaks, this play by Apple sounds really smart and the only opportunity window they are giving away here is "first to market" right?

      Is it safe to assume that eventually the weights will be out in the open for everyone?

      • bayarearefugee2 hours ago
        > and the only opportunity window they are giving away here is "first to market" right?

        A lot of the hype in LLM economics is driven by speculation that eventually training these LLMs is going to lead to AGI and the first to get there will reap huge benefits.

        So if you believe that, being "first to market" is a pretty big deal.

        But in the real world there's no reason to believe LLMs lead to AGI, and given the fairly lock-step nature of the competition, there's also not really a reason to believe that even if LLMs did somehow lead to AGI that the same result wouldn't be achieved by everyone currently building "State of the Art" models at roughly the same time (like within days/months of each other).

        So... yeah, what Apple is doing is actually pretty smart, and I'm not particularly an Apple fan.

      • pests3 hours ago
        > is entirely self-contained in the weights produced at the end, right?

        Yes, and the knowledge gained along the way. For example, the new TPUv4 that Google uses requires rack and DC aware technologies (like optical switching fabric) for them to even work at all. The weights are important, and there is open weights, but only Google and the like are getting the experience and SOTA tech needed to operate cheaply at scale.

    • ceejayoz6 hours ago
      > I'm oversimplifying but this effectively turns the iPhone into a dumb terminal for Google's brain, wrapped in Apple's privacy theater.

      This sort of thing didn't work out great for Mozilla. Apple, thankfully, has other business bringing in the revenue, but it's still a bit wild to put a core bit of the product in the hands of the only other major competitor in the smartphone OS space!

      • apercu6 hours ago
        I dunno, my take is that Apple isn’t outsourcing intelligence rather it’s outsourcing the most expensive, least defensible layer.

        Down the road Apple has an advantage here in a super large training data set that includes messages, mail, photos, calendar, health, app usage, location, purchases, voice, biometrics, and you behaviour over YEARS.

        Let's check back in 5 years and see if Apple is still using Gemini or if Apple distills, trains and specializes until they have completed building a model-agnostic intelligence substrate.

    • aurareturn6 hours ago
      Seems like there is a moat after all.

      The moat is talent, culture, and compute. Apple doesn't have any of these 3 for SOTA AI.

      • elzbardico5 hours ago
        It is more like Apple have no need to spend billions on training with questionable ROI when it can just rent from one of the commodity foundation model labs.
        • nosman4 hours ago
          I don't know why people automatically jump to Apple's defense on this.... They absolutely did spend a lot of money and hired people to try this. They 100% do NOT have the open and bottom-up culture needed to pull off large scale AI and software projects like this.

          Source: I worked there

          • elzbardico3 hours ago
            Well, they stopped.

            Culture is overrated. Money talks.

            They did things far more complicated from an engineering perspective. I am far more impressed by what they accomplished along TSMC with Apple Silicon than by what AI labs do.

            • tech-historianan hour ago
              Is Apple silicon really that impressive compared to LLMs? Take a step back. CPUs have been getting faster and more efficient for decades.

              Google invented the transformer architecture, the backbone of modern LLMs.

      • jpfromlondon2 hours ago
        is it that surprising? they're a hardware company after all.
    • overfeed5 hours ago
      > The writing was on the wall the moment Apple stopped trying to buy their way into the server-side training game like what three years ago?

      It goes back much further than that - up until 2016, Apple wouldn't let its ML researchers add author names to published research papers. You can't attract world-class talent in research with a culture built around paranoid secrecy.

    • segmondy5 hours ago
      10 years worth of cash? So all these Chinese labs that came out and did it for less than $1 billion must have 3 heads per developer, right?
      • andreyf2 hours ago
        Rumor has it that they weren't trained "from scratch" the was US would, i.e. Chinese labs benefitted from government "procured" IP (the US $B models) in order to train their $M models. Also understand there to be real innovation in the many-MoE architecture on top of that. Would love to hear a more technical understanding from someone who does more than repeat rumors, though.
      • 4fterd4rk27 minutes ago
        A lot of HN commentators are high on their own supply with regard to the AI bubble... when you realize that this stuff isn't actually that expensive the whole thing begins to quickly unravel.
    • dabockster6 hours ago
      It also lets them keep a lot of the legal issues regarding LLM development at arms length while still benefiting from them.
    • chatmasta2 hours ago
      It’s also a bet that the capex cost for training future models will be much lower than it is today. Why invest in it today if they already have the moat and dominant edge platform (with a loyal customer base upgrading hardware on 2-3 year cycles) for deploying whatever future commoditized training or inference workloads emerge by the time this Google deal expires?
    • ysnp7 hours ago
      Could you elaborate a bit on why you've judged it as privacy theatre? I'm skeptical but uninformed, and I believe Mullvad are taking a similar approach.
      • natch5 hours ago
        They transitioned from “nobody can read your data, not even Apple” to “Apple cannot read your data.” Think about what that change means. And even that is not always true.

        They also were deceptive about iCloud encryption where they claimed that nobody but you can read your iCloud data. But then it came out after all their fanfare that if you do iCloud backups Apple CAN read your data. But they aren’t in a hurry to retract the lie they promoted.

        Also if someone in another country messages you, if that country’s laws require that Apple provide the name, email, phone number, and content of the local users, guess what. Since they messaged you, now not only their name and information, but also your name and private information and message content is shared with that country’s government as well. By Apple. Do they tell you? No. Even if your own country respects privacy. Does Apple have a help article explaining this? No.

        • threatofrain5 hours ago
          If you want to turn on full end-to-end encryption you can, if you want to share your pubkey so that people can't fake your identity on iMessage you can, and there's still a higher tier of security than that presumably for journalists and important people.

          It's something a smart niece or nephew could handle in terms of managing risk, but the implications could mean getting locked out of your device which you might've been using as the doorway to everything, and Apple cannot help you.

        • dpoloncsak5 hours ago
          >Also if someone in another country messages you, if that country’s laws require that Apple provide the name

          I don't mean to sound like an Apple fanboy, but is this true just for SMS or iMessage as well? It's my understanding that for SMS, Apple is at the mercy of governments and service providers, while iMessage gives them some wiggle room.

          Ancedotal, but when my messages were subpoenaed, it was only the SMS messages. US citizen fwiw

        • richwater3 hours ago
          You people will never be happy until the only messaging that exists is in a dusty basement and Richard Stallman is sleeping on a dirty futon.
        • classicsc4 hours ago
          [dead]
      • greentea236 hours ago
        Mullvad is nothing like Apple. For apple devices: - need real email and real phone number to even boot the device - cannot disable telemetry - app store apps only, even though many key privacy preserving apps are not available - /etc/hosts are not your own, DNS control in general is extremely weak - VPN apps on idevices have artificial holes - can't change push notification provider - can only use webkit for browsers, which lacks many important privacy preserving capabilities - need to use an app you don't trust but want to sandbox it from your real information? Too bad, no way to do so. - the source code is closed so Apple can claim X but do Y, you have no proof that you are secure or private - without control of your OS you are subject to Apple complying with the government and pushing updates to serve them not you, which they are happy to do to make a buck

        Mullvad requires nothing but an envelope with cash in it and a hash code and stores nothing. Apple owns you.

        • Melatonic4 hours ago
          Agreed on most points but you can setup a pretty solid device wide DNS provider using configuration profiles. Similar to how iOS can be enrolled in work corporate MDM - but under your control.

          Works great for me with NextDNS.

          Orion browser - while also based on WebKit - is also awesome and has great built in Adblock and supposedly privacy respecting ideals.

          • greentea233 hours ago
            Apple has records that you are installing that, probably putting you on a list.

            And it works until it's made illegal in your country and removed from the app store. You have no guarantees that anything that works today will work tomorrow with Apple.

            Apple is setting us up to be under a dictator's thumb one conversion at a time.

        • MrDarcy6 hours ago
          This comment confuses privacy with anonymity.
          • asadotzler2 hours ago
            Anonymity is a critical aspect of privacy. If you cannot prevent your name being associated with your data, you do not have real privacy.
          • whilenot-dev5 hours ago
            Anonymity is an inherent measure to preserve ones individual privacy. What value did you intent to add with your remark?
          • greentea233 hours ago
            Not for all points. And not being anonymous means your identity is not private...
        • apparent3 hours ago
          You do not need an email address to set up an iPhone, and you do not need an email address or phone number to set up an iPad/Mac.

          If you want to use the App Store on these devices, you do need to have an email address.

      • drnick16 hours ago
        Because Apple makes privacy claims all the time, but all their software is closed source and it is very hard or impossible to verify any of their claims. Even if messages sent between iPhones are E2EE encrypted for example, the client apps and the operating system may be backdoored (and likely are).

        https://en.wikipedia.org/wiki/PRISM

      • tempodox7 hours ago
        The gov’t can force them to reveal any user’s data and slap them with a gag order so no one will ever know this happened.
        • MontyCarloHall6 hours ago
          All user data is E2E encrypted, so the government literally cannot force this. This has been the source of numerous disputes [0, 1] that either result in the device itself being cracked [0] (due to weak passwords or vulnerabilities in device-level protection) or governments attempting to ban E2E encryption altogether [1].

          [0] https://en.wikipedia.org/wiki/Apple%E2%80%93FBI_encryption_d...

          [1] https://en.wikipedia.org/wiki/Crypto_Wars

          • mmh00006 hours ago
            Maybe E2E, but the data eventually has to be decrypted to read it.

            Then you learn that every modern CPU has a built-in backdoor, a dedicated processor core, running a closed-source operating system, with direct access to the entire system RAM, and network access. [a][b][c][d].

            You can not trust any modern hardware.

            https://en.wikipedia.org/wiki/Intel_Management_Engine

            https://en.wikipedia.org/wiki/AMD_Platform_Security_Processo...

            https://en.wikipedia.org/wiki/ARM_architecture_family#Securi...

            https://en.wikipedia.org/wiki/Security_and_privacy_of_iOS

          • greentea236 hours ago
            What you cited is for data on a device that was turned off. Not daily internet connected usage. No one is saying you have no protection at all with Apple, it is just very limited compared to what it should be by modern security best practices, and much worse than what can be achieved on android and linux.
            • nozzlegear2 hours ago
              > much worse than what can be achieved on android and linux.

              * Certain types of Android

          • natch5 hours ago
            E2E encrypted is nothing if key escrow is happening.

            Why did they change their wording from:

            Nobody can read your data, not even Apple

            to:

            Apple cannot read your data.

            You know why.

            • ajam1507an hour ago
              When did they change their wording?
            • nozzlegear2 hours ago
              If they didn't want you to think key escrow might be possible, why wouldn't they just leave the wording the way it was? Why go through the effort and thereby draw attention to it? The court system doesn't use sovcit rules where playful interpretation of wording can get a trillion dollar corporation out of a lawsuit or whatever.
    • Melatonic5 hours ago
      Personally also think it's very smart move - Google has TPUs and will do it more efficiently than anyone else.

      It also lets Apple stand by while the dust settles on who will out innovate in the AI war - they could easily enter the game on a big way much later on.

    • hadlock5 hours ago
      Seems like the LLM landscape is still evolving, and training your own model provides no technical benefit as you can simply buy/lease one, without the overhead of additional eng staffing/datacenter build-out.

      I can see a future where LLM research stalls and stagnates, at which point the ROI on building/maintaining their own commodity LLM might become tolerable. Apple has had Siri as a product/feature and they've proven for the better part of a decade that voice assistants are not something they're willing to build a proficiency in. My wife still has an apple iPhone for at least a decade now, and I've heard her use Siri perhaps twice in that time.

    • PunchyHamsteran hour ago
      > To me, this deal is about the bill of materials for intelligence. Apple admitted that the cost of training SOTA models is a capex heavy-lift they don't want to own. Seems like they are pivoting to becoming the premium "last mile" delivery network for someone else's intelligence. Am I missing the elephant in the room?

      Probably not missing the elephant. They certainly have the money to invest and they do like vertical integration but putting massive investment in bubble that can pop or flatline at any point seems pointless if they can just pay to use current best and in future they can just switch to something cheaper or buy some of the smaller AI companies that survive the purge.

      Given how much AI capable their hardware is they might just move most of it locally too

    • stronglikedan5 hours ago
      > Seems like they are pivoting to becoming the premium "last mile" delivery network for someone else's intelligence.

      They have always been a premium "last mile" delivery network for someone else's intelligence, except that "intelligence" was always IP until now. They have always polished existing (i.e., not theirs) ideas and made them bulletproof and accessible to the masses. Seems like they intend to just do more of the same for AI "intelligence". And good for them, as it is their specialty and it works.

    • 7 hours ago
      undefined
    • haritha-j7 hours ago
      Agreed, especially since this is a competitive space with multiple players, with a high price of admission, and where your model is outdated in a year, so its not even capex as much as recurring expenditure. Far better to let someone else do all the hard work, and wait and see where things go. Maybe someday this'll be a core competency you want in-house, but when that day comes you can make that switch, just like with apple silicon.
    • ChildOfChaos6 hours ago
      The trouble is this seems to me like a short term fix, longer term, once the models are much better, Google can just lock out apple and take everything for themselves and leave Apple nowhere and even further behind.
      • raw_anon_11115 hours ago
        Of course there is going to be an abstraction layer - this is like Software Engineering 101.

        Google really could care less about Android being good. It is a client for Google search and Google services - just like the iPhone is a client for Google search and apps.

    • hashta2 hours ago
      this also addresses something else ...

      apple to some users "are you leaving for android because of their ai assistant? don’t leave we are bringing it to iphone"

    • kernalan hour ago
      >Apple has the best edge inference silicon in the world (neural engine),

      Can you cite this claim? The Qualcomm Hexagon NPU seems to be superior in the benchmarks I've seen.

    • semiquaver5 hours ago

        > without burning 10 years of cash flow.
      
      Sorry to nitpick but Apple’s Free Cash Flow is 100B/yr. Training a model to power Siri would not cost more than a trillion dollars.
    • baxuz4 hours ago
      > bill of materials for intelligence

      There is no intelligence

    • _joel7 hours ago
      > without burning 10 years of cash flow.

      Don't they have the highest market cap of any company in existence?

      • jayd166 hours ago
        You don't need to join every fight you see, even if you would do well.
      • fumblebee7 hours ago
        I believe both Nvidia and Google have higher market caps
      • turtlesdown116 hours ago
        They have the largest free cash flow (over $100 billion a year). Meta and Amazon have less than half that a year, and Microsoft/Nvidia are between $60b-70b per year. The statement reflects a poor understanding of their financials.
    • fooblaster7 hours ago
      calling neural engine the best is pretty silly. the best perhaps of what is uniformly a failed class of ip blocks - mobile inference NPU hardware. edge inference on apple is dominated by cpus and metal, which don't use their NPU.
    • whereismyacc7 hours ago
      best inference silicon in the world generally or specialized to smaller models/edge?
      • properbrew7 hours ago
        Not even an Apple fan, but from what I've been testing with for my dev use case (only up to 14b) it absolutely rocks for general models.
        • whereismyacc5 hours ago
          That I can absolutely believe but the big competition is in enterprise gpt-5-size models.
    • scotty797 hours ago
      > without burning 10 years of cash flow.

      Wasn't Apple sitting on a pile of cash and having no good ideas what to spend it on?

      • ceejayoz6 hours ago
        That doesn't make lighting it on fire a great option.
      • internetter7 hours ago
        Perhaps spending it on inference that will be obsoleted in 6 months by the next model is not a good idea either.

        Edit: especially given that Apple doesn’t do b2b so all the spend would be just to make consumer products

        • greentea236 hours ago
          Apple of course does an emormous amount of b2b.
      • turtlesdown116 hours ago
        The cash pile is gone, they have been active in share repurchase.

        They still generate about ~$100 billion in free cash per year, that is plowed into the buybacks.

        They could spend more cash than every other industry competitor. It's ludicrous to say that they would have to burn 10 years of cash flow on trivial (relative) investment in model development and training. That statement reflects a poor understanding of Apple's cash flow.

  • Workaccount27 hours ago
    If nothing else, this was likely driven by Google being the most stable of the AI labs. Gemini is objectively a good model (whether it's #1 or #5 in ranking aside) so Apple can confidently deliver a good (enough) product. Also for Apple, they know their provider has ridiculously deep pockets, a good understanding and infrastructure in place for large enterprises, and a fairly diversified revenue stream.

    Going with Anthropic or OpenAI, despite on the surface having that clean Apple smell and feel, carries a lot of risk Apple's part. Both companies are far underwater, liable to take risks, and liable to drown if they even fall a bit behind.

    • cush5 hours ago
      > Gemini is objectively a good model (whether it's #1 or #5 in ranking aside) so Apple can confidently deliver a good (enough) product

      Definitely. At at this point, Apple just needs to get anything out the door. It was nearly two years ago they sold a phone with features that still haven't shipped and the promise that Apple Intelligence would come in two months.

      • baxtr2 hours ago
        What are the top 3 features you’re missing right now?
        • codepoet802 minutes ago
          ANY ability to answer simple questions without telling me to open Safari and read a webpage for myself...?
        • woahan hour ago
          Siri to function above the level of Dragon NaturallySpeaking '95
    • mbirth7 hours ago
      I was more thinking about this being driven by the fact that Google pays Apple $20B a year for being the pre-selected search engine and this way, Apple still gets $19B and a free AI engine on top.
      • asadotzler2 hours ago
        It was 20 billion dollars years ago, 2022. There's little doubt it's closer to $25B now, perhaps more.
    • segmondy5 hours ago
      Yup, Anthropic has constant performance problems (not enough GPU), OpenAI is too messy with their politics and Altman.
    • tempodox6 hours ago
      Nothing about OpenAI smells clean.
    • knallfrosch6 hours ago
      Nothing about OpenAI is clean. Their complete org is controlled by Altmann, who was able to rehire himself after he was fired.

      Anthropic doesn't have a single data centre, they rent from AWS/Microsoft/Google.

  • runjake4 hours ago
    Apple has seemingly confirmed that the Gemini models will run under their Private Cloud Compute and so presumably Google would not have access to Siri data.

    https://daringfireball.net/linked/2026/01/12/apple-google-fo...

    • cpeterso4 hours ago
      Neither Apple's nor Google's announcement says Siri will use Gemini models. Both announcements say, word for word, "Google’s technology provides the most capable foundation for Apple Foundation Models". I don't know what that means, but Apple and Google's marketing teams must have crafted that awkward wording carefully to satisfy some contractual nuance.
      • w10-13 minutes ago
        > "Google’s technology provides the most capable foundation for Apple Foundation Models"

        Beyond Siri, Apple Foundation Models are available as API; will Google's technologies thus also be available as API? Will Apple reduce its own investment in building out the Foundation models?

      • runjake2 hours ago
        Direct quote from Google themselves:

        "Apple Intelligence will continue to run on Apple devices and Private Cloud Compute, while maintaining Apple's industry-leading privacy standards."

      • Workaccount22 hours ago
        Apple likely wants to post-train a per-trained model, probably along with some of Google's heavily NDA'ed training techniques too.
      • Ninjinka3 hours ago
        Check again: https://x.com/NewsFromGoogle/status/2010760810751017017?s=20

        "These models will help power future Apple Intelligence features, including a more personalized Siri coming this year."

      • baxtr2 hours ago
        Mostly likely the wording was crafted by an artificially intelligent entity.
  • quitit7 hours ago
    This is a bit of a layer cake:

    1. The first issue is that there is significant momentum in calling Siri bad, so even if Apple released a higher quality version it will still be labelled bad. It can enhance the user's life and make their device easier to use, but the overall press will be cherrypicked examples where it did something silly.

    2. Basing Siri on Google's Gemini can help to alleviate some of that bad press, since a non-zero share of that doomer commentary comes from brand-loyalists and astroturfing.

    3. The final issue is that on-device Siri will never perform like server-based ChatGPT. So in a way it's already going to disappoint some users who don't realise that running something on mobile device hardware is going to have compromises which aren't present on a server farm. To help illustrate that point: We even have the likes of John Gruber making stony-faced comparisons between Apple's on-device image generator toy (one that produces about an image per second) versus OpenAI's server farm-based image generator which makes a single image in about 1-2 minutes. So if a long-running tech blogger can't find charity in those technical limitations, I don't expect users to.

    • JohnMakin6 hours ago
      Siri is objectively bad though. It isn't some vendetta. I am disabled and there are at least 50 different things that I'd love siri to do that should be dead simple, yet it cannot. My favorite one was when I suffered a small but not serious fall, decided to test whether siri could be alerted to call 9-11 while being less than 6 feet away from me, absolutely could not understand let alone execute my request. It's a lot of stuff like this. Its core functionality often just does not work.

      > The final issue is that on-device Siri will never perform like server-based ChatGPT. So in a way it's already going to disappoint some users who don't realise that running something on mobile device hardware is going to have compromises which aren't present on a server farm.

      For many years, siri requests were sent to an external server. It still sucked.

      • margalabargala5 hours ago
        I don't think the parent said that Siri wasn't bad, on the contrary it sounds like they agree.

        Their point is that if Apple totally scraps the current, bad, product called "Siri" and replaces it with an entirely different, much better product that is also named "Siri" but shares nothing but the name, people's perceptions of the current bad Siri will taint their impressions of the new one.

        • quitit3 hours ago
          It's pretty clear they tried their best to miss or reinterpret the points I made so they could talk about something else.
          • JohnMakin3 hours ago
            I'm sorry for whatever cynicism leads you to believe this. I don't believe there is an "astroturfer" problem with siri, and that is mostly what I was responding to. Sorry you missed that.
            • quitit2 hours ago
              Oh that riled you up? Too bad. Better get out your alt accounts to address this great injustice.
      • Workaccount22 hours ago
        I'd be skeptical about even new LLM siri being able to dial 911.

        These models tend to have a "mind of their own", and I can totally, absolutely, see a current SOTA LLM convincing itself it needs to call 911 because you asked it how to disinfect a cut.

        • array_key_firstan hour ago
          Ideally you have a layer before the LLM that filters out stuff the phone can do without an LLM. The LLM probably shouldn't even have the power to call 911, that should be a layer lower. And probably you don't want to send simple queries like "call XYZ" to the cloud, best to just do it locally.
    • apparent3 hours ago
      There are many people who lament that Siri sucks but would be happy to admit if/when this changes. Even if it goes from super shitty (as evidenced by randomly calling people I have never called/texted when I ask it to call my wife) to "pretty good" I will be the first to admit that it is better. I look forward to it getting better and being able to use it more often.
    • mucle66 hours ago
      re 3: I doubt Google is going to hand over the weights to Apple to put on device.
      • MaysonL5 hours ago
        They wouldn’t fit.
      • quitit3 hours ago
        Nor was such a thing implied. The information in the various news articles about it also don't make that claim.
  • gnabgib8 hours ago
    Related: Apple nears $1B Google deal for custom Gemini model to power Siri (71 points, 2 months ago, 47 comments) https://news.ycombinator.com/item?id=45826975
    • johnthuss8 hours ago
      The biggest NEW thing here is that this isn't white-labeled. Apple is officially acknowledging Google as the model that will be powering Siri. That explicit acknowledgment is a pretty big deal. It will make it harder for Apple to switch to its own models later on.
      • mdasen7 hours ago
        Where does it say that it won't be white-labeled?

        Yes, Apple is acknowledging that Google's Gemini will be powering Siri and that is a big deal, but are they going to be acknowledging it in the product or is this just an acknowledgment to investors?

        Apple doesn't hide where many of their components come from, but that doesn't mean that those brands are credited in the product. There's no "fab by TSMC" or "camera sensors by Sony" or "display by Samsung" on an iPhone box.

        It's possible that Apple will credit Gemini within the UI, but that isn't contained in the article or video. If Apple uses a Gemini-based model anonymously, it would be easy to switch away from it in the future - just as Apple had used both Samsung and TSMC fabs, or how Apple has used both Samsung and Japan Display. Heck, we know that Apple has bought cloud services from AWS and Google, but we don't have "iCloud by AWS and GCP."

        Yes, this is a more public announcement than Apple's display and camera part suppliers, but those aren't really hidden. Apple's dealings with Qualcomm have been extremely public. Apple's use of TSMC is extremely public. To me, this is Apple saying "hey CNBC/investors, we've settled on using Gemini to get next-gen Siri happening so you all can feel safe that we aren't rudderless on next-gen Siri."

        • a_paddy7 hours ago
          Apple won't take the risk of being blamed for AI answers being incorrect. They will attribute Google/Gemini so users know how to be mad at if it doesn't work as expected.
          • qnpnpmqppnp6 hours ago
            Apple is already taking the risk of being blamed for their own AI right now, though (an AI that is much more prone to incredibly dumb errors than Gemini), so I don't find it that obvious that they wouldn't just continue taking the blame for Siri as they already do, except with an actually smarter Siri.
        • HarHarVeryFunny5 hours ago
          If I were Goodle, I'd offer Apple a very significant discount to have visible branding of "powered by Gemini".
          • gallerdude4 hours ago
            I'm sure Apple is more than happy to pay the premium for cleanness.
            • HarHarVeryFunny3 hours ago
              Maybe they'd prefer it for aesthetics, but OTOH in iOS 18.2+ they support off-device ChatGPT and apparently refer to it as "ChatGPT" both in settings and when prompting the user to ask if they want to use it.

              If they do refer to it as "Gemini" then this is a huge win for Google, and huge loss for OpenAI, since it really seems that the "ChatGPT" brand is the only real "moat" that OpenAI have, although recently there has been about a 20% shift in traffic from ChatGPT to Gemini, so the moat already seems to be running dry.

      • Angostura7 hours ago
        I don't see why - iOS originally shipped with Google Maps as standard, for example. Macs shipped with Internet Explorer as standard before Safari existed
        • johnthuss7 hours ago
          The Google Maps situation is a great example of why this will be hard. When Apple switched to their own maps it was a huge failure resulting in a rare public apology from the company. In order to switch you have to be able to do absolutely everything that the previous solution offered without loss of quality. Given Google's competence in AI development that will be a high bar to meet.
          • thinkindie7 hours ago
            several years after that they still have their own Maps though, they didn't go back to Google Maps.
            • robertlagrant2 hours ago
              That's the point of what the person you replied to is saying.
              • thinkindiean hour ago
                it's hard but not impossible. Unless Apple didn't learn the Google Maps/Maps lesson.
          • eli7 hours ago
            Well, yeah, Apple's Maps.app wasn't good enough when it launched (it's solid now though). That feels like a separate thing from white labeling and lock-in. Obviously they would have to switch to something of similar or better quality or users will be upset.

            But it's a whole lot easier to switch from Gemini to Claude or Gemini to a hypothetical good proprietary LLM if it's white label instead of "iOS with Gemini"

            • heraldgeezer6 hours ago
              >it's solid now though

              Depends on where you are. In my experience here in Sweden Google Maps is still better, Apple maps sent us for a loop in Stockholm (literally {{{(>_<)}}} )

          • MBCook7 hours ago
            They switched despite Apple Maps having poor data for a reason:

            Google wanted to shove ads in it. Apple refused and to switch.

            Their hand was forced by that refusal.

            • LexGray6 hours ago
              I thought it was Google refusing to provide turn by turn directions?

              Apple announced last year they are putting their own ads in Maps so if that was the real problem the corporate leadership has done a complete 180 on user experience.

              • array_key_firstan hour ago
                Apple does ads but they have a very particular taste with it. Not necessarily a better taste, but they do it in their own apple way. They're very much control freaks.
              • MBCook6 hours ago
                I think Google was withholding them unless Apple was willing to put the ads in.

                Apple is a very VERY different company than they were back then.

                Back then they didn’t have all sorts of services that they advertised to you constantly. They didn’t have search ads in the App Store. They weren’t trying to squeeze every penny out of every customer all the time no matter how annoying.

          • burnte7 hours ago
            The problem with the analogy is that users were asked to change their habits. Apple switching Siri models behind the scenes is much less problematic.
          • wat100006 hours ago
            It wouldn't have gone any better if the original mapping solution had been a white-labeled "Apple Maps" secretly powered by Google Map.
          • drcongo7 hours ago
            I was in agreement with the parent before I read this, and now I'm in agreement with you. It is a great example, I know so many people who never switched back to Apple Maps because it was so poor initially. Personally I find it a considerably better experience than Google Maps these days, but those lost users still aren't coming back.
            • mathieuh7 hours ago
              Mobile digital mapping was already a useful thing though. Even though Apple Maps was initially a failure I still came back to it every so often to see how it was progressing and eventually it ended up pretty good.

              Maybe I'm weird but mobile assistants have never been useful for me. I tried Siri a couple of times and it didn't work. I haven't tried it since because even if it worked perfectly I'm not sure I'd have any use for it.

              I see it more like the Vision Pro. Doesn't matter how good the product ends up being, I just don't think it's something most people are going to have a use for.

              As far as I'm concerned no one has proved the utility of these mobile assistants yet.

            • 9rx7 hours ago
              In this case, though, Siri has already successfully scared off anyone who isn't willing to reevaluate products.
      • charliebwrites8 hours ago
        Why so?

        Apple explicitly acknowledged that they were using OpenAI’s GPT models before this, and now they’re quite easily switching to Google’s Gemini

        • johnthuss7 hours ago
          The ChatGPT integration was heavily gated by Apple and required explicit opt-in. That won't be the case with the Gemini integration. Apple wants this to just work. The privacy concerns will be mitigated because Apple will be hosting this model themselves in their Private Cloud Compute. This will be a much more tightly integrated solution than ChatGPT was.
          • Angostura7 hours ago
            And you don't think they will include an abstraction layer?
            • layer86 hours ago
              An abstraction layer doesn’t prevent Google from seeing the data. Last year the story was that Apple would be running a Google model on their (Apple’s) own server hardware.
              • Angostura6 hours ago
                Yes, and that's still the story, as far as I can tell. So an abstraction layer would let them swap out the underlying model
        • hu37 hours ago
          I guess the question is, when are they going to use their own model?

          Surely research money is not the problem. Can't be lack of competence either, I think.

          • nothercastle7 hours ago
            I think they want it to work well with web search. That’s why Google is the obvious choice. Also their ai offering is low risk of getting eliminated where as open ai could fail at any time
          • LexGray6 hours ago
            There is just too much money being burned in AI for Apple to keep researchers. Also models have no respect for original art which leads to a branding issue of being a platform for artists.

            Apple is competent at timing when to step into a market and I would guess they are waiting for AI to evolve beyond being considered untrustworthy slop.

          • IOT_Apprentice7 hours ago
            It appears to be lack of competence given they lied about the initial features of Apple Intelligence.

            First, they touted features that no one actually built and then fired their AI figurehead “leader” who had no coherent execution plan—also, there appears to have been territorial squabbling going on, about who would build what.

            How on earth did Apple Senior Management allow this to unravel? Too much focus on Services, yet ignoring their absolute failures with Siri and the bullshit that was Apple Intelligence, when AI spending is in the trillions?

      • dewey7 hours ago
        Don't think that's an especially big deal, they've always included third party data in Siri or the OS which is usually credited (Example: Maps with Foursquare or TomTom, Flight information from FlightAware, Weather data and many more).
      • insin5 hours ago
        They can also put "Google" in the forever-necessary disclaimer

        Google AI can make mistakes

    • dylan6047 hours ago
      Is this another one of those AI deals where no real money changes hands? In this case, doesn't this just offset the fee Google pays Apple for having their search as the default on Apple devices?
      • asadotzleran hour ago
        I'll wager the accounting for the two contracts is separate. There may be stipulations that connect the two, but the payment from Google to Apple of $20B+/yr is a long-established contract (set of contracts, actually0 that Apple would not jeopardize for the relatively small Google to Apple $1B/yr contract, one still unproven and which may not stand the test of time.

        So, yes, practically speaking, the Apple to Google payment offsets a tiny fraction of the Google to Apple payment, but real money will change hands for each and very likely separately.

      • aoeusnth16 hours ago
        So changing cash flows (fee money) isn't real enough now?
  • apitman3 hours ago
    > After careful evaluation, we determined that Google’s technology provides the most capable foundation for Apple Foundation Models

    Sounds like Apple Foundation Models aren't exactly foundational.

  • asadm5 hours ago
    OpenAI had it, they had the foot in the door with their integration last year with Siri. But they dropped that ball and many other balls.
    • czscout4 hours ago
      Yeah, I was really expecting them to just continue the partnership that Apple announced when the iPhone 16/iOS 18 came out, but I suppose it's been pretty much radio silence on both fronts since then. Although the established stability and good enough-ness that Google offers with Gemini are probably more than enough reason for Apple to pivot to them as a model supplier instead.
    • toasterlovin2 hours ago
      I'm sure hiring Jony Ive to design hardware for them didn't help.
    • jquery4 hours ago
      Yeah. Super disappointing. I may end up switching to Gemini entirely at this rate.
  • OJFordan hour ago
    My experience with Gemini (3 Flash) has been pretty funny, not awful (but worse than Kimi K2 or GPT 5.2 Mini), but it's just so much worse at (or rather hyper focused on) following my custom instructions, I keep getting responses like:

        The idiomatic "British" way of doing this ...
    
        Alternatively, for an Imperial-style approach, ...
    
        As a professional software engineer you really should ...
    
    in response to programming/Linux/etc. questions!

    (Because I just have a short blurb about my educational background, career, and geography in there, which with every other model I've tried works great to ensure British spelling, UK information, metric units, and cut the cruft because I know how to mkdir etc.)

    It's given me a good laugh a few times, but just about getting old now.

  • jmacd6 hours ago
    This is one of those announcements that actually just excites me as a consumer. We give our children HomePods as their first device when they turn 8 years old (Apple Watch at 10 years, laptop at 12) and in the 6 years I have been buying them, they have not improved one ounce. My kids would like to listen to podcasts, get information, etc. All stuff that a voice conversation with Chatgpt or Gemini can do today, but Siri isn't just useless-- it's actually quite frustrating!
    • 464931686 hours ago
      It’s absolutely insane that you can’t say “Siri, play my audiobook” and it play the last audiobook you listened to. Like, come on.
      • http-teapot5 hours ago
        Or when you are driving, someone sends a yes-no question where the answer is no.

        Siri: Would you like to answer?

        Me: Yes

        Siri: ...

        Me: No + more words

        Siri: Ok (shuts off)

    • layer86 hours ago
      It remains to be seen what the existing HomePods will support. There’s been a HomePod hardware update in the pipeline for quite some time, and it appears like they are waiting for the new Siri to be ready.
    • knallfrosch6 hours ago
      Siri still can't play an Apple Music album when there is a song of the same name.

      Even "Play the album XY" leads to Siri only playing the single song. It's hilariously bad.

      • billti2 hours ago
        Or the even more frustrating:

        Me: "Hey Siri, play <well known hit song from a studio album that sold 100m copies"

        Siri: "OK, here's <correct song but a live version nobody ever listens to, or some equally obscure remix>"

        Being these things are at their core probability machines, ... How? Why?

  • paxysan hour ago
    I wonder if we will see them take the final step and just make Gemini the default AI assistant on iPhone.

    Might sound crazy but remember they did exactly this for web search. And Maps as well for many years.

    This way they go from having to build and maintain Siri (which has negative brand value at this point) and pay Google's huge inference bills to actually charging Google for the privilege.

  • lolive2 hours ago
    What happens to on-device intelligence? Does it stay a massive part of the Apple Intelligence offer? Or can we expect everything to be offloaded to the cloud?
  • mark_l_watson2 hours ago
    Old news now I think, but good news. Except for my Apple Watch I have given up using Siri, but I use Gemini and think it is good in general, and awesome on my brother's Pixel phone.

    Because Apple Silicon is so good for LLM inferencing, I hope they also do a deal for small on-device Gemma models.

  • elzbardico5 hours ago
    Models are becoming commodities, and their economy doesn't justify the billions required to train a SOTA model. Apple just recognized that.
    • 2 hours ago
      undefined
  • Yash163 hours ago
    Yes, the Gemini models are really good right now—especially the image models. The Nano Banana Pro model looks super promising. I’m planning to integrate image generation into mobile apps and other platforms. Apple has massive distribution, but it still feels like they haven’t fully integrated this kind of tech yet, so users have to rely on third-party apps to get top-quality results.

    At the moment, I’m using https://picxstudio.com to generate 4K-quality images with the Nano Banana Pro model, but my goal is to build my own app that delivers the same level of quality and control.

    • 2 hours ago
      undefined
  • tolerance2 hours ago
    This morning I was wondering what happened to whatever arrangement I thought Apple had with OpenAI. In a way I think OpenAI is a competitor and “new money”. Pairing with Google makes sense especially considering that this is “normie-facing” technology. And from what I recall, a lot of Apple fans prefer “Hey Google” in their cars over CarPlay. Or something to that effect.
  • zeras6 hours ago
    This is actually a smart and common sense move by Apple.

    The non-hardware AI industry is currently in an R&D race to establish and maintain marketshare, but with Apple's existing iPhone, iPad and Mac ecosystem they already have a market share they control so they can wait until the AI market stabilizes before investing heavily in their own solutions.

    For now, Apple can partner with solid AI providers to provide AI services and benefits to their customers in the short term and then later on they can acquire established AI companies to jumpstart their own AI platform once AI technology reaches more long term consistency and standardization.

  • gehsty3 hours ago
    The actual transactions around this deal will be interesting - will Google simply withold $1B from their search deal, will they pay it then Applepay it back (or a split). I doubt we’ll even know.
  • 3 hours ago
    undefined
  • 3 hours ago
    undefined
  • hashta3 hours ago
    I’m a long time Android user and almost switched to iPhone last year. Mostly because I use macOS and wanted better integration and also wanted to try it. Another big factor was the AI assistant. I stayed with Android because I think Google will win here. Apple will probably avoid losing users to their biggest competitor by reaching rough parity using the same models
  • elzbardico5 hours ago
    Really, Siri is an agent. Agents thrive when the subjacent model capabilities are higher, as it unlocks a series of other use cases that are hard to accomplish when the basic Natural Language Processing layer is weak.

    The better the basic NLP tasks like named entity recognition, PoS tagging, Dependency Parsing, Semantic Role Labelling, Event Extraction, Constituency parsing, Classification/Categorization, Question Answering, etc, are implemented by the model layer, the farther you can go on implementing meaningful use-cases in your agent.

    Apple can now concentrate on making Siri a really useful and powerful agent.

  • kenjackson7 hours ago
    Somewhat surprising. AI is such a core part of the experience. It feels like a mistake to outsource it to arguably your biggest competitor.
    • crazygringo7 hours ago
      It's clear they don't have the in-house expertise to do it themselves. They aren't an AI player. So it's not a mistake, just a necessity.

      Maybe someday they'll build their own, the way they eventually replaced Google Maps with Apple Maps. But I think they recognize that that will be years away.

      • kenjackson6 hours ago
        I agree that they don't appear poised to do it themselves. But why not work with Meta or OpenAI (maybe a bit more questionable with MS) or some other player, rather than Google?
        • crazygringo6 hours ago
          The optics of working with Meta make it a non-starter. Apple symbolizes privacy, Meta the opposite.

          With OpenAI, will it even be around 3 years from now, without going bankrupt? What will its ownership structure look like? Plus, as you say, the MS aspect.

          So why not Google? It's very common for large corporations to compete in some areas and cooperate in others.

          • anonymouskimmer4 hours ago
            SORRY TO EVERYONE ELSE FOR GOING OFF TOPIC.

            I didn't see you 41 day old reply to me until it was too late to comment on it. So here's a sarcastic "thanks for ignoring what I wrote" and telling me that exactly what I was complaining about is the solution to the problem I was complaining about.

            https://news.ycombinator.com/item?id=46114935

            1) I told you my household can't use Target or Amazon for unscented products, without costly remediation measures, BECAUSE EVEN SCENT-FREE ITEMS COME SMELLING FROM PERFUME CROSS-CONTAMINATION THANKS TO CLEANING, STORAGE, AND TRANSPORTATION CONDITIONS. SOMETIMES REALLY BADLY.

            FFS. If you are going to respond, first read.

            I also mentioned something other than "government intervention to dictate how products are made" as a solution to this issue, namely adequate segregation between perfumed and non-perfumed products.

            And I care less about my wallet than I do about my time and actual ability to acquire products that are either truly scent free, or like yesteryear, don't have everlasting fragrance fixatives.

            For people in my position, which make up a small percentage of the population (that still numbers in the millions), the free market has failed. We are a specialized niche that trades tips on how to make things tolerable.

            SORRY TO EVERYONE ELSE FOR GOING OFF TOPIC.

      • WithinReason6 hours ago
        Apple has surprisingly good quality AI papers, a lot of work on bridging research and product.
    • asadotzleran hour ago
      Web search is a core part of browsing and Apple is Google's biggest competitor in browsers. Google is paying Apple about 25x for integrating Google Search in Safari as Apple will be paying Google to integrate Google's LLMs into Siri. If you think depending on your competitor is a problem, you should really look into web search where all the real money is today.
    • deergomoo3 hours ago
      > AI is such a core part of the experience

      For who? Regular people are quite famously not clamouring for more AI features in software. A Siri that is not so stupendously dumb would be nice, but I doubt it would even be a consideration for the vast majority of people choosing a phone.

    • gregoriol7 hours ago
      They could use it like Google Search, not as the first thing the user sees, but as a fallback
    • xtoilette7 hours ago
      How much of the two revenue streams overlap in reality?
  • dogmayor5 hours ago
    How soon does elon sue them for not choosing xai? He's gonna cry some nonsense about antitrust just like he did last year when apple partnered with openai for apple intelligence opt-in.
  • thayne7 hours ago
    This seems like a pretty significant anti-trust issue. One of the two mobile OS makers is using a product from the other for its AI assistance. And that means that basically all mobile devices will be using the same AI technology.

    I don't expect the current US government to do anything about it though.

    • qnpnpmqppnp6 hours ago
      What antitrust rule do you think would be breached?

      I admit I don't see the issue here. Companies are free to select their service providers, and free to dominate a market (as long as they don't abuse such dominant position).

      • benoau6 hours ago
        Gatekeeping - nobody else can be the default voice assistant or power Siri, so where does this leave eg OpenAI? The reason this is important is their DOJ antitrust case, about to start trial, has made this kind of conduct a cornerstone of their allegations that Apple is a monopoly.

        It also lends credence to the DOJ's allegation that Apple is insulated from competition - the result of failing to produce their own winning AI service is an exclusive deal to use Google while all competing services are disadvantaged, which is probably not the outcome a healthy and competitive playing field would produce.

        • its_ethan6 hours ago
          So because Apple chose not to spend money to develop it's own AI, it must be punished for then choosing to use another companies model? And the reason that this is an issue is because both companies are large?

          This feels a little squishy... At what size of each company does this stop being an antitrust issue? It always just feels like a vibe check, people cite market cap or marketshare numbers but there's no hard criteria (at least that I've seen) that actually defines it (legally, not just someones opinion).

          The result of that is that it's sort of just up to whoever happens to be in charge of the governing body overseeing the case, and that's just a bad system for anyone (or any company) to be subjected to. It's bad when actual monopolistic abuse is happening and the governing body decides to let it slide, and it's bad when the governing body has a vendetta or directive to just hinder certain companies/industries regardless of actual monopolistic abuse.

          • benoau5 hours ago
            > So because Apple chose not to spend money to develop it's own AI, it must be punished for then choosing to use another companies model? And the reason that this is an issue is because both companies are large?

            No they were already being sued for antitrust violations, it just mirrors what they are accused of doing to exploit their platform.

            https://storage.courtlistener.com/recap/gov.uscourts.njd.544...

            • its_ethan5 hours ago
              So if it mirrors something they were already accused of (like you're saying), my questioning should be pretty easy to map onto that issue as well?

              It's the line of thinking that I'm trying to dig into more, not the specifics of this case. Now it feels like you're saying "this is anti-trust because someone accused them of anti-trust before".

              If that case was prosecuted and Apple was found guilty, I suppose you can point to it as precedent. But again, does it only serve as precedent when it's a deal between Apple and Google? Is it only a precedent when there's a case between two "large" companies?

              Again this is all really squishy, if companies aren't allowed to outsource development of another feature once they pass some sense of "large", when does it apply? What about the $1T pharmaceutical company that wants to use AI modeling? They're a large technically component company, if Eli Lily partnered with Gemini would you be sitting here saying that they also are abusing a monopolistic position that prevents competition in the AI model space?

              • benoau4 hours ago
                > Now it feels like you're saying "this is anti-trust because someone accused them of anti-trust before".

                No it's antitrust because they have a failed product, but purely by virtue of shutting out competitors from their platform they have been able to turn three years of flailing around into a win-by-outsourcing. What would Siri's position be like today if they hadn't blocked default voice assistants? Would they be able to recover from their plight to dominate the market just by adopting Google's technology? How would that measure against OpenAI, Anthropic or just using Google directly? This is why it's an antitrust issue.

                • its_ethan3 hours ago
                  No other thoughts on my actual questions? You're just addressing one-off sentences from my responses.

                  "it's antitrust because they have a failed product" is objectively hilarious

                  > What would Siri's position be like today if they hadn't blocked default voice assistants?

                  Probably pretty much the same. What would Gemini's position be like today if they hadn't blocked out default voice assistants? You only get Gemini when you use Gemini, just like you only got Siri when you use Siri (up until this deal takes effect). Also Siri has used ChatGPT already, so I'm not even convinced this is a valid criticism. They already didn't block OpenAI from being part of Siri.

                  > Would they be able to recover from their plight to dominate the market just by adopting Google's technology?

                  This is relevant how?

                  > How would that measure against OpenAI, Anthropic or just using Google directly?

                  How would what measure against other ai models? How would their ability to recover from a lack of investing in a better "homemade" AI model differ if they used OpenAI instead of Gemini? How does that have anything to do with antitrust? That's a business case study type of question. Also, shouldn't they be allowed to recover from their own lack of developing a model by using the best tool available to them?

              • troupo3 hours ago
                In Japan you can run other voice assistants than Siri (well, at least some of the functionality like calling them up via a button shortcut): https://developer.apple.com/documentation/appintents/launchi...

                Why only in Japan? Because Japan forced them to: https://9to5mac.com/2025/12/17/apple-announces-sweeping-app-...

          • thayne4 hours ago
            > it must be punished for then choosing to use another companies model

            The problem isn't that they used another company's model. It's that they are using a model made by the only company competing with them in the market of mobile OS.

        • qnpnpmqppnpan hour ago
          > Gatekeeping - nobody else can be the default voice assistant or power Siri, so where does this leave eg OpenAI?

          Sorry if I'm missing the point but if Apple had picked OpenAI, couldn't you have made the same comment? "nobody else can be the default voice assistant or power Siri, so where does this leave eg Gemini/Claude?".

        • KerrAvon36 minutes ago
          IANAL, but I don't believe either of these things is a recognized concept in US antitrust law.
      • thayne4 hours ago
        Apple and Google have a duopoly on Mobile OS. If Apple uses Google's model for Siri, that means Apple and Google are using their duopoly in one market (mobile OS) to enforce a monopoly for Google in another (model for mobile personal assistant AI).
        • qnpnpmqppnpan hour ago
          They are in a duopoly on the Mobile OS market, with no other significant player available. Google would be the sole integrated mobile AI, though there are competitors available if customers wanted to switch (customers for such products being the OS companies buying the AI services, not the end-users).

          However I don't see the link, how they are "using their duopoly", and why "they" would be using it but only one of them benefits from it. Being a duopoly, or even a monopoly, is not against anti-trust law by itself.

  • 1vuio0pswjnm73 hours ago
    "Google already pays Apple billions each year to be the default search engine on iPhones. But that lucrative partnership briefly came into question after Google was found to hold an illegal internet search monopoly.

    In September, a judge ruled against a worst-case scenario outcome that could have forced Google to divest its Chrome browser business.

    The decision also allowed Google to continue to make deals such as the one with Apple."

    How much is Google paying Apple now

    If these anti-competitive agreements^1 were public,^2 headlines could be something like,

    (A) "Apple agrees to use Google's Gemini for AI-powered Siri for $[payment amount]"

    Instead, headlines are something like,

    (B) "Apple picks Google's Gemini to run Ai-powered Siri"

    1. In other words, they are exclusive and have anticompetitive effects

    2. Neither CNBC nor I are suggesting that there is any requirement for the parties to make these agreements public. I am presenting a hypothetical relating to headlilnes, (A) versus (B), as indicated by the words "If" and "could"

    • ericmay3 hours ago
      It's probably anti-competitive, but I'm not sure about your argument which is that Apple and Google must disclose details of their business relationships just because they are Apple and Google. You could maybe argue something like this should be a requirement of publicly traded companies, but the long-term effect there would be fewer publicly traded companies so they don't have to disclose every deal they make.
  • TYPE_FASTER5 hours ago
    Apple and Google already have the search relationship. Makes sense for this to happen. Am curious what kind of data Google gets out of the deal.
    • ftchd3 hours ago
      none, it runs on Apple's cloud privately
      • sidibe2 hours ago
        And where does Apples cloud run?
  • lvl1552 hours ago
    This is the sad state of Apple right now. It is ridiculous that they basically had unlimited access to TSMC and achieved nothing in AI. Management is a joke.
    • estearum2 hours ago
      During his tenure as CEO, Tim Cook has added $700 million per day in enterprise value to Apple. Per day! For 14 years!
    • throwfaraway42 hours ago
      Ask the shareholders if they're a joke
      • asadotzleran hour ago
        Tim Cook replaced paying customers with Wall St. That's not a win for any of us except Apple shareholders, a number far, far lower than Apple users.
  • bflesch4 hours ago
    So this surely means that in the medium term Google will siphon off all of the iCloud data. A dark pattern here, a new EULA popup for the user to accept there, and just like with copilot on windows the users will "allow" Apple to share all data with Google.
    • Cockbrand3 hours ago
      I wouldn't expect this to happen, as Apple's resistance against this would be too strong. The data of Google's paying [enterprise] customers stays private as well, so the safeguards are in place already.
  • nashashmi5 hours ago
    I don't understand why Apple cannot implement their own LLM at the user phone level for easy pickings? like settings control? or app-specific shortcuts? or local data searching?

    I understand other things like image recognition, wikipedia information, etc require external data sets, and transferring over local data to that end can be a privacy breach. But the local stuff should be easy, at least in one or two languages.

    • coder5434 hours ago
      All signs are that they are doing exactly that. They already have an on-device LLM which powers certain features, and I expect they will have a better-trained version of that on-device model that comes out with the "new Siri" update.

      In the original announcement of the Siri revamped a couple of years ago, they specifically talked about having the on-device model handle everything it can, and only using the cloud models for the harder or more open ended questions.

  • rootusrootus6 hours ago
    This is good for Siri, in many ways. But I was kind of hoping we would see a time soon when phone hardware became good enough to do nearly 100% of the Siri-level tasks locally rather than needing Internet access.
    • Someone12346 hours ago
      I suspect we'll see that; but Siri is in such a bad state of disrepair that Apple really needs something now while they continue to look for micro-scale LLM models that can run well-enough locally. The two things aren't mutually exclusive.

      The biggest thing Apple has to do is get a generic pipeline up and running, that can support both cloud and non-cloud models down the road, and integrate with a bunch of local tools for agent-style workloads (e.g. "restart", "audio volume", "take screenshot" as tools that agents via different cloud/local models can call on-device).

    • layer86 hours ago
      I don’t think there’s a clear boundary of “Siri-level” tasks. In particular, properly determining whether a task is “Siri-level” or not is likely to require off-device AI.
      • rootusrootus5 hours ago
        I'd hope it could be the other way around. Some stuff should be relatively straightforward -- summarizing notifications, emails, setting timers, things like that should be obviously on-device. But aside from that, I would hope that the on-device AI can make the determination on whether it is necessary to go to a datacenter AI for a better answer.

        But you may be right, maybe on-device won't be smart enough to decide it isn't smart enough. Though it does seem like the local LLMs have gotten awfully good.

        • layer84 hours ago
          I can see them going that route, but it would cause similarly annoying breaks in the flow as current Siri offering to delegate to ChatGPT, or on-device Siri deciding it can do the task but actually failing or doing it wrong. It certainly wouldn’t be an “it just works” experience.
  • jm_redwood6 hours ago
    Does anyone know what Apple's "Private Cloud Compute" servers actually are? I recall murmurings about racked M chips or some custom datacenter-only variant?

    I'm really curious how Apple is bridging the gap between consumer silicon and the datacenter scale stack they must have to run a customized Gemini model for millions of users.

    RDMA over Thunderbolt is cool for small lab clusters but they must be using something else in the datacenter, right?

  • toroszo2 hours ago
    can they ask siri to fix the tahoe catastrophe please
  • beardyw7 hours ago
    " ... and Anthropic’s Clause."

    That will be their contract writing AI.

  • kachapopopow3 hours ago
    given that gemini 3 pro is presumably a relatively small model it wouldn't be too surprising to see an even more optimized model fit into latest iphones. I wish we knew the data behind gemini 3 flash because if my estimation that it's <50b is true, holy shit.
    • mudkipdev3 hours ago
      Google has Gemini Nano for on-device capabilities but basically never uses it and defers to cloud models instead
  • locusofself4 hours ago
    I wonder if this will my original homepods interesting to talk to or if they won't provide this on older devices.
    • Cockbrand3 hours ago
      Not sure if that's too much of a crutch for you, but it's quite easy to create an "Ask Gemini" shortcut that calls a Cloud Function and returns a spoken response. I use this on my HomePods all the time, and it's working great.
      • apparent2 hours ago
        How do you do this on a HomePod? I could definitely see Apple limiting this to newer hardware, as a way to bump sales.
  • golfer7 hours ago
    Is the era of Apple exceptionalism over? Has it been over for a while now?
  • Animatsan hour ago
    Will Apple and Google merge now? That would create a new #1, bigger than NVidia.

    It would take US antitrust approval, but under Trump, that's for sale.

  • soperj2 hours ago
    It's hilarious how Apple can't compete in the space and so many people here are just saying "Smart move by Apple" as if they had another choice at this point. It's not like they haven't tried.
    • pretext-12 hours ago
      If they wanted to, they could throw massive amounts of cash on it like Google and Facebook are, with the latter poaching Apple employees with 200$ million pay packages: https://www.bloomberg.com/news/articles/2025-07-09/meta-poac...

      But why on earth would they do that? It's both cheaper and safer to buy Google's model, with whom they already have a longstanding relationship. Examples include the search engine deal, and using Google Cloud infrastructure for iCloud and other services. Their new "private cloud compute" already runs on GCP too, perfect! Buying Gemini just makes sense, for now. Wait a few years until the technology becomes more mature/stable and then replace it with their own for a reasonable price.

      • soperj2 hours ago
        Why did they even have Ruoming Pang on staff? Because they were trying. Failing, and then saying we're waiting is objectively hilarious.
      • asadotzleran hour ago
        No, they couldn't, because all current and future ethe training hardware is already tied up by contracts from the frontier labs. Apple could not simply buy its way in given how constricted the supply is.
  • seydor7 hours ago
    I guess this is just a continuation of the Search deal, and an admission that LLMs are replacing search.

    I can't wait for gemini to lecture me why I should throw away my android

  • baal80spam7 hours ago
    I like it. I like Gemini.
  • eimrine7 hours ago
    Why they are constantly so bad at AI but so good at everything else?
    • MontyCarloHall7 hours ago
      Because their focus on user privacy makes it difficult for them to train at scale on users' data in the way that their competitors can. Ironically, this focus on privacy initially stemmed from fumbling the ball on Siri: recall that Apple never made privacy a core selling point until it was clear that Siri was years behind Google's equivalent, which Apple then retroactively tried to justify by claiming "we keep your data private so we can't train on it the way Google can." The result was a vicious cycle: initially botch AI rollout -> justify that failure with a novel marketing strategy around privacy that only makes it harder to improve their AI capabilities -> botch subsequent AI rollouts as a result -> ...

      To be clear, I'd much rather have my personal cloud data private than have good AI integration on my devices. But strictly from an AI-centric perspective, Apple painted themselves into a corner.

      • potamic5 hours ago
        That's a poor justification. There are companies that sell you all kinds of labelled data. OpenAI, Anthropic didn't train on their own user data.
      • tensor2 hours ago
        This is nonsense. You don't need Apple user data to build a good AI model, plenty of startups building base models have shown that. But even if you did it's nonsense as Apple has long had opt-in for providing data to train their machine learning models, and many of those models, like OCR or voice recognition, are excellent.
      • wat100006 hours ago
        Apple's privacy focus started long before the current AI wave. It got major public attention in the fight with the FBI over unlocking the San Bernardino shooter's phone. I don't think Google's equivalent even existed at that point.
    • jjtheblunt7 hours ago
      it's pretty Apple-ish to not jump into a frenzy, and wait for turbulence to settle, i believe. delegation to Gemini fits that theory?
      • dewey7 hours ago
        They've tried to have an AI assistant before AI was a big thing...it's just pretty bad and Siri never got better.

        If it would suddenly get better, like they teased (Some would say, lied about the capabilities) with Apple Intelligence that would fit pretty well. That they delegate that to Gemini now is a defeat.

      • tibbar7 hours ago
        I mean, Siri has been bad for what, 15 years now? It does seem like a bt of an outlier.
        • wooger6 hours ago
          Siri got substantially worse over time in fact, I swear it used to at least be able to give you answers to basic facts rather than just offering to google things.
        • raisedbyninjas6 hours ago
          Gemini only replaced Google assistant on Android a few weeks ago. I gave up on Google assistant a few years ago, but I'd guess it wasn't a worthwhile upgrade from Siri.
          • krupan2 hours ago
            Still using Google assistant after trying Gemini on my pixel about 6 months ago. It was not an assistant replacement, it couldn't even perform basic operations on my phone, it would just say something like, "I'm sorry, I'm just an LLM and I can't send text messages." Has that changed?
            • 34 minutes ago
              undefined
    • layer86 hours ago
      They aren’t so good at everything else either.
      • eimrine4 hours ago
        I would not lure so much comments if not say this. Let's fish an answer from that pool of Apple's fanboys.
    • lunar_rover7 hours ago
      Apple is almost purely customer products, they don't have the resources to compete with the giants in this field.

      Their image classification happens on-device, in comparison Google Photos does that server side so they already have ML infra.

    • blibble6 hours ago
      have you used iOS 26?

      "liquid ass" is how most of my friends describe it

    • DetroitThrow7 hours ago
      It's been a long running thing that Apple can't do software as well as competitors, though in my experience they've beat Google and a few others at devex and UX in their mobile frameworks overtime despite initial roughness. Slow and steady might win this race eventually, too.
    • xnx6 hours ago
      There's no reason to think that Apple would have any more skill at making a frontier AI model as they do at making airplanes or growing soybeans. Not much overlap between consumer electronics design and expertise, data, training, and datacenters needed for AI.
      • its_ethan5 hours ago
        I feel like this ignores how big of a part the software is for those "consumer electronics" Apple is so good at making.

        Apple definitely has software expertise, maybe it's not as specialized into AI as it is about optimizing video or music editors, but to suggest they'd be at the same starting point as an agriculture endeavor feels dishonest.

    • mdasen7 hours ago
      I think that's the thing: Apple is good at very little, but they seem like they're good at "everything else" because they don't do much else. Lots of companies spread themselves really thin trying to get into lots of unrelated competencies and tons of products. Apple doesn't.

      Why does a MacBook seem better than PC laptops? Because Apple makes so few designs. When you make so few things, you can spend more time refining the design. When you're churning out a dozen designs a year, can you optimize the fan as well for each one? You hit a certain point where you say "eh, good enough." Apple's aluminum unibody MacBook Pro was largely the same design 2008-2021. They certainly iterated on it, but it wasn't "look at my flashy new case" every year. PC laptop makers come out with new designs with new materials so frequently.

      With iPhones, Apple often keeps a design for 3 years. It looks like Samsung has churned out over 25 phone models over the past year while Apple has 5 (iPhone, iPhone Plus, iPhone Pro, iPhone Pro Max, iPhone 16e).

      It's easy to look so good at things when you do fewer things. I think this is one of Apple's great strengths - knowing where to concentrate its effort.

      • jen206 hours ago
        This is some magical thinking. Even if Samsung took all their manpower, all their thought process and all their capital, they still couldn’t produce a laptop that competes with the MacBook (just to take one example), because they fundamentally don’t have any taste as a company.

        Hell, they can’t even make a TV this year that’s less shit than last years version of it and all that requires is do literally nothing.

        • layer86 hours ago
          I haven’t seen a lot of good taste from Apple in recent years.
    • TiredOfLife6 hours ago
      > but so good at everything else?

      They aren't.

  • Havoc6 hours ago
    They already use GCP for storage so I guess there is some precedent for big ties between them
  • dubeye6 hours ago
    Oh God, please do this tomorrow.
  • 346792 hours ago
    Apple picks [ad company's] [ai ad server] to power Siri.
  • worldsavior2 hours ago
    How Apple has made some of the greatest phones in history, amazing engineering, and a lot more, but just can't make a simple model to run locally on the phone when many others did?
  • ProofHouse5 hours ago
    It tells you how bad their product management and engineering team is that they haven’t just decided to kill Siri and start from scratch. Siri is utterly awful and that’s an understatement, for at least half a decade.
  • dhruv30067 hours ago
    Didn't they make a deal with OpenAI sometime back?
  • spwa45 hours ago
    The question we're all waiting for ... for how many billions?
  • sodafountan5 hours ago
    Can someone explain to me how this was allowed to happen? Wasn't Siri supposed to be the leading AI agent not ten years ago? How was there such a large disconnect at Apple between what Siri could do and what "real" AI was soon to be capable of?

    Was this just a massive oversight at Apple? Were there not AI researchers at Apple sounding the alarm that they were way off with their technology and its capabilities? Wouldn't there be talk within the industry that this form of AI assistant would soon be looked at as useless?

    Am I missing something?

    • raw_anon_11115 hours ago
      Source: while I don’t have any experience with the inner workings of Siri, I have extensive experience with voice based automation with call centers (Amazon Connect) and Amazon Lex (the AWS version of Alexa).

      Siri was never an “AI agent”, with intent based systems, you give the system phrases to match on (intents) and to fulfill an intent, all of the “slots” have to be fulfilled. For instance “I want to go from $source to $destination” and then the system calls an API.

      There is no AI understanding - it’s a “1000 monkeys implementation”, you just start giving the system a bunch of variations and templates you want to match on in every single language you care about and match the intents to an API. That’s how Google and Alexa also worked pre LLM. They just had more monkeys dedicated to creating matching sentences.

      Post LLM, you tell the LLM what the underlying system is capable of, the parameters the API requires to fulfill an action and the LLM can figure out the users intentions and ask follow up questions until it had enough info to call the API. You can specify the prompt in English and it works in all of the languages that the LLM has been trained on.

      Yes I’ve done both approaches

  • didntknowyou5 hours ago
    i think it's good. Google has a record of being stable and working with large partners (govt etc) and avoids the controversial cult of altman.
  • ChrisArchitect7 hours ago
    Previously only back in June the discussion was barely a mention of Google Gemini, and leaned more towards why weren't they doing it themselves:

    Apple weighs using Anthropic or OpenAI to power Siri

    https://news.ycombinator.com/item?id=44426643

  • ChrisArchitect6 hours ago
    Google release hints at this being more than just Siri:

    > Apple and Google have entered into a multi-year collaboration under which the next generation of Apple Foundation Models will be based on Google's Gemini models and cloud technology. These models will help power future Apple Intelligence features, including a more personalized Siri coming this year.

    ... https://blog.google/company-news/inside-google/company-annou...

    • tensor2 hours ago
      This is actually the most important part of this announcement, and excellent news. I was pretty disappointed that they were going with an existing player rather than building their own models. But this implies that they will continue to build their own base models, just using Gemini as a starting point, which is a pretty good solution.
  • Noaidi8 hours ago
    Guess I am not using Siri anymore…

    By the way, have any of you ever tried to delete and disabled Siri’s iCloud backup? You can’t do it.

    • jjice7 hours ago
      Why not? Apple's ChatGPT integration has been pretty explicitly anonymizing requests and doesn't require an account. Maybe I'm missing something.
    • yohannparis5 hours ago
      In the article they clearly mentioned that Gemini model will be used for the Foundation Model running on device or their own Server. They are not sending Siri request to Google servers.
    • runjake7 hours ago
      Unless Apple is lying:

      On iPhone, Settings → iCloud → Storage → Siri → Disable and Delete

      Edit: Tried it. It works for me. Takes a minute though.

      • Noaidi2 hours ago
        I have a current case open with Apple with this issue. It does not work. And I don’t believe you. I’m sorry I just don’t believe you because Apple says there is a technical problem preventing this. That does not just affect me. Because I also tried it on three other phones of three other friends of mine and it does not work.
      • Noaidi7 hours ago
        TRY IT!
        • runjake6 hours ago
          I did. It works, as far as I can tell?
          • Noaidi3 hours ago
            You’re able to disable Siri in iCloud? Not turn off Siri, disable Siri back up in iCloud. And when you go back to it after not turning it on, it’s still off?
            • runjake2 hours ago
              Yes. But, I'm not surprised that you're having issues. In my experience managing a large enterprise, iCloud accounts seem to have all kinds of weird, account-specific issues.
    • hu37 hours ago
      You guys use Siri?
      • manuelmoreale7 hours ago
        My exact reaction every time I hear people discuss Siri. I don’t think I used it once in my life and it’s one of the first thing I turn off every time I have a new device. So interesting to see how different people use the same devices in completely different ways.
        • moi23887 hours ago
          Siri is extremely useful. That is, if your use cases are limited to:

          - setting a timer

          - dictating a title to search on Apple TV

          • spinningarrow7 hours ago
            Creating calendar events and reminders too!

            A feature set that has remained unchanged since Siri’s launch…

          • mbirth6 hours ago
            You can use Siri to call custom Shortcuts which in turn can ask for more details if required. And now that Shortcuts can make use of the LLMs (Apple’s or ChatGPT), there are a lot more ways to make Siri smarter.
          • manuelmoreale7 hours ago
            Makes sense then, considering timers I set them on my watch and I don’t watch tv.
      • godzillabrennus7 hours ago
        I used it when it launched to figure out it was useless and haven't gone back.
      • redwall_hp6 hours ago
        For CarPlay, yes. I don't need a virtual assistant to do things I can do but worse; I need reliable voice controls to send messages, start phone calls, change the map destination and such with as little friction as possible.

        Siri needs faster and more flexible handling of Spotify, Google Maps and third-party messaging apps, not a slop generator.

      • rootusrootus6 hours ago
        Only for opening/closing the garage door, setting timers, and sending texts. What else do people use the digital assistants for?
      • jen206 hours ago
        Hundreds of times a day for HomeKit, though rarely anything else. It’s _mostly_ fine, provided there are no HomePods around.
      • Noaidi7 hours ago
        Only when I wake up in the middle of the night to ask it what is the current time of the dystopia. That and the calculator.
    • rvz7 hours ago
      You're using Siri? lmao

      That's the Internet Explorer of chatbots.

      • Angostura7 hours ago
        That's where people get confused - it's not a chatbot or an LLM - it's a voice command interface. Adding something to the shopping list, setting a timer, turning up the heating in the back room, playing some music, skipping a track, sending a message - it works perfectly well for - and that's what I use it for virtually every day.

        This work is to turn it it into something else, more like a chatbot, presumably

        • HarHarVeryFunny3 hours ago
          Siri is already transitioning from an intent-based NLU system to an LLM.

          In iOS 18.1 (on iPhone 15+) Siri is part intent-based, part on-device "Apple Intelligence" small LLM, and in iOS 18.2 it also supports off-device ChatGPT.

          This year Siri 2.0 is expected to ditch the legacy intent-based system and instead use just the small on-device Apple Intelligence LLM plus (opt-in) off-device Gemini (running in some private cloud).

      • Noaidi7 hours ago
        Jeez, I only use it for the time and for the calculator, and to ask it to call someone. I am shocked anyone thinks I used it for anything more than that.

        Also, I have never turned on Apple "Intelligence".

    • volemo8 hours ago
      You’ve used Siri before?! /j
  • 7 hours ago
    undefined
  • oojuliuso7 hours ago
    Steve Jobs rolling in his grave. The mortal enemy. Thermonuclear war.
    • benoau7 hours ago
      Enemies? Google contributes about 20% of Apple's profits annually through their default search engine deal, that's more profitable than just about everything they do or make except selling iPhones.

      > The U.S. government said Apple Chief Executive Officer Tim Cook and Google CEO Sundar Pichai met in 2018 to discuss the deal. After that, an unidentified senior Apple employee wrote to a Google counterpart that “our vision is that we work as if we are one company.”

      https://www.bloomberg.com/news/articles/2020-10-20/apple-goo...

    • 6 hours ago
      undefined
    • golfer6 hours ago
      The original iPhone came pre-loaded with Google search, Maps, and Youtube. Jobs competed with Google but he also knew Google had best-in-class products too.
    • relium3 hours ago
      Jobs brokered a $150M deal with Apple's arch enemy Microsoft in 1997.
    • 6 hours ago
      undefined
  • willdotphipps3 hours ago
    Why couldn't Apple pull their finger out of their asses and make their own AI nonsense better then Crap GPT?