93 pointsby ishener5 hours ago15 comments
  • jackyinger2 minutes ago
    I get the feeling Apple is the next Intel.

    Intel went through a phase in the 2010’s of buying gobs of companies with fancy tech and utterly failing to integrate those acquisitions.

    And even more fundamental, Intel rested on its laurels of having good hardware and got bit hard in the end. Something similar seems to be happening at Apple.

  • tchalla3 hours ago
    > Notably, this is the second time CEO Aviad Maizels has sold a company to Apple. In 2013, he sold PrimeSense, a 3D-sensing company that played a key role in Apple’s transition from fingerprint sensors to facial recognition on iPhones. Q.ai launched in 2022 and is backed by Kleiner Perkins, Gradient Ventures, and others. Its founding team, including Maizels and co-founders Yonatan Wexler and Avi Barliya, will join Apple as part of the acquisition.

    Twice, well done!

    • tartoran3 hours ago
      What kind of tech does qAi bring to the table?
      • causalmodels3 hours ago
        " As first reported by Reuters, Apple has acquired Q.ai, an Israeli startup specializing in imaging and machine learning, particularly technologies that enable devices to interpret whispered speech and enhance audio in noisy environments."
        • cyrusradfar2 hours ago
          [puts on tin foil]

          you mean something that improves the detection and transcription of voices when the person doesn't realize the mic is on, like when it's in our pocket?

          • golbez935 minutes ago
            that was my first thought, big bump to their ad program
        • mNovak2 hours ago
          Maybe to allow sub-vocalized commands when wearing airpods, for example? I think this was a theme in the later Ender's Game series books.
        • Noaidi2 hours ago
          Yeah, so, I am never turning on Apple Intelligence...
          • tanseydavid2 hours ago
            Hope they do not adopt the MS approach to updates with the "shaken" Etch-a-Sketch for your settings on every update.
  • clueless4 hours ago
    Could Q.ai be commercializing the AlterEgo tech coming out of MIT Lab? i.e. "detects faint neuromuscular signals in the face and throat when a person internally verbalizes words"

    Yep, looks like that is it. Recent patent from one of the founders: https://scholar.google.com/citations?view_op=view_citation&h...

    • mikestorrent3 hours ago
      Yeah...

      Pardon the AI crap, but:

      > ...in most people, when they "talk to themselves" in their mind (inner speech or internal monologue), there is typically subtle, miniature activation of the voice-related muscles — especially in the larynx (vocal cords/folds), tongue, lips, and sometimes jaw or chin area. These movements are usually extremely small — often called subvocal or sub-articulatory activity — and almost nobody can feel or see them without sensitive equipment. They do not produce any audible sound (no air is pushed through to vibrate the vocal folds enough for sound). Key evidence comes from decades of research using electromyography (EMG), which records tiny electrical signals from muscles: EMG studies consistently show increased activity in laryngeal (voice box) muscles, tongue, and lip/chin areas during inner speech, silent reading, mental arithmetic, thinking in words, or other verbal thinking tasks

      So, how long until my Airpods can read my mind?

      • 3 hours ago
        undefined
  • deepfriedchokes4 hours ago
    Sounds pretty invasive for privacy, if this was ever paired with smart glasses in public.
    • Lammy3 hours ago
      Hence the name, I assume.
  • Sir_Twist3 hours ago
    “Q.ai is a startup developing a technology to analyze facial expressions and other ways for communication.”

    This is an interesting acquisition given their rumored Echo Show / Nest Hub competitor (1). Maybe this is part of their (albeit flawed and delayed) attempt to revitalize the Siri branding under their Apple Intelligence marketing. When you have to say the exact right words to Siri, or else she will add “Meeting at 10” as an all day calendar event, people get frustrated, and that non-technical illusion of the “digital assistant” is lost. If this is the model of understanding Apple have of their customers’ perception of Siri, then maybe their thinking is that giving Siri more non-verbal personable capability could be a differentiating factor in the smart hub market, along with the LLM rebuild. I could also see this tying into some sort of strategy for the Vision Pro.

    Now, whether this hypothetical differentiating factor is worth $2 billion, I’m not so sure on, but I guess time will tell.

    https://www.macrumors.com/2025/11/05/apple-smart-home-hub-20...

  • concavebinator3 hours ago
    In case there are any Ender's Game fans here, the capability to understand micro-expressions reminds me of how Ender subvocalizes to Jane. Orson Scott Card predicted yet another technological norm.
    • danhite2 hours ago
      Also earlier credit due to Isaac Asimov in Second Foundation [1953] "...

      The same basic developments of mental science that had brought about the development of the Seldon Plan, thus made it also unnecessary for the First Speaker to use words in addressing the Student.

      Every reaction to a stimulus, however slight, was completely indicative of all the trifling changes, of all the flickering currents that went on in another's mind. The First Speaker could not sense the emotional content of the Student's instinctively, as the Mule would have been able to do – since the Mule was a mutant with powers not ever likely to become completely comprehensible to any ordinary man, even a Second Foundationer – rather he deduced them, as the result of intensive training.

  • stefanos823 hours ago
    Why am I having a feeling that one of their reasons was so they can trademark "iQ", to match the iSomething "franchise", so to speak?
    • gralab3 hours ago
      Apple dropped the "i" naming scheme many years ago.
      • sgjohnson3 hours ago
        iCloud, iPad, iPhone, iMac, iMessage, iOS/iPadOS, iMovie?

        Granted, they are slowly but surely killing it, but it’s still going quite strong.

  • 5 hours ago
    undefined
    • 5 hours ago
      undefined
  • assaddayinh4 hours ago
    The ability to impress CEOs and signal hotness to investors, may not corelate at all with the ability to produce breakthrough technology. Thus companies like google grow up unbought to then become ..
  • alecco3 hours ago
    It's kind of sad watching Apple drift into irrelevancy. I know I'm not going to buy more products from them because nothing they have is worth the premium price.
  • loudandskittishan hour ago
    This story has Apple + $2B acquisition + AI

    ...how is this not at the top of the page?

  • 5 hours ago
    undefined
  • bnchrch4 hours ago
    Wake me up when they let one of these acqui-hires update Siri to be on par with a voice assistant I could make in an afternoon with off the shelf tools.
    • alighter4 hours ago
      This. And next word prediction / autocorrect that doesn’t look like it’s from the previous century.
      • tobmlt4 hours ago
        On both my nokia and my blackberry it was far far better than on my iphone. That wasn't quite 199X but pretty close.

        I wish the iphone had word prediction and autocorrect that was from the previous centruy

        • thewebguyd3 hours ago
          BlackBerry's keyboards & autocorrect were top notch. Nothing has matched it yet when using a pure virtual touch screen keyboard.

          Crazy he had pretty much perfected the tech of typing out text on a smartphone and then decided to throw it all away by moving to all-screen devices instead. A virtual keyboard with no tactile feel will never compare until we can have screens that can recreate the tactile bumps of a physical keyboard.

      • darth_avocado4 hours ago
        Apple autocorrect has gotten actually worse over the last decade. Before it used to be duck instead of a similar sounding word and it took one action to correct it. Now it’s just fuschia and it takes 5 mins to correct the correction to the autocorrect.
        • tartoran3 hours ago
          I agree with this sentiment. It was so annoying that I turned auto correct off. I found that writing on iPhone has got worse as well, or at least it's my own observation. On the other hand, voice dictation has improved quite a bit that I can just dictate into my phone when needed. For more serious work I use a work device not a consumption one.
    • wahnfrieden4 hours ago
      that already made the news. it will be powered by gemini and may launch before next wwdc.
  • robinsoncrusue4 hours ago
    [flagged]
    • tiffanyh3 hours ago
      The full quote:

      > enable devices to interpret whispered speech and enhance audio in noisy environments.

      I personally see a lot of people using Siri on speakerphone in public places and am amazed due to the background noise … that Siri can even capture half of what’s said.

    • null_deref3 hours ago
      Why did your comment omit the American company that thought it’s a good idea to buy it? Do you think it implies something about all American companies?
    • blastro3 hours ago
      unreal
  • yomansat2 hours ago
    It still surprises me how everyone was closing their Russian based stores when they invaded Ukraine, but here's a much worse situation and it's business as usual...