A new Apple purchase would solve my biggest AI problem

Apple acquired Q.ai in a deal valued at around $2 billion. While there’s still a lot we don’t know, some details about Q.ai’s work have me very excited about the future of Apple’s AI and Siri offerings.
Apple’s latest acquisition, iQ.ai, focuses on understanding ‘silent’ voice input

Apple acquires companies all the time, but it’s rare for a company to make an acquisition as expensive as Q.ai.
The reported purchase price of $2 billion makes Q.ai the second largest acquisition in Apple’s history. It trails only Beats’ $3 billion acquisition a decade ago.
Yet despite the hefty price tag, much of Q.ai’s work is shrouded in mystery.
As 9 on 5 Mac Editor-in-Chief Chance Miller wrote, the company “developed machine learning technology to add ‘sound and silence’ to speech.”
Its website has the tagline: “In a world full of noise we create a new kind of silence.”
Israel’s technology hub Geektime dig into the patent details to reveal Q.ai’s work. Here is the translation:
According to its patent applications, the company appears to work by reading what is being said, not using voice, but using visual sensors that detect muscle and skin movements on the face, to translate it into words or commands. Some patents show the use of a headset that also scans the user’s cheek and jaw, and will apparently allow you to talk to Siri, Apple’s voice assistant, using only lip movements.
This technology is expected to be integrated with smart glasses and/or earbuds like AirPods.
And if these early reports are accurate, Apple may be well on its way to solving my biggest AI problem.
Why the Q.ai acquisition could be big for AI and Siri

Like millions of others around the world, I’ve found my use of AI chatbots to skyrocket over the past few years.
If I have a question about something, I’ll quickly go to ChatGPT and/or Google Gemini, among others.
LLM-based chatbots all have their problems, including providing erroneous information at times. But for the most part, I’ve found them to be a great time saver.
When I use these AI chatbots, however, I almost never interact using my voice.
I’m often around other people, whether that’s my family at home or strangers on the street or at the coffee shop where I work.
Because of that, I always write my own AI applications. But using the iOS app’s keyboard can feel blurry at times and slows me down. It would be much quicker and easier to just speak my questions, if it weren’t for the social hangup.
But if a future version of Siri can understand facial movements and inaudible whispers, that will open up a whole new world of AI possibilities.

Whether I’m at home, or out and about again, I’ll be able to speak near-silent to Siri and have her understand me. No need to remove my iPhone first. No need to type the question.
I can use AI help whenever I need it—every time I engage with the world around me, and without being “that guy” talking to himself.
There are a lot of question marks about how this might actually work. I think AirPods with cameras and/or Apple glasses will be involved.
If the future of computing involves ever-present AI chatbots, then Q.ai’s technology could be an important part of that.
How do you primarily interact with AI chatbots today? Does Q.ai technology sound appealing? Let us know in the comments.
Best iPhone accessories


FTC: We use auto affiliate links to earn income. More.




