Apple's latest attempt to launch the new Siri runs into snags
bloomberg.com84 points by petethomas 13 hours ago
84 points by petethomas 13 hours ago
I feel like the difference between Steve Jobs’ and Tim Cook’s leadership styles is that Cook is really good at optimizing existing processes, but does not have the vision to capitalize on what’s next.
Apple got into the smartphone game at the right time with a lot of new ideas. But whatever the next big shift in technology is, they will be left behind. I don’t know if that is AI, but it’s clear that in AI they are already far behind other companies.
Just my opinion:
Apple doesn't need to solve AI. It's not core to their business in the same way that search engines aren't core to their business.
What Apple does best lies at the combination of hardware, software, physical materials, and human-computer interface design. This is why they're spending so much more on mixed reality than AI, even knowing that a product like the Vision Pro today isn't going to be a big seller. It's why they're investing in their own silicon. This strategy tends to yield unexpected wins, like the Mac Mini suddenly becoming one of the hottest computers in the world because it turns out it's amazing for sandboxing agents if you don't want to use the cloud, or the Mac Studio becoming arguably the best way to run local AI models (a nascent space that is on the cusp of becoming genuinely relevant), or the MacBook Pro becoming by far the best laptop in the world for productivity in the AI age (and it's not even close).
Your conclusion is that they're going to be left behind, but the evidence is that they're already well ahead in the areas that are core to their business. They can trivially pay Google a billion a year for Gemini. Nobody else can do what they can in the fusion of hardware, software, and materials as long as they stay focused.
Where they genuinely slipped up was their marketing -- an unusual mistake for Apple. And that does indeed lie with the CEO.
> It's not core to their business
You're not thinking ahead. AI isn't just chat bots and image editing. I want to tell my phone:
I'm road tripping to XYZ tomorrow, 10 am to 5 pm.
and have my phone become a guide for the day, including stops it knows I like and hotel in my price range with the amenities it knows I need. If I get hungry it just slips in a stop wherever I ask.This can come as an "everything app" or it can be a "new OS". Either way it will change how people interact with their phone.
If Android becomes this OS, which it may very well happen, iOS is toast. Apple's branding moat isn't that deep.
I think that sounds like an incredible feature, but like so many things my phone can already do I'd never actually use it. I just don't want to become someone who does what their phone suggests.
Plus I have a partner and friends, so unless we all want to follow my phone's instructions it's not going to work.
Do you actually want that built into the OS? To me, this is an app but an invasion of privacy if it’s integrated into the OS.
I saw a video of a Chinese phone that did something like that. Their implementation was a privacy and security nightmare but basically it shared a active feed of your screen with an LLM and would literally tap, type and swipe to achieve your objective. Like order these oodles from this app, it would only interrupt it's actions at payment processing screens.
Looked really cool and like the AI I've always imagined.
Even if that worked, how do you know it will choose the stops you like and not the ones that paid Apple more to be featured?
How much data about you does an application like that need to store? Do you really think it can be stored and processed locally or will it have to go to some server that's a secret court order away - or a bribe away - from leaking it?
And last, why do you think a LLM - which is what "AI" means this year - can do that?
Oh and last last thing, honest guv, do read the chapter in Accelerando where the main character loses his smart glasses and is basically crippled because he can't remember anything on his own. (Don't ask an "AI" for a summary because Stross books aren't as popular as React and it will make a mish mash of all he has ever published, I just checked.)
> What Apple does best lies at the combination of hardware, software, physical materials, and human-computer interface design.
This was true maybe a decade ago, but not so now (under the watch of Tim Cook).
You listed Mac hardware becoming popular in the age of AI as examples of "unexpected wins". Maybe that's true (I don't know if it is) - but Macs were only 8% of Apple's 2025 revenue. Apple has become an iPhone company (50% of revenue) that sells services (26% of revenue).
And AI can eat away at both. If Siri sucks so hard that people switch away, that would also reduce Services revenue from lost App Store revenue cuts. If Google bundles Gemini with YouTube and Google Photos storage, people might cancel their iCloud subscriptions.
I think the parent comment was making the point that Tim Cook's Apple has missed the boat and it doesn't show signs that it's going to catch the next wave.
I have an iPhone 16 and I'm locked in because of all my photos being on my iCloud subscription. But in 2030, if my colleague can use their Pixel phone to record a work meeting, have it diarized, send out minutes, grab relevant info and surface it before the next relevant meeting, and Siri can still only set a timer for 5 minutes, then I might actually switch.
> Macs were only 8% of Apple's 2025 revenue.
If the Mac were its own standalone business, it would rank at no. 134 on the Fortune 500 with $33.7 billion in revenue. Also, that's a 12% increase in revenue compared to 2024.
If anything, AI has brought more attention to the Mac. Just about every major AI app is released for the Mac first. I've seen complaints about it on HN.
The latest is Claude Cowork. It was released for macOS on January 12th; it didn't ship for Windows until February 10th; it's still not available for Windows running on ARM.
It's been nearly a year since Dia launched [1], the first AI browser, and it's still not available for Windows.
We just had the frenzy over OpenClaw [2] with AI enthusiasts lining up at Apple Stores to buy a Mac mini just to run it!
The most popular AI channels on YouTube are almost exclusively using Macs. Apple seems to have enough runway until they get their act together.
[1]: https://browsercompany.substack.com/p/letter-to-arc-members-...
[2]: https://builder.aws.com/content/399VbZq9tzAYguWfAHMtHBD6x8H/...
> Siri sucks so hard that people switch away
I don’t think people choose iPhone for the Siri.
> my colleague can use their Pixel phone to record a work meeting
I think lots of startups are tackling this in this space. Hardly a native feature. Attainable an app install away
Tried an android phone given by my company. Gemini is at your fingertips, with a single button press. That’s INCREDIBLE! [everything Siri never delivered]. Put that into a headphone or headphone-enabled glasses. Plus a ring. And the need for an advanced UI-based phone fades away for many usages.
I have a Pixel besides my iPhone (for reasons). When I got a Pixel 9 about a year ago, my feeling was the same (Gemini as at your fingertips. INCREDIBLE!). A few months later after the novelty wore off, I just found the push of AI everywhere in Pixel OS and Google apps just annoying. I now use GrapheneOS on my Pixel. One of the many reasons is that it does not try to push AI anywhere.
Now I just have a single LLM (Le Chat) isolated in its own little app sandbox, never getting in my face unless I choose to open it myself.
Isn't this just Facebook's Ray-Bans in a nutshell?
Facebook is lacking access to the interesting data. If you are in the Google ecosystem then your private and business life is likely already there.
I get your general point but specifically regarding :
> have it diarized, send out minutes, grab relevant info and surface it before the next relevant meeting
Slack already has this integrated and it works quite well.
Also, since AI will mean most are just let go, why would they need meeting minutes? AI would be so crucial as to be the make or break phone/laptop feature, but people would still have meetings?
At best they will use it to tell them for special offers that they can buy with food coupons.
I’m not sure how to say it without sounding like an Apple fanboy, but Tim Cook has been the CEO for the past 15 years. Every single year people have been whining how “he’s not visionary and etc.”, but at some point you have to give him some credit. Apple of 2026 has completely different landscape versus apple of 2010/earlier. Scaling from millions to billions of sales is incredibly hard, and he’s been able to accomplish it.
Those Macs you are talking about are still very niche and mostly used by loyal customers that do basic/common things or very vocal fanboys who always find a way to shill for whatever Apple comes up with, no matter how flawed and lackluster the product is.
Even if you want to run local AI, Macs are not really a good deal when you account for the price of soldered RAM and the limitations of AI tools on macOS. But as always, the minority is very vocal, so it looks like it's all the rage but for the most part, people doing work are still using PCs and they don't have that much time to argue about it on the internet.
>They can trivially pay Google a billion a year for Gemini.
But they can’t vertically integrate the feature, not with acceptable levels of reliability and security.
That’s the key issue here, an apple AI would be something that can read and interact with your mail, pictures, contacts, location, and so on, but right now giving such access to an LLM would be a ticking timebomb. And those kind of integrated products are probably coming to competitors, even if their security plan is just YOLO.
You're not wrong, but you can also get most (or even nearly all) of that right now today on any device by just getting a Google AI subscription. Gemini already does most of that through its own personal intelligence feature. You do need to use Gmail at minimum for it to be useful, but the vast majority of iPhone users do already.
I agree on that they should focus on hardware, software, UX, etc.
I think the problem of current Apple management and especially Tim Cook is that they want to squeeze out as much profit as possible and they see AI as another _Services_ profit center.
A better Apple would say AI is just an app and provide extension points into the OS so that users can plug their favorite LLM, anything from ChatGPT to Mistral, but in a privacy-preserving way if the user wants.
While that would lead to less profit in the short term, Apple's moat was its UX and halo effects (cynically: social signalling). The draw to Apple may last for a bit during enshittification of the platform, but long-term the brand value is more important than short-term profits.
I think this is wrong. Google is a competitor both in devices and in the OS for mobile devices. Apple charge a premium that they justify by superior features, ease of use, effortless integration with other Apple products and so on. I wonder how well they will be able to produce differentiating iOS AI features whilst they use Gemini. I suspect it will more or less have parity with Android devices. If more and more interactions with the device occur through this AI interface I wonder what that does to the perception of Apple products. I suppose they already have the worst AI voice assistant and it hasn't damaged them all that much.
Google is not really a competitor to Apple in devices. I mean, they sell devices, but at a way lower volume. The Pixel phone is essentially a tech demo that exists to push their Android partners into making more competitive devices themselves.
The corporate strategies are not directly comparable. The entire Android project is essentially a loss leader to feed data back into Google’s centralized platform, which makes money on ads and services. Whereas Apple makes money directly from the device sales, supported by decentralized services.
Apple never produced a differentiated experience in search or social, two of the largest tech industries by revenue. Yet Apple grew dramatically during that time. Siri might never be any better than Google’s own assistant, and it might never matter.
Your framing fits well for the Nexus era and even the earliest Pixel iterations, where Google’s hardware largely functioned as a reference implementation and ecosystem lever, nudging OEMs into making better devices.
However, the current Pixel strategy appears materially (no pun intended) different. Rather than serving as an “early adopter” pathfinder for the broader ecosystem, Pixel increasingly positions itself as the canonical expression of Android—the device on which the “true” Android experience is defined and delivered. Far from nudging OEMs, it's Google desperately reclaiming strategic control over their own platform.
By tightening the integration between hardware, software, and first-party silicon, Google appears to be pursuing the same structural advantages that underpin Apple’s hardware–software symbiosis. The last few generations of Pixel are, effectively, Google becoming more like Apple.
I think you're assuming that no durable or at-scale changes in compute form factor will occur, so that their success pretty much just solely comes down to differentiated iPhone software features. That seems unlikely to me. I don't see phones going away in the next decade like some have predicted, but I do think new compute form factors are going to start proliferating once a certain technological "take off" point is reached.
The broader point I'm making is that Apple likely couldn't do all the other things they're excelling at right now and compete head-on with Google / OpenAI / Anthropic on frontier AI. Strategically, I think they have more wiggle room on the latter for now than many give them credit for so long as they continue innovating in their core space, and I think those core innovations are yielding synergies with AI that they would've lost out on if they'd pivoted years ago to just training frontier LLMs. There's a very real risk that if they'd poured resources into LLMs too early, they would've ended up liquidating their reserves in a race-to-the-bottom on AI against competitors who specialize in it, while losing their advantages in fundamental devices and compute form factors over time.
I think the issue here is the public promises that were made. Jobs tended not to do that. Things were announced and released when they were ready, which gave them the time to do it right, without any delays.
Sure, there were things like AirPower and the MobileMe widgets… things that were announced, but never shipped. However, by and large, a big new thing was announced, and a week later it would ship. The iPhone was only announced 6 months early to avoid it being leaked by compliance filing (or maybe it was patents).
Cook would be wise to go back to this instead of promising the shareholders things he can’t deliver on.
I think slow playing AI is the right move for Apple. Third party apps give their customers access today, and Apple can take the time to figure out how AI fits into a large cohesive vision for their products and ecosystem… or if it fits at all. Rushing something out doesn’t do anyone any favors, and has never been Apple’s competitive advantage.
> The iPhone was only announced 6 months early to avoid it being leaked by compliance filing (or maybe it was patents).
Is there evidence of this? I think the phone and watch were announced early because Apples vertical integration strategy and quality standards require them to reveal the existence to the rest of the company so they can get everything else working well with the new product category on release. It wouldn't be possible to keep it secret after revealing it internally so they did a simultaneous internal/external reveal to control the message, to maximize impact, to deny rumor sites, etc...
Yes, or at least that’s the reason Jobs gave in the original Jan 2007 keynote:
> We’re going to be shipping these in June. We’re announcing it today because with products like this we’ve got to go ahead and get FCC approval which takes a few months, and we thought it would be better if we introduced this rather than ask the FCC to introduce it for us. So here we are.
I think comparisons between Jobs and Cook are trite and cliche by now, and also pointless. Jobs was a generational talent; everyone looked up to him when it came to defining products. Of course Cook is not able to do what Jobs did. No one can.
Apple has already been left behind by many tech shifts: web, search, social, crypto, metaverse, etc. At various times popular opinion had them left behind by netbooks, by tablets, by smart phones, by Windows, by web browsers… until they weren’t.
Apple does not have to lead all categories of tech to be a very successful company.
I'd say that after the Apple Maps launch, Tim Cook learned a lesson about allowing features that need additional work the time they need to fully bake.
Google just launched the Pixel 10 with several promised AI features broken, and could really stand to learn the same lesson.
https://www.androidauthority.com/google-pixel-10-magic-cue-o...
https://arstechnica.com/google/2025/09/google-pulls-daily-hu...
Apple Maps is still such a sucky service, at least where I live and where I travel to.
It regularly directs me to incorrect addresses and businesses and labels places obviously incorrectly.
Every use of the search function promotes guide content for a single city I'm not currently in, with no way to configure or turn them off. Good products should go out of their way to annoy you IMHO.
They only managed to get their cycle routing for the UK and Ireland working in 2025 after years and years of complaints.
I'm not a fan of Google but I feel compelled to keep Google Maps because Apple Maps is still so unreliable.
I'd offer the balance here that I still don't enjoy using Android and generally prefer iOS to it, warts and all.
Not sure of that since Vision Pro needed the M5 to work as promised, instead of the blurry mess that ran on the M2
It seems to me that people have been saying that Apple will be left behind since the Apple II. That they keep doing non-obvious things that somehow succeed is what makes them an interesting company.
I don't know that Apple will dominate AI, personally I dislike Siri and iOS, but I think Apple have a very good shot at delivering workable local AI for professionals.
If Apple can lift the inference performance of their forthcoming M5 Ultra chip I think they may become an off the shelf standard for those that want to run large models locally.
That in itself is probably enough to keep them relevant until actual useful uses of Apple Intelligence come to light.
I don’t think Apple or Microsoft (via Windows) will dominate AI. There’s just too much value in the AI being in the cloud (big powerful models vs local) and across your devices (more context on you, running on low powered edge devices like watches, glasses, smart home devices), and the idea of an OS being a decisive factor is already fading with how much work people do in a browser or cloud app.
I think there's room for multiple approaches here.
Cloud based AI obviously has a lot of advantages e.g. batched proccessing on the best hardward, low power edge devices, data sharing, etc.
There's still room for local inference though. I don't know that I want "more context on me" all the time. I want some context, some of the time and I want to be in full control of it.
I'd pay for that. I don't think it will be for everyone but a number of people would pay a premium for an off shelf product that provides privacy and control that cloud vendors by their nature just can't offer.
100% agreed. You speak the truth, but already the apologists are writing textbooks justifying Apple's failed strategy :)) and this is why the company thrives. Just blind loyalty to a company that couldn't care less about them.
All I really want from Apple is to continue perfecting their computers, phones and tablets to be the absolute best computing devices possible. As long as they keep iteratively improving those things I don’t care if they’re thought or innovation leaders in whatever hot new thing comes along.
Steve Jobs had the AI vision. Siri was a (sorta) early AI. And it was acquired under his watch.
To be fair no one has solved ai assistant at consumer level yet.
I agree. It’s still being figured out.
My prediction is that Apple is the hardware and platform provider (like it’s always been). We’re not asking them to come up with a better social media, or a better Notion or a better Netflix.
I think their proprietary chips and GPUs are being undervalued.
My feeling is that they’re letting everyone move fast and break things while trailing behind and making safe bets.
>My feeling is that they’re letting everyone move fast and break things while trailing behind and making safe bets.
That's what is happening but I don't think it was by choice. They clearly had plans to deliver a lot more and have repeatedly failed.
Funny you should mention social media in the context of Apple, because they seem to have been attempting that with iTunes Ping[1] and then Apple Music.
iTunes Ping was a Jobs-era attempt to create a social network for music. It seems that they were trying to rely on integrating with Facebook, who pulled out of the collaboration in the last minute before Ping's release.
Apple hasn't seem to have given up on social networks for music. Apple Music presents a nascent networking feature where users can see what their friends are listening to.[2] It seems that Apple has learned their lesson from Ping and does not rely on a third-party for a social graph, which is instead powered by iOS contacts.
While social media is not Apple's bread and butter, they have maintained their interest in having presence in this market. I would assume that this stems from Apple's overall desire to maintain influence over on-the-top services that define the iOS experience. If they let third parties flourish even further, thirds parties gain leverage that they can use during negotiations with Apple. If third parties successfully negotiate for more features that creates parity with apps on non-Apple devices, Apple loses its differentiation on the device markets, thereby losing revenue.
(I think Stratechery wrote about Apple's service strategy that was motivated by its past relationships with Adobe and Spotify. Couldn't find the link.)
> We’re not asking them to come up with a better social media, or a better Notion or a better Netflix.
You're right that we haven't asked them for better on-the-top services. But it seems to be in Apple's interest to compete with third party services providers and make sure they do not supersede Apple in terms of their influence over on-the-top experiences.
[1] https://en.wikipedia.org/wiki/ITunes_Ping
[2] https://support.apple.com/en-gb/guide/iphone/iphdf490a9e9/io...
Just as an aside, I do not get a social media platform for music. I don’t need a separate social network to manage, and certainly wouldn’t care what 99% of the people I know are listening to 99.9% of the time.
They should start at getting very basic speech recognition working in cars would be a big help.
> it’s clear that in AI they are already far behind other companies.
I think it's exactly the opposite, actually. They've integrated AI flawlessly into existing products to an extent nobody else has even come close to. Photos, for instance, makes better use of AI than any other photo management app in existence. If anything, ChatGPT/Microsoft/Google/etc are absolutely crippled because they don't have access to the data people actually use on a day to day basis—instead, it's scattered across a million browser apps and private silos.
And, you don't have to use an asinine chatbot integration looking like a fool to use it.
Perhaps Google comes the closest to being able to capitalize on this, but I can't say I can remember using any AI integration they have, and I stopped giving them my data over a decade ago.
If you haven't used any of Google's services with your actual data for over a decade, there's a pretty simple explanation about why you don't remember any times you've interacted with one of their AI integrations for those services, and it has nothing do with the relative quality of them.
"I think it's exactly the opposite, actually. They've integrated AI flawlessly into existing products to an extent nobody else has even come close to. Photos, for instance"
Have you used a Samsung? Apple's AI miserably fails in every comparison out there in the photos app.
There's also Google Photos with Gemini which helps you find any photo you want with AI better than anyone else.
But sure, Apple has the best AI integration
That's just the typical Apple distortion field.
I have been using Apple devices and supporting many of their users for over 20 years, and they are all extremely invested in their choice of computing device. It's really a source of pride for many of them, weirdly. For this reason, anything Apple does is necessarily better than everything else on the market. It's a bit pointless to argue because they come from an emotional standpoint; if you point at the many things not working properly, they always have an excuse to handwave it away. It's really funny because I use Apple stuff, and I find many qualities in it, but I'm unwilling to be blind to the faults and weaknesses.
This sort of ego investment exists for other brands as well; I think it is a lack of emotional maturity and an inability to realize that a brand does not care if you do not fully "love" their products.
Apple is in a better situation now than with Gil Amelio, thanks to printing money with iDevices.
However everything else is quite similar for those of us that were around.
Except now there isn't a Be or NeXT to acquire, nor the former founder to get back.
They have a software issue (I mean who doesn’t) but Cook has tried his hand in lots of products. Some worked like the airpod, airtag and watch; and a particular one flopped: vision. It’s a marvelous tech device that unfortunately had no demand.
Also the m series can be attributed to him and it’s as good as innovation can get.
Right or wrong, at least he takes risks. Apple Vision Pro was launched two years too early, but you can’t say that he just realized on existing products.
2 years early? I don’t think the time for Vision is now, or even 3 years from now based on the hardware and use cases.
Was he taking risks or just reactionary after Facebook going all-in on the Metaverse (and by the time the product was done, the Metaverse was pretty dead already)?
I feel like, today, most of the other LLM providers can do what "Apple Intelligence" promised - it'll link with my email/calendar/etc and it can find stuff I ask with a fuzzy search.
That said, I don't really use this functionality all that often, because it didn't really (effortlessly) solve a big need for me. Apple sitting out LLMs means they didn't build this competency along the way, even when the technology is now proven.
I think the same thing is true was VR - except Apple did invest heavily in it and bring a product to market. Maybe we won't see anything big for a while, and Silicon Valley becomes the next Detroit.
You mean you don't have an everyday need to find an authentic Italian restaurant and make dinner reservations? Without actually doing it yourself?
> it'll link with my email/calendar/etc
Wait, how does that work? I've never heard of this outside of closed ecosystems (iPhone is obviously the best at this, but I guess also google crap if you're invested into gmail/gcal/etc)
It's easy: make my life easier.
Instead they choose to optimize for shareholder value.
This was such a self inflicted own goal. Siri has needed work for years and every year they neglected it. When they first bought Siri it was state of the art and then it just languished. Pulling an Intel and sweating your assets until it is too late is never a good idea.
Siri has had many, many, many engineers on it for a while.
I don't doubt it, but what were they all doing? The Metaverse had 10k employees on it for multiple years and seemed to almost be a standstill for long periods of time. What do these massive teams do all day?
Have meetings to figure out how to interact with the other 9990 employees. Then try and make the skeleton app left behind by the team of transient engineers who left after 18 months before moving on to their next gig work, before throwing it out and starting again from scratch.
Exactly. What Meta accomplished could have been done by a team of less than 40 mediocre engineers. It’s really just not even worth analyzing the failure. I am in complete awe when I think about how bad the execution of this whole thing was. It doesn’t even feel real.
I wish I could be assigned a project and make no progress in over a decade and still have a job.
This thread is an example how 24h news cycle hurts brain cells.
They are late with a release, they must have unlearned to build software.
Given the way current LLMs hallucinate, and given that Apple (presumably) won’t accept this behaviour in Siri, I’m skeptical that existing technology (or existing technology scaled up) can ever create the Siri Apple and its customers want.
I'll settle for "gets voice to text right most of the time". Seriously, Apple is so far behind on the cheapest table stakes at this point I highly doubt their high standards is the issue.
Yeah, but isn't the voice recognition (as opposed to voice comprehension) separate from the supposedly LLM powered bit of Siri? I want better voice comprehension too, but I don't think that moving to a LLM powered Siri will solve that.
Wouldn’t it? Something like Whisper is great for recognition, and is built on a transformer architecture, like most of the SOTA voice stuff is.
Oh absolutely. The amount of times I have to pause, take a deep breathe and OVER-enunciate (still with mixed success) because my voice, pulse rise and my patience decreases with every absolute butchering (like not even "close but no cigar" but "how on earth did you come up with that?") Siri does to dictated text message in CarPlay...
I don’t even bother anymore. When it reads back the text message and asks if I want to send it I just laugh heartily and say yeah. Sometimes the recipient has to read it aloud and try to phonetically guess what the original words were.
Literally what's the difference between that and Siri now.
Siri can't understand or pronounce very well.
A few weeks ago Siri via Car Play responded to a text and sent it without me saying a word or radio on, and with the setting where it asks first before sending enabled. It responding "Why?" to a serious text was seriously inconvenient in the moment. I watched it happen in disbelief.
(Edit: Didn't see your last paragraph before writing the response below)
I think there is a distinction between Siri misunderstanding what was said (which you can see/hear), and Siri understanding what you said but hallucinating an answer. In both cases, you strictly have to check the result, but in the first case it's clear that you've been misunderstood.
Yeah. Apple don’t half ass things. This is why people take their products seriously.
I don't think that's at all a safe presumption, given that AI still happily hallucinates summaries of text messages/email that is contradictory to that actual content of the message.
I worked at Siri (post acquisition) 13 years ago as one of the early data scientists. Let's just say I am not a bit surprised.
I'd rather they get it right than released it unfinished.
But they can't get it right. Siri seems just the most conspicuous indicator that Apple has unlearned to do software. Everything is going to shit there.
It doesn't surprise me that Siri continues to be bad - Apple's current plan is to use a low-quality LLM to build a top-quality product, which turned out to be impossible.
What does surprise me is that Google Home is still so bad. They rolled out the new Gemini-based version, but if anything it's even worse than the old one. Same capabilities but more long-winded talking about them. It is still unable to answer basic questions like "what timer did you just cancel".
I can't even get gemini on my phone, configured as my assistant, to schedule a timer. It just googles the answer now or tells me "Gemini can't do that". 16 years ago it was doing that perfectly.
That's all I use Siri for. I'll be really cheesed off if this gets lost in the new version.
Indeed especially compared to chatGPT running so much better on my same iPhone where siri shits the bed. Voice transcription sucks in every aspects on my iPhone except surprise chatGPT gets what I am saying 90% of the time.
I got myself an iPhone 16 Pro because of the promised AI features. I had a vision in my mind of what it ought be like:
While driving past a restaurant, I wanted to know if they were open for lunch and if they had gluten-free items on their menu.
I asked the "new" Siri to check this for me while driving, so I gave it a shot.
"I did some web searches for you but I can't read it out to you while driving."
Then what on earth is its purpose if not that!? THAT! That is what it's for! It's meant to be a voice assistant, not a laptop with a web browser!
I checked while stopped, and it literally just googled "restaurant gluten free menu" and... that's it. Nothing specific about my location. That's nuts.
Think about what data and access the phone has:
1. It knows I'm driving -- it is literally plugged into the car's Apple CarPlay port.
2. It knows where I am because it is doing the navigating.
3. It can look at the map and see the restaurant and access its metadata such as its online menu.
4. Any modern LLM can read the text of the web page and summarize it given a prompt like "does this have GF items?"
5. Text-to-voice has been a thing for a decade now.
How hard can this be? Siri seems to have 10x more developer effort sunk into refusing to do the things it can already do instead of... I don't know... just doing the thing.
Siri on the Apple Watch is even more fun. It can never answer a question and always opens up a webpage. Then you try to read it on the teensy display and then you are rewarded for your effort by the Siri/WatchOS/whatever closing the view after just a few seconds (even when you were scrolling with the crown).
I am pretty sure they aren't doing any QA or the QA results don't get to the developers. With Pixel Watch I can still understand all the little bugs, it is well-known by now that (some of) the Pixel Watch PMs themselves use iPhones and Apple Watches. But you'd think that the Apple Watch PMs themselves use Apple Watches? The only other explanation that I can think of is that the org is pretty dysfunctional by now.
Seems weird to comment on delayed new features from Apple. Obviously if it doesn't meet the quality bar it would get pushed back, that's just how they do things.
But I wonder how much of the problem is due to trying to minimise data processing off-device. Even with Open AI as a last resort, I don't imagine you get much value choosing betwixt the local model or a private cloud that doesn't save context.
Meanwhile the average user is yeeting their PII into Altman's maw without much thought so Siri is always going to seem rubbish by comparison.
This is obviously a death march project. Just delay it indefinitely until the Google Gemini based Siri chatbot is ready. Why ship something half-assed?
The referenced Bloomberg source (https://www.bloomberg.com/news/articles/2026-02-11/apple-s-i...) says this about the delayed effort:
But it’s been a complex undertaking. The revamped Siri is built on an entirely new architecture dubbed Linwood. Its software will rely on the company’s large language model platform — known as Apple Foundations Models — which is now incorporating technology from Alphabet Inc.’s Google Gemini team.
Do similar issues exist with Gemini on Android?
Or are these challenges very Siri/iOS specific?
Gemini can and does send everything to Google.
Apple's challenge is they want to maintain privacy, which means doing everything on-device.
Which is currently slower than the servers that others can bring to the table - because they already grab every piece of data you have.
> Apple's challenge is they want to maintain privacy, which means doing everything on-device.
Apple is not trying to do everything on-device, though it prefers this as much as possible. This is why it built Private Cloud Compute (PCC) and as I understand it, it’s within a PCC environment that Google’s Gemini (for Apple’s users) will be hosted as well.
This isn't planned to be exclusively on-device. Siri isn't exclusively on-device now, to begin with.
Not sure whether it's a language/pronounciation issue but for 15 years since siri was released i have not seen a single person using it successfully without having to yell at it for not waking up or not understanding the request correctly
I was cleaning up room today and while wiping dust from bookshelf homepod sits on, siri out of blue goes with "I thought so".
It'll be 15 years this October and I can't still use siri with my language.
Are Apple AI agent delays bearish for AI agents in general? Unless something else is the issue it’s normal behavior for Apple not to implement something everyone else already has until it’s very good and solid.
Apple wants to vertically integrate. Their AI strategy until recently was to develop their own LLM models that were small enough to run on device. But massive scaling is what makes LLMs so powerful, so all their internal models were terrible and unusable.
Basically they bet that compute efficient LLMs were the future. That bet was wrong and the opposite came true.
These aren't "agent" features. they're the features announced in 2024, like ai summaries in web search and natural language image editing
> Siri doesn’t always properly process queries or can take too long to handle requests, they said
I mean, for anyone familiar with LLMs this is not exactly a surprise. There is no way Apple can remove the inherent downsides of this technology regardless of how enthusiastic the ai bros are about it.
In a twisted way, I’m happy there are at least some teams at Apple where it doesn’t get a pass for bugs just because it has AI on the sticker
I think we can conclude at this point that the guy yelling at engineers to "just stick ChatGPT into Siri" doesn't understand that the result is unusable, for whatever reason. That reason might be that the UE is bad, or because it grossly violates user privacy, but it might be that Apple would loose $$$$$ because LLM inference is expensive.
Damn paywalls! Sorry, I shouldn't be so negative. I'd just like to be able to read the article.
[flagged]
Is it not impressive what xai did with Grok? It's already integrated into twitter and my Tesla. So quickly? What prevented apple from doing the same but building out their equivalent of grok?
This is not a bad example. Tesla is indeed running a custom LLM, available in their vehicles, capable of acting as a general chatbot and issuing commands to the car, developed in-house. While Grok is not up-to-par with other frontier models, it's certainly far beyond Siri.
Grok runs on a cloud server, I think Apple are trying to do as much as possible on-device, which makes it a lot harder.
I just wish they would fix the out of memory disaster on my MacBook that is ios26