A Look Ahead to iOS 27: When Your Band-Aid Needs a Band-Aid

Catching Up

Back in October, I wrote about Apple's fastest design reversal ever—the Liquid Glass toggle that appeared in iOS 26.1 just weeks after iOS 26's launch. At the time, I was genuinely impressed by Apple's speed in responding to user feedback.

Now? I'm wondering if that toggle was less "Apple listening" and more "Apple scrambling to contain a dumpster fire."

The Problems Keep Piling Up

Let's recap what Liquid Glass actually broke:

Readability took a nosedive. The transparency makes UI elements fuzzy and icons hard to distinguish. It's like trying to read your phone through a shower door—technically possible, but why would you want to?

Contrast became optional. Buttons camouflage themselves against certain backgrounds. I've legitimately lost interactive elements on my screen. When playing "Where's the Button?" becomes a daily activity, your UI has failed.

Spacing is all wrong. Elements that should be clearly separated just... aren't. The visual hierarchy that made iOS intuitive? Gone.

Performance took a hit. Animation lags, stuttering, battery drain. Turns out making everything translucent and bubbly is computationally expensive. Who knew? (Everyone. Everyone knew.)

Accessibility is a nightmare. The Nielsen Norman Group—the folks who literally wrote the book on usability—published a scathing analysis titled "Liquid Glass Is Cracked, and Usability Suffers in iOS 26." When the usability experts are calling you out by name, you've got a problem.

The Toggle Isn't Enough

Remember how I praised Apple for adding the Liquid Glass toggle in iOS 26.1? I stand by that—the speed was impressive. But here's the thing: the toggle shouldn't have been necessary in the first place.

The fact that Apple had to add a "make this usable" option within weeks of launch tells you everything you need to know about how ready this design was for prime time.

And let's be real: the Tinted mode is better, but it's still not right. It's like they built a house with no windows, realized that was a problem, added frosted glass windows, and then acted like they'd solved everything. Sure, it's an improvement, but maybe just... regular windows?

A lot of users are staying on iOS 18 until Apple figures this out. Can't say I blame them.

The Real Question: Why?

Here's what really gets me: iOS 26 was supposed to be a stability-focused release.

Bug fixes. Polish. The boring stuff that makes your phone work better. Instead, Apple decided to completely overhaul the UI with a design that clearly wasn't ready. It's like they were so focused on making it look cool in the marketing materials that they forgot people actually have to use this thing every day.

And this isn't happening in a vacuum...


The AI Features Apple Can't Deliver: Where's My Smart Siri?

Or: A Timeline of Broken Promises, Shifting Strategies, and Potential Legal Landmines

The Promise

Remember when Apple announced Apple Intelligence? The promise of a truly smart Siri that could:

  • Understand personal context
  • See what's on your screen
  • Actually do things in your apps

That was supposed to launch with iOS 18.

Then it was iOS 19.

Now we're looking at iOS 26.4 in spring 2026.

That's over a year of delays, folks. And according to reports, Apple employees are still expressing concerns about Siri's performance in early builds. Not exactly confidence-inspiring.

The Waitlist Fiasco

Let's talk about that initial waitlist. Yeah, the one where you update to iOS 18.1 or later on your shiny new iPhone 16 (or iPhone 15 Pro), and then you have to join a queue just to activate Apple Intelligence features on your own device.

You read that right. You bought expensive hardware specifically marketed with these AI features, you download the update, and then Apple says "cool, now wait a few hours while we activate the features you already paid for."

The waitlist typically took anywhere from a few minutes to a few hours to clear, but the whole concept is absurd. Imagine buying a car with advertised features, driving it off the lot, and then being told "great, now park it for a few hours while we remotely enable the features you already paid for."

Apple's explanation? The features put pressure on their cloud resources, specifically the registration process for Apple's Private Cloud Compute (PCC) platform. Which... okay, but maybe figure that out before you ship the product?

The AI Strategy Pivot

Here's where it gets really interesting. Apple's original plan was to build their own AI models—they even had an internal framework called "Ajax" and were working on what people called "Apple GPT."

But then reality hit. Building competitive AI models is hard. So Apple pivoted to a partnership strategy:

First came ChatGPT. Apple announced a deal with OpenAI in 2024, integrating ChatGPT into iOS. The idea was that Siri could hand off complex queries to ChatGPT when needed. Fine, makes sense.

Then came Google. In what has to be one of the most surprising moves, Apple reportedly signed a **$1 billion deal with Google** to use a custom Gemini AI model to power the next version of Siri, targeted for spring 2026.

Let me repeat that: Apple is paying Google—Google—a billion dollars to make Siri smarter. The same Google that Apple has spent years positioning itself against on privacy grounds.

Now, I actively try to avoid Google products and services. So the idea of a Google-powered Siri? I'm not thrilled. At all. And I suspect I'm not alone in this.

But Wait, There's a Legal Problem

Here's where this gets really interesting: this deal might be destined for a day in court.

In August 2024, a federal judge ruled that **"Google is a monopolist"** in the search and advertising markets. In September 2025, the court issued remedies that included:

  • Forcing Google to share search data with competitors
  • Putting restrictions on exclusive payment deals that Google uses to ensure prime placement in browsers and on smartphones
  • Barring Google from making new exclusive deals like the ones it had

While Google was allowed to keep its existing $20+ billion deal with Apple for default search placement, the court made it clear that these kinds of exclusive arrangements are problematic.

Now Apple is signing a new $1 billion AI deal with Google to power Siri. Given that:

  1. Google has been found to be a monopolist
  2. Courts have specifically targeted Google's exclusive deals with companies like Apple
  3. This is a brand new agreement, not a continuation of an existing one
  4. AI partnerships are already attracting regulatory scrutiny

...it seems pretty likely this deal is going to attract antitrust attention. The Department of Justice just spent years proving that Google's exclusive deals harm competition. Why would they ignore Apple handing Siri over to Google?

The timing is particularly bad. Google CEO Sundar Pichai testified in April 2025 that Google hoped to reach a Gemini deal with Apple by mid-year. This was during ongoing antitrust proceedings. Regulators were literally in the process of restricting Google's exclusive deals, and Google was negotiating a new exclusive AI deal with Apple.

The Rollout Has Been a Mess

Even the features that have launched are problematic:

AI Notification Summaries
Introduced in October, then promptly halted due to major errors in beta software. Nothing says "ready for prime time" like having to pull a feature immediately after launch.

Limited Device Support
Only the iPhone 15 Pro, iPhone 15 Pro Max, iPhone 16 series, and M-series iPads and Macs can run Apple Intelligence. Everyone else is left out entirely.

Staggered Feature Releases
The features that were supposed to launch together have been spread across multiple updates, leaving users confused about what they actually have access to.

iOS 27: Lowered Expectations

Here's where Apple is finally being realistic. They've delayed iOS 27 and scaled back its AI ambitions. Instead of pushing forward with more AI features, they're focusing on:

  • Fixing animation lags
  • Addressing app launch delays
  • Solving Wi-Fi and Bluetooth disconnection issues
  • Fixing sporadic camera problems

In other words, they're going back to fix all the stuff that should have been working in the first place.

The Pattern

Look at the timeline:

  1. iOS 26: Released with controversial UI changes and missing AI features
  2. iOS 26.1: Band-aid fix for UI complaints (the toggle I wrote about)
  3. iOS 26.4Maybe delivers promised Siri features powered by Google (spring 2026)
  4. iOS 27: Scaled back to focus on stability

This is Apple playing catch-up with their own promises. And honestly? I think the stability focus is the right call. But it shouldn't have come to this.

What This Actually Means

Apple is trying to compete in the AI space while maintaining their reputation for polish and reliability. Right now, they're failing at both.

The Liquid Glass design shows they're willing to sacrifice usability for aesthetics.

The AI delays and partnerships show they're struggling to deliver on technical promises and are having to rely on competitors' technology—specifically, a competitor that's been found to be an illegal monopolist.

The iOS 27 pivot shows they know they've overextended.

The Google deal is particularly problematic. Apple spent years building their brand around privacy and independence from Google. Now they're paying Google a billion dollars to power Siri. That's not a strategic partnership—that's an admission that they can't build competitive AI on their own timeline.

And for users like me who actively avoid Google? We're stuck with a choice: use a gimped Siri, or accept Google integration into one of the most personal devices we own.

Plus, there's a decent chance this whole deal gets challenged in court anyway. The antitrust ruling against Google specifically targeted these kinds of exclusive arrangements. Apple and Google might be betting that AI is different enough from search that regulators won't care, but I wouldn't take that bet.

Where We Go From Here

I'm cautiously optimistic about iOS 27's stability focus. Sometimes the best move is to step back, fix what's broken, and then move forward with a solid foundation.

But Apple needs to rebuild trust. They need to deliver on promises. And they need to be honest about what they can and can't do in-house. The Google partnership might be pragmatic, but it's also a huge shift in Apple's philosophy—and potentially a legal minefield.

My iOS 26.1 post praised Apple for listening. I still think the speed of that response was notable. But listening isn't enough—they need to stop creating problems that require listening in the first place.

Until then? I'll be over here with my Tinted mode enabled, dreading my future Google-powered Siri, and wondering what happened to the Apple that shipped things when they were ready—and didn't need to partner with monopolists to do it.
                    
What do you think about Apple's plans for iOS 27? Have thoughts about their recent development strategies? I'd love to hear your thoughts—find me on Mastodon at @ppb1701@ppb.social and let's talk Apple and iOS.