iOS 26 and Liquid Glass feel like one big cover-up operation. Here’s why


iOS 26 has been officially announced (rolling out in September) and along with the new numbering convention, one of the most substantial changes lies in the design. Many industry experts, alongside Apple itself, label the new Liquid Glass look as the biggest design overhaul in years.

“Inspired by the depth and dimensionality of visionOS, the new design takes advantage of Apple’s powerful advances in hardware, silicon, and graphics technologies. The new material, Liquid Glass, is translucent and behaves like glass in the real world,” says Apple in an official press release.

This is all fine and dandy, but it raised a couple of questions in my mind immediately. Do we NEED a translucent interface that behaves like real glass? And what does that even mean? That if you swipe or tap too fast or hard, it would break?

The second thing is right there in the text itself. Liquid Glass takes advantage of the hardware processing power of Apple’s silicon. I can’t help myself but wonder, don’t we have better and more useful ways to utilize hardware than to make things look like “glass in the real world”?


And finally, what are the chances Apple is throwing sand in our eyes in order to deflect our attention away from much more important stuff? Such as the fact that Siri is nowhere (mentioned just two times during the official event), and that Apple Intelligence still feels like a bad copy of what every other LLM has been doing for a long time now.

But first – let’s talk about Liquid Glass.

Form over function – people like pretty things

There are a lot of reasons why we love polished and good-looking things, smartphone interfaces included. Some of these things are rooted deeply in our brains from prehistoric times, while others we’ve learnt to value a bit later in our evolutionary journey.

Evolutionary basis of aesthetics

Our ability to recognize symmetry goes way back to ancient prehistoric times, and it’s tied to our survival mechanisms.

Not only in faces, but in everything around us – cars, furniture, the shape of your house, smartphones, clothes – the list goes on and on.

It’s a well-known phenomenon, and even though we’re trying to escape it and appraise things for other qualities, it’s so visceral that most of the time it’s beyond our conscious efforts.

Neuroscience and psychology of aesthetics

How does this work? Basically, without getting into boring scientific details, visually pleasing things activate a circuit in our brain tied to the dopamine reward system.

Put simply, you feel good when you see pretty things, and you also seek them actively. 

This leads to some very interesting effects on our ability to be objective, and one of these is called “the aesthetic-usability effect.”

The aesthetic-usability effect

The aesthetic-usability effect refers to users’ tendency to perceive attractive products as more usable. People tend to believe that things that look better will work better — even if they aren’t actually more effective or efficient.

This lies at the core of many interface design decisions, including iOS 26 and Liquid Glass. This effect has also been used for decades – if anyone has even used Linux versus Windows or macOS, they know how powerful the first one can be and, at the same time, how most people prefer the better-looking OS. The decision happens almost instantly.

Testing has shown that users form an opinion about a webpage within 50 milliseconds of exposure.

iOS 26 – Liquid Glass or smoke and mirrors?

Back to the topic at hand. So, people like pretty things, and Apple decided to make iOS 26 pretty with Liquid Glass. What lies behind this decision?

Probably the fact that the smarter Siri Apple has promised us is still somewhere in school. In contrast, other smart assistants and LLMs can now write scripts, generate videos and podcasts, create movies from still frames, organize your emails, and be much more useful in general with multimodal input and cloud access.

Apple Intelligence has been a major talking point ever since its official announcement last year, but it continues to lag behind the competition.

Apple made a big deal of adding Live Translate to iOS 26, but this feature has been on board OneUI since January 2024.

The same goes for Visual intelligence. The ability to contextually search for an object has been part of the Android world for more than a year now, called ‘Circle to Search’.

Is Apple playing catch-up with iOS?

 

 
I believe the answer is “yes.” However, there are some positives to be taken from the latest iOS 26 announcement.

The major one is consistency. This Liquid Glass overhaul may or may not be useful or helpful, but it will arrive across all Apple devices. The whole ecosystem – iPhones, iPads, Macs, and Vision Pro headsets.

That’s a bold move and probably required more work than we imagined.

But the truth of the matter is that Liquid Glass seems to me as a big cover up for the lack of new and original features. iOS 26 is playing the catch-up game and it seems this will continue to be the case for some time.

What do you think about Liquid Glass and iOS 26 in general? Happy with the new features and look? Do you think it’s all hype and no real innovation? Slap your opinions down in the comment section below.



Source link

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *