apple announced its new liquid glass initiative (?) at wwdc 25 and i have thoughts. this likely will not be well structured.

having used it for a limited amount of time across macOS, iOS, and iPadOS, one thing became instantly clear (lol).. content is king. menus now melt away under the glassy interface when you arent focusing on them, and even though content oriented UX/UI has become almost the new normal, i think this approach has been the most human focused. interestingly though, i think the way that search functions have been made separate from tab bars, and located right next to the thumb does give us some clues as to how apple wants us to think about utility and usability. maybe reaching a bit here, but the thinking around UI and how it relates to it's proximity to input seems to draw a lot from visionOS and spatial computing in general. as ive written about before, the future of computing revolves around input.

ive seen a lot of graphic designers and founders and techbros on twitter talk about the visual "inconsistencies" found across liquid glass in its current beta state (which is a whole other thing), and i think its frankly a little ridiculous. people have been so figmapilled into thinking that all UI is an equation. decisions should be made strictly along the lines of user surveys, data points and whatever else.. that along with the industry moving into a generative age of UI/UX design makes it blatantly obvious to me that the humans designing for "humans".. just aren't anymore.. so yea it seems ridiculous to some that the corner radius is 3px off, or that objects on top of glass have to be focused on to be read, but life isnt equation! different materials, objects, etc have different affordances, inherent properties, downsides, upsides.. and i think its ok, and even encouraged for devices that have a goal of being more personal and human to be a bit inconsistent, just like life is. i really feel like nobody remembers the first beta of ios 7 (yall deadass wasnt there!!!!!!) where apple decided that the majority of people had grown from needing constant literal visuals of the materials and objects we were used to using (skeuomorphism), and took full advantage of high dpi (retina) displays. liquid glass is that next step of tech intertwining closer with the materiality of our everyday lives. i think that makes sense.

the wsj interview with the apple execs also showed apple's disconnect with "ai" as most people see it, and i think shows some opportunities for them in the future. obviously, apple flopped hard with the improved siri implementation, and now theyre being upfront about it, but i think theres also a misunderstanding that apple has been completely disillusioned by the thought of improving computing with "AI".. i mean we're talking about the same company that has had NPUs in their phones since 2017! personally i think apple could totally do the chatbot thing and make it all personal and context rich (call it newton Please the branding would be perfect) but as they said to get that where it "just works" and on device takes time not to fuck up.. the current on device power of the neural engine and apple intelligence is so good that we dont even notice it! i think thats exactly what apple wants and im sure itll get there with siri/chat/wherever they go with it.