WWDC 2024 observations
Yesterday, Apple unveiled most of the new software features coming to MacOS, iOS, VisionOS, iPadOS, WatchOS, etc. These are my initial impressions, although I haven’t read much about it yet, so please bear with me.
The whole AI stuff looks well implemented, especially the “Writing Tools” part, but I’m glad they went the “private” way, doing everything themselves using their own on-device machine learning, except for certain situations when it will prompt users to accept using ChatGPT (and possibly other services in the future).
The image generation thing has an interesting UI design, but all the examples given were strange, cringey, and I can’t imagine anyone using this except for your parents in awkward Facebook or WhatsApp group messages. The birthday wishes example? Come on. Even the lifestyle video showed two young people apparently using this kind of AI-generated images while texting, as if it was something cool and popular people do all the time. Maybe I’m too old, but it felt forced and awkward.
I’m not a fan of the custom emoji feature either, but I understand how this will appeal to the media and the general public. It’s even the first feature mentioned in Apple’s own summary video.
I’m unsure about the real-life use cases for the handwriting feature on the iPad. Do people actually do that? The improvement in handwriting is interesting, generating text in your own writing style is definitely a neat party trick, but I’m curious to know what percentage of users will actually benefit from this. Or is it one of those features made for reviewers to boost the sales of Apple Pencils?
Same question with the iPhone mirroring: it’s cool that it exists, and I’m sure I’ll try it out, even benefit from it once or twice a year, but it looks like a very niche thing. Also, does it work on the iPad? With the touchscreen, it would make more sense, and iPads are more likely to be used lying down or sitting comfortably in an armchair, where people may be less tempted to get up and get their phones than if they are just on the desk with their Mac. I get the concept, but it seems to me that 90% of it would have already been solved with notifications only. To paraphrase my grandmother complaining about people parking their car right in front of the bakery of the village instead of the car park a hundred metres away while advertising their weekend hikes: “People would prefer to mirror their iPhone screen on the Mac instead of getting up to get their phone, while bragging about closing their Apple Watch activity ring.”1
Still no mention of search improvements on Safari. This is a wait and see situation and I hope more details will come out about Safari and if custom search engine settings are available, or at least more options than just Google and Bing-related search engines.
Still no tool to use one’s selfie camera to generate a Memoji that looks like you, which would seem like an obvious feature to add, no?
Not a word on the App Store situation, on commissions, and current issues.
Also, no mention of the expected ad blocker of Safari, so it makes me hopeful for the previous point: we will learn more about Safari’s upcoming features later. Yes, this is wishful thinking.
The AI highlight card on Safari is a great feature for the user, but I think this is something that could also be managed by all browsers and websites at the meta data level, without the automatic need for AI. Like I mentioned when talking about blog post publishing dates, browsers could display some information from a website directly on the toolbar: publishing date, phone number, address, reading time, etc.
I found Tim Cook a bit untuned compared to the rest of the speakers, who were all pretty good and energetic. I found Cook tired and not very enthusiastic. There was also a weird transition before he mentions Apple Intelligence for the first time, which makes me think that they didn’t shoot this video too long ago and rushed some parts of the editing. Were they waiting for the OpenAI deal to be green lighted?
If I were in charge of the editing, I would have started the whole show with the AI stuff, so that later, when they mentioned all their “machine learning” enabled features, they could have also used the Apple Intelligence name, avoid repetition, and spread out AI use cases more evenly throughout the video.
Siri’s improvements are very promising, but the initial bar was indeed very low; it’s also interesting that they kept the Siri name for the assistant only, but used “Apple Intelligence” for other use cases of the AI.
finally: "the ability to change shortcut buttons on the lock screen, and — lo and behold — move icons around anywhere on the home screen: no need to put random widgets on the top of the screen to keep most-used apps at the bottom of the screen, where it’s easier to reach."
Math notes were for me the most convincing and interesting demo, but I’m struggling to see it as anything other than a demo, except for teachers maybe? This would be so good on a blackboard.
I like the UI for the AI proofreading thing: highlighting where the changes are instead of just replacing the text, etc. I’m glad they didn’t go the chatbot way for everything, à la ChatGPT.2
I’m not convinced by the new Photos app layout. I find the layout a bit strange and confusing. In the current version of Photos, I only navigate through albums and the library. The “For You” moments are good when I see them, but I don’t want that part upfront.
The AI stuff was announced even though it won’t be available for most iPhone users, if I understood correctly. Only iPhones 15 Pro, in the coming months, and only in the US. It feels like this special AI segment was made for Wall Street; otherwise, it would have made more sense for Apple to mention AI all along with the OS’s other updates, I think.
I like that email summaries can be displayed right from the inbox, but I wonder what it will mean for the email subject field in the future if this kind of feature becomes more popular.
I’m a bit confused by the new Vitals app and how it is different from the Health app.
I still don’t get the fuss, frustration, and excitement around RCS and how many people don’t understand what the blue bubbles mean in Apple Messages.
I don’t own an Apple TV, but I loved the attention given to tiny UI details like automatically adding subtitles when you rewind or when the sound is muted. This is a feature I want everywhere to be honest.
Did the new iPadOS floating toolbar really deserve a visual demo and 45 seconds of the livestream?
See also:
- Kev Quirk’s excellent knee-jerk reaction to WWDC24
- Apple’s Artificial Approach to ‘Apple Intelligence’ (MG Siegler)