Hands-on with Apple Intelligence

I've had some time with Apple Intelligence, using 18.1 for several weeks and now 18.2 for most of the day.

I like the notification summaries. Likewise, I've had a few instances where, just by glancing at my phone, I can see and even understand what's waiting for me in my inbox. It could be better; sometimes, it gets the order of information wrong and states the opposite of what the message says, but it's relatively infrequent and legitimately helpful.

Pressing the camera button and doing an image search is compelling. I took a picture of my dog, a corgi Sheppard mix, and when I asked it to tell me what kind of dog it saw, it was dead on. It provided info on everything I tried, including pulling the game info from a shot of my TV as I played. This was the first time AI felt like magic to me. I am still trying to figure out how this works in the real world. Having this on a work trip I'm going on in early November will be interesting.

The playground stuff is as ridiculous as I assumed it would be. I don't see myself ever using it beyond some initial experimentation; thankfully, you can remove the app. There are some image generation features elsewhere in the system, but they're easy enough to ignore.

As I mentioned in my initial thoughts back at WWDC, I'm impressed with Apple's approach to all this more than the heavy-handed, borderline dangerous way Google and Microsoft have done it. Hopefully, it stays that way.