Trying Out the Latest Apple Intelligence Features on iOS
I tried the latest Apple Intelligence features on iOS Developer Beta, and the first thing I felt is that I’m being scammed. So, hear me out.
Apple Intelligence: What’s Really Happening?
Apple calls it a suite of AI Visual Intelligence features, “Apple Intelligence.” But when I press and hold the camera control to do a visual search, then tap on the magnifying glass, it says “Searching with Google.” Then, it slaps the search results from Google Lens on the screen in a small overlay window.
The Reality of Apple’s Visual Search
Similarly, if I want to ask a question about the image by tapping the “Ask” button, it invokes ChatGPT. So, can anyone tell me, where is the Apple Intelligence here? Apple is literally using Google’s and OpenAI’s smart features and calling it “Apple Intelligence.” But this is just the beginning.
Basic Features: The Apple Intelligence Limitations
The second problem is that you are getting the most basic version of both.
Comparing Google Lens and Apple Intelligence
So, let’s start with Google Lens. On the left, I’m using Apple Intelligence; on the right, I’m using the Google app to do the same exact visual search on the same iPhone Pro Max. With Apple Intelligence, all I get is the search results in a web container, and that’s pretty much it. In the Google app, I can do the same plus have the ability to add extra words to the query to make it more specific, like checking the price, finding a different color, or locating nearby stores that sell the same product, etc.
Limited Capabilities with ChatGPT Integration
In this example, you can clearly see that Apple Intelligence is giving you less, while you can get many more features by using the Google app completely free of charge on your iPhone. And the same story repeats itself with ChatGPT. When I use it through Apple Intelligence, all I get is some text on the screen, and it doesn’t read it back. This makes the ChatGPT app a much better option, as I can listen to the answer while doing something else.
Device Limitations and In-App Restrictions
Apple Intelligence on iPhone Pro Models Only
So, why isn’t Apple Intelligence available on iPhones prior to the Pro models? It’s not even Apple’s technology! Plus, Google and ChatGPT offer more features in their own apps, which you can install on any iPhone running iOS 16.0 or later for the Google app and iOS 16.1 or later for ChatGPT.
Camera Control and Visual Search Restrictions
Another thing that made the whole experience feel more like a scam is that only the iPhone Pro models support the visual search. And guess what? That’s because only the iPhone Pro models have the camera control. It’s as if Apple cannot put a button on the screen to trigger the same feature on older models, but they want to give the useless camera control a fake value by intentionally making it a mandatory feature when it’s not. Again, I can download these apps and get an even better experience without needing to purchase the iPhone Pro.
Wallpaper Customization by In-Depth Tech Reviews
Before moving to the next chapter, let me remind you about the Wallpapers by In-Depth Tech Reviews app. If you like any of the wallpapers I use in my articles, that’s where you can find them. I release new wallpapers every week. It provides you with multiple styling options like blur, brightness, and hue to make your wallpaper stand out, with the ability to edit your home and lock screen wallpapers separately, sync your favorites across all your devices, and more. The Google Play Store download link is in the description.
Siri with Apple Intelligence: How It Works
That’s it when it comes to visual search. Now, let’s talk about Siri. Once more, Apple used ChatGPT to make Siri look smarter, but the overall experience is not great.
A Siri and ChatGPT Comparison
So, let me show you a quick demo. I asked ChatGPT how to make lasagna. As you see, all it gives me is text on the screen, and I have to read it myself. My only option is to copy the text, but I cannot listen to it, and with my volume set to max, it doesn’t read anything back. So, let’s try the ChatGPT app:
“How to make lasagna?”
Making a lasagna is a bit of an art, but here’s a simple way to do it. First, preheat your oven to 375°F, cook your lasagna noodles according to the package, and while they’re boiling, prepare your meat sauce by browning some ground beef or Italian sausage.
As you saw, Siri acts as an operator who takes my own words and passes them over to ChatGPT, but it doesn’t read back the responses, which is a huge bummer. Meanwhile, the ChatGPT app gives a more intuitive experience, as if I’m talking to someone, so I can use it while driving, cooking, or walking.
Final Thoughts: Apple’s Approach to Siri and Visual Intelligence
Apple’s Missed Opportunity
So overall, when it comes to Siri and visual intelligence, Apple didn’t offer anything new. They didn’t only rely on other companies to do their part, but they also gave the most basic experience you can get from these services. Plus, they made them exclusive to the latest and greatest iPhones, while any iPhone user can download the third-party apps and enjoy the same experience, if not even better.
Future Updates and Comparison Plans
Other than this, I think they did well in other areas like the writing tools, the Photos app AI features, and so on, which is something I’m going to talk about in future articles and compare them with the same features offered by Samsung and Google.
For now, let me know what you think in the comments: Do you agree that Apple messed up with Siri and visual intelligence, or do you think otherwise?