Design a minimalistic, high-quality image showcasing Apple’s new ‘Intelligence’ feature. Highlight futuristic AI elements with glowing details and a clean, tech-inspired background. Avoid prominent people or text in the composition

Is Apple’s New 'Intelligence' Feature Really Smart? A Deep Dive into iOS Visual Search and Siri AI Limitations

Introduction

I tried the Apple’s new ‘Intelligence’ features on iOS developer beta, and the first thing I felt is I’m being scammed, so hear me out.

The Misleading “Apple Intelligence”

Apple calls it a suite of AI Visual Intelligence features, Apple intelligence, but when I press and hold the camera control to do a visual search then tap on the magnifying glass, it says “searching with Google.” Then it slaps the search results from Google Lens on the screen in a small overlay window. Similarly, if I want to ask a question about the image by tapping the “ask” button, it invokes ChatGPT.

Where is the Apple Intelligence?

So can anyone tell me, where is the Apple intelligence here? Apple is literally using Google’s and OpenAI’s smart features and calls it Apple intelligence. But this is just the beginning.

Limited Functionality: A Comparison with Google Lens

The second problem is you are getting the most basic version of both.

Apple Intelligence vs. Google App

Let’s start with Google Lens. On the left, I’m using Apple Intelligence; on the right, I’m using the Google app to do the same exact visual search on the same iPhone Pro Max. With Apple intelligence, all I get is the search results in a web container, and that’s pretty much it. In Google app, I can do the same plus the ability to add extra words to the query to make it more specific, like knowing the price, checking a different color, or finding nearby stores that sell the same product, etc.

The Value Comparison: Google App is Free

In this example, you can clearly see that Apple intelligence is giving you less, while you can get much more features by using the Google app completely free of charge on your iPhone. And the same story repeats itself with ChatGPT.

The Limitations of ChatGPT Through Apple Intelligence

Using ChatGPT via Apple Intelligence vs. ChatGPT App

When I use it through Apple intelligence, all I get is some text on the screen, and it doesn’t read it back, which makes the ChatGPT app a much better option as I can listen to the answer while doing something else.

Why Apple Intelligence is Not Available on Older iPhones

So why is Apple intelligence not available on iPhones prior to the Pro? It’s not even Apple’s technologies. Plus, Google and ChatGPT offer more features in their own apps, which you can install on any iPhone running iOS or later for Google app and iOS or later for ChatGPT.

The iPhone Models Limitation

Another thing that made the whole experience feel more like a scam is only the iPhone models support the visual search, and guess what? That’s because only the iPhone models have the camera control—as if Apple cannot put a button on the screen to trigger the same feature on older models—but they want to give the useless camera control a fake value by intentionally making it a mandatory feature when it’s not.

Getting the Same Experience on Older iPhones

Again, I can download these apps and get even a better experience without the need to purchase the iPhone.

Features of the Wallpaper App

It provides you with multiple styling options, like blur, brightness, and hue, to make your wallpaper stand out with the ability to edit your home and lock screen wallpapers separately, sync your favorites across all your devices, and more. The Google Play Store download link is in the description.

Siri with Apple Intelligence: Limited Functionality

Visual Search Limitations

And now let’s move on to the next feature: Siri with Apple Intelligence. That’s it when it comes to visual search.

Siri and ChatGPT Integration

Now let’s talk about Siri. Once more, Apple used ChatGPT to make Siri look smarter, but the overall experience is not great.

A Demo: Siri and ChatGPT for Recipe Assistance

Let me show you a quick demo. Ask ChatGPT how to make lasagna. As you see, all it gives me is text on the screen, and I have to read it myself. My only option is to copy the text, but I cannot listen to it, and I have my volume set to the max, and it doesn’t read anything back.

ChatGPT App vs. Siri for Recipe Instructions

So let’s try the ChatGPT app. “How to make a lasagna.” Making a lasagna is a bit of an art, but here’s a simple way to do it: first, preheat your oven to °F, cook your lasagna noodles according to the package, and while they’re boiling, prepare your meat sauce. Brown some ground beef or Italian sausage.

Siri as an Operator for ChatGPT

As you saw, Siri acts as an operator who takes my own words and passes it over to ChatGPT. Plus, it doesn’t read back the responses, which is a huge bummer, while the ChatGPT app gives a more intuitive experience as if I’m talking to someone. So I can use it while driving, cooking, or walking.

Final Thoughts

The Overall Shortcomings of Apple Intelligence

So overall, when it comes to Siri and visual intelligence, Apple didn’t offer anything. They didn’t only rely on other companies to do their part, but they also gave the most basic experience you can get from these services. Plus, they made them exclusive to the latest and greatest iPhone, while any iPhone user can download the third-party apps and enjoy the same experience, if not even better.

Where Apple Did Well

Other than this, I think they did well in other areas like the writing tools, the Photos app AI features, and so on, which is something I’m going to talk about in future articles and compare them with the same features offered by Samsung and Google.

 

Leave a Reply

Your email address will not be published. Required fields are marked *