Google vs. Apple: The Battle of Visual Intelligence

Google vs. Apple: The Battle of Visual Intelligence

Google might have left Apple’s visual intelligence in the dust before it even exists on our phones. Let me explain.

How Apple Intelligence Works

When Apple announced visual intelligence at the iPhone 16 event, a lot of us were thinking the same thing: it sounds a lot like Google Lens, right? You press the camera control button, hold the iPhone up to something you’re curious about, and your phone will tell you all about it. It can add an event to the calendar, bring up ChatGPT for more help, or search Google if you see, say, a bike that you might want to buy. Some of this is similar to what you’ve been able to do with Google Lens for a while now, but Google recently announced new features for Google Lens.

New Features in Google Lens

Google recently added a range of new features like voice input and video search to the Google app, which go beyond what Apple showed us so far with visual intelligence. And get this: these features work on both iOS and Android versions of the Google app. Even better, you can get this functionality right now on the iPhone without having to wait for Apple to officially roll out visual intelligence in iOS 18.2 or whatever version it finally hits our iPhone 16s on. I’ll show you how to get a similar experience using the Google app on any iPhone, complete with a custom button press in just a second.

The Promise of Apple Intelligence

Okay, I know what you’re thinking. We don’t exactly know how visual intelligence will work yet, beyond what Apple has shown us at the keynote, because it’s not out and it’s not even in a beta release yet. And yes, I 100% agree; there are a lot of unknowns here. There’s also another element at play too. Apple has something called Visual Lookup.

Visual Lookup vs. Visual Intelligence

Visual Lookup, within the Photos app, allows you to take a photo or a video of a landmark, plant, or a pet and then tap that Visual Lookup icon down here. How will this be any different from visual intelligence, or will visual intelligence tie into this existing system? So many unanswered questions. But given that I can do so much with the existing Google app right now on the iPhone, I’m not holding my breath for game-changing visual intelligence features on top of the Google Lens functionality that you might already know about, including identifying things that you take photos of and finding stuff to buy. Here are the new things that it can do: voice input lets you hold up the phone to something, press and hold the shot button on screen, and ask a question. So if I have a random item in front of me, I can ask, “What is this?” and it’s going to pull up an AI overview using Gemini that gives me an idea of what it is and any other information that I might want. You can also record a video and feed that into Google Lens to learn more about something and even narrate a question while you do it, just like you would with a still capture. This one is available if you’re enrolled in the AI Overviews and More section within Labs. You can do that by tapping on the Labs icon in the top left of the Google app and then turning on AI Overviews.

How to Get Google Lens on Your iPhone

Alright, so you are ready to get some of that visual intelligence life on the iPhone right now. So here is the formula to get Google Lens on the iPhone that simulates what it might be like to use visual intelligence when it finally rolls out. First, make sure you have the Google app on the iPhone. Then open the Shortcuts app and tap the plus icon in the search actions bar. Type “Google,” and Google Lens should show up. Select it and then tap done. Now you can go into your action button settings in the Settings app and swipe over to make it that shortcuts option. Make sure to select that shortcut that you just made.

Utilizing Shortcuts for Google Lens

Now, if you don’t have an iPhone with the action button, all is not lost. You can use the back tap to start the shortcut. With that same shortcut you just made, go to Accessibility, then Touch, then Back Tap, and then select either double or triple tap, depending on which one you prefer. Then scroll down to choose that same shortcut.

One housekeeping note: if you try and run this shortcut from the lock screen, you might get a prompt asking you if you want to run it. You’ll need to unlock your iPhone here, but if you do have Face ID turned on, you’re probably looking at the phone anyway when you’re pulling it out, and it should just unlock automatically.

And one final trick: you can even run this shortcut from the Control Center. Swipe down and then long press it. Then tap to add a control, search for Shortcuts, and then add that shortcut you just made.

Final Thoughts

I’m curious to know if searching with video or voice is something that you would find helpful or would you even use it? Or are you feeling more like these tech companies are just trying to fall over themselves to bring us new, useful AI features to market without a strong enough use case? For me, it’s still early days with all of these new features, but I can tell you for sure my iPhone’s action button has well and truly been mapped to that Google Lens shortcut now.

I really hope that there is something more to visual intelligence than meets the eye when it finally rolls out. So game on, Apple. Time to show us what you got! Thanks so much for watching. I hope you enjoyed the episode. Make sure to drop me a comment with your thoughts about visual intelligence and Google Lens.

 

Leave a Reply

Your email address will not be published. Required fields are marked *