close
close

Mondor Festival

News with a Local Lens

5 Ways Visual Intelligence Will Make Your Life Easier
minsta

5 Ways Visual Intelligence Will Make Your Life Easier

I’ve been using Apple’s new Visual Intelligence feature since the iOS 18.2 developer beta was released a few weeks ago. At first I thought it would be one of those features that would just be cool to show off to my friends, but that I wouldn’t actually use. But as the days and weeks went by, I found myself using it repeatedly in a multitude of situations. Here are some of the best ways to use visual intelligences, and I’ll even show you how to use it on devices that don’t have camera controls! Let’s go.

Be sure to watch our video below to see all of these use cases in action and I even show five additional use cases that will resonate.

Finally, a quick reminder. This is still in developer beta and public as of this writing. So if you are running iOS 18.1, this feature will not be available yet. iOS 18.2 is expected to be released to the public in mid-December.

How to activate visual intelligence

Oddly, there is no dedicated setting for this feature. You receive a prompt when you first update to iOS 18.2, telling you how to enable it. But then that’s it. Technically, this only works on iPhone 16 models, as you need a camera control button to activate it. All you need to do is long press the button to access visual intelligence.

1. Summarize and read books aloud

One of the best use cases I’ve found is that you can use visual intelligence to read something aloud, regardless of the text. I found this extremely useful for books. It could also somewhat replace audiobooks in certain situations. Its use is very simple:

  • Open visual intelligence
  • Point your phone at a text you want to summarize or read to yourself
  • Your iPhone will see that it is text and as if you want it summarized or read aloud
  • Select an option
  • Sit back and relax

As someone who needs an audiobook and the physical book to follow along, this is amazing. I can simply take a photo of a page and ask Siri to read it out loud to me. Try this with any text, from contracts to instructions to books. It’s extremely useful

One of the main options and functions of Visual Intelligence will be the search function. This feature has been around for a while with other phones and apps, but it’s great that it’s now a default feature for iOS. You can take a visual intelligence photo of anything, and then it will give you two options: ask and search. For reverse image search, just tap on the Search option and it will then find similar items on Google. Here you can see that I took an image of an iPhone and it found iPhones near me that I could buy. But it literally works with almost anything you’re looking for!

3. ChatGPT Ask Feature

As I mentioned above, two main functions will always be available when using visual intelligence: ask and search. We’ve already covered search, so what is Ask? This is where ChatGPT comes in. If you take a visual intelligence and tap the Ask option, it will upload that image to ChatGPT and let it interpret what it sees. It works with anything and everything. It’s also surprisingly amazing to fix a bunch of little details. One thing I’ve noticed is that he tends to avoid specific types of intellectual property like cartoon characters. But it will still describe them and give you as much information without saying the character’s name.

4. Real-time business information

Apple brings the augmented reality experience with this feature in Visual Intelligence. Now you can simply access visual intelligence, point it at a business, and you’ll get all the information you need. You don’t even need to take the image; just point your visual intelligence camera at it and it will start working its magic. The example I used in my video was a local coffee shop. I pointed my phone at the cafe and it was immediately recognized. The Apple Maps list appears along with other options such as:

  • Call
  • Menu
  • Order
  • Hours
  • Rating on Yelp
  • pictures

All of this is intelligently recognized and you don’t need to dig through menus. Just point your phone and everything is at your fingertips.

5. Problem Solving

This is one of those features I wish I had back in 2012 when I was in calculus class. You can take a visual intelligence image of any math problem and then ask ChatGPT to solve it. You get the step by step method to solve these equations or problems. I remember spending so long with geometry proofs or always having to show your work to solve something. Now we can just take a photo and Siri will do the rest.

Final Take

As I mentioned, all of these features are still in beta, so they continue to learn and improve over time. But it’s amazing that this is now built into the native operating system and we don’t need to use other applications to achieve this type of use case. I use Visual Intelligence more and more as the days go by, both in my personal and professional life. The crazy thing is, this is the worst that could happen and it will only get better!

Be sure to watch our video here to get even more use cases and see what using these new features really looks like. What do you think about Visual Intelligence? Is this something you would use? Have you installed the beta version on your devices? Let’s discuss it in the comments below!

FTC: We use automatic, revenue-generating affiliate links. More.