×
Apple’s Visual Intelligence rivals Google Lens — here’s how to use it
Written by
Published on
Join our daily newsletter for breaking news, product launches and deals, research breakdowns, and other industry-leading AI coverage
Join Now

Apple’s latest AI innovation: Visual Intelligence, a feature of iOS 18.2 developer beta, brings Google Lens-like functionality to iPhone 16 models, enhancing the way users interact with their surroundings through their device’s camera.

Key features and functionality: Visual Intelligence allows users to perform a variety of tasks using their iPhone’s camera, including object description, price lookup, and business information retrieval.

  • The feature utilizes ChatGPT to provide detailed descriptions of captured images and can answer follow-up questions.
  • Visual Intelligence can extract important information like email addresses and phone numbers from images.
  • Users can opt to use Google search instead of ChatGPT for visual searches, providing access to similar images and shopping prices.
  • The tool also offers text summarization and translation capabilities for captured images.

How to use Visual Intelligence:

  1. Long press the Camera Control button to launch Visual Intelligence, which is indicated by a rainbow-colored outline on the screen.
  2. Point the camera at the desired object or scene and tap the on-screen shutter button to capture and analyze the image.
  3. For additional information, tap the chat bubble icon to interact with ChatGPT or use the Google search button for alternative results.

Availability and system requirements: The feature is currently limited to iPhone 16 models running the iOS 18.2 developer beta, with an official release expected in December.

  • Visual Intelligence requires an active data connection to function properly.
  • The tool is designed to work in vertical mode, with horizontal mode support not yet available.

Broader context: Visual Intelligence is part of Apple’s larger push into AI-powered features, collectively known as Apple Intelligence.

  • Other Apple Intelligence features introduced in iOS 18.1 include Clean Up for object removal in photos and Writing Tools for style adjustments.
  • Additional functionalities like call recording, proofreading, and text-based Siri interactions have also been introduced.

Looking ahead: As Apple continues to develop and refine its AI capabilities, Visual Intelligence represents a significant step in competing with established tools like Google Lens and enhancing the overall user experience of iPhone devices.

  • The official release of iOS 18.2 may bring further improvements or changes to Visual Intelligence based on user feedback and testing during the beta phase.
  • This development signals Apple’s commitment to integrating AI technology more deeply into its ecosystem, potentially leading to more advanced features in future updates.
Visual Intelligence finally gives Apple its version of Google Lens — here’s how to use it

Recent News

Sakana AI’s new tech is searching for signs of artificial life emerging from simulations

A self-learning AI system discovers complex cellular patterns and behaviors in digital simulations, automating what was previously months of manual scientific observation.

Dating app usage hit record highs in 2024, but even AI isn’t making daters happier

Growth in dating apps driven by older demographics and AI features masks persistent user dissatisfaction with the digital dating experience.

Craft personalized video messages from Santa with Synthesia’s new tool

Major tech platforms delivered customized Santa videos and messages powered by AI, allowing parents to create personalized holiday greetings in multiple languages.