×
Apple’s Visual Intelligence rivals Google Lens — here’s how to use it
Written by
Published on
Join our daily newsletter for breaking news, product launches and deals, research breakdowns, and other industry-leading AI coverage
Join Now

Apple’s latest AI innovation: Visual Intelligence, a feature of iOS 18.2 developer beta, brings Google Lens-like functionality to iPhone 16 models, enhancing the way users interact with their surroundings through their device’s camera.

Key features and functionality: Visual Intelligence allows users to perform a variety of tasks using their iPhone’s camera, including object description, price lookup, and business information retrieval.

  • The feature utilizes ChatGPT to provide detailed descriptions of captured images and can answer follow-up questions.
  • Visual Intelligence can extract important information like email addresses and phone numbers from images.
  • Users can opt to use Google search instead of ChatGPT for visual searches, providing access to similar images and shopping prices.
  • The tool also offers text summarization and translation capabilities for captured images.

How to use Visual Intelligence:

  1. Long press the Camera Control button to launch Visual Intelligence, which is indicated by a rainbow-colored outline on the screen.
  2. Point the camera at the desired object or scene and tap the on-screen shutter button to capture and analyze the image.
  3. For additional information, tap the chat bubble icon to interact with ChatGPT or use the Google search button for alternative results.

Availability and system requirements: The feature is currently limited to iPhone 16 models running the iOS 18.2 developer beta, with an official release expected in December.

  • Visual Intelligence requires an active data connection to function properly.
  • The tool is designed to work in vertical mode, with horizontal mode support not yet available.

Broader context: Visual Intelligence is part of Apple’s larger push into AI-powered features, collectively known as Apple Intelligence.

  • Other Apple Intelligence features introduced in iOS 18.1 include Clean Up for object removal in photos and Writing Tools for style adjustments.
  • Additional functionalities like call recording, proofreading, and text-based Siri interactions have also been introduced.

Looking ahead: As Apple continues to develop and refine its AI capabilities, Visual Intelligence represents a significant step in competing with established tools like Google Lens and enhancing the overall user experience of iPhone devices.

  • The official release of iOS 18.2 may bring further improvements or changes to Visual Intelligence based on user feedback and testing during the beta phase.
  • This development signals Apple’s commitment to integrating AI technology more deeply into its ecosystem, potentially leading to more advanced features in future updates.
Visual Intelligence finally gives Apple its version of Google Lens — here’s how to use it

Recent News

Netflix drops AI-generated poster after creator backlash

Studios face mounting pressure over AI-generated artwork as backlash grows from both artists and audiences, prompting hasty removal of promotional materials and public apologies.

ChatGPT’s water usage is 4x higher than previously estimated

Growing demand for AI computing is straining local water supplies as data centers consume billions of gallons for cooling systems.

Conservationists in the UK turn to AI to save red squirrels

AI-powered feeders help Britain's endangered red squirrels access food while diverting invasive grey squirrels to contraceptive stations.