Since iOS 15, the Photos app has been able to analyze all the images stored in your iPhone or iPad’s photo library to extract text. This branch of the excellent Live Text feature is extremely handy for finding a photo using a snippet of text, but for some bizarre reason the Photos app’s built-in search engine wasn’t able to search for text. Finding the images required going through Spotlight, iOS’s global search engine.
iOS 15 live text preview
Luckily, iOS 16 fixes this funny oversight. The Photos app can now dig into its own database and find images based on the text they contain. It’s still more convenient than using Spotlight, especially since the search persists even if you switch to a different app. With iOS 15 you had to restart the search every time, which quickly became tedious.

For the rest, the operation and the presentation of Spotlight and the photos app are identical. In particular, the text you are looking for is highlighted on the photos with a yellow line. When the text is too small to read without zooming, which is particularly the case in this Maps screenshot, a yellow dot is first displayed to show the position of the text.

Text search built into Photos eliminates the need to search in Spotlight. In order not to break the habit, iOS 16 still shows the results provided by the Photos app in its search engine, but there is now a shortcut to start the search in the app. Photos takes over and displays the results as shown above.

This isn’t the only Live Text-related feature in iOS 16, or even the most important:
iOS 16: “Live Text” works in videos, in third-party apps and is even more convenient