You may have spotted the white Apple cars on the road, each equipped with a camera system, or even encountered backpackers carrying cameras. They are shooting the streets to feed the photo database for the 360° View feature, Apple's equivalent of Google Maps' Street View feature.
Blurry photos in Maps
These image collections regularly travel across regions and cities in France and elsewhere in the world. You can find out about the collectors' presence at this address. And indeed, this page was recently modified to indicate that in addition to 360° View, the photos will also be used to develop and improve other Apple products and services, starting this March.
This isn't about using raw images, however. Apple specifies that it only uses photos that have been processed to blur faces and license plates. It is still possible to request that a face, a license plate, or a house not be identifiable at all.
What will these photos be used for, specifically? Apple explains that the data from these images will be used to "train models that power Apple products and services, including models related to image recognition, creation, and enhancement." Apple Intelligence offers several functions related to image editing, such as the Fix tool in the Photos app to erase unwanted objects, to create a movie of a memory, or even Visual Intelligence.
It is Visual Intelligence that could benefit from this. This clone of Google Lens, which recognizes places and objects, currently relies on Google and ChatGPT to provide information about an image. Apple could replace term these two providers with its own data.
Source: 9to5Mac
0 Comments