With the launch of iOS 17.2, Apple has outlined the Maps-related data that it is collecting in order to improve the augmented reality location function. In a new support document, Apple says that it is aiming to bolster the speed and accuracy of augmented reality features in the Maps app.
When using augmented reality features in Maps, including immersive walking directions or the refine location option, Apple collects information on "feature points" that represent the shape and appearance of stationary objects like buildings. The data does not include photos or images, and the feature points collected are not readable by a person.
According to Apple, Maps uses on-device machine learning to compare feature points to Apple Maps reference data that is sent to the iPhone. The camera filters out moving objects like people and vehicles, with Apple collecting just the feature points of stationary objects.
The comparison between the feature points and the Apple Maps reference data allows Maps to pinpoint a user location and provide detailed walking directions with AR context. Using either the AR Walking directions or Refine Location refreshes Apple's reference data to improve augmented reality accuracy.
Data that Apple collects is encrypted and not associated with an individual user or Apple ID. Apple also uses on-device machine learning to add "noise" to the feature points data to add irregular variations that prevent any "unlikely" attempt to use feature points to reconstruct an image from the data.
According to Apple, only an "extremely sophisticated attacker" that has access to the company's encoding system would be able to recreate an image from feature points, but because the data is encrypted and limited only to Apple, "an attack and recreation are extremely unlikely."
The use of AR data can be disabled to prevent Apple from collecting it. The "Improve AR Location Accuracy" toggle can be accessed in the Settings app by going to Privacy and Security and then tapping on Analytics and Improvements.
Top Rated Comments
You already can add public images to locations on Maps but it isn't a very easy or obvious process. That base of crowd-sourced images and reviews is what gives Google Maps an edge in certain cases.
The route to your destiny is sometimes wrong, making it longer, just a mess since ios 6, though much better than back then thats for sure, but i cant see them equaling or overtaking google in the next 5 years at least. Had good times with google maps native app back then, good data and good and smooth app just like apple maps.
This is also a reason why they lack. In USA it would have a big impact, iphones are a solid 50% of the market. Here in europe less but it would still make a big difference.
Just the other day i went to a supermarket on my street with google directions but the supermarket was wrong placed ( rare to see but it also happens ) changed it in 1 minute. Fun thing google tracks and tells you how many people have searched that specific supermarket since you change it, 70k people already searched and it just took 1 person to do it, helped that many.
Not saying people have to basically work for them, but little changes here and there are usefull. Though at the state apple maps is now you basically would have to work for them for the map to be in order. Apple just needs to get it right.