Last week, Apple finally published a detailed review of Vision Pro and VisionOS data privacy. While the company probably could have made this available at launch, it helps explain what exactly the spatial computer collects from our environment and sends to third-party apps, and much more…
9to5Mac Security Bite is provided exclusively by Mosyle, Apple's only unified platform. Everything we do is to ensure Apple devices are ready and secure in the enterprise. Our unique integrated approach to management and security combines Apple's most advanced security solutions to fully automate the security and safety of your data. Compliance, next-gen EDR, AI-powered zero trust, and exclusive privilege management with the most powerful and advanced Apple MDM on the market. The result is a fully automated, unified Apple platform that is now trusted by more than 45,000 organizations and puts millions of Apple devices up and running effortlessly and affordably. Request an EXTENDED TRIAL strong> today and see why Mosyle is all you need for your Apple experience.
Privacy, shmivacy: Why should I care?
For many security professionals, the researchers I talk to mention mixed reality with great trepidation. While consumers are more concerned about the Vision Pro's roughly $4,000 price tag, those in the security industry seem more aware of the dangers. After all, this device has six microphones and twelve cameras that you carry around your home.
As I highlighted in a previous Security Bite post, the overall privacy risks associated with the Apple Vision Pro or any headset can be alarming. For example, the distance from the ground measured by depth sensors can determine the user's height. The sound of a passing train can help indicate a physical location. The user's head moments can be used to determine emotional and neurological states. Data collected in front of the user's eyes is perhaps the most worrisome. Not only could this lead to targeted advertising and behavioral profiling, but it could also expose sensitive health information. Optometrists can often help diagnose patients' medical conditions simply by looking at their eyes.
Privacy Information New Vision Pro environments
Although the environments in Apple Vision Pro appear real, they are created using a combination of camera data and LiDAR to provide a near real-time view of the user's space.In addition, VisionOS uses tracing sound rays to simulate the behavior of sound waves as they interact with objects and surfaces.Apps overlay these scenes or, in some cases, create their own environment.
With the release of the new Vision Pro Privacy Review, we now have a better understanding of what environmental data is sent from the headset and shared with apps.
- Plane Estimation: Detects nearby flat surfaces on which virtual 3D objects, or what Apple calls volumes, can be placed. It enhances immersion by allowing users to interact with virtual objects as part of their physical environment.
- Scene reconstruction: Scene reconstruction involves creating a polygon mesh that accurately depicts the outlines of objects in the user's physical space . This grid helps virtual objects align correctly with physical objects in the user's environment.
- Image Anchor: This feature ensures that virtual objects remain locked in specified positions relative to real world objects, even when the user moves. WSJ's Joanna Stern demonstrated the technology at the beginning of a video posted on X, where she was seen placing multiple timers over objects simmering on the stove.
- Object recognition
- Object recognition
Strong>: Apple says it uses object recognition to identify “objects of interest in your space.” In a broad sense, it is used by Vision Pro to determine what is happening in your environment.
By default, applications cannot access environmental data in Vision Pro. To make gameplay more realistic, third party developers may need access to this environmental data. It's a similar tapping process to allow access to photos or camera on iPhone; Full Space in Vision Pro can access environmental data to provide a more immersive experience.
“For example, Encounter Dinosaurs requests access to your environment so that dinosaurs can break through your physical space. By giving an app access to environmental data, it can map the world around you using a scene grid, recognize objects in your environment, and determine the location of specific objects in your environment” Apple explains.
However, any app will only have access to information about your surroundings within five meters of you. This is why immersive elements such as shadows and reflective areas no longer exist beyond this distance.