Mobile Phone Motion Sensors Found to Leak Tracking Data
A user's information can be accessed by an attacker in many ways, not just from a device's system software.
As Apple addressed its developers at this week's convention, it outlined all sorts of changes it was going to be making in its new software releases.
All of them shared the theme of user privacy. Apple is acting to differentiate itself from other device makers by presenting themselves as the most privacy conscious.
But a user's information can be accessed by an attacker in many ways, not just from a device's system software.
A recent paper outlined how a device can be fingerprinted (thereby giving a persistent link to its user) just from its hardware characteristics.
The paper came up with a proof of concept that utilized the M-series motion co-processors that are found in iPhones to generate a device fingerprint. These same kinds of sensors are on Android devices as well.
Access to these sensors does not require any special permissions, and the data can be accessed via both a native app that is installed on a device as well as by JavaScript when visiting a website on an iOS and Android device.
The researchers also found that no user interaction is required to leverage this kind of data access. An advertiser or other third-party can come up with this tracking data invisibly to the user.
It's well known that MEMS (Micro-Electro-Mechanical Systems) sensors are inaccurate in unique ways. Natural variation during the manufacture of embedded sensors means that the output of each sensor is unique.
iPhones as well as Google's Pixel 2 and 3 smartphones will try to compensate for this by using a calibration process that is applied to each sensor.
The strength of the calibration gives a value that inversely relates to the level of inaccuracy found in the sensor. Bingo, you have been fingerprinted to 67 bits of globally unique entropy.
Now, the researchers told Apple about this back in August of 2018. It responded to the problem with the fix inside iOS 12.2 in March 2019.
Apple masked the low-level sensor output by adding random noise to the analog-to-digital converter output. They also removed default access to motion sensors in Safari. They think that these steps will stop any exploitation of the effect in its tracks until iOS 13.
The authors say that they told Google in December 2018 about the situation. Android manufacturers usually don't do the calibration step because of cost, so those devices will not be vulnerable. But higher-end Android devices like the Galaxies do go the extra mile, and so are vulnerable to the privacy intrusion. Google has made no patch response as of yet. Maybe the idea of tracking devices seems good to the company.
Apple is acting in a way that shows them to be serious about privacy, as they told the developers. Compared to other areas, SensorID is a relatively small privacy issue. It was far more "potential" than "actual." However, mitigation -- which was not going to be simple -- could only be done by Apple. It was their responsibility. They did it. Unlike Google, who didn't.
If Apple keeps doing this kind of privacy stuff, "Sign In with Apple" is going to end up a trusted-by-the-users competitor in the sure-to-be-coming OAuth sweepstakes.
— Larry Loeb has written for many of the last century's major "dead tree" computer magazines, having been, among other things, a consulting editor for BYTE magazine and senior editor for the launch of WebWeek.
Read more about:
Security NowAbout the Author
You May Also Like