Last week there were reports in the media that users were complaining of off-the-mark readings from the Apple iPhone 5S compass. Compared to the previous iPhone 5, Apple’s native compass app is displaying discrepancies an average of 8 to 10 degrees with both devices running iOS7.
This has caused “wonky” game experiences such as in driving and physics-based games that rely on tilting the screen for in-game motion.
This has brought into question whether or not this is a hardware malfunction with the motion sensors or some other chip. There are several teardowns available online. An examination of the bill of materials shows that the iPhone 5S has:
The NXP LPC18 is an ARM Cortex-M3 MCU. It is a coprocessor for the Apple A7 Apps Processor. This MCU is the sensor hub controller of the iPhone 5S. It has been referenced as the M7.
The first things that came to my mind as to the cause of the problem:
Of course there could be actual hardware problems with the sensors and/or the MCU, but these are relatively mature products that are widely deployed. It seems unlikely that Apple would be the only customer with bad chips. The design of motion sensors in smartphones is well understood at this point, so layout issues should not be the case.
The Sensor Fusion algorithm in the M7 (NXP MCU) is Apple’s Core Motion Framework. This handles the raw positioning data from the motion sensors. The M7 continues to operate even when the A7 apps processor goes to sleep or is busy with some other function. The M7 passes information to the applications via iOS7.
The most likely culprit for the Apple iPhone 5S compass problems is the Core Motion Framework. It appears to be Apple’s own home brew. Apple tries to do as much as possible in-house such as its OS and the chip design for its apps processor family. However, the companies involved in developing sensor fusion algorithms have many years of experience and knowledge that cannot be acquired in a short period of time. These companies are Movea, Sensor Platforms, PNI Sensors and Hillcrest Labs. The sensor fusion companies are very familiar with the different sensors and how to work with their unique characteristics. Apple is working with sensors from three different vendors. It appears then that the problem is the sensor fusion code embedded in the NXP MCU that is flash based. I would assume that the code could be upgraded.