Hard Iron Offset

By stueckrath.c , 12 September 2012

In the PC Demo UI, it seems the difference between raw magnetometer values and calibrated values is this:


  • Raw values multiplied by 0.3 to get uT units

  • Axes adjusted to match accelerometer (i.e. x=y, y=x, z=-z)

  • Hard Iron offsets are applied


The Hard Iron offsets are calculated somehow once the device has rotated around each axis. I assume they are calculated by fitting a sphere to the cloud of raw magnetometer values (in x,y,and z). The center of the sphere is the Hard Iron offset. I found a very useful article from FreeScale http://cache.freescale.com/files/sensors/doc/app_note/AN4248.pdf that describes this.

My question is why are the Hard Iron offsets being calculated each time on startup? The Hard Iron offsets are a function of ferrous metal that moves with the sensor. Once the chip is on the PCB and attached to whatever surface it will move with, the Hard Iron offset should not change. Thus it seems that calculating the Hard Iron offsets should be a one time calibration procedure. In my application, I need the sensors to startup with their orientation relative to Earth (i.e. North). I am not able to move the sensors before taking readings and cannot control their initial heading on startup.
phpbb Topic ID
14667