Magnetometer Robust Deep Human Pose Regression With Uncertainty Prediction Using Sparse Body Worn Magnetic Inertial Measurement Units

Magnetometer Robust Deep Human Pose Regression With Uncertainty Prediction Using Sparse Body Worn Magnetic Inertial Measurement Units

Abstract

We propose a deep learning based framework that learns data-driven temporal priors to perform 3D human pose estimation from six body worn Magnetic Inertial Measurement units sensors. Our work estimates 3D human pose with associated uncertainty from sparse body worn sensors. We derive and implement a 3D angle representation that eliminates yaw angle (or magnetometer dependence) and show that 3D human pose is still obtained from this reduced representation, but with enhanced uncertainty. We do not use kinematic acceleration as input and show that it improves the generalization to real sensor data from different subjects as well as accuracy. Our framework is based on Bi-directional recurrent autoencoder. A sliding window is used at inference time, instead of full sequence (offline mode). The major contribution of our research is that 3D human pose is predicted from sparse sensors with a well calibrated uncertainty which is correlated with ambiguity and actual errors. We have demonstrated our results on two real sensor datasets; DIP-IMU and Total capture and have come up with state-of-art accuracy. Our work confirms that the main limitation of sparse sensor based 3D human pose prediction is the lack of temporal priors. Therefore fine-tuning on a small synthetic training set of target domain, improves the accuracy.