Low Cost Inertial Measurement Unit for Motion Capture in Biomedical Applications
- 1k Downloads
A low-cost inertial measurement unit has been developed for accurate motion capture, allowing real-time spatial position registration (linear and angular) of the user’s whole-body. For this, we implemented a dedicated circuit for 9 degrees of freedom motion sensors, composed of an accelerometer, gyroscope and a magnetometer. We also applied signal processing and data fusion algorithms to prevent the inherent drift of the position signal. This drift is known to exist during the sensor integration process and the implemented algorithms showed promising results. This system is meant to be used in two specific biomedical applications. The first one is linked to the development of a low-cost system for gait analysis of the whole-body, which can be used in home-based rehabilitation systems. The second application is related to the real-time analysis of working postures and the identification of ergonomic risk factors for musculoskeletal disorders.
KeywordsMotion capture Sensor fusion Cyber-physical systems Home rehabilitation Work risk analysis
Pervasive and ubiquitous environments have been spreading in our society, especially focusing on tracking user’s location and activities by monitoring physiological signals, motion and orientation.
There are two main cyber-physical approaches to this field, using two different sensorial components, the first is based on computer vision devices for human motion capture, such as using single or a network of cameras, infrared cameras or other optical devices [1, 2], while the task of human movement analysis has also been influenced by using inertial sensors, such as accelerometers and gyroscopes that are present in common smartphones or by using dedicated sensors connected to microcontrollers (which can also communicate to smartphones or other computation devices) [2, 3].
“How to design a low-cost system that is capable of accurate detection in real-time of the user’s whole-body?”
In this paper we propose a system that answers this question (a relation of this work to Cyber-Physical Systems is presented in Sect. 2). Section 3 gives a more detailed overview of the state of the art and the related literature to inertial measurement units and the associated applications of such systems. Section 4 provides a summarized description of our proposed physical system and the associated algorithms of signal processing, the 9 degrees of freedom data fusion and the implementation of a real-time data transmission protocol. The initial laboratory results are presented in Sect. 5, while Sect. 6 gives a summarized conclusion of our working progress and a preview of our future work plan.
2 Research Connection with Cyber-Physical Systems
Cyber-Physical Systems (CPS) have been emerging as the next computing revolution because they integrate the available communication and computational capacities to create new interactions between cyber and physical components . These systems are merging the physical and virtual world and several applications have been identified in communication, transportation, infrastructure, energy, robotics, manufacturing and healthcare [4, 5].
Cyber-Physical frameworks have already been proposed for use in rehabilitation systems . Such systems can be used for home-based physiotherapy services processes, such as the monitoring of users in senior care, patients with reduced mobility (for example with hip or knee replacement surgery), stroke and heart attack patients and other subjects that require occupational therapy to regain day to day skills.
CPS applications are already inside the healthcare system, which has been extensively reviewed , and require a special architecture that can handle the privacy of the data, that can also handle communication between the hospital/rehabilitation centre, the storage unit and the sensors. Finally they also need to handle the computational resources that arise from a feedback system that receives real-time data from various patients and a large network of sensors and requires a real-time classification or response back to the users .
In the manufacturing area, there have been several advances in hazard risk management and their approach to the preservation of worker’s safety and health has been highly improved, by developing new work equipment features and definition of more secure tasks .
A big trend that has been emerging in this area is the definition of a sensing and smart factory, which uses information from a network of physical sensors and virtual databases to make real-time risk assessments , which has been extensively overviewed for the construction industry . These sensors can also be used to monitor and track the assembly line, by using real-time information to guide the manual assembly process . Right now, most studies in this area use visual based motion capture frameworks for to identify workers ergonomic risk factors , even adapting systems such as the KinectTM sensor for body kinematic measurement in the workplace . Most of these studies lack the possibility of real-time management and will be hard to implement in a real setting, due to inherent problems in these systems: there is necessity of adaptation of the background (other objects can influence the field of view, the lightning can affect the measurements) and the user’s to be completely inside the field of view of the cameras. To solve such a problem, some groups have been fusing data coming from physiological sensors (e.g. ECG, EEG, EMG) location sensors (e.g. GPS or Ultra-Wideband technology) and normal cameras to make a real-time monitor tasks in the workplace and analysis of ergonomic risks .
3 State of the Art
Motion capture sensorial systems are usually divided into two categories: optical sensors and non-optical sensors . Optical systems use cameras to acquire motion information of the subject being studied. Due to their scalability and high sensitivity to the motion, they are regarded as the standard on several fields such as gait analysis, film production and video game animation. However, they have some disadvantages: they are usually very expensive, do not work on every lighting environment and require special software for different tasks. On the other hand, there are the non-optical systems, which are comprised of every other system e.g. magnetic arrays, microelectromechanical systems (MEMS)-based .
MEMS-based inertial motion capture systems use inertial sensors (e.g. gyroscopes and accelerometers) to acquire motion information, have been subject of great interest due to constant miniaturization of the sensors, ability to work on any environment and relative low cost when compared with other systems. They do, however, also have several drawbacks, namely the lack of information about the full 3D motion and the inherent noise in some sensors.
The MEMS-based sensors have been more recently implemented for real-time human motion tracking in various physical activities, such as boxing, golf swinging, and table tennis , but also have been able to actually identify some simple activities using fuzzy logic, such as walking, sitting or stair climbing . These human activity measurements can be useful in the identification of bad postures in the workplace and the identification of ergonomic hazard risks in the workplace.
Both issues have been tackled simultaneously in two fronts, the first is using combinations of sensors to get the most information (e.g. combining 3D accelerometers with 3D gyroscopes to get information about the acceleration and angular velocity in 3 Dimensions). The second is by using Data Fusion algorithms to mitigate the effect of the measurement noise, for example using Kalman filtering in a limited frequency band  or other methodologies to filter the inherent noise over the whole frequency band of interest .
A final approach is by fusing both visual (both with and without using optical markers) and inertial sensors, in order to solve the intrinsic problems in each system . Concerning the rehabilitation procedure of gait disorders, all the approaches, such as using just inertial sensors, or optical sensors or fusing both of them have been tested and studied [20, 21, 22, 23].
4 Materials and Methods
In this section we provide a representation of the physical sensor architecture, with the implemented sensor elements and a description of the fusion and signal processing algorithms.
To address the noise issue described in Sect. 3, we implemented the Mahony’s filter in Direction Cosine Matrix form in both Python and MATLAB following the algorithm in , a detailed discussion about the problem and filtering algorithms have been extensively discussed in [24, 25] respectively.
5 Initial Laboratory Results and Discussion
For the next experiment, we tested the sensor for rapid and slow movements to check the filter response, using a goniometer to rotate the sensor around the z axis (gamma), while trying to maintain the other axis stabilized. The rotation sequence was the following: at 2.5 s, we moved from 0º (all the angles are relative to this starting point) to 24º, and then we moved to 165º with 2 additional steps of 5º. Afterwards we move to –100º (8 s), which followed to a fast rotation to 90º (11.5 s) then to –45º, then to 150º. At 15 s, we rotated back to 0º, then a fast rotation to 160º, with a slow rotation back to 90º, then again to 160º, with a gradual slow rotation to –25º. At 36 s, we rotated to 45º, back to –110º, then to 120º and a slow rotation to –125º, finalized by a rotation to 110º.
6 Conclusions and Future Work Plan
In this work, we implemented a low-cost cost measurement unit that been developed for accurate motion capture, by implementing a dedicated circuit with 9 degrees of freedom sensors (accelerometer, gyroscope and magnetometer), connected to a Raspberry Pi, which transmits the data to a local network, allowing access by authorized computers. We implemented signal processing and data fusion algorithms to correct the inherent drift process that occurs even when the sensor is stopped, as shown in Fig. 2. As future work we can use an absolute position coordinate system (e.g. north east down system) instead of the existing coordinate system.
We are planning to use this system in two specific biomedical applications, as the first one is linked to the real-time analysis of working postures and the identification of ergonomic risk factors for musculoskeletal disorders. For this we are planning to start with a network of at least 3 sensors to test the analysis in the upper limbs. We will make a feedback system that uses the ISO 11226 recommendations for the holding time for upper arm elevation in a specific posture and the acceptable shoulder, forearm and hand positions in a static posture . To analyse a dynamic working posture, we will use a similar framework to the project FAST ERGO_X , which is based on fuzzy rules. Instead of using camera sensors, we will be using inertial sensors, which will give the ability to make the analysis in real-time.
The second application is related to the development of a low-cost system for gait analysis of the whole-body, which can be used in home-based rehabilitation systems, we will use the same network of sensors for the upper limbs to validate them using the gold standard sensors used in rehabilitation centres.
Leonardo Martins is supported by a PhD Scholarship with the reference SFRH/BD/88987/2012 and is also partially funded by FCT Strategic Program UID/EEA/00066/203 of UNINOVA, CTS, funded by the Portuguese funding institution FCT - Fundação para a Ciência e a Tecnologia. We also acknowledge the technical support of the engineers of NGNS - Ingenious Solutions.
- 4.Geisberger, E., Cengarle, M.V., Keil, P., Niehaus, J., Thiel, C., Thönnißen-Fries, H.-J.: Cyber-Physical Systems - Driving force for innovation in mobility, health, energy and production (2011)Google Scholar
- 6.Ma, X., Tu, X., Huang, J., He, J.: A cyber-physical system based framework for motor. In: ACWR 2011 Proceedings of the 1st International Conference on Wireless Technologies for Humanitarian Relief, pp. 285–290 (2011)Google Scholar
- 8.Wang, J., Abid, H., Lee, S., Shu, L., Xia, F.: A secured health care application architecture for cyber-physical systems. Control Eng. Appl. Info. 13, 101–108 (2011)Google Scholar
- 9.Lazaro, O., Moyano, A., Uriarte, M., Gonzalez, A., Meneu, T., Fernández-Llatas, C., Traver, V., Molina, B., Palau, C., Lopez, O., Sanchez, E., Ros, S., Moreno, A., Gonzalez, M., Antonio, J., Sepulcre, M., Gozalvez, J., Collantes, L., Prieto, G.: Integrated and personalised risk management in the sensing enterprise. In: Banaitiene, N. (ed.) Risk Management - Current Issues and Challenges, pp. 285–312. InTech, Rijeka (2012)Google Scholar
- 11.Bader, S., Aehnelt, M.: Tracking assembly processes and providing assistance in smart factories. In: 6th International Conference on Agents and Artificial Intelligence, ICAART 2014, Proceedings, vol.1, ESEO, Angers, Loire Valley, France, 6–8 March 2014, pp. 161–168. SciTePress - Science and Technology Publications (2014)Google Scholar
- 15.Liu, H., Wei, X., Chai, J., Ha, I., Rhee, T.: Realtime human motion control with a small number of inertial sensors. Symp. Interact. 3D(3), 133–140 (2011)Google Scholar
- 20.Ali, A., Sundaraj, K., Ahmad, B., Ahamed, N., Islam, A.: Gait disorder rehabilitation using vision and non-vision based sensors: a systematic review. Bosn. J. Basic Med. Sci. 12, 193–202 (2012)Google Scholar
- 23.Tao, Y., Hu, H., Zhou, H.: Integration of vision and inertial sensors for home-based rehabilitation. IEEE Int. Conf. Robot. Autom. (2005)Google Scholar
- 24.Premerlani, W., Bizard, P.: Direction cosine matrix IMU: Theory (2009)Google Scholar
- 25.Madgwick, S.O.H., Harrison, A.J.L., Vaidyanathan, R.: Estimation of IMU and MARG orientation using a gradient descent algorithm. In: IEEE International Conference on Rehabilitation Robotics, pp. 179–185 (2011)Google Scholar
- 26.Ma, E., Popovic, M., Masani, K.: Wearable gait analysis using vision-aided inertial sensor fusion. In: Book of abstracts - IUPESM 2015 World Congress on Medical Physics and Biomedical Engineering, Toronto, 7–12 June 2015, p. 386 (2015)Google Scholar
- 27.International Organization for Standardization: ISO 11226 - Ergonomics—Evaluation of static working postures (2000)Google Scholar
- 28.Nunes, I.: FAST ERGO_X - a tool for ergonomic auditing and work-related musculoskeletal disorders prevention. Work. 34, 133–148 (2009)Google Scholar