Technical Medical Centre

Session overview & Review presentations 

Addressing motion artifacts in biomedical imaging and sensing

Ata Chizari (TNW-BMPI), Mirjam J. Schaap (Dermatology-RadboudUMC), Tom Knop (TNW-BMPI), Marieke M.B. Seyger (Dermatology-RadboudUMC), and Wiendelt Steenbergen (TNW-BMPI)

Abstract 

When working with a medical device, a microscope, or a biomedical sensor, unwanted motions can cause unrealistic measurements, known as motion artifacts. In biology, aligning time-lapse images of certain cells can be labor-intensive due to potential displacement between images. In medicine, consider performing computed tomography (CT) or magnetic resonance imaging (MRI) on a subject with a severe hand tremor. The resulting images might be blurry, causing inconvenience for the patient who may need to repeat the scan and for the medical investigator who must interpret the distorted images. Another example is recording electrocardiography or oxygen saturation with an optical sensor on a walking or running subject. The recorded signals can be severely distorted by motion.

In some imaging modalities, such as laser speckle perfusion imaging, motion artifacts not only blur the recorded blood flow images but also result in unrealistic higher perfusion values. At the Biomedical Photonic Imaging Group of the University of Twente, in collaboration with the Department of Dermatology at RadboudUMC, we developed a handheld perfusion imager (HAPI) and conducted a study on psoriasis patients. We proposed a comprehensive algorithm for data analysis that accounts for image deblurring and motion artifact correction. Our results showed a statistically significant difference between perfusion images obtained with the handheld modality compared to the control mounted measurement. However, this difference became non-significant after applying our correction method.

While briefly touching upon the results of our study, we would like to discuss how other techniques that may deal with motion artifacts (e.g. microscopy, electro-optical sensors, MRI, photoacoustics, and optical coherence tomography) can benefit from our solution. We are also interested in learning how others address motion artifacts in their data analysis.