Novel mixed reality dose visualisation system may help manage radiation exposure during IR procedures

7530
visualisation
An interventionalist wearing the HoloLens goggles

Mixed reality dose visualisation is expected to improve exposure dose management for interventional radiology (IR) patients and health professionals by depicting invisible radiation exposure in real space, a study published in Journal of Medical Systems reports. First author Takeshi Takata (Graduate School of Medical Care and Technology, Teikyo University, Tokyo, Japan), senior author Jun’ichi Kotoku (Graduate School of Medical Care and Technology, Central Radiology Division, Teikyo University, Tokyo, Japan) et al examined for the first time in the relevant literature the concept and feasibility of an immersive, real-time dose visualisation system using mixed reality for dose management.

Contextualising this research, the study authors write: “For interventional radiology, dose management has persisted as a crucially important issue to reduce radiation exposure to patients and medical staff.” Currently, IR patient dose management generally uses only a displayed dose area product, they continue, saying that this parameter does not identify the exposed position. “For that reason, a dose distribution on the patient’s skin cannot be managed,” they note. Similarly, despite measuring the radiation exposure of the health professionals involved in IR procedures with personal dosimeters, Takata and colleagues explain that the dose is unknown until the dosimeter is read out.

“To optimise exposure for all involved, health professionals must note their exposure dose in real-time according to each situation,” they claim. “An intuitive, real-time dose management system is expected to facilitate management of the dose and to improve patients’ and health professionals’ medical safety.”

Their mixed reality dose visualisation system uses a wearable mixed reality holographic device (Microsoft HoloLens; Microsoft). According to the investigators’ description, the HoloLens uses a simultaneous localisation and mapping algorithm for spatial mapping and detection of the wearer’s position, and provides an accurate three-dimensional holographic representation of digital objects with a full sense of depth. The objects can be viewed by a group of viewers simultaneously from all angles, enabling shared interaction. The display can be flipped up if it gets in the way.

“Projection of the patient’s skin dose onto the patient’s body and estimation of the health professional’s dose requires accurate tracking of their positions and movements in a room,” they explain. “Nonetheless, position tracking using an external tracking system is adversely affected by disturbances such as the movement of staff members and fluoroscopy systems during a procedure. Therefore, an object recognition tracking method is preferred for [our] mixed reality dose visualisation system.” Using a three-dimensional graphics platform and the object detection and tracking functionalities built into a software development kit, they were able to register images in real-time by tracking the relative position and orientation of the real objects using an RGB camera on HoloLens; this positional information was then used to update the transformations within the virtual world for dose projection and estimation. “The optical detection and tracking of a target are useful for real-time high-accuracy registration with no need for an external tracking system,” Takata et al write.

Although HoloLens is a stand-alone device, it lacks sufficient computational power to run the real-time dose simulation, they add. “Therefore, we separated computation of the visualisation and dose calculation, respectively, to HoloLens and an external server.” The patient skin dose distribution and operator eye lens dose were estimated using the Monte Carlo system on an external computer in real time. The estimated doses were transported sequentially to HoloLens and were projected in a real space.

A fast, accurate measure of dose and distribution

The number recognition system accurately recognised all displayed fluoroscopic conditions every 0.2 seconds. In addition, validation of the measured values of the sensor unit (judged based on a comparison with the laser sensor value) revealed that averages of the error were less than 0.18% in three directions between the mixed reality dose visualisation system sensors and the laser displacement sensor.

Average patient skin dose distribution was estimated within three seconds. The dose distribution was visualised with four seconds latency. Takata and colleagues describe how the dose distribution appears in practice: “It is depicted as coloured cubes on the skin. A colour bar shown in the air represents the absorbed dose level. The health professional’s eye lens dose rate was updated continuously. It was indicated as numerical values in the air. [The] mixed reality dose visualisation system has the following functions.”

This led them to conclude: “More than other real-time dose estimation systems for IR, this mixed reality dose visualisation system provides an immersive experience and a detailed depiction of the dose distribution. Although several visualisation systems can be used for dose distributions, our mixed reality dose visualisation system can visualise the dose distribution in three dimensions in front of the user’s eyes. This visualisation mode will help procedures proceed smoothly, because the surgeon need not look at a distant monitor to know the dose.”


LEAVE A REPLY

Please enter your comment!
Please enter your name here