10/27/2021 2 Comments New vision-based robotic 3D ultrasoundTechnical University of Munich developed in conjunction with Zhejiang University and Johns Hopkins University a new vision-based robotic 3D ultrasound system powered by Cephasonics to create accurate 3D scans even if the target limb changes position after the scan begins.
While 2D ultrasound imaging can provide images in real time even if the target moves, 3D ultrasound imaging is much more sensitive to changes in movement. Repositioning limbs, for example, may be necessary to make different parts of a limb available to the ultrasound probe in order to image an entire artery. The angle of the 2D image changes from its original position, making it difficult to stitch the images together into a 3D representation. Robotic ultrasound systems (RUSS) produce high quality images by precisely controlling the contact force and orientation of the ultrasound probe. Combining robot arms with depth cameras to determine optimal probe position increases the accuracy of the scan. This is still not effective for 3D imaging because this system also cannot account for movement of the scan target. The new system developed by the researchers, as described in the paper titled “Motion-Aware Robotic 3D Ultrasound” (bit.ly/VSD-MTN3D), combines RUSS with depth cameras to make 3D imaging with ultrasound possible. The system uses a LBR iiwa 14 R820 robot arm from Kuka (Augsburg, Germany; www.kuka.com/en-us), a Cicada LX ultrasound machine from Cephasonics Ultrasound (Santa Clara, CA, USA; www.cephasonics.com), and an Azure Kinect 3D camera from Microsoft (Redmond, WA, USA; www.microsoft.com/en-us). The process begins by drawing a red line on the patient’s skin to mark the path of the ultrasonic probe during the scan. The Kinect camera images the line, and the software registers the path to calculate a trajectory for the robot arm. Two marker spheres with retroreflective layers are placed on both ends of the drawn red line. The spheres help provide reliable position data for the Kinect camera and aid in extracting region of interest and trajectory data. This helps the system account for limb position adjustment during the scan. Because the system can assign 3D coordinates to images during the scan, even if the limb moves, the system can compensate for the movement and stitch the images together as if the limb movement never took place. This allows for creating accurate 3D scan data. Testing demonstrated that the system compensates for up to 40° of target rotation. The researchers proposed that a laser system could project a trajectory onto the surface of a limb and replace the drawn red line. Also, while the system test simulated vascular scans, the system could also perform bone visualization.
2 Comments
10/30/2022 10:34:36 pm
Hand term technology question. Road attention analysis activity will.
Reply
Leave a Reply. |
Cephasonics Ultrasound is a division / registered dba of White Eagle Sonic Technologies, Inc.
Products
|
Company |