This demo showcases how our framework makes robotic ultrasound work out of the box.
Ultrasound images can be frame-grabbed from any commercial ultrasound system and processed to extract the relevant pixels from the whole frame. After providing the imaging depth as set in the ultrasound system, our software can compute the pixel spacing and, hence, the position of each pixel in physical space. This information can be complemented with the output of a tracking system to combine the content of multiple ultrasound images into a single three-dimensional image.
A robot can not only act as a tracking system but also move the ultrasound probe in space to progressively scan a three-dimensional region of the patient's body. While our software supports real-time low-level robotic control, this can be an overkill for simple scenarios, in particular in the context of academia. Our ROS plugin can act as an interface to any ROS-based system, such as iiwa_stack running on the KUKA sunrise platform. This makes it possible to control a collaborative robot without a real-time operating system, as for this use case. The URDF description of the robot can then be used to show the robot in our advanced viewer and compute forward kinematics on the data provided by the robot itself.
A further prerequisite for accurate reconstruction of 3D ultrasound data with a robotic system is proper calibration, both spatial and temporal. The former consists of measuring the relative position between the robot tool control point and the ultrasound image reference frame. Our software offers an intuitive wizard for roughly estimating this geometric transformation in order to provide an initialization for our image-based calibration based on our state-of-the-art image registration routines. The latter must compensate for the delay between the acquisition of tracking and image data, which is intrinsically present in any complex system. This delay would cause inaccuracy in the 3D reconstruction of the image for complex trajectories of the probe and severe artifacts when a single voxel is imaged in successive ultrasound frames.
All these functionalities can be accessed from our C++ SDK or from the ImFusion Suite, which provides a convenient GUI to use our software for rapid prototyping and development. This video shows the steps required to set up a robotic ultrasound system for research purposes. Our libraries can then be called with the same parameters in a production-grade software, as is already done in multiple certified medical devices on the market.
This demo was made possible by Stamatia Giannarou, who graciously let us use her robotics lab and equipment at the @hamlynsymposiumonmedicalro9103
Смотрите видео Discover the fastest path to robotic ultrasound with ImFusion Suite онлайн без регистрации, длительностью часов минут секунд в хорошем качестве. Это видео добавил пользователь ImFusion GmbH 07 Ноябрь 2023, не забудьте поделиться им ссылкой с друзьями и знакомыми, на нашем сайте его посмотрели 65 раз и оно понравилось людям.