Abstract
We have been developing a liver surgical support system. By matching the depth images of the real liver and the 3D liver model during surgery, the position of the liver, invisible blood vessels and tumors is estimated. The tip position of the surgical knife is measured by single point measurements camera using specific markers. By merging all information, the distance between the knife tip and the target parts such as vessels or tumors is calculated and the proximity of the knife to the target parts is determined. To indicate the proximity, we have been developing a surgical knife attachment with light emitting diodes (LEDs). When the knife approaches to the target parts, the LEDs on the attachment gradually turn on the light. The newly developed attachment becomes compact and lightweight than the previous one. It uses a wireless controller and ArUco markers which can be tracked by an inexpensive USB camera. We conducted experiments to check the performance of ArUco markers and the navigation of the operator using the new attachment. The results showed the new attachment had comparable navigation accuracy to the previous one.
You have full access to this open access chapter, Download conference paper PDF
Similar content being viewed by others
Keywords
1 Introduction
The liver has several blood vessels such as arteries, veins, portals, etc. and these are not externally visible. By using X-ray imaging, computed tomography (CT) or magnetic resonance imaging (MRI) etc., the position of these vessels in the liver is checked preoperatively. During surgery, surgeons touch the incision part of the liver using their fingers, feel for a pulse-beat, and perceive the proximity of the vessels. However, this method is dangerous and can injury to the blood vessels and result in bleeding.
Surgical support systems for orthopedic, dental, or brain surgery are developed [1,2,3]. These systems determine the position of the surgical tools, synchronize with the tomographic images and navigate the tool to the target position. The target body parts of these support systems are rigid and exhibit negligible deformation during surgery.
However, the liver is soft and changes it shape during operation and the existing support systems cannot be applied to the liver surgery. Research targeted on the surgical support systems for soft organs has been limited, and no effective system have been developed yet. We have been developing a liver surgery support system.
In this paper, the detail of our liver surgery support system and newly developed wireless knife attachment are described. It can alert the proximity of the target parts by lighting LEDs.
2 Our Liver Surgical Support System
The detail of our liver surgical support system was described in [4]. It uses two depth cameras with different characteristics. The first camera is lower precision, but a wide measurement range used for determining the shape of the liver during surgery. Kinect for Windows v2 developed by Microsoft is currently used. Three-dimensional (3D) polyhedron model is generated from tomographic images preoperatively. The 3D model contains the shape of the liver, inner blood vessels, and tumors. By matching the depth images from the depth camera and Z-buffer of the 3D polyhedron model during surgery, the positions of the liver is determined using simulated annealing algorithm and the location of the inner blood vessels and the tumors are estimated [5,6,7,8]. The second camera is higher precision and performs single point measurements using single markers [9] to determine the tip position of the surgical knife. MicronTraker3 (type H3-60) by ClaroNav is currently used. By merging all information, the distance between the knife tip and the vessels or tumors is calculated [10] and the proximity of the knife to the target parts is determined. To indicate the proximity, the LEDs on our surgical knife attachment are gradually illuminated (Fig. 1).
3 Conventional Surgical Knife Attachment
The conventional surgical knife attachment was presented in [4]. The size and weight of the attachment are approximately 70 × 70 × 70 mm and 261 g respectively. The size of marker for MicronTracker3 is 30 × 40 mm. An LED bar module (OSX10201-GYR1) [11] and a piezoelectric speaker are mounted in the attachment and a microcomputer (Arduino Nano) controls them. The LED module has 10 LEDs (five greens, three yellows and two reds). The attachment has no internal power source. The electric power and control commands are externally supplied from the main PC through USB cable (Fig. 2). Specification details are listed in Table 1.
4 Wireless Surgical Knife Attachment with Proximity Indicators Using ArUco Marker
In the new attachment, the case was made smaller and lighter, and a microcomputer with a Wi-Fi module and a rechargeable battery were built in the case and controlled wirelessly (Fig. 3). We confirmed that this battery drive for more than 6 h. For the knife position measurement system, ArUco [12] was used. ArUco is a marker recognition library for AR which can estimate postures of cameras and markers. ArUco is built in the OpenCV library, it can be easily used with an inexpensive USB camera. ArUco uses black and white square markers (ArUco marker, Fig. 4). Specification details are listed in Table 2. The circuit diagram of the attachment is illustrated in Fig. 5.
5 Knife Tip Position Calibration
To measure the relative vector from the marker on the attachment to the knife tip, the calibration procedure described in [4] was basically used. Camera and knife coordinate systems are defined as \( \sum_{c} \) and \( \sum_{k} \) respectively. The knife attached marker and the table fixed marker are represented as \( {\text{M}}_{knife} \) and \( {\text{M}}_{table} \) respectively.
To acquire a relative vector from \( {\text{M}}_{knife} \) to the knife tip, the knife tip is placed at the origin point \( p_{table}^{c} \) of \( {\text{M}}_{table} \), and the position and orientation of each marker are measured in \( \sum_{c} \) coordinates (Fig. 6). The position and orientation of \( {\text{M}}_{knife} \) measured in \( \sum_{c} \) coordinates are defined \( P_{knife}^{c} \) and \( R_{knife}^{c} \) respectively. The relative vector \( P_{rel}^{c} \) is calculated by the following equation. \( {\text{P}}_{\text{rel}}^{\text{k}} \)
To convert \( P_{rel}^{c} \) in \( \sum_{k} \) coordinates, the following is used.
Finally, the knife tip position \( P_{tip}^{c} \) in \( \sum_{c} \) coordinates is estimated as follows.
Note that the OpenCV returns the orientation of the ArUco marker as 3 × 1 rotation vector with reference to the marker coordinate system. In this rotation vector, the direction of the vector shows the rotation axis and the norm of the vector shows the rotation angle. To convert the rotation vector into a rotation matrix in the camera coordinate system, the Rodrigues’ rotation formula was used.
6 Preliminary Experiments
6.1 Positional Precision in Distance Between Camera and Marker
We verify the difference in position measurement of the ArUco marker depending on the distance between the ArUco marker and the camera. Since it is quite difficult to verify the absolute accuracy of the distance, we will examine the repetitive accuracy.
The experimental procedure is as follows. 30 × 30 mm sized ArUco marker (ID 0 of DICT_4 × 4) was set on the horizontal table. A USB camera (Logicool C615, resolution 1600 × 896 pixels) was attached to a tripod and placed above the marker. By capturing ArUco marker from the camera, the position of the marker was measured. The distance between the marker and the camera was set to about 500, 600, 700 mm (Fig. 7), and the position of the marker was measured 100 times respectively.
The measurement result of the marker position is shown in Fig. 8. The positive direction of the x, y and z coordinate axis is the right, the downward and the backward direction with the camera position as the origin respectively. You can see that the measurement distance is almost correct from the graph. The standard deviation of the measurement distance of each axis is shown in Fig. 9. The standard deviation in the z direction is larger than in the x and y directions in all measurement distances. However, all standard deviations are less than 1 mm. From this result, it was found that the ArUco has a high repetitive accuracy of distance and the accuracy is not much different with the distance of the camera under 700 mm.
6.2 Positional Precision in Changing Angle of Marker
We verify changes in marker positional precision due to camera and marker angle. As the method in the previous section, the position of the ArUco marker was measured by the camera above the ArUco marker. The marker was tilted to 0, 15, 30, 45, 60 and 75º around x axis (Fig. 10), and 100 times of the marker position were measured for each.
The measurement result of the marker position is shown in Fig. 11. You can see that the measurement distance is almost correct from the graph. The standard deviation of the measurement distance of each axis is shown in Fig. 12. As the marker angle increased, the standard deviation of the Z axis tends to decrease. On the other hand, the marker detection rate tends to decrease as the marker angle increased (Fig. 13). Summarize these results, it is better to use ArUco markers at around 60º.
7 Navigation Experiment
We conducted experiments to validate operator navigation using the new attachment. The task is to trace an invisible target circle with the knife tip based on the LED information. The target circle is set on a flat horizontal table in front of the subject. The diameter of the circle is 80Â mm. The attachment cube was fixed to a hard steel rod with 130Â mm in length, 6Â mm in diameter.
With \( \left( {x_{\text{tip}} , y_{\text{tip}} } \right) \) as the knife tip position, \( \left( {x_{\text{c}} , y_{\text{c}} } \right) \) as the center of the circle, and \( r \) as the radius of the circle, a distance \( {\text{L}} \) between the knife tip and the circle is calculated from the following equation.
The LED array contains 10 LEDs. However, the microcomputer of new attachment has only 9 digital I/O. For this reason, the LEDs are gradually illuminated between \( 1 \le {\text{L}} \le 9 \) mm and can navigate over a range in \( \pm 9 \) mm of the circle.
The five subjects (A to E) are undergraduate students from the Osaka Electro-Communication University, not surgeons. The experimental results of the trajectory of the tip position are shown in Fig. 14. The green circle is the target circle and the purple dots show the trajectory. The LEDs turn on within the range between the blue and yellow circles. The maximum and averaged navigation errors for each subject are shown in Fig. 15. The results show that it was possible to navigate with the maximum error of 9.3 mm only with the output of the LED array. This is nearly within the navigable range by the LED array, and the usefulness of the new device was shown.
8 Conclusion
We developed a compact, lightweight, inexpensive wireless surgical knife attachment which can display proximity by gradually illuminating LEDs using ArUco markers and a microcomputer with Wi-Fi module. To investigate the characteristics of the positioning of the ArUco marker, we measured the repetitive accuracy of the position of the marker by changing statically in the position and posture of the marker, and clarified the marker has a suitable angle for stable and robust use. We also conducted a navigation experiment to five subjects. The subject held the knife with the attachment in hand and traced the knife tip to the invisible circle by watching the illuminated LEDs on the attachment. The LEDs are gradually illuminated according to the distance between the knife tip and the circle. As a result, it was shown that navigation is possible within the presentable range and the new attachment has comparable navigation accuracy to the previous one.
In the future, we will consider the use of high resolution USB camera, examination of marker size, use of multiple cameras for more precision, robustness and stabilization.
References
Knee Navigation Application - Brainlab. https://www.brainlab.com/en/surgery-products/orthopedic-surgery-products/knee-navigation/
ClaroNav - Dental and ENT Navigation Solutions. http://www.claronav.com/
Surgical Theater - Surgical Navigation Advanced Platform (SNAP). http://www.surgicaltheater.net/site/products-services/surgical-navigation-advanced-platform-snap
Yano, D., Koeda, M., Onishi, K., Noborio, H.: Development of a surgical knife attachment with proximity indicators. In: Marcus, A., Wang, W. (eds.) DUXU 2017. LNCS, vol. 10289, pp. 608–618. Springer, Cham (2017). https://doi.org/10.1007/978-3-319-58637-3_48
Noborio, H., Onishi, K., Koeda, M., Mizushino, K., Kunii, T., Kaibori, M., Kon, M., Chen, Y.: A fast surgical algorithm operating polyhedrons using Z-Buffer in GPU. In: Proceedings of the 9th Asian Conference on Computer Aided Surgery (ACCAS 2013), pp. 110–111 (2013)
Noborio, H., Watanabe, K., Yagi, M., Ida, Y., Nankaku, S., Onishi, K., Koeda, M., Kon, M., Matsui, K., Kaibori, M.: Experimental results of 2D depth-depth matching algorithm based on depth camera kinect v1. In: Proceedings of the International Conference on Intelligent Informatics and Biomedical Sciences (ICIIBMS 2015), Track 3: Bioinformatics, Medical Imaging and Neuroscience, pp. 284–289 (2015)
Onishi, K., Noborio, H., Koeda, M., Watanabe, K., Mizushino, K., Kunii, T., Kaibori, M., Matsui, K., Kon, M.: Virtual liver surgical simulator by using Z-buffer for object deformation. In: Antona, M., Stephanidis, C. (eds.) UAHCI 2015. LNCS, vol. 9177, pp. 345–351. Springer, Cham (2015). https://doi.org/10.1007/978-3-319-20684-4_34
Watanabe, K., Yoshida, S., Yano, D., Koeda, M., Noborio, H.: A new organ-following algorithm based on depth-depth matching and simulated annealing, and its experimental evaluation. In: Marcus, A., Wang, W. (eds.) DUXU 2017. LNCS, vol. 10289, pp. 594–607. Springer, Cham (2017). https://doi.org/10.1007/978-3-319-58637-3_47
MicronTracker - ClaroNav. http://www.claronav.com/microntracker/
Noborio, H., Kunii, T., Mizushino, K.: GPU-based shortest distance algorithm for liver surgery navigation. In: Proceedings of the 10th Anniversary Asian Conference on Computer Aided Surgery, pp. 42–43 (2014)
OSX10201-GYR1 - AKIZUKI DENSHI TSUSHO CO., LTD. http://akizukidenshi.com/download/ds/optosupply/OSX10201-XXXX.PDF
Garrido-Jurado, S., Muñoz-Salinas, R., Madrid-Cuevas, F.J., MarÃn-Jiménez, M.J.: Automatic generation and detection of highly reliable fiducial markers under occlusion. Pattern Recogn. 47(6), 2280–2292 (2014)
Acknowledgement
This research was supported by Grants-in-Aid for Scientific Research (No. 26289069) from the Ministry of Education, Culture, Sports, Science and Technology (MEXT), Japan.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2018 Springer International Publishing AG, part of Springer Nature
About this paper
Cite this paper
Koeda, M., Yano, D., Shintaku, N., Onishi, K., Noborio, H. (2018). Development of Wireless Surgical Knife Attachment with Proximity Indicators Using ArUco Marker. In: Kurosu, M. (eds) Human-Computer Interaction. Interaction in Context. HCI 2018. Lecture Notes in Computer Science(), vol 10902. Springer, Cham. https://doi.org/10.1007/978-3-319-91244-8_2
Download citation
DOI: https://doi.org/10.1007/978-3-319-91244-8_2
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-91243-1
Online ISBN: 978-3-319-91244-8
eBook Packages: Computer ScienceComputer Science (R0)