Accurate Evaluation of Rotational Angle and Translation Movement of Our Organ-Following Algorithm Based on Depth-Depth Matching
We present an algorithm, based on simulated annealing, that causes a virtual liver to mimic an actual liver. We evaluate its precision using the concordance rate of range images of both virtual and actual livers. This concordance rate is evaluated by superimposing a range image, in which a liver polyhedron standard triangulated language form is put through graphical z-buffering using the computer graphics of a PC and a depth image of the actual liver taken with Kinect v2. However, when the actual liver moves in a translational and rotational manner, we are unable to evaluate how accurately the concordance rate corresponds to the actual movement. In this study, we first manufacture a mechanical system that moves a replica of an actual liver in a translational and rotational manner for measurement. This system has two translational degrees of freedom (i.e., X, Y) and three rotational degrees of freedom (i.e., yaw, roll, pitch). This enables the system to move the replica of an actual liver in an extremely accurate manner. Next, we precisely move the actual liver and investigate how much the simulated annealing-based algorithm moves the virtual liver, and we evaluate its accuracy. Whereas previous experiments were conducted under fluorescent lamps and sunlight, our experiment is conducted in an operating room lit by two shadow-less lamps. The Kinect v2 captures depth images utilizing a shade filter to prevent interference from the infrared light of the shadow-less lamps. The past concordance rate and precision of the amount of translational and rotational movement are also evaluated.
KeywordsDigital imaging and communications in medicine Virtual liver polyhedron standard triangulated language form Replica of an actual liver Simulated annealing Liver surgery navigator
This research has been partially supported by the Collaborative Research Fund for Graduate Schools (A) of the Osaka Electro-Communication University, and a Grant-in-Aid for Scientific Research of the Ministry of Education, Culture, Sports, Science and Technology (Research Project Number: JP26289069).
- 2.Morita, Y., Takanishi, K., Matsumoto, J.: A new simple navigation for anatomic liver resection under intraoperative real-time ultrasound guidance. Hepatogastroenterology 61(34), 1734–1738 (2014)Google Scholar
- 3.Rusu, R.B., Cousins, S.: 3D is here: point cloud library (PCL). In: IEEE International Conference on Robotics and Automation, pp. 1−4 (2011)Google Scholar
- 5.Noborio, H., et. al.: Motion transcription algorithm by matching corresponding depth image and Z-buffer. In: Proceedings of the 10th Anniversary Asian Conference on Computer Aided Surgery, pp. 60–61 (2014)Google Scholar
- 6.Watanabe, K., et al.: Parameter identification of depth-depth-matching algorithm for liver following. Jurnal Teknologi Med. Eng. 77(6), 35–39 (2015). Penerbit UTM PressGoogle Scholar
- 7.Noborio, H., et al.: Experimental results of 2D depth-depth matching algorithm based on depth camera kinect v1. J. Bioinform. Neurosci. 1(1), 38–44 (2015)Google Scholar
- 8.Noborio, H., Watanabe, K., Yagi, M., Ohira, S., Tachibana, K.: Algorithm experimental evaluation for an occluded liver with/without shadow-less lamps and invisible light filter in a surgical room. In: Marcus, A., Wang, W. (eds.) DUXU 2017. LNCS, vol. 10289, pp. 524–539. Springer, Cham (2017). https://doi.org/10.1007/978-3-319-58637-3_41CrossRefGoogle Scholar
- 9.Watanabe, K., Yoshida, S., Yano, D., Koeda, M., Noborio, H.: A new organ-following algorithm based on depth-depth matching and simulated annealing, and its experimental evaluation. In: Marcus, A., Wang, W. (eds.) DUXU 2017. LNCS, vol. 10289, pp. 594–607. Springer, Cham (2017). https://doi.org/10.1007/978-3-319-58637-3_47CrossRefGoogle Scholar
- 10.Noborio, H., Yoshida, S., Watanabe, K., Yano, D., Koeda, M.: Comparative study of depth-image matching with steepest descendent and simulated annealing algorithms. In: Proceedings of the 11th International Joint Conference on Biomedical Engineering Systems and Technologies (BIOSTEC 2018) - BIODEVICES, pp. 77–87 (2018)Google Scholar