Advertisement

DeepEye: A Dedicated Camera for Deep-Sea Tripod Observation Systems

  • Huimin LuEmail author
  • Yujie Li
  • Hyoungseop Kim
  • Seiichi Serikawa
Chapter
Part of the Studies in Computational Intelligence book series (SCI, volume 810)

Abstract

The deep-sea tripod systems are designed and built at the U.S. Geological Survey (USGS) Pacific Coastal and Marine Science Center (PCMSC) in Santa Cruz, California. They are recovered in late September 2014 after spending about half a year collecting data on the floor of the South China Sea. The deep-sea tripod systems are named as Free-Ascending Tripod (FAT), are deployed at 2,100 m water depth—roughly 10 times as deep as most tripods dedicated to measuring currents and sediment movement at the seafloor. Deployment at this unusual depth was made possible by the tripod’s ability to rise by itself to the surface rather than being pulled up by a line. Instruments mounted on the tripod took bottom photographs and measured such variables as water temperature, current velocity, and suspended-sediment concentration. FAT is used to better understand how and where deep-seafloor sediment moves and accumulates. Besides of this, we also use them to study the deep-sea biology. The obtained the images from the camera, the biology animals are hardly to be distinguished. In this project, we are concerned to use novel underwater imaging technologies for recovering the deep-sea scene.

Keywords

Underwater camera Deep-sea tripod South China Sea 

Notes

Acknowledgements

This work was supported by Leading Initiative for Excellent Young Researcher of Ministry of Education, Culture, Sports, Science and Technology-Japan (16809746), Grants-in-Aid for Scientific Research of JSPS (17K14694), Research Fund of State Key Laboratory of Marine Geology in Tongji University (MGK1803), Research Fund of State Key Laboratory of Ocean Engineering in Shanghai Jiaotong University (1510; 1315), Research Fund of The Telecommunications Advancement Foundation, Fundamental Research Developing Association for Shipbuilding and Offshore, Japan-China Scientific Cooperation Program (6171101454), International Exchange Program of National Institute of Information and Communications (NICT), and Collaboration Program of National Institute of Informatics (NII).

References

  1. 1.
  2. 2.
    Ocean Networks Canada. http://www.oceannetworks.ca/
  3. 3.
    NOAA Ocean Climate Observation Program. http://www.oco.noaa.gov/
  4. 4.
    Japanese Ocean Flux Data Sets with Use of Remote Sensing Observation. http://dtsv.scc.u-tokai.ac.jp/j-ofuro/
  5. 5.
    Woods Hole Oceanographic Institution. http://www.whoi.edu/
  6. 6.
    Serikawa, S., Lu, H.: Underwater image dehazing using joint trilateral filter. Comput. Electr. Eng. 40(1), 41–50 (2014)CrossRefGoogle Scholar
  7. 7.
    Lu, H., Li, Y., Zhang, Y., Chen, M., Serikawa, S., Kim, H.: Underwater optical image processing: a comprehensive review. Mob. Netw. Appl. 1–8 (2017)Google Scholar
  8. 8.
    Lu, H., Li, Y., Uemura, T., Ge, Z., Xu, X., He, L., Serikawa, S., Kim, H.: FDCNet: filtering deep convolutional network for marine organism classification. Multimed. Tools Appl. 1–14 (2017)Google Scholar
  9. 9.
    Lu, H., Li, Y., Nakashima, S., Kim, H., Serikawa, S.: Underwater image super-resolution by descattering and fusion. IEEE Access 5, 670–679 (2017)CrossRefGoogle Scholar
  10. 10.
    Li, Y., Lu, H., Li, J., Li, X., Li, Y., Serikawa, S.: Underwater image de-scattering and classification by deep neural network. Comput. Electr. Eng. 54, 68–77 (2016)CrossRefGoogle Scholar
  11. 11.
    Lu, H., Li, B., Zhu, J., Li, Y., Li, Y., Xu, X., He, L., Li, X., Li, J., Serikawa, S.: Wound intensity correction and segmentation with convolutional neural networks. Concurr. Comput.: Pract. Exp. 27(9), 1–10 (2017)Google Scholar
  12. 12.
    Lu, H., Li, Y., Nakashima, S., Serikawa, S.: Turbidity underwater image restoration using spectral properties and light compensation. IEICE Trans. Inf. Syst. 99(1), 219–227 (2016)CrossRefGoogle Scholar
  13. 13.
    Lu, H., Li, Y., Zhang, L., Serikawa, S.: Contrast enhancement for images in turbid water. JOSA 32(5), 886–893 (2015)CrossRefGoogle Scholar
  14. 14.
    Li, Y., Lu, H., Zhang, L., Li, J., Serikawa, S.: Real-time visualization system for deep-sea surveying. Math. Probl. Eng. 1–10 (2014)Google Scholar
  15. 15.
    Lu, H., Zhang, L., Serikawa, S.: Maximum local energy: an effective approach for multisensor image fusion in beyond wavelet transform domain. Comput. Math Appl. 64(5), 996–1003 (2012)CrossRefGoogle Scholar

Copyright information

© Springer Nature Switzerland AG 2020

Authors and Affiliations

  • Huimin Lu
    • 1
    Email author
  • Yujie Li
    • 2
  • Hyoungseop Kim
    • 1
  • Seiichi Serikawa
    • 1
  1. 1.Kyushu Institute of TechnologyFukuokaJapan
  2. 2.Fukuoka UniversityFukuokaJapan

Personalised recommendations