Skip to main content

Orchids Classification Using Spatial Transformer Network with Adaptive Scaling

  • Conference paper
  • First Online:
Book cover Intelligent Data Engineering and Automated Learning – IDEAL 2019 (IDEAL 2019)

Part of the book series: Lecture Notes in Computer Science ((LNISA,volume 11871))

Abstract

The orchids families are large, diverse flowering plants in the tropical areas. It is a challenging task to classify orchid species from images. In this paper, we proposed an adaptive classification model of the orchid images by using a Deep Convolutional Neural Network (D-CNN). The first part of the model improved the quality of input feature maps using an adaptive Spatial Transformer Network (STN) module by performing a spatial transformation to warp an input image which was split into different locations and scales. We applied D-CNN to extract the image features from the previous step and warp into four branches. Then, we concatenated the feature channels and reduced the dimension by an estimation block. Finally, the feature maps would be forwarded to the prediction network layers to predict the orchid species. We verified the efficiency of the proposed method by conducting experiments on our data set of 52 classes of orchid flowers, containing 3,559 samples. Our results achieved an average of 93.32% classification accuracy, which is higher than the existing D-CNN models.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Abadi, M., et al.: TensorFlow: a system for large-scale machine learning. In: OSDI, vol. 16, pp. 265–283 (2016)

    Google Scholar 

  2. Arwatchananukul, S., Charoenkwan, P., Xu, D.: POC: paphiopedilum orchid classifier. In: 2015 IEEE 14th International Conference on ICCI*CC, pp. 206–212. IEEE (2015)

    Google Scholar 

  3. Erhan, D., Szegedy, C., Toshev, A., Anguelov, D.: Scalable object detection using deep neural networks. In: Proceedings of the IEEE Conference on CVPR, pp. 2147–2154 (2014)

    Google Scholar 

  4. Guru, D., Kumar, Y.S., Manjunath, S.: Textural features in flower classification. Math. Comput. Model. 54(3–4), 1030–1036 (2011)

    Article  Google Scholar 

  5. He, K., Zhang, X., Ren, S., Sun, J.: Deep residual learning for image recognition. In: Proceedings of the IEEE Conference on CVPR, pp. 770–778 (2016)

    Google Scholar 

  6. He, K., Zhang, X., Ren, S., Sun, J.: Identity mappings in deep residual networks. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) ECCV 2016. LNCS, vol. 9908, pp. 630–645. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46493-0_38

    Chapter  Google Scholar 

  7. Hiary, H., Saadeh, H., Saadeh, M., Yaqub, M.: Flower classification using deep convolutional neural networks. IET Comput. Vision 12(6), 855–862 (2018)

    Article  Google Scholar 

  8. Jaderberg, M., Simonyan, K., Zisserman, A., et al.: Spatial transformer networks. In: Advances in Neural Information Processing Systems, pp. 2017–2025 (2015)

    Google Scholar 

  9. Johnson, J., Karpathy, A., Fei-Fei, L.: DenseCap: fully convolutional localization networks for dense captioning. In: Proceedings of the IEEE Conference on CVPR, pp. 4565–4574 (2016)

    Google Scholar 

  10. Li, L., Qiao, Y.: Flower image retrieval with category attributes. In: 2014 4th IEEE International Conference on ICIST, pp. 805–808. IEEE (2014)

    Google Scholar 

  11. Liu, W., Rao, Y., Fan, B., Song, J., Wang, Q.: Flower classification using fusion descriptor and SVM. In: 2017 International ISC2, pp. 1–4. IEEE (2017)

    Google Scholar 

  12. Liu, Y., Tang, F., Zhou, D., Meng, Y., Dong, W.: Flower classification via convolutional neural network. In: International Conference on FSPMA, pp. 110–116. IEEE (2016)

    Google Scholar 

  13. Nilsback, M.E., Zisserman, A.: A visual vocabulary for flower classification. In: 2006 IEEE Computer Society Conference on CVPR, vol. 2, pp. 1447–1454. IEEE (2006)

    Google Scholar 

  14. Nilsback, M.E., Zisserman, A.: Automated flower classification over a large number of classes. In: Sixth Indian Conference on Computer Vision, Graphics & Image Processing, 2008, ICVGIP 2008, pp. 722–729. IEEE (2008)

    Google Scholar 

  15. Russakovsky, O., et al.: ImageNet large scale visual recognition challenge. IJCV 115(3), 211–252 (2015). https://doi.org/10.1007/s11263-015-0816-y

    Article  MathSciNet  Google Scholar 

  16. Sari, Y.A., Suciati, N.: Flower classification using combined a* b* color and fractal-based texture feature. Int. J. Hybrid Inf. Technol. 7(2), 357–368 (2014)

    Article  Google Scholar 

  17. Siraj, F., Ekhsan, H.M., Zulkifli, A.N.: Flower image classification modeling using neural network. In: 2014 International Conference on IC3INA, pp. 81–86. IEEE (2014)

    Google Scholar 

  18. Szegedy, C., et al.: Going deeper with convolutions. In: Proceedings of the IEEE Conference on CVPR, pp. 1–9 (2015)

    Google Scholar 

  19. Szegedy, C., Vanhoucke, V., Ioffe, S., Shlens, J., Wojna, Z.: Rethinking the inception architecture for computer vision. In: Proceedings of the IEEE Conference on CVPR, pp. 2818–2826 (2016)

    Google Scholar 

  20. Xia, X., Xu, C., Nan, B.: Inception-v3 for flower classification. In: 2017 2nd International Conference on ICIVC, pp. 783–787. IEEE (2017)

    Google Scholar 

  21. Zawbaa, H.M., Abbass, M., Basha, S.H., Hazman, M., Hassenian, A.E.: An automatic flower classification approach using machine learning algorithms. In: 2014 International Conference on ICACCI, pp. 895–901. IEEE (2014)

    Google Scholar 

  22. Zhang, C., Liang, C., Li, L., Liu, J., Huang, Q., Tian, Q.: Fine-grained image classification via low-rank sparse coding with general and class-specific codebooks. IEEE Trans. Neural Netw. Learn. Syst. 28(7), 1550–1559 (2017)

    Article  MathSciNet  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding authors

Correspondence to Watcharin Sarachai , Jakramate Bootkrajang , Jeerayut Chaijaruwanich or Samerkae Somhom .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2019 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Sarachai, W., Bootkrajang, J., Chaijaruwanich, J., Somhom, S. (2019). Orchids Classification Using Spatial Transformer Network with Adaptive Scaling. In: Yin, H., Camacho, D., Tino, P., Tallón-Ballesteros, A., Menezes, R., Allmendinger, R. (eds) Intelligent Data Engineering and Automated Learning – IDEAL 2019. IDEAL 2019. Lecture Notes in Computer Science(), vol 11871. Springer, Cham. https://doi.org/10.1007/978-3-030-33607-3_1

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-33607-3_1

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-33606-6

  • Online ISBN: 978-3-030-33607-3

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics