Skip to main content

Quantitative model of irrigation effect on maize yield by deep neural network


A rapidly expanding world population and extreme climate change have made food production a crucial challenge in the twenty-first century. Improving crop management could be an effective solution for this challenge. However, due to the associated cost and time to perform field works, researchers are widely rely on agricultural systems modeling to examine the impacts of different crop management scenarios. Meanwhile, due to the complexity of agricultural systems modeling, their applications in producing practical knowledge for producers are limited. Meanwhile, deep learning techniques have been recognized as the preferred method compared to other machine learning techniques, especially when dealing with large datasets. In addition, deep learning techniques are easily adopted by non-experts due to the feature of learning ability that can automatically discover the classifications from raw data. Meanwhile, one of the drawbacks of using deep learning techniques is the training time, which can last anywhere from a couple of weeks to even a few months. Therefore, the goal of this study is to examine the applicability of deep learning techniques to compute a numerical model of crop growth. In this study, an agricultural systems model known as the Decision Support System for Agrotechnology Transfer (DSSAT) is used to evaluate the impacts of irrigation amount and time of application on crop yield. A deep learning network is utilized and trained by incorporating the large amounts of DSSAT models inputs (i.e., precipitation date, precipitation amount, irrigation date irrigation amount) and output (i.e., maize yield at the end of the growing season). However, in order to simplify the process, we combined the amount of irrigation and rainfall together and presented them in the form of the amount of water per day. Experimental results have demonstrated the effectiveness of this proposed deep learning technique in crop yield prediction.

This is a preview of subscription content, access via your institution.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11


  1. 1.

    Andrychowicz M, Denil M, Gomez S, Hoffman MW, Pfau D, Schaul T, Shillingford B, de Freitas N (2016) Learning to learn by gradient descent by gradient descent (Nips) 1–17.

  2. 2.

    Baldi P (1995) Gradient descent learning algorithm overview: a general dynamical systems perspective. IEEE Trans Neural Netw 6(1):182–195.

    Article  Google Scholar 

  3. 3.

    Beleites C, Salzer R, Sergo V (2013) Validation of soft classification models using partial class memberships: an extended concept of sensitivity & co. applied to grading of astrocytoma tissues. Chemom Intell Lab Syst 122:12–22.

    Article  Google Scholar 

  4. 4.

    Bennett J, Jones J, Zur B, Hammond L (1986) Interactive effects of nitrogen and water stresses on water relations of field-grown corn leaves 1. Agron J 78(2):273–280

    Article  Google Scholar 

  5. 5.

    Bergstra J, Yamins D, Cox DD (2013) Hyperopt: a python library for optimizing the hyperparameters of machine learning algorithms. 12th PYTHON IN SCIENCE CONF. (SCIPY 2013) (Scipy), 13–20.

  6. 6.

    Bordes A, Chopra S, Weston J (2014) Question answering with subgraph embeddings. arXiv preprint arXiv:1406.3676

  7. 7.

    Collobert R, Weston J, Bottou L, Karlen M, Kavukcuoglu K, Kuksa P (2011) Natural language processing (almost) from scratch. J Mach Learn Res 12:2493–2537

    MATH  Google Scholar 

  8. 8.

    Costa M (1996) Probabilistic interpretation of feedforward network outputs, with relationships to statistical prediction of ordinal quantities. Int J Neural Syst 7(5):627–37.

    Article  Google Scholar 

  9. 9.

    DeLotell PJ, Millam LA, Reinhardt MM (2011) The use of deep learning strategies in online business courses to impact student retention. Am J Buss Educ 3(12):49–56.

    Article  Google Scholar 

  10. 10.

    Deng L, Yu D (2013) Deep learning: methods and applications. Foundations and Trends® in Signal Processing 7(3–4):197–387.

  11. 11.

    French MN, Krajewski WF, Cuykendall RR (1992) Rainfall forecasting in space and time using a neural network. J Hydrol 137(1–4):1–31.

    Article  Google Scholar 

  12. 12.

    Glorot X, Bengio Y (2010) Understanding the difficulty of training deep feedforward neural networks. Pmlr 9:249–256

    Google Scholar 

  13. 13.

    Hastie T, Tibshirani R, Friedman J (2001) The Elements of Statistical Learning. pp 9–11 Math. Intell. 27:83–85.

    Article  Google Scholar 

  14. 14.

    Hecht-Nielsen R (1992) Theory of the backpropagation neural network**based on “nonindent” by Robert Hecht-Nielsen, which appeared in proceedings of the international joint conference on neural networks 1, pp 593–611, June 1989. 1989 IEEE. June 1989. Academic Press.

  15. 15.

    Hochreiter S, Schmidhuber J (1997) Long short-term memory. Neural Comput 9(8):1735–1780

    Article  Google Scholar 

  16. 16.

    Jean S, Cho K, Memisevic R, Bengio Y (2014) On using very large target vocabulary for neural machine translation. arXiv preprint arXiv:1412.2007

  17. 17.

    Jiang D, Yang X, Clinton N, Wang N (2004) An artificial neural network model for estimating crop yields using remotely sensed information. Int J Remote Sens 25(9):1723–1732.

    Article  Google Scholar 

  18. 18.

    Jin X, Kumar L, Li Z, Feng H, Xu X, Yang G, Wang J (2018) A review of data assimilation of remote sensing and crop models. Eur J Agron 92:141–152.

    Article  Google Scholar 

  19. 19.

    Jones JW, Hoogenboom G, Porter CH, Boote KJ, Batchelor WD, Hunt LA, Wilkens PW, Singh U, Gijsman AJ, Ritchie JT (2003) The DSSAT cropping system model. Eur J Agron 18:235–265.

    Article  Google Scholar 

  20. 20.

    Jordan MI, Mitchell TM (2015) Machine learning: trends, perspectives, and prospects. Science 349(6245):255–260

    MathSciNet  Article  Google Scholar 

  21. 21.

    Leung MK, Xiong HY, Lee LJ, Frey BJ (2014) Deep learning of the tissue-regulated splicing code. Bioinformatics 30(12):i121–i129

    Article  Google Scholar 

  22. 22.

    Liang Z, Zhang G, Huang JX, Hu QV (2014) Deep learning for healthcare decision making with EMRs. In: Proceedings—2014 IEEE international conference on bioinformatics and biomedicine, IEEE BIBM 2014 (Cm), pp 556–559.

  23. 23.

    Liu W, Wang Z, Liu X, Zeng N, Liu Y, Alsaadi FE (2017) A survey of deep neural network architectures and their applications. Neurocomputing 234:11–26.

    Article  Google Scholar 

  24. 24.

    Marino DL, Amarasinghe K, Manic M (2016) Building energy load forecasting using Deep Neural Networks. In: IECON proceedings (industrial electronics conference), pp 7046–7051.

  25. 25.

    Masters D, Luschi C (2018) Revisiting small batch training for deep neural networks, pp 1–18.

  26. 26.

    Muhd KAK, Mohd ZA, Nadaraj M (2014) Wheat yield prediction: artificial neural network based approach, pp 161–165

  27. 27.

    Najafabadi MM, Villanustre F, Khoshgoftaar TM, Seliya N, Wald R, Muharemagic E (2015) Deep learning applications and challenges in big data analytics. J Big Data 2(1):1–21.

    Article  Google Scholar 

  28. 28.

    Paoli C, Voyant C, Muselli M, Nivet ML (2010) Forecasting of preprocessed daily solar radiation time series using neural networks. Solar Energy 84(12):2146–2160.

    Article  Google Scholar 

  29. 29.

    Tai L, Li S, Liu M (2016) A deep-network solution towards model-less obstacle avoidance. In: IEEE international conference on intelligent robots and systems 2016-November, pp 2759–2764.

  30. 30.

    Taormina R, Chau KW, Sethi R (2012) Artificial neural network simulation of hourly groundwater levels in a coastal aquifer system of the Venice lagoon. Eng Appl Artif Intell 25(8):1670–1676.

    Article  Google Scholar 

  31. 31.

    Wallach D, Makowski D, Jones JW, Brun F (2014) Working with dynamic crop models, pp 3–9

  32. 32.

    Wallach D, Makowski D, Jones JW, Brun F (2014) Working with dynamic crop models, pp 479–487

  33. 33.

    Xiong HY, Alipanahi B, Lee LJ, Bretschneider H, Merico D, Yuen RK, Hua Y, Gueroussov S, Najafabadi HS, Hughes TR et al (2015) The human splicing code reveals new insights into the genetic determinants of disease. Science 347(6218):1254806

    Article  Google Scholar 

  34. 34.

    Zhang C, Bengio S, Hardt M, Recht B, Vinyals O (2016) Understanding deep learning requires rethinking generalization. arXiv preprint arXiv:1611.03530

  35. 35.

    Zhang L, Zhang L, Du B (2016) Deep learning for remote sensing data: a technical tutorial on the state of the art. IEEE Geosci Remote Sens Mag 4(2):22–40

    Article  Google Scholar 

  36. 36.

    Zhang Q, Yang LT, Chen Z, Li P (2018) A survey on deep learning for big data. Inf Fusion 42(November 2017):146–157.

    Article  Google Scholar 

Download references

Author information



Corresponding author

Correspondence to A. Pouyan Nejadhashemi.

Ethics declarations

Conflict of interest

The authors declare that they have no conflict of interest.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Saravi, B., Nejadhashemi, A.P. & Tang, B. Quantitative model of irrigation effect on maize yield by deep neural network. Neural Comput & Applic 32, 10679–10692 (2020).

Download citation


  • Deep learning
  • Neural network
  • Parallel computing
  • Crop modeling