Further Idea on Optimal Q-Learning Fuzzy Energy Controller for FC/SC HEV

  • Jili TaoEmail author
  • Ridong Zhang
  • Yong Zhu


With the development of intelligent algorithms, the learning-based algorithm has been considered as viable solutions to various optimization and control problems. GA can also be efficient to optimize the new emerging intelligent algorithm. Here, an adaptive fuzzy energy management control strategy (EMS) based on Q-Learning algorithm is presented for the real-time power split between the fuel cell and supercapacitor in the hybrid electric vehicle (HEV) in order to adapt the dynamic driving pattern and decrease the fuel consumption. Different from the driving pattern recognition based method, Q-Learning controller observes the driving states, takes actions, and obtains the effects of these actions. By processing the accumulated experience, the Q-Learning controller progressively learns an appropriate fuzzy EMS output tuning policy that associates suitable actions to the different driving patterns. The environment adaptation capability of fuzzy EMS is then improved needless of driving pattern recognition. To enhance the learning capability and decrease the effect on the initial values of Q-table, GA can also be utilized to optimize the initial values of Q-Learning based fuzzy energy management.


  1. 1.
    Ralph, T.R. 2006. Principles of fuel cells. Platinum Metals Review 50 (4): 200–201.Google Scholar
  2. 2.
    Meacham, et al. 2006. Analysis of stationary fuel cell dynamic ramping capabilities and ultra capacitor energy storage using high resolution demand data. Journal of Power Sources 156(2): 472–479.Google Scholar
  3. 3.
    Khaligh, A., and Z. Li. 2010. Battery, ultracapacitor, fuel cell, and hybrid energy storage systems for electric, hybrid electric, fuel cell, and plug-in hybrid electric vehicles: state of the art. IEEE Transactions on Vehicular Technology 59 (6): 2806–2814.Google Scholar
  4. 4.
    Liu, C., et al. 2010. Graphene-based supercapacitor with an ultrahigh energy density. Nano Letters 10 (12): 4863–4868.Google Scholar
  5. 5.
    Hofman, T., et al. 2007. A Rule-based energy management strategies for hybrid vehicles. International Journal of Electric Hybrid Vehicles 1 (1): 71–94.MathSciNetGoogle Scholar
  6. 6.
    Trovão, J.P., et al. 2013. A multi-level energy management system for multi-source electric vehicles—an integrated rule-based meta-heuristic approach. Applied Energy 105 (2): 304–318.Google Scholar
  7. 7.
    Chen, B.C., Y.Y. Wu, and H.C. Tsai. 2014. Design and analysis of power management strategy for range extended electric vehicle using dynamic programming. Applied Energy 113 (1): 1764–1774.Google Scholar
  8. 8.
    Golchoubian, P., and N.L. Azad. 2017. Real-time nonlinear model predictive control of a battery-supercapacitor hybrid energy storage system in electric vehicles. IEEE Transactions on Vehicular Technology 66 (11): 9678–9688.Google Scholar
  9. 9.
    Panday, A., and H.O. Bansal. 2016. Energy management strategy for hybrid electric vehicles using genetic algorithm. Journal of Renewable Sustainable Energy 8 (1): 646–741.Google Scholar
  10. 10.
    Jalil, N., N.A. Kheir, and M. Salman. 1997. Rule-based energy management strategy for a series hybrid vehicle. In American Control Conference.Google Scholar
  11. 11.
    Hemi, H., J. Ghouili, and A. Cheriti. 2014. A real time fuzzy logic power management strategy for a fuel cell vehicle. Energy Conversion Management 80 (4): 63–70.Google Scholar
  12. 12.
    Zhang, R., J. Tao, and H. Zhou. 2019. Fuzzy optimal energy management for fuel cell and supercapacitor systems using neural network based driving pattern recognition. IEEE Transactions on Fuzzy Systems 26 (4): 1833–1843.Google Scholar
  13. 13.
    Wu, L., et al. 2011. Multiobjective optimization of HEV fuel economy and emissions using the self-adaptive differential evolution algorithm. IEEE Transactions on Vehicular Technology 60 (6): 2458–2470.Google Scholar
  14. 14.
    Opila, D.F., et al. 2012. An energy management controller to optimally trade off fuel economy and drivability for hybrid vehicles. IEEE Transactions on Control Systems Technology 20 (6): 1490–1505.Google Scholar
  15. 15.
    Glavic, M., R. Fonteneau, and D. Ernst. 2017. Reinforcement learning for electric power system decision and control: past considerations and perspectives. IFAC-PapersOnLine 50 (1): 6918–6927.Google Scholar
  16. 16.
    Dayeni, M.K., and M. Soleymani. 2016. Intelligent energy management of a fuel cell vehicle based on traffic condition recognition. Clean Technologies Environmental Policy 18 (6): 1–16.Google Scholar
  17. 17.
    Johnson, D.A. and M.M. Trivedi. 2011. Trivedi. Driving style recognition using a smartphone as a sensor platform. In International IEEE Conference on Intelligent Transportation Systems. 2011.Google Scholar
  18. 18.
    Stenneth, L., et al. 2011. Transportation mode detection using mobile phones and GIS information. In Acm Sigspatial International Symposium on Advances in Geographic Information Systems.Google Scholar
  19. 19.
    Gong, Q., Y. Li, and Z.R. Peng. 2007. Optimal power management of plug-in HEV with intelligent transportation system. In IEEE/ASME International Conference on Advanced Intelligent Mechatronics.Google Scholar
  20. 20.
    Liaw, B.Y. 2004. Fuzzy logic based driving pattern recognition for driving cycle analysis. Journal of Asian Electric Vehicles 2 (1): 551–556.Google Scholar
  21. 21.
    Wang, J., et al. 2015. Driving cycle recognition neural network algorithm based on the sliding time window for hybrid electric vehicles. International Journal of Automotive Technology 16 (4): 685–695.Google Scholar
  22. 22.
    Xing, Z., et al. 2015. Embedded feature-selection support vector machine for driving pattern recognition. Journal of the Franklin Institute 352 (2): 669–685.zbMATHGoogle Scholar
  23. 23.
    Yuan, Z., et al. 2016. Reinforcement learning-based real-time energy management for a hybrid tracked vehicle. Applied Energy 171: 372–382.Google Scholar
  24. 24.
    Qi, X., et al. 2016. Data-driven reinforcement learning-based real-time energy management system for plug-in hybrid electric vehicles. Journal of the Transportation Research Board 2572 (1): 1–8.Google Scholar
  25. 25.
    Teng, L., et al. 2017. Reinforcement learning optimized look-ahead energy management of a parallel hybrid electric vehicle. IEEE/ASME Transactions on Mechatronics PP(99): 1497–1507.Google Scholar
  26. 26.
    Yue, H., et al. 2018. Energy management strategy for a hybrid electric vehicle based on deep reinforcement learning. Applied Sciences 8 (2): 187–198.Google Scholar
  27. 27.
    Volodymyr, M., et al. 2015. Human-level control through deep reinforcement learning. Nature 518 (7540): 529.Google Scholar
  28. 28.
    Caux, S., et al. 2010. On-line fuzzy energy management for hybrid fuel cell systems. International Journal of Hydrogen Energy 35 (5): 2134–2143.Google Scholar
  29. 29.
    Sutton, R.S., and A.G. Barto. 1998. Reinforcement learning: an introduction. IEEE Transactions on Neural Networks 9 (5): 1054.Google Scholar

Copyright information

© Springer Nature Singapore Pte Ltd. 2020

Authors and Affiliations

  1. 1.School of Information Science and EngineeringNingboTech UniversityNingboChina
  2. 2.The Belt and Road Information Research InstituteHangzhou Dianzi UniversityHangzhouChina

Personalised recommendations