Advertisement

IDA 2016 Industrial Challenge: Using Machine Learning for Predicting Failures

  • Camila Ferreira CostaEmail author
  • Mario A. Nascimento
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9897)

Abstract

This paper presents solutions to the IDA 2016 Industrial Challenge which consists of using machine learning in order to predict whether a specific component of the Air Pressure System of a vehicle faces imminent failure. This problem is modelled as a classification problem, since the goal is to determine if an unobserved instance represents a failure or not. We evaluate various state-of-the-art classification algorithms and investigate how to deal with the imbalanced dataset and with the high amount of missing data. Our experiments showed that the best classifier was cost-wise 92.56 % better than a baseline solution where a random classification is performed.

Notes

Acknowledgements

We acknowledge partial financial support by NSERC Canada, as well as preliminary discussions on this challenge with Philippe Gaudreau.

References

  1. 1.
    Aizerman, M.A., Braverman, E.A., Rozonoer, L.: Theoretical foundations of the potential function method in pattern recognition learning. In: Automation and Remote Control, pp. 821–837 (1964)Google Scholar
  2. 2.
    Boser, B.E., Guyon, I.M., Vapnik, V.N.: A training algorithm for optimal margin classifiers. In: Proceedings of the 5th COLT, pp. 144–152 (1992)Google Scholar
  3. 3.
    Cover, T.M., Hart, P.E.: Nearest neighbor pattern classification. IEEE Trans. Inf. Theor. 13(1), 21–27 (1967)CrossRefzbMATHGoogle Scholar
  4. 4.
    Cox, D.R.: The regression analysis of binary sequences. J. R. Stat. Soc. Ser. B 20(2), 215–242 (1958)MathSciNetzbMATHGoogle Scholar
  5. 5.
    Dempster, A.P., Laird, N.M., Rubin, D.B.: Maximum likelihood from incomplete data via the EM algorithm. J. R. Stat. Soc. Ser. B 39(1), 1–38 (1977)MathSciNetzbMATHGoogle Scholar
  6. 6.
    Graham, J.W.: Missing data analysis: making it work in the real world. Annu. Rev. Psychol. 60, 549–576 (2009)CrossRefGoogle Scholar
  7. 7.
    Ho, T.K.: Random decision forests. In: Proceedings of the 3rd IJDAR, pp. 278–282 (1995)Google Scholar
  8. 8.
    Mazumder, R., Hastie, T., Tibshirani, R.: Spectral regularization algorithms for learning large incomplete matrices. J. Mach. Learn. Res. 11, 2287–2322 (2010)MathSciNetzbMATHGoogle Scholar
  9. 9.
    Pedregosa, F., et al.: Scikit-learn: machine learning in python. J. Mach. Learn. Res. 12, 2825–2830 (2011)MathSciNetzbMATHGoogle Scholar
  10. 10.
    Quinlan, J.R.: Induction of decision trees. Mach. Learn. 1(1), 81–106 (1986)Google Scholar
  11. 11.
    Rubin, D.B.: Multiple Imputation for Nonresponse in Surveys. Wiley, New York (1987)CrossRefzbMATHGoogle Scholar

Copyright information

© Springer International Publishing AG 2016

Authors and Affiliations

  1. 1.Department of Computing ScienceUniversity of AlbertaEdmontonCanada

Personalised recommendations