Advertisement

Leveraging Reproduction-Error Representations for Multi-Instance Classification

  • Sebastian KauschkeEmail author
  • Max Mühlhäuser
  • Johannes Fürnkranz
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11198)

Abstract

Multi-instance learning deals with the problem of classifying bags of instances, when only the labels of the bags are known for learning, and the instances themselves have no labels. In this work, we propose a method that trains autoencoders for the instances in each class, and recodes each instance into a representation that captures the reproduction error for this instance. The idea behind this approach is that an autoencoder trained on only instances of a single class is unable to reproduce examples from another class properly, which is then reflected in the encoding. The transformed instances are then piped into a propositional classifier that decides the latent instance label. In a second classification layer, the bag label is decided based on the output of the propositional classifier on all the instances in the bag. We show that this reproduction-error encoding creates an advantage compared to the classification of non-encoded data, and that further research into this direction could be beneficial for the cause of multi-instance learning.

Keywords

Multi-instance learning Denoising autoencoder Bag classification Reproduction-error representation 

Notes

Acknowledgements

This work has been sponsored by the German Federal Ministry of Education and Research (BMBF) Software Campus project Effiziente Modellierungstechniken für Predictive Maintenance [01IS17050]. We also gratefully acknowledge the use of the Lichtenberg high performance computer of the TU Darmstadt for our experiments.

References

  1. 1.
    Amores, J.: Multiple instance classification: review taxonomy and comparative study. Artif. Intell. 201, 81–105 (2013)MathSciNetCrossRefGoogle Scholar
  2. 2.
    Andrews, S., Tsochantaridis, I., Hofmann, T.: Support vector machines for multiple-instance learning. In: Advances in Neural Information Processing Systems - NIPS’03, pp. 561–568 (2003)Google Scholar
  3. 3.
    Bunescu, R.C., Mooney, R.J.: Multiple instance learning for sparse positive bags. In: Proceedings of the 24th International Conference on Machine Learning, pp. 105–112 (2007)Google Scholar
  4. 4.
    Demšar, J.: Statistical comparisons of classifiers over multiple data sets. J. Mach. Learn. Res. 7, 1–30 (2006)MathSciNetzbMATHGoogle Scholar
  5. 5.
    Dietterich, T.G., Lathrop, R.H., Lozano-Pérez, T.: Solving the multiple instance problem with axis-parallel rectangles. Artif. Intell. 89(1–2), 31–71 (1997)CrossRefGoogle Scholar
  6. 6.
    Feng, S., Xiong, W., Li, B., Lang, C., Huang, X.: Hierarchical sparse representation based multi-instance semi-supervised learning with application to image categorization. Signal Process. 94, 595–607 (2014)CrossRefGoogle Scholar
  7. 7.
    Foulds, J., Frank, E.: A review of multi-instance learning assumptions. In: Knowledge Engineering Review, vol. 25, pp. 1–25. Cambridge University Press, Cambridge (2010)Google Scholar
  8. 8.
    Frank, E., Xu, X.: Applying propositional learning algorithms to multi-instance data. (Working paper 06/03). Technical report, University of Waikato, Department of Computer Science (2003)Google Scholar
  9. 9.
    Kauschke, S., Fürnkranz, J., Janssen, F.: Predicting cargo train failures: a machine learning approach for a lightweight prototype. In: Proceedings of the 19th International Conference on Discovery Science - DS’16, pp. 151–166 (2016)CrossRefGoogle Scholar
  10. 10.
    Kingma, D.P., Ba, J.: Adam: a method for stochastic optimization. In: CoRR (2014). http://arxiv.org/abs/1412.6980
  11. 11.
    Liu, M., Zhang, J., Adeli, E., Shen, D.: Landmark-based deep multi-instance learning for brain disease diagnosis. Med. Image Anal. 43, 157–168 (2018)CrossRefGoogle Scholar
  12. 12.
    Sipos, R., Fradkin, D., Moerchen, F., Wang, Z.: Log-based predictive maintenance. In: Proceedings of the 20th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining - KDD’14, pp. 1867–1876 (2014)Google Scholar
  13. 13.
    Sutskever, I., Martens, J., Dahl, G., Hinton, G.: On the importance of initialization and momentum in deep learning. In: Proceedings of the International Conference on Machine Learning - ICML’13, pp. 1139–1147 (2013)Google Scholar
  14. 14.
    Vincent, P., Larochelle, H., Manzagol, P.-A.: Stacked denoising autoencoders: learning useful representations in a deep network with a local denoising criterion. J. Mach. Learn. Res. 11, 3371–3408 (2010)Google Scholar
  15. 15.
    Wang, J., Zucker, J.D.: Solving multiple-instance problem: a lazy learning approach. In: Proceedings of the 17th International Conference on Machine Learning - ICML’00, pp. 1119–1125 (2000)Google Scholar
  16. 16.
    Wang, Y., Yao, H., Zhao, S.: Auto-encoder based dimensionality reduction. Neurocomputing 184, 232–242 (2016)CrossRefGoogle Scholar
  17. 17.
    Weidmann, N., Frank, E., Pfahringer, B.: A two-level learning method for generalized multi-instance problems. In: Lavrač, N., Gamberger, D., Blockeel, H., Todorovski, L. (eds.) ECML 2003. LNCS (LNAI), vol. 2837, pp. 468–479. Springer, Heidelberg (2003).  https://doi.org/10.1007/978-3-540-39857-8_42CrossRefGoogle Scholar
  18. 18.
    Wu, J., Yu, Y., Huang, C., Yu, K.: Deep multiple instance learning for image classification and auto-annotation. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition - CVPR’15, pp. 3460–3469. IEEE (2015)Google Scholar
  19. 19.
    Yan, Z., Zhan, Y., Zhang, S., Metaxas, D., Zhou, X.S.: Multi-instance multi-stage deep learning for medical image recognition. In: Deep Learning for Medical Image Analysis, pp. 83–104. Academic Press (2017)Google Scholar
  20. 20.
    Zhou, Z.H., Sun, Y.Y., Li, Y.F.: Multi-instance learning by treating instances as non-i.i.d. samples. In: Proceedings of the 26th International Conference on Machine Learning - ICML’09, pp. 1249–1256. ACM (2009)Google Scholar

Copyright information

© Springer Nature Switzerland AG 2018

Authors and Affiliations

  • Sebastian Kauschke
    • 1
    • 2
    Email author
  • Max Mühlhäuser
    • 2
  • Johannes Fürnkranz
    • 1
  1. 1.Knowledge Engineering GroupTU DarmstadtDarmstadtGermany
  2. 2.Telecooperation GroupTU DarmstadtDarmstadtGermany

Personalised recommendations