Skip to main content

Case-Based Decision Support System with Contextual Bandits Learning for Similarity Retrieval Model Selection

  • Conference paper
  • First Online:
Book cover Knowledge Science, Engineering and Management (KSEM 2018)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 11061))

  • 1648 Accesses

Abstract

Case-based reasoning has become one of the well-sought approaches that supports the development of personalized medicine. It trains on previous experience in form of resolved cases to provide solution to a new problem. In developing a case-based decision support system using case-based reasoning methodology, it is critical to have a good similarity retrieval model to retrieve the most similar cases to the query case. Various factors, including feature selection and weighting, similarity functions, case representation and knowledge model need to be considered in developing a similarity retrieval model. It is difficult to build a single most reliable similarity retrieval model, as this may differ according to the context of the user, demographic and query case. To address such challenge, the present work presents a case-based decision support system with multi-similarity retrieval models and propose contextual bandits learning algorithm to dynamically choose the most appropriate similarity retrieval model based on the context of the user, query patient and demographic data. The proposed framework is designed for DESIREE project, whose goal is to develop a web-based software ecosystem for the multidisciplinary management of primary breast cancer.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    http://www.desiree-project.eu.

References

  1. Parra-Calderón, C.L.: Patient similarity in prediction models based on health data: A scoping review, JMIR Med Inform. 5(1) (2017)

    Google Scholar 

  2. Alexandrini, F., Krechel, D., Maximini, K., Wanggenheim, A.: Integrating CBR into the health caser organization. In: 16th IEEE Symposium on Computer-Based Medical Systems (2003)

    Google Scholar 

  3. Mary, J., Gaudel, R., Preux, P.: Bandits and recommender systems. In: Pardalos, P., Pavone, M., Farinella, G.M., Cutello, V. (eds.) MOD 2015. LNCS, vol. 9432, pp. 325–336. Springer, Cham (2015). https://doi.org/10.1007/978-3-319-27926-8_29

    Chapter  Google Scholar 

  4. Séroussi, B, et al.: Reconciliation of multiple guidelines for decision support: a case study on the multidisciplinary management of breast cancer within the DESIREE project. In: 2017 Proceedings of the AMIA Annual Symposium, Washington DC, 4–8 November 2017 (2017)

    Google Scholar 

  5. Larburu, N., et al.: Augmenting guideline-based CDSS with experts’ knowledge. In: 10th International Conference on Health Informatics, Porto, Portugal, 21–23 February 2017 (2017)

    Google Scholar 

  6. Bouneffouf, D., Feraud, R.: Multi-armed bandit problem with known trend. Neurocomputing 205, 16–21 (2016)

    Article  Google Scholar 

  7. Langford, J., Zhang, T.: The Epoch-Greedy algorithm for multi-armed bandits with side information. In: Advances in Neural Information Processing System, pp. 817–824 (2008)

    Google Scholar 

  8. Bouneffouf, D., Bouzeghoub, A., Gançarski, A.L.: A contextual-bandit algorithm for mobile context-aware recommender system. In: Huang, T., Zeng, Z., Li, C., Leung, C.S. (eds.) ICONIP 2012. LNCS, vol. 7665, pp. 324–331. Springer, Heidelberg (2012). https://doi.org/10.1007/978-3-642-34487-9_40

    Chapter  Google Scholar 

  9. Agrawal, S., Goyal, N.: Thompson sampling for contextual bandits with linear payoffs. In: ICML (3), pp. 127–135 (2013)

    Google Scholar 

Download references

Acknowledgments

The DESIREE project has received funding from the European Union´s Horizon 2020 research and innovation program under grant agreement No. 690238.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Booma Devi Sekar .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2018 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Sekar, B.D., Wang, H. (2018). Case-Based Decision Support System with Contextual Bandits Learning for Similarity Retrieval Model Selection. In: Liu, W., Giunchiglia, F., Yang, B. (eds) Knowledge Science, Engineering and Management. KSEM 2018. Lecture Notes in Computer Science(), vol 11061. Springer, Cham. https://doi.org/10.1007/978-3-319-99365-2_37

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-99365-2_37

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-99364-5

  • Online ISBN: 978-3-319-99365-2

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics