Advertisement

A Differential Privacy Workflow for Inference of Parameters in the Rasch Model

  • Teresa Anna Steiner
  • David Enslev Nyrnberg
  • Lars Kai HansenEmail author
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11054)

Abstract

The Rasch model is used to estimate student performance and task difficulty in simple test scenarios. We design a workflow for enhancing student feedback by release of difficulty parameters in the Rasch model with privacy protection using differential privacy. We provide a first proof of differential privacy in Rasch models and derive the minimum noise level in objective perturbation to guarantee a given privacy budget. We test the workflow in simulations and in two real data sets.

Keywords

Rasch model Differential privacy Student feedback 

Notes

Acknowledgements

We would like to thank Martin Søren Engmann Djurhuus, who worked with us on the project in its early stages during the course “Advanced Machine Learning” at DTU. Further, we would like to thank Morten Mørup for access to the DTU data.

References

  1. 1.
    Chaudhuri, K., Monteleoni, C.: Privacy-preserving logistic regression. In: Proceedings of the 21st International Conference on Neural Information Processing Systems, NIPS 2008, pp. 289–296. Curran Associates Inc., USA (2008). http://dl.acm.org/citation.cfm?id=2981780.2981817
  2. 2.
    Chaudhuri, K., Monteleoni, C., Sarwate, A.D.: Differentially private empirical risk minimization. J. Mach. Learn. Res. 12(Mar), 1069–1109 (2011)MathSciNetzbMATHGoogle Scholar
  3. 3.
    Choppin, B.: A fully conditional estimation procedure for Rasch model parameters (CSE report 196): University of California. Center for the Study of Evaluation (1983)Google Scholar
  4. 4.
    Dwork, C.: Differential privacy: a survey of results. In: Agrawal, M., Du, D., Duan, Z., Li, A. (eds.) TAMC 2008. LNCS, vol. 4978, pp. 1–19. Springer, Heidelberg (2008).  https://doi.org/10.1007/978-3-540-79228-4_1CrossRefzbMATHGoogle Scholar
  5. 5.
    Dwork, C., Roth, A., et al.: The algorithmic foundations of differential privacy. Found. Trends® Theor. Comput. Sci. 9(3–4), 211–407 (2014)MathSciNetzbMATHGoogle Scholar
  6. 6.
    EU GDPR Portal: GDPR key changes - an overview of the main changes under GDPR and how they differ from the previous directive. https://www.eugdpr.org/key-changes.html (2018). Accessed 19 May 2018
  7. 7.
    Foulds, J., Geumlek, J., Welling, M., Chaudhuri, K.: On the theory and practice of privacy-preserving bayesian data analysis. In: Proceedings of the Thirty-Second Conference on Uncertainty in Artificial Intelligence, pp. 192–201. AUAI Press (2016)Google Scholar
  8. 8.
    Ji, Z., Lipton, Z.C., Elkan, C.: Differential privacy and machine learning: a survey and review. arXiv preprint arXiv:1412.7584 (2014)
  9. 9.
    Vahdat, M., Oneto, L., Anguita, D., Funk, M., Rauterberg, M.: A learning analytics approach to correlate the academic achievements of students with interaction data from an educational simulator. In: Conole, G., Klobučar, T., Rensing, C., Konert, J., Lavoué, É. (eds.) EC-TEL 2015. LNCS, vol. 9307, pp. 352–366. Springer, Cham (2015).  https://doi.org/10.1007/978-3-319-24258-3_26. https://archive.ics.uci.edu/ml/machine-learning-databases/00346/CrossRefGoogle Scholar
  10. 10.
    Uden, L., Liberona, D., Welzer, T.: Learning technology for education in cloud. In: Third International Workshop. LTEC 2014. Springer (2014)Google Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  • Teresa Anna Steiner
    • 1
  • David Enslev Nyrnberg
    • 1
  • Lars Kai Hansen
    • 1
    Email author
  1. 1.Department of Applied Mathematics and Computer ScienceTechnical University of Denmark B324Kongens LyngbyDenmark

Personalised recommendations