Skip to main content

Data-Driven Generation of Rubric Parameters from an Educational Programming Environment

  • Conference paper
  • First Online:
  • 4474 Accesses

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 10331))

Abstract

We demonstrate that, by using a small set of hand-graded students, we can automatically generate rubric parameters with a high degree of validity, and that a predictive model incorporating these rubric parameters is more accurate than a previously reported model. We present this method as one approach to addressing the often challenging problem of grading assignments in programming environments. A classic solution is creating unit-tests that the student-generated program must pass, but the rigid, structured nature of unit-tests is suboptimal for assessing more open-ended assignments. Furthermore, the creation of unit-tests requires predicting the various ways a student might correctly solve a problem – a challenging and time-intensive process. The current study proposes an alternative, semi-automated method for generating rubric parameters using low-level data from the Alice programming environment.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   79.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   99.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

References

  1. Diana, N., Eagle, M., Stamper, J., Grover, S., Bienkowski, M., Satabdi, B.: An instructor dashboard for real-time analytics in interactive programming assignments (2016)

    Google Scholar 

  2. Friedman, J., Hastie, T., Tibshirani, R.: The Elements of Statistical Learning. Springer series in statistics, vol. 1. Springer, Berlin (2001)

    MATH  Google Scholar 

  3. Pausch, R., Burnette, T., Capehart, A.C., Conway, M., Cosgrove, D., DeLine, R., Durbin, J., Gossweiler, R., Koga, S., White, J.: A brief architectural overview of Alice, a rapid prototyping system for virtual reality (1995)

    Google Scholar 

  4. Wang, H.C., Chang, C.Y., Li, T.Y.: Assessing creative problem-solving with automated text grading. Comput. Educ. 51(4), 1450–1466 (2008)

    Article  Google Scholar 

  5. Werner, L., Denner, J., Campe, S.: The fairy performance assessment: mea-suring computational thinking in middle school. In: Proceedings of the 43rd ACM Technical Symposium on Computer Science Education - SIGCSE 2012, pp. 215–220 (2012)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Nicholas Diana .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2017 Springer International Publishing AG

About this paper

Cite this paper

Diana, N., Eagle, M., Stamper, J., Grover, S., Bienkowski, M., Basu, S. (2017). Data-Driven Generation of Rubric Parameters from an Educational Programming Environment. In: André, E., Baker, R., Hu, X., Rodrigo, M., du Boulay, B. (eds) Artificial Intelligence in Education. AIED 2017. Lecture Notes in Computer Science(), vol 10331. Springer, Cham. https://doi.org/10.1007/978-3-319-61425-0_47

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-61425-0_47

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-61424-3

  • Online ISBN: 978-3-319-61425-0

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics