Advertisement

Harnessing the Currents of the Digital Ocean

Abstract

The digital revolution concerns the shift in human history that allows the transformation of experiential inputs and work products into digital form that can be immediately collected, modified, moved, stored, and computed upon. This shift has already had remarkable social consequences and raises fundamental questions regarding the nature of science and knowledge. In the context of educational research, it raises key questions about the nature of our relationship with data in scientific endeavor and the role of computing systems and computational skills of researchers.

This paper extends the discussion begun by DiCerbo & Behrens (Computers and their impact on state assessment: Recent history and predictions for the future, Information Age, Charlotte, NC, pp. 273–306) which outlined how the societal shift related to the digital revolution can be understood in terms of a shift from a pre-digital “digital desert” to a post-digital “digital ocean.” Using the framework of Evidence Centered Design (Language testing, 19(4), 477–496, 2002) they suggest that the core processes of educational assessment and data collection can be re-thought in terms of new capabilities from computing devices and large amounts of data and suggest further that many of our original categories of educational activity represent views limited by their origination in the digital desert. After reviewing the core ideas of the digital desert to digital ocean shift, implications for understanding educational research are addressed in terms of methodological implications of this shift, including the role of data in hypothesis generation, the role of data in theory testing, the impact of continuous data generation and analysis, and the changing role of statistical and computational tools. Implications for training are addressed throughout.

Keywords

Work Product Summative Assessment Digital Device Open Educational Resource Fixed Response 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

References

  1. Almond, R. G., DiBello, L. V., Moulder, B., & Zapata-Rivera, J.-D. (2007). Modeling diagnostic assessments with Bayesian networks. Journal of Educational Measurement, 44(4), 341–359.CrossRefGoogle Scholar
  2. Almond, R., Steinberg, L., & Mislevy, R. (2002). Enhancing the design and delivery of assessment systems: A four-process architecture. The Journal of Technology, Learning and Assessment, 1(5). Retrieved from https://escholarship.bc.edu/ojs/index.php/jtla/article/view/1671
  3. American Psychological Association, American Educational Research Association, & National Council on Measurement in Education. (1999). Standards for educational and psychological testing. Washington, DC: American Psychological Association.Google Scholar
  4. Baker, R., Walonoski, J., Heffernan, N., Roll, I., Corbett, A., & Koedinger, K. (2008). Why students engage in “gaming the system” behavior in interactive learning environments. Journal of Interactive Learning Research, 19(2), 185–224.Google Scholar
  5. Behrens, J. T. (1997). Principles and procedures of exploratory data analysis. Psychological Methods, 2, 131–160.CrossRefGoogle Scholar
  6. Behrens, J. T., & DiCerbo, K. E. (2013). Technological implications for assessment ecosystems: Opportunities for digital technology to advance assessment. Princeton, NJ: The Gordon Commission on the Future of Assessment.Google Scholar
  7. Behrens, J. T., DiCerbo, K. E., Yel, N., & Levy, R. (2012). Exploratory data analysis. In I. B. Weiner, J. A. Schinka, & W. F. Velicer (Eds.), Handbook of psychology (Research methods in psychology 2nd ed., Vol. 2, pp. 34–70). New York: Wiley.Google Scholar
  8. Behrens, J. T., Frezzo, D., Mislevy, R., Kroopnick, M., & Wise, D. (2006). Structural, functional and semiotic symmetries in simulation-based games and assessments. In E. Baker, J. Dickieson, W. Wulfeck, & H. O’Neil (Eds.), Assessment of problem solving using simulations (pp. 59–80). New York, NY: Routledge.Google Scholar
  9. Behrens, J. T., & Robinson, D. H. (2005). The micro and the macro in the analysis and conceptualization of experimental data. In G. D. Phye, D. H. Robinson, & J. Levin (Eds.), Empirical methods for evaluating educational interventions (pp. 147–173). Burlington, MA: Elsevier.CrossRefGoogle Scholar
  10. Behrens, J. T., & Smith, M. L. (1996). Data and data analysis. In D. C. Berliner & R. C. Calfee (Eds.), Handbook of educational psychology (pp. 945–989). New York: Macmillan.Google Scholar
  11. Bird, S., Klein, E., & Loper, E. (2009). Natural language processing with python (1st ed.). Sebastopol, CA: O’Reilly Media.Google Scholar
  12. Brooks, S., Gelman, A., Jones, G., & Meng, X. (2011). The handbook of Markov chain Monte Carlo. Boca Raton, FL: Chapman Hall/CRC.Google Scholar
  13. Cen, H., Koedinger, K., & Junker, B. (2006). Learning factors analysis–a general method for cognitive model evaluation and improvement. In Intelligent Tutoring Systems (pp. 164–175). Retrieved from http://link.springer.com/chapter/10.1007/11774303_17
  14. Conway, D., & White, J. M. (2012). Machine learning for hackers (1st ed.). Sebastopol, CA: O’Reilly Media.Google Scholar
  15. Daniel, J. (2012). Making sense of MOOCs: Musings in a maze of myth, paradox and possibility. Journal of Interactive Media in Education, 3. Retrieved from http://www-jime.open.ac.uk/jime/article/viewArticle/2012-18/html
  16. DiCerbo, K. (2014). Game-based measurement of persistence. Journal of Educational Technology and Society, 17(1), 17–28.Google Scholar
  17. DiCerbo, K., & Behrens, J. (2012). Implications of the digital ocean on current and future assessment. In R. Lissitz & H. Jiao (Eds.), Computers and their impact on state assessment: Recent history and predictions for the future (pp. 273–306). Charlotte, NC: Information Age.Google Scholar
  18. DiCerbo, K. E. & Behrens, J. T. (2014). The impact of the digital ocean on education. [white paper] London: Pearson.Google Scholar
  19. Engeström, Y., Miettinen, R., & Punamäki, R.-L. (1999). Perspectives on activity theory. New York, NY: Cambridge University Press.CrossRefGoogle Scholar
  20. Feng, M., & Heffernan, N. T. (2006). Informing teachers live about student learning: Reporting in the assistment system. Technology, Instruction, Cognition and Learning, 3(1/2), 63.Google Scholar
  21. Franks, B. (2012). Taming the big data tidal wave: Finding opportunities in huge data streams with advanced analytics. Hoboken, NJ: Wiley.Google Scholar
  22. Frezzo, D. C., Behrens, J. T., Mislevy, R. J., West, P., & DiCerbo, K. E. (2009). Psychometric and evidentiary approaches to simulation assessment in Packet Tracer software. In Fifth International Conference on Networking and Services, 2009. ICNS’09 (pp. 555–560). Retrieved from http://ieeexplore.ieee.org/xpls/abs_all.jsp?arnumber=4976818
  23. Frezzo, D. C., DiCerbo, K. E., Behrens, J. T., & Chen, M. (2014). An extensible micro-world for learning in the networking professions. Information Sciences, 264, 91–103.Google Scholar
  24. Gee, J. P. (2003). What video games have to teach us about learning and literacy. Computers in Entertainment (CIE), 1(1), 20.CrossRefGoogle Scholar
  25. Gelman, A., & Hill, J. (2006). Data analysis using regression and multilevel/hierarchical models (1st ed.). New York, NY: Cambridge University Press.CrossRefGoogle Scholar
  26. Gigerenzer, G. (2009). Surrogates for theory. APS Observer, 22(2). Retrieved from http://www.psychologicalscience.org/index.php/publications/observer/2009/february-09/surrogates-for-theory.html.
  27. Glass, G. V. (1976). Primary, secondary, and meta-analysis of research. Educational Researcher, 5(10), 3–8.CrossRefGoogle Scholar
  28. Gordon, M. E., Slade, L. A., & Schmitt, N. (1986). The “Science of the Sophomore” revisited: From conjecture to empiricism. Academy of Management Review, 11, 191–207.Google Scholar
  29. Gould, S. J. (1981). The mismeasure of man. New York: W. W. Norton.Google Scholar
  30. Guazzelli, A., Lin, W.-C., & Jena, T. (2012). PMML in action: Unleashing the power of open standards for data mining and predictive analytics (2nd ed.). Los Angeles: CreateSpace Independent Publishing Platform.Google Scholar
  31. Hedges, L. V., & Olkin, I. (1985). Statistical methods for meta-analysis (1st ed.). New York, NY: Academic Press.Google Scholar
  32. IMS. (2006). IMS question & test interoperability specification v2.0/v2.1. Retrieved March 2013, from http://www.imsglobal.org/question/index.html
  33. Janert, P. K. (2010). Data analysis with open source tools (1st ed.). Sebastopol, CA: O’Reilly Media.Google Scholar
  34. Kappiarukudil, K. J., & Ramesh, M. V. (2010). Real-time monitoring and detection of “heart attack” using wireless sensor networks. In 2010 Fourth International Conference on Sensor Technologies and Applications (SENSORCOMM) (pp. 632–636). doi: 10.1109/SENSORCOMM.2010.99.
  35. Kery, M., & Schaub, M. (2011). Bayesian population analysis using WinBUGS: A hierarchical perspective (1st ed.). New York, NY: Academic Press.Google Scholar
  36. Kruschke, J. K. (2010). Doing Bayesian data analysis: A tutorial with R and BUGS (1st ed.). New York, NY: Academic Press.Google Scholar
  37. Landauer, T. K., Foltz, P. W., & Laham, D. (1998). An introduction to latent semantic analysis. Discourse Processes, 25(2–3), 259–284.CrossRefGoogle Scholar
  38. Levy, R., Mislevy, R. J., & Behrens, J. T. (2011). MCMC in educational research. In S. Brooks, A. Gelman, G. Jones, & X. L. Meng (Eds.), Handbook of Markov Chain Monte Carlo (pp. 531–546). Boca Raton: Chapman and Hall/CRC.Google Scholar
  39. McKinney, W. (2012). Python for data analysis. Sebastopol, CA: O’Reilly Media.Google Scholar
  40. Mislevy, R. J., Behrens, J. T., DiCerbo, K. E., & Levy, R. (2012). Design and discovery in educational assessment: Evidence centered design, psychometrics, and data mining. Journal of Educational Data Mining, 4(1), 11–48.Google Scholar
  41. Mislevy, R. J., Corrigan, S., Oranje, A., DiCerbo, K. E., John, M., Bauer, M. I., et al. (2014). Psychometrics and game-based assessment. New York, NY: Institute of Play.Google Scholar
  42. Mislevy, R. J., Steinberg, L. S., & Almond, R. G. (2002). Design and analysis in task-based language assessment. Language Testing, 19(4), 477–496.CrossRefGoogle Scholar
  43. Pearl, J. (1988). Probabilistic reasoning in intelligent systems: Networks of plausible inference. Morgan Kaufmann. Retrieved from http://books.google.com/books?hl=en&lr=&id=AvNID7LyMusC&oi=fnd&pg=PA1&dq=Pearl+bayesian+networks&ots=FY-OSfkwZ6&sig=3J1ZLPeMMUfZNG_k73CkHClSj7o
  44. Perkins, J. (2010). Python text processing with NLTK 2.0 cookbook. Birmingham, UK: Packt Publishing.Google Scholar
  45. Reeves, D. (2001). 101 questions and answers about standards, assessment, and accountability. Englewood, CO: Lead + Learn Press.Google Scholar
  46. Ricci, F., Rokach, L., Shapira, B., & Kantor, P. B. (Eds.). (2011). Recommender systems handbook. Berlin: Springer.Google Scholar
  47. Rossant, C. (2013). Learning IPython for interactive computing and data visualization. Birmingham, UK: Packt Publishing.Google Scholar
  48. Russell, M. A. (2011). Mining the social web: Analyzing data from Facebook, Twitter, LinkedIn, and other social media sites (1st ed.). Sebastopol, CA: O’Reilly Media.Google Scholar
  49. Salen, K. (2012). Seminar. Presented at the Educational Testing Service, Princeton, NJ.Google Scholar
  50. Salsburg, D. S. (1985). The religion of statistics as practiced in medical journals. The American Statistician, 39, 220–223.Google Scholar
  51. Seidel, E., & Deift, A. (2011). Data: A centuries-old revolution in science, Part II. Society for Industrial and Applied Mathematics News, 44(7). Retrieved from https://www.siam.org/news/news.php?id=1908.
  52. Shute, V. J. (2011). Stealth assessment in computer-based games to support learning. Computer games and instruction. Charlotte, NC: Information Age Publishers. Retrieved from http://myweb.fsu.edu/vshute/pdf/shute%20pres_h.pdf
  53. Shute, V. J., & Ventura, M. (2013). Measuring and supporting learning in games: Stealth assessment. Cambridge, MA: The MIT Press.Google Scholar
  54. Siemens, G., & Long, P. (2011). Penetrating the fog: Analytics in learning and education. Educause Review, 46(5), 30–32.Google Scholar
  55. Smolan, R., & Erwitt, J. (2012). The human face of big data (1st ed.). Sausalito, CA: Against All Odds Productions.Google Scholar
  56. Tukey, J. T. (1977). Exploratory data analysis. Reading, MA: Addison Wesley.Google Scholar
  57. Vaingast, S. (2009). Beginning python visualization: Crafting visual transformation scripts (1st ed.). Berkeley, CA: Apress.CrossRefGoogle Scholar
  58. Van der Linden, W. J., & Hambleton, R. K. (1996). Handbook of modern item response theory. Springer. Retrieved from http://books.google.com/books?hl=en&lr=&id=aytUuwl4ku0C&oi=fnd&pg=PR5&dq=hambleton+item+response+theory&ots=JXdX5GjwfM&sig=KKbXwfRqSFqzMLsU_2BTY-rhcFk
  59. Wolf, G. (2002, October 5). The data driven life. The New York Times Magazine, pp. 38–45.Google Scholar
  60. Wolf, G., Carmichael, A., & Kelly, K. (2010). The quantified self. TED. Retrieved from http://www.ted.com/talks/gary_wolf_the_quantified_self.html

Copyright information

© Springer Science+Business Media New York 2014

Authors and Affiliations

  1. 1.PearsonNotre DameUSA

Personalised recommendations