Abstract
Contemporary research practice unreasonably obscures formative research outcomes from public notice. Indeed, this exclusion – often unintentional – holds true even when the research is publicly funded. Accordingly, the Public must search scholarly channels, such as academic journals, for research information that is not composed for general comprehension. Essentially, a breach in information transmission separates researchers and society at large. In education, a similar communication gap exists between students and instructors, given that instructors rely on traditional assessment activities to measure student performance and rarely realize the corresponding study efforts. Consequently, certain important formative evidences go largely unnoticed. Today, researchers are exploring smart learning processes that exploit opportunities triggered by environmental affordance, personal need, and/or professional expectation, and mitigate various assessment difficulties.
This presentation introduces Open Research in the context of Smart Learning. First, it discusses the advantages of opening the research process to an authorized public, fellow students, educators and policymakers. For example, it argues that greater accessibility can promote research growth and integrity. Second, it uses observational study methods to illustrate the ways students and educators can conduct their own experiments using continuously arriving data. This second section introduces three matching techniques (i.e. Coarsened Exact Matching, Mahalanobis Distance Matching, and Propensity Score Matching) and three data imbalance metrics (i.e. L1 vector norm, Average Mahalanobis Imbalance, and Difference in Means) to assess the level of data imbalance within matched sample datasets. Ultimately, the presentation promotes Smart Learning Environments that incorporate automated tools for opportunistic capture, analysis and remediation of various formative study processes. Such environments can enable students to ethically share and receive study data that help them conduct personal observational studies on individual study related questions. Moreover, it explains key traits of observational studies that are relevant for smart learning environments, considering the comparable traits of blocked randomized experiments. Remarkably, this presentation proposes a novel idea to connect Open Research with Persistent Observational Study methods. It explores how open research can support adaptive and self-regulated learning. It advocates for innovative research practices that can produce better and smarter learning.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Olmos, A., & Govindasamy, P. (2015). Propensity Scores: A Practical Introduction Using R. Journal of MultiDisciplinary Evaluation, 11(25), 68–88.
Iacus, S. M., King, G., Porro, G., & Katz, J. N. (2012). Causal inference without balance checking: Coarsened exact matching. Political Analysis, 1–24.
King, G., Nielsen, R., Coberley, C., & Pope, J. E. (2011). Comparative Effectiveness of Matching Methods for Causal Inference. Unpublished Manuscript, 15, 1–26. http://doi.org/10.1.1.230.3451
LaLonde, R. J. (1986). Evaluating the econometric evaluations of training programs with experimental data. The American Economic Review, 604–620.
Dehejia, R. H., & Wahba, S. (1999). Causal effects in nonexperimental studies: Reevaluating the evaluation of training programs. Journal of the American Statistical Association, 94(448), 1053–1062.
Dehejia, R. H., & Wahba, S. (2002). Propensity score-matching methods for nonexperimental causal studies. Review of Economics and Statistics, 84(1), 151–161.
King, G., Lucas, C., & Nielsen, R. (2014). The Balance-Sample Size Frontier in Matching Methods for Causal Inference. American Journal of Political Science.
King, G., & Nielsen, R. (2016). Why propensity score should not be used for matching, (617).
Hannan, E. L. (2008). Randomized Clinical Trials and Observational Studies: Guidelines for Assessing Respective Strengths and Limitations. JACC: Cardiovascular Interventions, 1(3), 211–217. http://dx.doi.org/10.1016/j.jcin.2008.01.008
Concato, J., Shah, N., & Horwitz, R. I. (2000). Randomized, Controlled Trials, Observational Studies, and the Hierarchy of Research Designs. The New England Journal of Medicine, 342(25), 1887–1892.
Medical Publishing Internet, Kent W. The advantages and disadvantages of observational and randomised controlled trials in evaluating new interventions in medicine. Educational article [Internet]. Version 1. Clinical Sciences. 2011 Jun 9. Available from: https://clinicalsciences.wordpress.com/article/the-advantages-and-disadvantages-of-1blm6ty1i8a7z-8/.
Silverman, S. L. (2009). From Randomized Controlled Trials to Observational Studies. The American Journal of Medicine, 122(2), 114–120. http://dx.doi.org/10.1016/j.amjmed.2008.09.030
At Work, Issue 83, Winter 2016: Institute for Work & Health, Toronto.
Sullivan, G. M. (2011). Getting Off the “Gold Standard”: Randomized Controlled Trials and Education Research. Journal of Graduate Medical Education, 3(3), 285–289. http://doi.org/10.4300/JGME-D-11-00147.1
Lindholm, M. (2015), Public Commitment to Research, VA Barometer 2015/16 – VA Report 2015:6, Vetenskap & Allmänhet, http://v-a.se/downloads/varapport2015_6_eng.pdf
Pardo, R., Calvo, S. (2002), Attitudes toward science among the European public: a methodological analysis, Public Understand. Sci. 11, 155–195, https://www.upf.edu/pcstacademy/_docs/155.pdf
http://www.stic-csti.ca/eic/site/stic-csti.nsf/eng/00088.html
https://english.eu2016.nl/latest/news/2016/04/05/eu-action-plan-for-open-science
http://www.stic-csti.ca/eic/site/stic-csti.nsf/eng/00088.html
http://www.nstmis-dst.org/PDF/FINALRnDStatisticsataGlance2011121.pdf
Giannakos, M., Sampson, D. G. and Kidzinski, L. (2016), Introduction to smart learning analytics: Foundations and developments in video-based learning, Smart Learning Environment, Vol. 3 No. 12, https://doi.org/10.1186/s40561-016-0034-2
Gros, B. (2016), The design of smart educational environments, Smart Learning Environment, Vol. 3 No. 15, https://doi.org/10.1186/s40561-016-0039-x
Kinshuk, Chen, N. S. and Cheng, I. L. (2016), Evolution is not enough: Revolutionizing current learning environments to smart learning environments, International Journal of Artificial Intelligence in Education, Vol. 26 No. 2, pp. 561–581.
Kumar, V.S., Fraser, S.N., Boulanger, D. (2017). Discovering the predictive power of five baseline writing competences, Journal of Writing Analytics, 1 (1), pp. N/A, https://journals.colostate.edu/analytics/article/view/107.
Bartling, S and Friesike, S. (2017), Opening Science, Springer Open, http://book.openingscience.org
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2018 Springer Nature Singapore Pte Ltd.
About this paper
Cite this paper
Kumar, V.S., Fraser, S., Boulanger, D. (2018). Open Research and Observational Study for 21st Century Learning. In: Chang, M., et al. Challenges and Solutions in Smart Learning. Lecture Notes in Educational Technology. Springer, Singapore. https://doi.org/10.1007/978-981-10-8743-1_17
Download citation
DOI: https://doi.org/10.1007/978-981-10-8743-1_17
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-10-8742-4
Online ISBN: 978-981-10-8743-1
eBook Packages: EducationEducation (R0)