Using Instructional Design to Support Community Engagement in Clinical and Translational Research: a Design and Development Case
While community stakeholder engagement is becoming increasingly common in health care, operationalized training materials to support this learner population (community members) are scarce. Instructional design principles were used to create an Open Educational Resource (OER) to support the involvement of community stakeholders in health care research at a university health science center. Prior to the development of this project, a formal group, whose members named themselves Citizen Scientists (CSs), already existed to offer lay perspective on clinical and translational research studies. These CSs are involved in a wide range of active committees within the university’s college of medicine. The challenge of this program, however, is that the CSs require training to engage in these activities (e.g., reviewing research proposals). This design and development research case outlines the instructional design processes, and formative evaluation methods and results of the creation of an OER. While the description of the instructional design processes can be useful for similar project implementations, information on methods and results from the formative evaluation add the following benefits: (a) help community stakeholders to analyze whether projects’ goals have been met, (b) present project aspects that could be improved, and (c) support other communities by creating a model for project evaluation based on similar contexts and with similar project goals.
KeywordsInstructional design Citizen Scientist Clinical and translational research Health care Open educational resource Formative evaluation
Citizen Scientist Program
Increasingly, stakeholder engagement from community members is recognized as essential for the conduct of more meaningful research that can ultimately lead to the more rapid uptake and use of research findings in diverse health-related settings. The University of Florida (UF) Clinical and Translational Science Institute (CTSI) employs a team of Citizen Scientists (CS) to offer a lay perspective on active and proposed clinical and translational research studies. Although CSs have had an important role in fields such as archeology, astronomy, and natural history (Silvertown 2009), such an approach is still relatively new in the domain of clinical research (Domecq et al. 2014). Therefore, the availability of instructional resources specifically designed for this learner population are scarce, making the formal training (Morrison et al. 2010) of new members a challenge for institutions seeking to establish CS programs at their sites.
The task of contributing to the generation of research ideas and actively participating in research projects may sound straightforward but, in reality, it requires special training and knowledge about the following areas: (a) the impact CSs can have as community members; (b) the precautions that research groups must take in relation to how they conduct the research and how they recruit, inform, and treat research participants ethically; (c) the rights of research participants; (d) how research studies are designed, proposed, and funded; (e) how research makes its way from the clinical setting to the community through translational research; (f) the value of multiple stakeholders in the research process; (g) how culture plays a role in health care; (h) how biomedical data can be used to improve health care outcomes; and (i) how to understand and speak the language of research.
Design and Development Research Approach
Given the many aforementioned topics that CSs have to learn for successful participation in clinical and translational research, the problem required a flexible solution that enabled the CSs to learn the materials at different paces and times. Further, members of the UF CS program could be added on an ongoing basis, which creates a logistical problem for training new members: integrating introductory-level content into existing CS activities is repetitive and time-consuming for veteran CSs, and can be overwhelming for new CSs. Therefore, a technology-enhanced, self-paced, and instructional system was a necessary solution to address this problem. The solution needed to be flexible to accommodate the many use cases of the curriculum to support the face-to-face engagement among the other CSs and researchers. This article provides a design and development research case which is defined as a study focused on a product’s design, development, and formative evaluation to provide contextually rich lessons learned and illustrates the process related to the creation of technology-enhanced instructional materials to support the training of CSs (Richey et al. 2004). More specifically, this research case is representative of the Product and Tool Research category that addresses each phase of the instructional design project as well as the rationale for its design and development, and the tools used (Richey and Klein 2007).
Research Questions and Purpose
The research questions that guided this study were based on content, learning, and curriculum validity (Reeves 2011): (a) Do the learning materials and pedagogy implemented in the CS curriculum create the depth and breadth necessary to prepare CSs to collaborate in clinical research?, and (b) Does the format of the learning materials used in the CS curriculum deliver the content effectively?
Based on a mixed method research approach (Johnson and Onwuegbuzie 2004) that used quantitative and qualitative data to analyze the context, learners, and quality of the learning materials developed, the purpose of this research was to explore how the practical approaches described for this context are effective in addressing the instructional problems identified, thereby contributing to the body of knowledge related to design and development research (Richey and Klein 2007). The main advantage of this design and development-mixed method research case is the complementary types of data offered to account for the instructional problem, project goals, and framework used for the design and development of the technology-enhanced, instructional materials.
The project subscribed to the Analysis, Design, Development, Implementation, and Evaluation (ADDIE) instructional design model (Morrison et al. 2010) for the most part; however, the formative evaluation was performed prior to the final implementation; that is, the development and formative evaluation of each module happened in a cyclical scheme until a satisfactory result was achieved. The Analysis, Design, Development, Evaluation, and Implementation (ADDEI) approach was based on time and budget constraints. Due to the project timeline, if implementation was conducted first, more time and money would be necessary to make the revisions to the video tutorials, assessments, and website, as well as other platforms used to host the learning materials. Implementation was conducted only in the final stage of the project, when all the learning materials had been designed, validated, and formatively evaluated.
The analysis stage of the project was implemented through meetings with the program facilitator, Subject Matter Experts (SMEs), CSs, and observations of CSs’ regular meetings. The onsite observations and individual interviews with CSs were particularly useful to guide decisions on which media to use based on their affordances and the nature of the content to be covered.
Three CSs were interviewed using the interview protocol shown in Appendix 1. Two of the interviewees were female and one was male. The CSs included one full-time college student, and two senior citizens. The CSs’ tenure in the program ranged from 8 months to a little over a year. Motivations for joining the program included previous experiences with the health care system, an affinity to the institution, and the desire to be able to assist with research in general. While the CSs interviewed ranged in experience, the interviewees provided a wealth of knowledge pertaining to the challenges of learning about their roles in the CS program. These challenges included the complex web of health care funding sources, types and classifications of research, complex terminology and acronyms used in the health care system, and most importantly, how CSs fit in the process of clinical and translational research.
As for their learning preferences, all interviewees preferred to receive instructional information through video; however, one of them emphasized that there should be a balance between video tutorials and written materials because, in her words, “Video at times is really good, but solely video is like watching Netflix or TV vs. trying to learn.” Based on our learners and contextual analysis (Morrison et al. 2010), a multimedia learning environment seemed to be the optimal choice to meet the project goals and learners’ general preferences. This decision is supported by the Cognitive Theory of Multimedia Learning (CML), which states that the cognitive processes involved with learning can be enhanced when appropriate pictorial and verbal information are integrated with each other and with prior knowledge (Mayer 2005).
The project team was composed of CSs, an educational technology researcher/practitioner, an educational technology graduate student, a public health professional, and health care researchers who aided the identification of SMEs for the design and development of the instructional materials. The SMEs who collaborated on the project were nurses, physicians, a research navigator, a grant specialist, research coordinators, a social worker, and faculty members in the areas of ethics, research, biomedical informatics, and translational science.
The design process started with the definition of learning objectives, which were based on the knowledge that lay people need in order to become well-informed, engaged CSs in clinical and translational research. Learning objectives for each lesson were crafted by project staff and were based on the takeaways necessary to gain a basic understanding of each topic. These learning objectives are designed to prepare the CSs to collaborate in meaningful ways with researchers, and most importantly, empower them to offer critical, insightful reviews on a variety of topics.
The curriculum was developed to support the engagement of community stakeholders in clinical and translational research; however, its development also adhered to the notion of involving multiple stakeholders from the start until the completion of the project. This early stakeholder engagement approach (Schwalbe 2015) was crucial to the instructional design and quality of the final product as CSs provided feedback on, validation, and incremental approval of, the instructional materials.
The design of the curriculum and supporting instructional materials was based on previous instructional practices used by the CTSI staff and veteran CSs, and on the levels of engagement new CSs may develop to become involved in research activities. The existing instructional materials used for in-class training as well as the insights provided by the face-to-face instructors of the program were helpful to the instructional design team to define the scope of the project (Schwalbe 2015), learning objectives, instructional materials, and evaluation instruments (Morrison et al. 2010).
The CS curriculum and supporting instructional materials represent an interface between teaching- and learner-centered approaches (Brown 2003). For example, the web-based instructional materials and activities are based on what new CSs need to learn as they become involved in clinical and translational research, with no built-in individualized assistance; however, the curriculum can be easily modified to integrate the web-based materials into classroom activities, which allows the instructor to use the materials according to individual needs and learning profiles (Brown 2003).
Welcome and Orientation: provides a general introduction to the CS curriculum and the role of CSs in research, as well as the importance of maintaining confidentiality
Research Ethics: presents information on elements necessary for ethical research including the Institutional Review Board, historical breaches in research ethics, and the informed consent process
Sponsored Research: familiarizes new CSs with the concept of clinical research, including how it is designed, funded, and presented in scientific publications
Clinical and Translational Science: offers information on types of research studies and the roles CSs can have in clinical and translational research
Stakeholder Engagement: describes examples of and reasons for stakeholder engagement, and illustrates how research can benefit from multiple perspectives (e.g., researchers, vendors, patients, and caregivers)
Cultural Diversity in Research: addresses cultural competence and how CSs can ensure multiple cultures are acknowledged and respected in research projects
Biomedical Informatics: features information on the role of data in transforming health care, including how big data can be used to improve health outcomes
To ensure that the content in the CS curriculum was compatible with the project’s goal and learners’ needs, a comprehensive process of creation, revision, validation, and formative evaluation was conducted. SMEs were approached by project staff and offered a description of the project, draft learning objectives for their topic, and parameters such as format, duration, and project timeline requirements. After agreeing to participate, SMEs proceeded in several ways, sometimes utilizing all three approaches: (1) schedule meeting with project manager to further discuss expectations and needs, (2) arrange to present draft content in person to the CS group to ensure content was in lay terms and “digestible” in the allotted time, and (3) set a date to record video presentation.
Once the SME was comfortable with the assignment, a draft PowerPoint slide set, and written transcript of the presentation were created and sent to the project manager. Revisions were made to the presentation materials to ensure the visual materials were consistent across presenters and that the content was written for a lay audience. This meant minimizing the use of jargon and abbreviations, explaining key terminologies prior to their first use, and careful review by the project manager to ensure compliance and consistency. Revision of content prior to recording was necessary, as any reshooting would need to be scheduled among the video studio, SME, and project manager. Given the timeline for this project, reshoots were not encouraged. In fact, only one SME reshot their video lesson after formative evaluation by CSs.
The videos were recorded in a professional recording studio with a “green screen,” and were guided by best practices for creating instructional videos and multimedia learning materials (Clark and Mayer 2016; Swarts 2012). The final videos superimposed the presentation slides in the background, to the left of the SME (see Fig. 3). This ensured a uniform look-and-feel among different topics and presenters. The SMEs presented their video content by reading the transcript of the presentation content from a teleprompter. SMEs were allowed as many takes as necessary to record the video tutorial, and editing of major verbal stumbles or reading miscues occurred in post-production. Edited videos were reviewed by project staff to ensure quality prior to showing the CSs for formative evaluation purposes.
The assessment items and associated feedback messages were created by project staff based on final SME video presentations, PowerPoint slides, and teleprompter scripts. Revisions to the assessments focused on ensuring items were in lay terms and aligned with the lesson’s learning objectives. Ongoing input from the CS program manager was crucial for this component; as experience training, this specific learning population offered vital input on assessment content. Once assessment items were created and revised, they were sent to SMEs for further revision and validation. All assessment items were validated by SMEs prior to the formative evaluation. There was only one instance in which minor changes suggested by the SME could not be implemented prior to the formative evaluation as they were sent a few hours prior to the formative evaluation session. Examination by project staff for accuracy was still conducted. After the formative evaluation, additional revisions to the video tutorials and assessment items were implemented as needed, and in the case of video tutorials, only if absolutely necessary. In their final format, assessment items incorporate a color-coded interactive feedback functionality created in Adobe Captivate to aid CSs in regulating their own learning (Butler and Winne 1995). The use of feedback in computer-based learning environments has been shown to be an effective learning aid when the feedback provides elaboration (Van der Kleij et al. 2015).
Formative evaluation sessions and respective sub-modules
2.3, 2.4b, and 6.1
2.2, 3.3, and 4.1
3.2, 4.2, and 5.1
3.1, 7.1, and 7.2
One sub-module, 2.4, received substantial revisions after its first formative evaluation (2.4a) and was granted another formative evaluation (2.4b), which included revisions to question stems, distractors, and a new video tutorial by the same presenter. This was the only video tutorial that was re-filmed. The project team relied on descriptive statistics using Likert scale questions, and open-ended questions to evaluate assessment items and the quality of video tutorials. The data analysis and graphics were conducted in R, a free software environment for statistical computing and graphics.
Once the assessment items and videos achieved the desired quality, they were included with supplemental learning materials (e.g., journal articles, tip sheets, and links to additional resources) on the CS program website. Within the seven modules are 30 lessons, half of which are didactic. As noted, the didactic lessons each include thought questions, learning objectives, a video tutorial, a practice assessment with elaboration feedback, and in some cases, supplemental materials and video clips with tips shared by veteran CSs. The remaining lessons contain additional training information including case study videos, animated videos, and video interviews offering insights from both CSs and researchers who have engaged CSs.
All raw instructional materials created throughout the project are available as an Open Educational Resource (OER), meaning they are available to anyone upon request (Johnstone 2005), which can facilitate customization by other institutions in the implementation of a CS program. Furthermore, the availability of the instructional materials as an OER reflects the institutional goals of supporting socially responsible practices in clinical and translational research (Reeves 2000, 2011), which includes the involvement of CSs and potential engagement with other institutions and stakeholder with an interest in the CS program and curriculum. This research is based on practices that connect the learning, community stakeholder engagement, and clinical research domains, with the increased benefit of outreach impact supported by its OER nature.
Internal consistency reliability was α = .885 for the assessment score and α = .863 for the video score. These high reliability coefficients support the use of composite scores in evaluating the instructional materials.
Research question 1: Do the learning materials and pedagogy implemented in the CS curriculum create the depth and breadth necessary to prepare CSs to collaborate in clinical research?
Sample recognition and knowledge-transfer questions
Which of the following would not be classified as intellectual property?
a. A researcher’s hypothesis
b. A researcher’s dataset
c. A researcher’s idea
d. A researcher’s last name
Please read the following scenarios and choose the most appropriate response for each question based on what you have learned about intellectual property and confidentiality.
Dr. Samuel Parker has completed a research study in which he and his team have tested a surgical process that they hypothesized would provide about 75% relief to most people suffering from a certain kind of chronic pain. As a Citizen Scientist, you worked on the grant proposal with Dr. Parker and members of his research team. Because of this, you are included in an e-mail message which states that the preliminary results of the study indicate that this technique does work.
You are very excited about the results of the study and you forward the e-mail to your sister, who suffers from the same form of chronic pain so that she can get the surgery as quickly as possible.
Is it acceptable for you to share this information with your sister?
a. As a Citizen Scientist and a member of the research team, you have the right to share this information.
b. It is acceptable to share this information with your sister because she is a relative.
c. All preliminary results are considered intellectual property and should not be disclosed.
d. Patients in urgent need of this procedure should have knowledge of and access to this information as soon as possible, so you should share it.
Even though the mean percentage of correct responses was less than 60% for only one lesson, sub-module 2.4, several questions in different lessons (Appendix 4) were also below this threshold, making additional revisions necessary for most lessons. These individual items were checked for clarity of language, appropriate stems, believable distractors, and clear instructions.
Research question 2: Does the format of the learning materials used in the CS curriculum deliver the content effectively?
Research questions and answers
a) Do the learning materials and pedagogy implemented in the CS curriculum create the depth and breadth necessary to prepare CSs to collaborate in clinical research?
The elaboration of video tutorials and validation of assessment items by SMEs provided the necessary evidence that the curriculum would serve CSs well in terms of the information they should have in order to become sufficiently knowledgeable about a given area to productively participate as a stakeholder in clinical research. The cyclical revision of assessment items based on set performance goals was crucial to ensure that CSs would achieve minimal understanding of the content presented, including the ability to transfer factual information to more ill-defined situations introduced through the scenarios used in the assessment items. Some of the content overlaps by design to ensure that CSs are met where they are at. The approach towards the facilitation of the learning objectives offers a more customized learning experience to current and future CSs.
b) Does the format of the learning materials used in the CS curriculum deliver the content effectively?
The availability of different learning materials (e.g., video tutorials, assessment items, and face-to-face support by the instructor) were all carefully crafted to represent the types of knowledge and skills that CSs need to engage in activities related to clinical and translational research. The materials used were viewed favorably overall when tested in formative evaluation sessions.
The instructional design processes described in this article were successful progressions for the creation of the final technology-enhanced learning solution: a self-paced, online OER that utilizes video- and assessment-based instructional packages across seven distinct topics. During this process, many challenges were encountered in designing and developing the instructional materials. First and foremost, the modified ADDIE (ADDEI) employed during this project proved to be an effective approach to address the constraints imposed by the allocated budget, tight deadlines, and busy schedules of all involved; therefore, it should be considered as a viable solution by other projects with similar constraints facing the same challenges.
Additionally, it is imperative to develop clear guidelines and procedures for the development of the video tutorials. The SMEs, who also served as video tutorial presenters, needed clear instructions to focus their content, guidance in how to write their teleprompter scripts, direction on how to design their slides, guidelines for best practices in recording, and often multiple takes to finalize their recording. It is equally important to have strong SME buy-in and commitment (Van Rooij 2010) to both the process and product to ensure the highest level of quality possible. For instance, at first, many of the SMEs did not want to write a script for their video tutorials. However, once the logistics of the video recording experience were described and project staffs offered to assist in writing the content, SMEs were often happy to make edits to a draft script rather than write one in its entirety. This process required clear and concise communication, patience, and staff dedication to coordinate multiple video tutorial slides and scripts concurrently.
The design of the assessment items for each didactic lesson should be unmistakably aligned with the learning objectives, presentation slides, and teleprompter script (Martone and Sireci 2009). It was critically important to use consistent language from the video tutorials in the assessment items to avoid confusion and unnecessary extraneous cognitive load among the learners during the practice assessment (Paas et al. 2003). In several cases, the instructional design team had to “wordsmith” the assessment stems and distractors to ensure consistency and clarity after the formative evaluation sessions with the CSs. Also, a challenge was creating consistent and effective slides with appropriate pictures, text, and animation effects that facilitate learning. SMEs without instructional design training tend to use unnecessary animations, redundant text, and decorational pictures, which can actually hinder the learning outcomes (Morrison et al. 2010).
A final consideration is the formative evaluation process, instruments, and instructions to the target learners involved in the procedures. Initially, the formative evaluation sessions were open for CS comments about the video tutorials and assessment items. While some of the open conversation was helpful, the project team quickly learned that the open discussion format often led to irrelevant topics and unhelpful points being discussed. It was later decided to have the CSs provide all of their feedback on the videos and assessments in paper form only. This change in process resulted in more focused formative evaluation data. While the lessons discussed here are not exhaustive, they do provide contextually rich guidance for others to consider in the implementation of similar projects using similar processes. Note that as this project was completed within a specific context, these lessons may be difficult to generalize.
Current Status and Next Steps
The final step for the CS Instructional Design Project included the creation of a detailed instructor guide to support the adoption of the curriculum by other CS groups. This guide features additional resources, discussion prompts that can be used in online or face-to-face instruction, and suggestions on how to integrate this content into existing institutional training resources. This instructor guide will be used to implement the curriculum with a small cohort of untrained CSs at UF, where the curriculum content has been ported into a Learning Management System (LMS) that will allow for a customized experience with clear performance metrics. The organization of the curriculum in a LMS was important to ensure the quality of the training, provide flexibility to instructors to incorporate additional materials, allow instructors to assess learning performance, provide a platform that learners can revisit as needed, and give learners the opportunity to regulate their own learning (Vovides et al. 2007). Following implementation of the curriculum with this cohort, a summative evaluation is planned. This will include individual interviews with members of this cohort, evaluation of their performance on the assessment items, and their overall impression of the quality of the video tutorials and learning materials. The similarity of these metric tools with those used during the formative evaluation will allow for assessment of quality and suitability of the curriculum for its target audience.
The involvement of lay people in clinical and translational research has proven to be an essential (Domecq et al. 2014) and cost-effective measure (Bonney et al. 2009), not only to collaborate with researchers who benefit from the unique perspectives that CSs bring to the research process, but especially for the body of knowledge that is generated based on unprecedented levels of contributions. The instructional materials developed specifically for new CSs can contribute to a wide adoption of translational science (Zerhouni 2005), making it a closer reality to community stakeholders and increasing the scope of scientific knowledge in meaningful ways. Although the design and development research case presented here is, to some degree, context-bound due to its application to a particular project (Richey and Klein 2007), it provides valuable insights on how instructional design research and practice can be used to support the development of robust, technology-enhanced curriculum materials to facilitate learning.
Research reported in this publication was supported in part by the OneFlorida Clinical Data Network, funded by the Patient-Centered Outcomes Research Institute #CDRN-1501-26692, in part by the OneFlorida Cancer Control Alliance, funded by the Florida Department of Health’s James and Esther King Biomedical Research Program #4KB16, and in part by the University of Florida Clinical and Translational Science Institute, which is supported in part by the NIH National Center for Advancing Translational Sciences under award number UL1TR001427. The authors would like to acknowledge the effort and assistance provided by the University of Florida Citizen Scientist Program members: Anastasia Anderson, Ravi Bhosale, Shirley Bloodworth, Quintina Crawford, Christy Evans, Myrtle Graham, Claudia Harris, Nathan Hilton, Janelle Johnson, Bill Larsen, Carlos Maeztu, and Nadine Zemon.
Compliance with Ethical Standards
The content is solely the responsibility of the authors and does not necessarily represent the official views of the Patient-Centered Outcomes Research Institute (PCORI), its Board of Governors or Methodology, the OneFlorida Clinical Research Consortium, the University of Florida’s Clinical and Translational Science Institute, the Florida Department of Health, or the National Institutes of Health.
- Brown, K. L. (2003). From teacher-centered to learner-centered curriculum: improving learning in diverse classrooms. Education, 124(1), 49.Google Scholar
- Domecq, J. P., Prutsky, G., Elraiyah, T., Wang, Z., Nabhan, M., Shippee, N., Pablo Brito, J., Boehmer, K., Hasan, R., Firwana, B., Erwin, P., Eton, D., Sloan, J., Montori, V., Asi, N., Abu Dabrh, A. M., & Murad, M. H. (2014). Patient engagement in research: a systematic review. BMC Health Services Research, 14, 89.CrossRefGoogle Scholar
- Gronlund, N. E. (2004). Writing instructional objectives for teaching and assessment. Upper Saddle River: Pearson/Merrill/Prentice Hall.Google Scholar
- Johnstone, S. M. (2005). Open educational resources serve the world. Educause Quarterly, 28(3), 15.Google Scholar
- Mayer, R. E. (Ed.). (2005). The Cambridge handbook of multimedia learning. Cambridge: Cambridge university press.Google Scholar
- Morrison, G. R., Ross, S. M., Kemp, J. E., & Kalman, H. (2010). Designing effective instruction. Hoboken: John Wiley & Sons.Google Scholar
- Reeves, T. C. (2000). Socially responsible educational technology research. Educational Technology, 40(6), 19–28.Google Scholar
- Reeves, T. C. (2011). Can educational research be both rigorous and relevant. Educational Designer, 1(4), 1–24.Google Scholar
- Richey, R. C., & Klein, J. D. (2007). Design and development research: methods, strategies, and issues. Mahwah: Lawrence Erlbaum Associates.Google Scholar
- Richey, R. C., Klein, J. D., & Nelson, W. A. (2004). Developmental research: studies of instructional design and development. In D. H. Jonassen (Ed.), Handbook of research on educational communications and technology (pp. 1099–1130). Mahwah: Lawrence Erlbaum Associates.Google Scholar
- Schwalbe, K. (2015). Information technology project management. Cengage Learning.Google Scholar
- Swarts, J. (2012). New modes of help: best practices for instructional video. Technical Communication, 59(3), 195–206.Google Scholar
- Zerhouni, E. A. (2005). Translational and clinical science—time for a new vision. Retrieved from http://www.nejm.org/nejmspecial?query=cmgtl&utm_source=nejm&utm _medium=cm&utm_campaign=notablearticles16.