Advertisement

Collection-Based Education by Distance and Face to Face: Learning Outcomes and Academic Dishonesty

  • Andrea LuckyEmail author
  • Marc Branham
  • Rachel Atchison
Open Access
Article

Abstract

Assembling and curating specimen collections is a valuable educational exercise that integrates subject-specific skills such as field collection, curation, identification, organization, and interpretation of relationships. Collection projects have been used primarily in face-to-face classes, but they can be readily adapted for distance education. The primary challenges to using collection projects in distance education center on two concerns: (1) whether distance students learn as much as their face-to-face peers and (2) whether academic dishonesty occurs more often in distance education than face to face. This study addressed both concerns by assessing learning outcomes in two entomology courses with both face-to-face and distance sections and evaluating the frequency of specimen-based plagiarism (submitting specimens collected by someone else). An ungraded survey testing students’ insect identification knowledge found equivalent learning outcomes in face-to-face and distance classes. Insect collections were monitored for plagiarized (resubmitted or purchased) specimens in a 5-year “mark-release-recapture” investigation. Academic dishonesty was detected in fewer than 2% of collections; cheating was more than 12 times more likely in distance than in face-to-face classes. This study’s findings raise the possibility that distance learning assessments can be artificially inflated by cheating, suggesting that evaluations of distance learning should be considered in light of academic dishonesty. These results highlight the benefits and challenges of collections as teaching tools in distance education and underscore the need for instructors to be vigilant about academic integrity.

Keywords

Academic honesty Plagiarism Distance education Learning outcomes Educational assessment Insect identification 

Introduction

As higher education increasingly embraces distance education, one of the greatest challenges in developing effective STEM courses is the difficulty of creating effective laboratories for distance delivery. An abundance of evidence clearly demonstrates that, under the right conditions, virtual classes and laboratories that employ computer simulations of real-world learning situations (augmented reality, virtual reality, virtual worlds, and computer graphics) can be as effective as face-to-face instruction (DeJong et al. 2013; Nguyen 2015; Potkonjak et al. 2016). However, the need for active, authentic, and student-driven experiences in science instruction is widely recognized by scientists, educators, and policymakers as a national educational priority (Brewer and Smith 2011; Beier et al. 2018; Austin 2018). Active learning, in particular, has been shown to increase educational performance outcomes (Gardner and Belland 2012; Freeman et al. 2014). As a result, hands-on science activities that can be deployed in distance-delivered science courses are especially valuable. Specimen collection and curation projects are examples of assignments that can engage face-to-face and distance learning students in authentic and integrative project-based learning, but to date, little has been published about the benefits and challenges of working with collections in either delivery mode.

The Value of Collection Projects

Collection projects are valuable for learning hands-on skills while integrating theory and practice. Students learn not only about the subjects themselves (insects, fungi, plants, rocks, etc.), but about the environmental context in which they occur, along with how to preserve, identify, and record data associated with each one (Sandvoss et al. 2003). To successfully complete a collection, students must also engage with relevant literature to understand the material they have collected. Because collections are place specific, they permit all students, even distance students, to engage meaningfully with their local environment. Traditionally, collections have been the backbone of hands-on curricula in specific “-ologies” (entomology, mycology, botany, geology, etc.), where they serve as formative assessments that help students engage in integrative learning and allow instructors to identify the degree to which mastery has been achieved (Goubeaud 2010). Students of natural sciences today engage extensively with digital technology and data management as well as with actual specimens. In a digital and connected world, specimen collections serve as bridges between disciplines and across time, forming the foundations for research that may reconstruct the past, characterize the present, or predict the future of biodiversity in a changing world (Suarez and Tsutsui 2004; Cook et al. 2014).

Collections in Context: Distance Education

Collection-based learning has the potential to meet a growing need for hands-on experiences in distance education. In fall 2014 alone, 5.8 million students were enrolled in distance education in the USA, with 2.85 million of these students taking courses exclusively by distance. As these numbers continue to rise, the number of students taking only face-to-face classes diminishes each year (Allen et al. 2016). The rise in popularity and availability of distance education has also led to concern about the equivalence of face-to-face and distance learning, including the prevalence of academic dishonesty.

Many studies have demonstrated that distance education can be as effective as face-to-face classes (USDE 2010; Lack 2013; Wu 2015), but as in all teaching, the specifics matter. How a class is developed and managed for the distance learning environment and how instructors interact with students can play a significant role in determining educational outcomes. Many educators, even those who teach distance courses, have limited experience as distance students themselves, and as such may have limited insight into the distance learning experience. Although this will change as the field grows, currently, many instructors may be unaware of aspects of student behavior that are more obvious face to face, such as learning trajectories and cheating over the course of the semester. This has, in part, led to a widespread perception that academic misconduct, in the form of plagiarism or cheating, is more prevalent in distance education (Ubell 2017).

Unique Challenges

Collection-based education has the potential to be an excellent match for distance curricula but has not yet been widely integrated into distance learning. This may have to do with the unique challenges associated with collection-based learning. Resources such as microscopes, preservation materials, and reference material may be unevenly available to students. Collection submission can involve direct mailing to the instructor, but this adds costs and the possibility of damage or loss. An alternative, a photo-based collection, requires photographic equipment and enough technical savvy to upload images to a digital platform. These specific challenges can be overcome when the instructors collaborate with students to find appropriate solutions.

Another challenge in collection-based education is that plagiarism, i.e., students submitting work that is not their own, can be difficult to detect. Cheating poses a particularly tricky problem when performance metrics, such as test or assignment scores, are used to compare learning across teaching modes. If cheating in distance education is more common than face to face, as has been suggested by multiple studies (Rowe 2004; Lanier 2006; Bell and Federman 2013; but also see Grijalva et al. 2006; Walker 2010), then academic dishonesty may quietly confound learning metrics.

Enforcement of high standards of integrity is not trivial in an online environment (Singh and Hurley 2017), just as it has long been a challenge in face-to-face classrooms, as evidenced by decade-long studies (Haines et al. 1986; Diekhoff et al. 1996; Vandehey et al. 2007; Stiles et al. 2018). Academic dishonesty prevention and detection measures have primarily focused on written assignments (plagiarism) and test-taking (prohibited assistance or materials). For example, textual similarity in written assignments can be detected by programs such as Turnitin (www.Turnitin.com; Oakland, CA, USA), and a variety of proctoring programs are now available to monitor students during testing. In contrast, detecting cheating in a collection requires a different approach than in an essay or exam. It can be difficult to assess if the contents of a collection were collected, curated, and organized by a particular student and not the work of someone else, for example, a previous graduate of the course or a vendor of insect specimens.

The extent of cheating in general—in distance and in face-to-face classes—is poorly understood, in part because detected and self-reported rates of cheating may not accurately reflect actual cheating frequency. However, we do know that cheating in college is not uncommon. Although empirical research is limited, more than 60% of university undergraduate and 40% of graduate students self-report having cheated on tests or written assignments (ICAI 2017, https://academicintegrity.org/statistics; Park 2003). A limited amount of research has evaluated differences in cheating and learning distance versus face to face, and many of these studies do not control for course-based differences such as instructor, content, student demographics, or level of instruction (Park 2003; Bell and Federman 2013). Especially needed in this field are comparative evaluations where these factors are kept consistent across distance and face-to-face delivery platforms.

This study investigated how student learning differed in face-to-face and distance entomology courses. Because of widespread concerns about the value and integrity of distance classes, we specifically asked (1) whether distance students’ knowledge gain was equivalent to that of face-to-face students and (2) whether academic dishonesty is more prevalent in distance classes. Our ultimate goal was to improve learning outcomes and reduce cheating in both teaching modalities. We focused on these questions in the context of specimen-based entomology education because being able to collect, curate, and identify insects is fundamental to this field. In doing so, this study contributes to the practice of collection-based education and, more generally, assessment in distance education.

Methods

This study took a two-part approach to examining differences in effectiveness of classes delivered by distance and face to face. First, we assessed learning gains in face-to-face and distance sections of an advanced insect classification course. Second, we assessed the prevalence of cheating by examining insect collections in classes where the project constituted an integral part of the curriculum. All analyses were performed using R statistical software (R Core Team 2018).

Location and Context

The Entomology and Nematology Department at the University of Florida houses one of the largest and most highly regarded entomology education programs in the world (CWUR 2017) with a well-developed distance education program. As a result, many course offerings are delivered both face to face and by distance. We focused on two classes delivered regularly in both modalities to assess the influence of instructional delivery in courses where other variables remained consistent (e.g., course content). Student enrollment ranged from 5 to 86 students per face-to-face or distance section per course, and the same suite of two to three instructors regularly taught each class during the fall, spring, and summer semesters that each was offered.

Student Population

Participants were students enrolled in two courses at the University of Florida (UF): the introductory Principles of Entomology (ENY 3005/5006) and advanced Insect Classification (ENY 4161/6166). Completing the introductory course is a prerequisite for enrollment in the advanced course, and successful completion of both is a requirement for the undergraduate major or minor in Entomology, as well as for the degree of MS or PhD in Entomology in the College of Agricultural and Life Sciences at UF. Both courses are offered face to face, by distance delivery, or by both methods during every semester (fall, spring, and summer). Students in face-to-face classes were local undergraduate and graduate students enrolled in degree programs at UF. Students in distance sections were a mix of degree-seeking UF students on campus and at remote campuses, students in degree programs at other universities, and non-degree seeking students. The collection requirements for distance students were identical to those for face-to-face students (i.e., number of correctly identified specimens and associated data); distance students’ collections were either mailed or directly delivered to instructors.

Learning Gain Comparisons

To compare learning gains in distance and face-to-face sections of a single course (Insect Classification), we assessed students’ pre- and post-course knowledge of insect identification with an ungraded 50-question photographic survey of insect orders, suborders, and families (Table 1; Appendix 1—Classification Survey). One of two regular instructors (AL or MB) administered this survey to students in each of 13 iterations of this course between fall 2014 and fall 2017. The survey presented the same questions in the same order and was completed online, unproctored (seven times by distance), or on paper under an instructor’s supervision (six times face to face).
Table 1

Number of course iterations in the advanced course, Insect Classification, with number of students who completed insect pre- and post-course insect identification surveys

 

Number of course iterations

Number of students

Distance

7

81

Face to face

6

101

Total

13

182

We used a linear mixed effects model to determine whether delivery mode, instructor, and timing (pre- vs post-course) had individual and/or interactive effects on survey performance. Because pre- and post-course scores are for the same individuals and students come with inherent variation, we included individual student (anonymized) in the model as a random factor. t tests were then performed to examine interactions between test time (pre- versus post-course) and delivery mode on assessment scores.

Assessing Cheating

More than 1000 (1266) insect collections submitted in introductory (983) and advanced (283) entomology courses were examined for specimen-based plagiarism as part of routine course assessment between spring 2013 and fall 2017 (Table 2). This corresponds to 40 semester-long classes, 15 of which were distance-delivered (267 students) and 25 delivered face to face (999 students).
Table 2

Summary of the number of collection-based courses taught over 5 years from 2013 to 2017, including total number of students per course by delivery method, number of collection-associated cheating incidents noted in parentheses, and percent cheating rounded to two significant digits

Course level

Delivery method

No. of times taught

No. of students enrolled (cheating)

% Cheating

Advanced

Distance

7

82 (5)

6.1

Face to face

10

201 (2)

1.0

Total

17

283 (7)

2.5

Introductory

Distance

8

185 (11)

6.0

Face to face

15

798 (3)

0.4

Total

23

983 (14)

1.4

Total

Distance

15

267 (16)

6.0

Face to face

25

999 (5)

0.5

All

40

1266 (21)

2.0

To ensure that insect specimens were not being “recycled” from previously submitted collections, we used a mark-release-recapture approach. Each semester over 5 years, all insect specimens submitted for a grade were marked with invisible ink, then released back to students. All specimens in the UF Entomology Teaching Collection (more than 50,000 insects) were similarly marked. Any specimens recaptured in student collections with our distinctive marking were thus unmistakably recognizable as previously turned in for a grade (prohibited in both classes) or purloined from the teaching collection.

Mark

Specimens were marked with a UV-detectable invisible ink (Sirchie Co., Youngsville, NC) by dabbing one droplet of ink on the dorsum of each specimen with a fine paintbrush. This liquid dried to an invisible film that was undetectable under normal lighting.

Release

After being graded and marked, insect collections were returned to students who wanted to keep them; donated collections were incorporated into the UF Entomology Teaching Collection or used for other education and outreach activities.

Recapture

Each collection submitted for a grade was routinely checked for marked specimens by examining all insects under short-wave UV light in an otherwise darkened room. Scanning collections this way required good organization within and across classes but produced straightforward results; specimens that had been marked could be identified unambiguously. Each semester, the number of collections with recycled specimens was recorded, along with the number of specimens plagiarized in each collection.

Contract cheating was harder to identify, but we searched for it by first looking for obvious errors in identification or curation. Red flags included collection localities outside of species’ known ranges, misidentification of common species, and lack of supporting data such as field notes or references used for identification. We also familiarized ourselves with and screened for specific suites of specimens and label styles used by individual vendors. Once a collection was identified by these markers, further inspection often revealed obviously falsified data.

Collections with suspicious collection records were examined by entomologists associated with the University of Florida Entomology and Nematology Department, the Florida State Collection of Arthropods (FSCA) at the Florida Department of Agriculture and Consumer Services Division of Plant Industry (FDACS-DPI), and the US Department of Agriculture Agricultural Research Service (USDA-ARS). Expert knowledge of specimen identification and geographic ranges enabled us to determine likely cases of falsified records—in particular, specimens that were labelled as having been collected far outside of their geographic ranges. Students who were identified as having plagiarized or falsified material in a collection received a grade of zero on the assignment and were reported to the Dean of Students Office. These students were free to keep their (marked) specimens but were required to remove collection labels to prevent circulation of false data.

Statistical Analysis of Cheating

To determine if specimen-based cheating was more prevalent in distance than in face-to-face classes and whether there were differences in the frequency of cheating in the introductory or advanced course, we fit our data to a binomial logistic regression model that incorporated class (introductory vs advanced) and delivery method (face to face vs distance). Logistic regression allowed us to predict the probability for a categorical response (cheater versus non-cheater). To do this, we used a binomial error distribution, where the response variable is given as a two-column integer matrix: the first column contained the number of cheaters and the second the number of non-cheaters. An analysis of variance (ANOVA) with a chi-squared test statistic was used to select the model of best fit, i.e., which combination of predictor variable(s) best explained the data. Significance indicated that our best fit model better explained the data than the null model. We also calculated goodness-of-fit measures (Hosmer and Lemeshow, Cox and Snell, and Nagelkerke R2) as additional support of the model fit as assessed by ANOVA. The odds ratio from logistic regression indicated how each predictor influenced incidence of cheating.

Results

Learning

Comparison of insect identification survey scores by delivery method and instructor revealed that neither factor alone explained assessment score: pre- and post-course assessment scores were equivalent in face-to-face and distance classes (lmer: t = 0.004, df = 260.48, p = 0.997) and between instructors (lmer: t = 1.352, df = 180.54, p = 0.178). Time was the only singly significant explanatory variable. As expected, post-course scores were significantly higher than pre-course scores (lmer: t = − 26.596, df = 182.64, p < 0.001; Fig. 1), indicating that student knowledge improved over the course of the semester. While post-course scores across delivery methods were equivalent (t = − 0.62387, df = 157.05, p = 0.5336), overall learning gains were significantly greater in face-to-face sections as a result of pre-course scores of distance students being 4.8884 points (approximately 12%) higher on average, than those of students in face-to-face classes (t = − 4.0109, df = 170.51, p < 0.001). In other words, the average post-score for students in face-to-face classes improved by 50%, while the average post-score for distance students improved by only 36%. This is indicated by a significant interaction between delivery and survey timing (lmer: t = 4.994, df = 182.64, p < 0.001).
Fig. 1

Average pre-course and post-course scores on the insect identification survey administered in the advanced course, Insect Classification. Letters represent groups that are significantly different from others. Error bars represent 95% confidence intervals

Cheating

Plagiarism was detected in 21 collections (of 1266 total), less than 2% of all submitted. We used logistic regression to examine the influence of course (introductory vs. advanced) and delivery (face to face vs. distance) on cheating probability. Model evaluation by Pearson’s chi-squared test revealed that course did not significantly contribute to data fit (Table 3), so, as a result, course was removed from the model and the logistic regression predicting cheating probability was re-run with delivery mode as the only explanatory variable (Eq. 1, Table 4).
$$ \mathrm{Predicted}\ \mathrm{logit}\ \mathrm{of}\ \left(\mathrm{Cheating}\right)=-5.2923+2.5394\ \left(\mathrm{Delivery}\right) $$
(1)
Table 3

Initial logistic model evaluation by Pearson’s chi-squared test to examine the influence of course (introductory vs. advanced) and delivery (face to face vs. distance) on cheating probability

 

X 2

df

p value

Null

70.422

  

Delivery

40.649

1

4.857 × 10−8

Course

40.367

1

0.5951

Table 4

Logistic regression analysis of cheating probability with only significant predictor variable(s) included

Predictor

β

SE (β)

Wald’s statistic

p value

Odds ratio

Intercept

− 5.2923

0.4483

− 11.8

< 2 × 10−16

NA

Delivery

2.5394

0.5172

4.91

9.11 × 10−7

12.67

We found a significant positive relationship between cheating and distance delivery; the regression coefficient was greater than 1 (2.5394), indicating a positive relationship, and Wald’s statistic (4.91) significantly different from 0 (p value < 0.001), indicating that the positive relationship between cheating and distance delivery is significant. The odds ratio revealed that the odds of plagiarism in a distance course was 12.67 times greater than the odds of plagiarism in a face-to-face course (Figs. 2 and 3). Additional goodness-of-fit tests for the model expressed by Eq. 1 returned additional support for good model fit (Hosmer & Lemeshow = 0.58, Cox and Snell = 0.52, Nagelkerke R2 = 0.58), consistently specifying that over 50% of the variance is explained by this model compared to the null model.
Fig. 2

Percent plagiarism by a delivery method overall and b delivery method per course. Plagiarism per course iteration was averaged across course delivery and/or level. Error bars represent one standard error measurement

Fig. 3

Number of cheating incidents on insect collection projects in face-to-face and distance courses. Collections submitted in face-to-face and distance courses were determined to be unplagiarized (light grey), recycled (dark grey), or purchased (black)

Discussion

Collections have long been fundamental to natural history research and education (Suarez and Tsutsui 2004). More recently, they are being incorporated into distance education curricula to serve as ex situ laboratories. Little research has focused on the educational benefits or challenges of using collections in formal educational settings, and to our knowledge, no studies to date have addressed their use in distance education. We assessed learning in face-to-face and distance entomology classes through pre- and post-course surveys and evaluated the prevalence of academic dishonesty across delivery modes. We found that end of course knowledge was equivalent in face-to-face and distance classes, suggesting that mastery of the material was not affected by delivery mode. In contrast, academic dishonesty was more likely in distance classes than in face-to-face classes. These results suggest that a distance-delivered collection-based course can provide a natural history education on par with a face-to-face class, but that cheating is more of a problem in the distance environment.

Learning Outcomes Are Equivalent

Our survey of nearly 200 students in an advanced entomology class revealed that face-to-face and distance students performed equally well on an ungraded end-of-semester insect identification assessment. The lack of difference across delivery mode suggests that distance delivery of this course material, including the focus on collection-based learning, is as effective as in face-to-face classes. This result echoes a growing body of literature that finds distance learning to be as effective as face-to-face learning (Wu 2015).

We initially focused on assessing delivery-based differences in learning gains to account for differences in student knowledge levels upon entering the course; gains were significantly higher in face-to-face classes as compared with distance classes. This result was largely the result of differences in pre-course survey scores (Table 4, Fig. 1). Specifically, distance students scored higher than face-to-face students on the pre-course survey, and as they achieved equivalent scores on the post-course survey, their learning gains were lower when compared to face-to-face students. Why did students taking the course by distance have higher scores upon entry? One possibility is that distance students in this advanced course may start the class better prepared than face-to-face students because they anticipate the rigor of the class. The sequence of courses required for most distance students results in completion of the prerequisite introductory course during the semester immediately prior to this class. Face-to-face students, on the other hand, may comprise a more diverse group, with some having taken prerequisite courses long ago. Differences in pre-course scores may also relate to the higher rate of student course withdrawals from the distance sections, where poorly performing students tend to drop the class rather than complete it with a low or failing grade. This results in the removal of the lowest scores (pre- and post-course) from the dataset. These factors may result in significant differences within the student populations grouped together here; further study of student demographics could test these theories.

Cheating Is more Prevalent in Distance Classes

Concerns about academic dishonesty transcend delivery mode, but cheating rates are widely thought to be higher in distance education than in traditional, face-to-face classes (Bell and Federman 2013). Written plagiarism has been shown to be common both in face-to-face and distance delivery (Park 2003), and text-matching software, such as Turnitin, has helped educators detect and deter this type of cheating (Heckler et al. 2013).

When collections are plagiarized, specimens are collected by someone other than the student submitting the assignment and records are falsified to suggest that specimens were collected during the appropriate time frame, by the owner of the collection, and in a believable location. To detect specimen-based plagiarism, we developed a method of marking specimens for easy identification of “recycled material” and focused on characteristics of collections that indicated material had been purchased. We found cheating to be relatively uncommon at < 2.0% of all collections, but, troublingly, collections submitted in distance classes were more than 12 times more likely to include plagiarized material than those in face-to-face classes. Neither type of cheating (specimen recycling or purchasing) was notably more common than the other; each type of cheating was detected in less < 1.0% of all student collections. In face-to-face classes, recycled specimens that had been previously submitted for a grade were detected more often than purchased specimens, whereas in distance classes, both forms of plagiarism were equally common. Despite clear warnings about what constituted plagiarism and data falsification in both face-to-face and distance sections of both courses, higher incidence of cheating was found consistently in distance sections of introductory and advanced courses. These results add to a growing body of research on cheating in distance education that suggest disciplinary “loopholes” in distance instruction unintentionally create opportunities for students to cheat (Wolverton 2016; Ubell 2017). Just as it is relatively easy for students to pay for a term paper, it is similarly possible for students to purchase specimen collections from other students or a vendor.

It is unlikely that any of the cases of plagiarism reported here would have been detected if this study of cheating prevalence had not been undertaken. Without an explicit way for instructors to detect recycled content, past students may have trafficked collection materials through social networks such as fraternities, sororities, or for profit, without consequence. Students were aware of the danger of being caught in this way because instructors have, for decades, warned students against specimen-based plagiarism by (falsely) stating that collections were marked. This study, however, was the first time that collections were actually marked and checked regularly. We expect that our detection rate of marked specimens is a reasonable approximation of specimen recycling because we exhaustively marked all specimens submitted during this time. It is possible that the first years of this study underestimated cheating if unmarked collections recycled from previous semesters went undetected. Contract cheating detection rates were probably lower, however. While we were diligent about scanning for obvious indicators of purchase or data falsification, a high-quality collection that conformed to the assignment requirements could have passed scrutiny undetected.

Does Cheating in Distance Education Influence Learning Assessments?

Juxtaposing these results, namely higher rates of cheating by distance and higher pre-course followed by equivalent post-course learning outcomes, raises an uncomfortable question about the veracity of distance survey scores. In other words, are distance learning assessments compromised by increased levels of student cheating, and how does this impact our ability to infer distance vs. face-to-face learning outcomes? While we attempted to decouple learning assessment from student grades in this study by offering an ungraded learning survey, we cannot be certain that students did not cheat on the ungraded learning assessment. We found no direct evidence of cheating on assessments in this study but acknowledge that students may have used outside materials to improve their scores. Why would a student cheat on an ungraded survey? Perhaps to mask limited knowledge, reduce embarrassment about performing poorly, or simply out of the habit to cheat when unobserved. If some groups of students are more likely to cheat, knowing more about how course delivery encourages or discourages performance and cheating could help improve learning outcomes. Answering the question, more broadly, of whether cheating artificially skews distance learning metrics is important in order to better understand and improve distance education; for now, it remains to be satisfactorily addressed.

What to Do Now?

All academic programs aspire to maintain high standards of integrity, regardless of delivery method. Nevertheless, there is a growing need for recognition that academic honesty issues in distance education are a considerable and growing challenge for educators (Bell and Federman 2013). Addressing this problem requires a combination of willingness on the part of instructors to acknowledge and deal with cheating in individual courses as well as systematic changes in institutional structures (McCabe and Pavela 2000; Park 2004). How can instructors deter cheating? Along with structuring courses and assignments to reduce the opportunities for students to cheat, raising students’ awareness of the importance of academic honesty in classes, along with clear information about consequences, can be effective in deterring cheating (Michaels and Williams 2013). These practices have universal benefits: although they may be designed to address academic integrity issues in distance learning, they also benefit students in face-to-face classes. Institutions can develop campus-wide initiatives such as honor codes (McCabe 2002; LoSchiavo and Schatz 2011) or courses to increase awareness of academic integrity (Roberts and Hai-Jew 2009), which increase community commitment to campus honesty.

Needless to say, cheating in any delivery format is problematic because it circumvents the learning process and hinders the student’s and instructor’s ability to gauge mastery over course material. We took the results of this study back to our own classes and redoubled our efforts to deter cheating. We explicitly and repeatedly reminded our students (distance and face to face) what constituted academic dishonesty and of the consequences for any student submitting a collection that included marked specimens or specimens with falsified records. Faculty members teaching collection-based classes agreed to respond uniformly to cheating by reporting any academic dishonesty to the UF Dean of Students Office, enforcing a standardized grade penalty and requiring completion of an Ethical Decision-making Seminar and Plagiarism Avoidance Workshop. The effectiveness of these measures remains to be seen.

Conclusion

Collection-based courses can successfully be delivered by distance as well as face to face. However, because cheating may be more prevalent in distance-delivered than face-to-face collection-based courses, we have strong evidence pointing us toward one aspect of distance delivery that can be improved. If we are to accurately assess student learning outcomes for all delivery methods, then we must include cheating as part of the discussion or we will be left with uncertainty as to the validity of assessment metrics. We hope that these results will facilitate discussion about academic honesty and how to maintain high standards in distance classes. The best prevention is deterrence, and understanding how and when cheating takes place is essential to addressing the factors that limit learning. If academic dishonesty can be expected to be more problematic in distance classes as our results suggest, then the integrity of learning assessments, even if they are ungraded, should be interpreted with caution.

Notes

Acknowledgements

Our sincere thanks are given to the faculty and teaching assistants in the UF Entomology and Nematology Department who supported this project, especially R. Baldwin, E. Machtinger, C.W. Miller, M. Vickers, and E.N.I. Weeks. We thank UF IFAS statistical consultant James Colee for assistance with analyses and two anonymous reviewers for their helpful suggestions. We offer additional thanks to J. Capinera for first suggesting marking specimens, and for 20 years of circulating rumors to students that specimens were marked! We also extend our thanks to the many professional entomologists who generously assisted in identifying insects in student collections.

Funding

AL and MB received no financial support for the research and authorship of this article. Publication of this article was supported by the University of Florida Open Access Fund. RAA was supported by NSF GRFP under Grant No. DGE-1315138 and DGE-1842473.

Compliance with Ethical Standards

Conflict of Interest

The authors declare that they have no conflict of interest.

Ethical Approval

All procedures performed in studies involving human participants were in accordance with the ethical standards of the institutional and/or national research committee and with the 1964 Helsinki declaration and its later amendments or comparable ethical standards. UF IRB approval was granted to conduct this study (IRB201700949).

References

  1. Allen, I. E., Seaman, J., Poulin, R., & Straut, T. T. (2016). Online report card: tracking online education in the United States. Babson College, MA: Babson Survey Research Group.Google Scholar
  2. Austin, A. E. (2018). Vision and change in undergraduate biology education: unpacking a movement and sharing lessons learned. American Association for the Advancement of Science. http://visionandchange.org/finalreport. Accessed 25 October 2018.
  3. Beier, M. E., Kim, M. H., Saterbak, A., Leautaud, V., Bishnoi, S., & Gilberto, J. M. (2018). (2018). The effect of authentic project-based learning on attitudes and career aspirations in STEM. J Res Sci Teach, 00, 1–21.Google Scholar
  4. Bell, B., & Federman, J. (2013). E-learning in postsecondary education. Futur Child, 23(1), 165–185.CrossRefGoogle Scholar
  5. Brewer, C. A., & Smith, D. (2011). Vision and change in undergraduate biology education: a call to action. American Association for the Advancement of Sciences. http://visionandchange.org/finalreport. Accessed 25 October 2018.
  6. Cook, J. A., Edwards, S. V., Lacey, E. A., Guralnick, R. P., Soltis, P. S., Soltis, D. E., Welch, C. K., Bell, K. C., Galbreath, K. E., Himes, C., Allen, J. M., Heath, T. A., Carnaval, A. C., Cooper, K. L., Liu, M., Hanken, J., & Ickert-Bond, S. (2014). Natural history collections as emerging resources for innovative education. BioScience, 64(8), 725–734.CrossRefGoogle Scholar
  7. CWUR: Center for World University Rankings (2017). http://cwur.org/2017/subjects.php. Accessed 25 October 2018.Google Scholar
  8. DeJong, T., Lin, M. C., & Zacharia, Z. C. (2013). Physical and virtual laboratories in science and engineering education. Science, 340(6130), 305–308.CrossRefGoogle Scholar
  9. Diekhoff, G. M., LaBeff, E. E., Clark, R. E., Williams, L. E., Francis, B., & Haines, V. J. (1996). College cheating: ten years later. Res High Educ, 37(4), 487–502.CrossRefGoogle Scholar
  10. Freeman, S., Eddy, S. L., McDonough, M., Smith, M. K., Okoroafor, N., Jordt, H., & Wenderoth, M. P. (2014). Active learning increases student performance in science, engineering, and mathematics. Proc Natl Acad Sci, 111(23), 8410–8415.CrossRefGoogle Scholar
  11. Gardner, J., & Belland, B. R. (2012). A conceptual framework for organizing active learning experiences in biology instruction. J Sci Educ Technol, 21(4), 465–475.CrossRefGoogle Scholar
  12. Goubeaud, K. J. (2010). How is science learning assessed at the postsecondary level? Assessment and grading practices in college biology, chemistry and physics. J Sci Educ Technol, 19(3), 237–245.CrossRefGoogle Scholar
  13. Grijalva, T. C., Nowell, C., & Kerkvliet, J. (2006). Academic honesty and online courses. Coll Stud J, 40(1), 180–185.Google Scholar
  14. Haines, V. J., Diekhoff, G. M., LaBeff, E. E., & Clark, R. E. (1986). College cheating: immaturity, lack of commitment, and the neutralizing attitude. Res High Educ, 25(4), 342–354.CrossRefGoogle Scholar
  15. Heckler, N. C., Rice, M., & Bryan, C. H. (2013). Turnitin systems. J Res Technol Educ, 45(3), 229–248.CrossRefGoogle Scholar
  16. ICAI: International Center for Academic Integrity (2017). https://academicintegrity.org/statistics. Accessed 25 October 2018.Google Scholar
  17. Lack, K. A. (2013). Current status of research on online learning in postsecondary education. ITHAKA S+R. http://sr.ithaka.org/wp-content/uploads/2015/08/ithaka-sr-online-learning-postsecondary-education-may2012.pdf. Accessed 25 October 2018.
  18. Lanier, M. M. (2006). Academic integrity and distance learning. Journal of Criminal Justice Education, 17(2), 244–261.CrossRefGoogle Scholar
  19. LoSchiavo, F. M., & Shatz, M. A. (2011). The impact of an honor code on cheating in online courses. MERLOT Journal of Online Teaching and Learning, 7(2), June 2011.Google Scholar
  20. McCabe, D. L. (2002). Honor codes and other contextual influences on academic integrity: a replication and extension to modified honor code settings. Res High Educ, 43(3), 357–378.CrossRefGoogle Scholar
  21. McCabe, D. L., & Pavela, G. (2000). Some good news about academic integrity. Change, 32(5), 32–38.CrossRefGoogle Scholar
  22. Michaels, T. B., & Williams, M. A. (2013). Student equity: discouraging cheating in online courses. Administrative Issues Journal: Education, Practice, Research, 3(2), 30–41.Google Scholar
  23. Nguyen, T. (2015). The effectiveness of online learning: beyond no significant difference and future horizons. MERLOT Journal of Online Learning and Teaching, 11(2), June 2015.Google Scholar
  24. Park, C. (2003). In other (people’s) words: plagiarism by university students—literature and lessons. Assess Eval High Educ, 28(5), 471–488.CrossRefGoogle Scholar
  25. Park, C. (2004). Rebels without a clause: towards an institutional framework for dealing with plagiarism by students. J Furth High Educ, 28(3), 291–306.CrossRefGoogle Scholar
  26. Potkonjak, V., Gardner, M., Callaghan, V., Mattila, P., Guetl, C., Petrović, V. M., & Jovanović, K. (2016). Virtual laboratories for education in science, technology, and engineering. Computers and Education, 95(C (April 2016)), 309–327.CrossRefGoogle Scholar
  27. R Core Team (2018). R: A Language and Environment for Statistical Computing. Vienna: R Foundation for Statistical Computing. https://www.R-project.org.
  28. Roberts, C. J. J., & Hai-Jew, S. (2009). Issues of academic integrity: An online course for students addressing academic dishonesty. MERLOT J Online Learn Teach, 5(2), 182–196.Google Scholar
  29. Rowe, N. C. (2004). Cheating in online student assessment: beyond plagiarism. Online Journal of Distance Learning Administration, 7(2), Summer 2004.Google Scholar
  30. Sandvoss, L. M., Harwood, W. S., Korkmaz, A., Bollinger, J. C., Huffman, J. C., & Huffman, J. N. (2003). Common molecules: bringing research and teaching together through an online collection. J Sci Educ Technol, 12(3), 277–284.CrossRefGoogle Scholar
  31. Singh, R., & Hurley, D. (2017). The effectiveness of teaching and learning process in online education as perceived by university faculty and instructional technology professionals. Journal of Teaching and Learning with Technology, 6(1), 65–75.CrossRefGoogle Scholar
  32. Stiles, B. L., Wong, N. C. W., & LaBeff, E. E. (2018). College cheating thirty years later: the role of academic entitlement. Deviant Behav, 39(7), 823–834.CrossRefGoogle Scholar
  33. Suarez, A. V., & Tsutsui, N. D. (2004). The value of museum collections for research and society. BioScience, 54(1), 66–74.CrossRefGoogle Scholar
  34. Ubell, R. (2017). Online cheating. Inside Higher Education. (February 6, 2017).Google Scholar
  35. USDE: United States Department of Education, Office of Planning, Evaluation, and Policy Development. (2010). Evaluation of evidence-based practices in online learning: a meta-analysis and review of online learning studies. D.C.: Washington https://www2.ed.gov/rschstat/eval/tech/evidence-based-practices/finalreport.pdf. Accessed 25 October 2018.Google Scholar
  36. Vandehey, M. A., Diekhoff, G. M., & LaBeff, E. E. (2007). College cheating: a twenty-year follow-up and the addition of an honor code. J Coll Stud Dev, 48(4), 462–480.CrossRefGoogle Scholar
  37. Walker, J. (2010). Measuring plagiarism: researching what students do, not what they say they do. Stud High Educ, 35(1), 41–59.CrossRefGoogle Scholar
  38. Wolverton, B. (2016). The new cheating economy. The Chronicle of Higher Education. (August 28, 2016).Google Scholar
  39. Wu, D. D. (2015). Online learning in postsecondary education: a review of the empirical literature (2013–2014). ITHAKA S+R.  https://doi.org/10.18665/sr.221027. Accessed 25 October 2018.

Copyright information

© The Author(s) 2019

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.

Authors and Affiliations

  1. 1.Entomology and Nematology DepartmentUniversity of FloridaGainesvilleUSA

Personalised recommendations