Advertisement

Alpha Testing, Beta Testing, and Customized Testing

  • Shalin Hai-Jew
Chapter

Abstract

Once some digital learning contents have been created and provisionally “finalized,” these are ready to be put through more formal review and revision. This chapter deals with development-based assessments, commonly termed “Alpha Testing, Beta Testing, and Customized Testing.” These include internal and external assessments. The typical work of the in-house alpha (α) test involves various elements, including legal requirements, ethical requirements, pedagogical design, accessibility, usability, informational accuracy, technological functionalities, device playability, language(s) clarity, metadata accuracy and completeness, research and usage monitoring, branding, and others. The work of the beta (β) test, which involves the bringing in of outside representative target users, involves testing for user interactivity, user experiential learning, user feedback for learning, social aspects in support of learning, and others. This chapter also addresses other types of customized tests that may be written for particular unique features of specific learning resources, based on their designs, versioning, and adaptations, and other aspects. After the testing, revisions are made to the learning resources albeit not usually a full redesign or retrofitting (based on expense). If anything, unless fundamental errors were made, the revisions and edits will be piecemeal and at the margins. Finally, there is a section on how to assess the practical efficacy of the respective alpha, beta, and customized tests.

Keywords

Alpha testing Beta testing Customized testing Pilot testing (pretesting) Meta-assessment 

Notes

Key Terms and Definitions

Alpha (α) testing

In-house testing of learning objects for whether they meet pre-defined standards (for such issues as legality, accessibility, technological functionality, and others)

Audience fidelity

The faithfulness or alignment of a test group with the actual users who will be using the learning resources, how representative a particular group is in representing the target audience

Automated testing

The uses of applied scripts to assess particular features of learning objects and sequences

Beta (β) testing

Testing of learning objects with select public audiences to test for learning efficacy and public acceptance as well as other features

Customized testing

Adaptive and unique testing specific to particular learning objects or projects

Digital preservation

Work of re-versioning digital files into formats that may be more accessible or usable over time (even in light of the “slow fires” of technological change)

Interactivity

The interactions between a user (or users) and technology systems

Metadata

Data about data

Porting

Moving contents from one technology platform to another

Showstopper

A problematic factor in a work (or learning resource, in this case) that prevents it from being used because of the seriousness of this issue or challenge

Universal file format

An openly accessible file type that proprietary file formats may be converted to or from

User interface

A designed screen through which users may interact with technology systems

References

  1. Dinevski D. Open educational resources and lifelong learning. In the proceedings of the ITI 2008 30th International Conf on Information Technology Interfaces. June 23 – 26, 2008. Cavtat, Croatia. 117 – 122, 2008.Google Scholar
  2. Johnson LA, and Schleyer TKL. Developing high-quality educational software. Educational Methodologies. Journal of Dental Education 67: 1209 – 1220, 2003.Google Scholar
  3. Krueger RA, and Casey MA. Focus Groups: A Practical Guide for Applied Research. 4th Ed. Los Angeles: SAGE. 2009.Google Scholar
  4. Liu M, Jones C, and Hemstreet S. Interactive multimedia design and production processes. Journal of Research on Computing in Education 30: 254 – 280, 1998.  https://doi.org/10.1080/08886504.1998.10782226.CrossRefGoogle Scholar
  5. Merrill MD, Reigeluth CM, and Faust GW. The Instructional Quality Profile: A curriculum evaluation and design tool. In H.F. O’Neil’s Procedures for Instructional Systems Development. Ch. 6. Academic Press, Inc. 165 – 204. 1979.Google Scholar
  6. Noack D. So you wanna be a beta tester? Link-up 17: 8, 2000. Nursing & Allied Health Database.Google Scholar
  7. White BS, and Branch RM. Systematic pilot testing as a step in the instructional design process of corporate training and development. Performance Improvement Quarterly 14:, 75 – 94, 2001.CrossRefGoogle Scholar

Additional Reading Section

  1. Abbott A. Methods of Discovery: Heuristics for the Social Sciences. New York: W.W. Norton & Company. 2004.Google Scholar
  2. Creswell JW. Research Design: Qualitative, Quantitative, and Mixed Methods Approaches. 2nd Ed. Thousand Oaks, California: SAGE Publications. 2003.Google Scholar
  3. Gall MD, Gall JP, and Borg WR. Educational Research: An Introduction. 7th Ed. Boston: Pearson Education Inc. 2003, 1996, 1989, 1983, 1979, 1971, 1963.Google Scholar
  4. Glaser BG, and Strauss AL. The Discovery of Grounded Theory: Strategies for Qualitative Research. New York: Aldine de Gruyter. 1967, 1995, 1999.Google Scholar
  5. Hai-Jew S. Techniques for Coding Imagery and Multimedia: Emerging Research and Opportunities. Advances in Knowledge Acquisition, Transfer, and Management Book Series. Hershey, Pennsylvania: IGI Global. 2018.Google Scholar
  6. Krueger RA, and Casey MA. Focus Groups: A Practical Guide for Applied Research. 4th Ed. Los Angeles: SAGE Publications. 2009.Google Scholar
  7. Miles MB, Huberman AM, and Saldaña J. Qualitative Data Analysis: A Methods Sourcebook. 3rd Ed. Los Angeles: SAGE Publications. 2014, 1994.Google Scholar
  8. Moustakis C. Heuristic Research: Design, Methodology, and Applications. Newbury Park, California: SAGE Publications. 1990.Google Scholar
  9. Norman GR, and Streiner DL. PDQ Statistics. 2nd Ed. Hamilton, Ontario: BC Decker Inc. 1999.Google Scholar
  10. Ruel E, Wagner WE III, and Gillespie BJ. The Practice of Survey Research: Theory and Applications. Los Angeles: SAGE Publications Ltd. 2016.Google Scholar
  11. Saldaña J. The Coding Manual for Qualitative Researchers. 2nd Ed. Los Angeles: SAGE Publications Ltd. 2013.Google Scholar
  12. Taylor SJ, and Bogdan R. Introduction to Qualitative Research Methods: A Guidebook and Resource. 3rd Ed. New York: John Wiley & Sons, Inc. 1998.Google Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  • Shalin Hai-Jew
    • 1
  1. 1.Information Technology Services (ITS)Kansas State UniversityManhattanUSA

Personalised recommendations