Abstract
A JavaScript framework named ‘jsPsych’ developed by de Leeuw (2015) is widely used for conducting Web-based experiments, and its functionality can be enhanced by using plugins. This article introduces a new jsPsych plugin which enables experimenters to set different onset times for geometric figures, images, sounds, and moving objects, and present them synchronized with the refresh of the display. Moreover, this study evaluated the stimulus onset asynchronies (SOAs) using visual and audio stimuli. The results showed that: (i) the deviations from the intended SOAs between two visual stimuli were less than 10 ms, (ii) the variability across browser–computer combinations was reduced compared with the no-plugin condition, and (iii) the variability of the SOAs between visual and audio stimuli was relatively large (about 50 ms). This study concludes that although the use of audio stimuli is somewhat limited, the new plugin provides experimenters with useful and accurate methods for conducting psychophysical experiments online. The latest version of the plugin can be downloaded freely from https://jspsychophysics.hes.kyushu-u.ac.jp under the MIT license.
This is a preview of subscription content, access via your institution.
Notes
- 1.
The issue related to the 32-bit Chrome seemed to be resolved when the browser was updated (at least in the Chrome v. 83.0.4102.61).
References
Barnhoorn, J. S., Haasnoot, E., Bocanegra, B. R., & van Steenbergen, H. (2015). QRTEngine: An easy solution for running online reaction time experiments using Qualtrics. Behavior Research Methods, 47(4), 918–929. https://doi.org/10.3758/s13428-014-0530-7
Bazilinskyy, P., & de Winter, J. C. F. (2018). Crowdsourced Measurement of Reaction Times to Audiovisual Stimuli With Various Degrees of Asynchrony. Human Factors, 60(8), 1192–1206. https://doi.org/10.1177/0018720818787126
Bridges, D., Pitiot, A., MacAskill, M. R., & Peirce, J. W. (2020). The timing mega-study: comparing a range of experiment generators, both lab-based and online. PsyArXiv, 10.31234/osf.io/d6nu5
Chetverikov, A., & Upravitelev, P. (2016). Online versus offline: The Web as a medium for response time data collection. Behavior Research Methods, 48(3), 1086–1099. https://doi.org/10.3758/s13428-015-0632-x
Crump, M. J. C., McDonnell, J. V., & Gureckis, T. M. (2013). Evaluating Amazon’s Mechanical Turk as a Tool for Experimental Behavioral Research. PLoS ONE, 8(3). https://doi.org/10.1371/journal.pone.0057410
de Leeuw, J. R. (2015). jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods, 47(1), 1–12. https://doi.org/10.3758/s13428-014-0458-y
de Leeuw, J. R., & Motz, B. A. (2016). Psychophysics in a Web browser? Comparing response times collected with JavaScript and Psychophysics Toolbox in a visual search task. Behavior Research Methods, 48(1), 1–12. https://doi.org/10.3758/s13428-015-0567-2
Garaizar, P., & Reips, U. D. (2019). Best practices: Two Web-browser-based methods for stimulus presentation in behavioral experiments with high-resolution timing requirements. Behavior Research Methods, 51(3), 1441–1453. https://doi.org/10.3758/s13428-018-1126-4
Pauszek, J. R., Sztybel, P., & Gibson, B. S. (2017). Evaluating Amazon’s Mechanical Turk for psychological research on the symbolic control of attention. Behavior Research Methods, 49(6), 1969–1983. https://doi.org/10.3758/s13428-016-0847-5
Pinet, S., Zielinski, C., Mathôt, S., Dufau, S., Alario, F.-X., & Longcamp, M. (2017). Measuring sequences of keystrokes with jsPsych: Reliability of response times and interkeystroke intervals. Behavior Research Methods, 49(3), 1163–1176. https://doi.org/10.3758/s13428-016-0776-3
Pronk, T., Wiers, R.W., Molenkamp, B. & Murre, J. (2019). Mental chronometry in the pocket? Timing accuracy of web applications on touchscreen and keyboard devices Behavior Research Methods https://doi.org/10.3758/s13428-019-01321-2
Reimers, S., & Stewart, N. (2015). Presentation and response timing accuracy in Adobe Flash and HTML5/JavaScript Web experiments. Behavior Research Methods, 47(2), 309–327. https://doi.org/10.3758/s13428-014-0471-1
Reimers, S., & Stewart, N. (2016). Auditory presentation and synchronization in Adobe Flash and HTML5/JavaScript Web experiments. Behavior Research Methods, 48(3), 897–908. https://doi.org/10.3758/s13428-016-0758-5
Reips, U.-D., & Neuhaus, C. (2002). WEXTOR: a Web-based tool for generating and visualizing experimental designs and procedures. Behavior Research Methods, Instruments, & Computers : A Journal of the Psychonomic Society, Inc, 34(2), 234–240. https://doi.org/10.3758/BF03195449
Richter, J., & Gast, A. (2017). Distributed practice can boost evaluative conditioning by increasing memory for the stimulus pairs. Acta Psychologica, 179(April), 1–13. https://doi.org/10.1016/j.actpsy.2017.06.007
Sasaki, K., & Yamada, Y. (2019). Crowdsourcing visual perception experiments : a case of contrast threshold. https://doi.org/10.7717/peerj.8339
Schubert, T. W., Murteira, C., Collins, E. C., & Lopes, D. (2013). ScriptingRT: A Software Library for Collecting Response Latencies in Online Studies of Cognition. PLoS ONE, 8(6). https://doi.org/10.1371/journal.pone.0067769
Semmelmann, K., & Weigelt, S. (2017). Online psychophysics: reaction time effects in cognitive experiments. Behavior Research Methods, 49(4), 1241–1260. https://doi.org/10.3758/s13428-016-0783-4
Slote, J., & Strand, J. F. (2016). Conducting spoken word recognition research online: Validation and a new timing method. Behavior Research Methods, 48(2), 553–566. https://doi.org/10.3758/s13428-015-0599-7
Stewart, N., Chandler, J., & Paolacci, G. (2017). Crowdsourcing Samples in Cognitive Science. Trends in Cognitive Sciences, 21(10), 736–748. https://doi.org/10.1016/j.tics.2017.06.007
van Steenbergen, H., Band, G. P. H., & Hommel, B. (2015). Does conflict help or hurt cognitive control? Initial evidence for an inverted U-shape relationship between perceived task difficulty and conflict adaptation. Frontiers in Psychology, 6. https://doi.org/10.3389/fpsyg.2015.00974
von Bastian, C. C., Locher, A., & Ruflin, M. (2013). Tatool: A Java-based open-source programming framework for psychological studies. Behavior Research Methods, 45(1), 108–115. https://doi.org/10.3758/s13428-012-0224-y
Woods, A. T., Velasco, C., Levitan, C. A., Wan, X., & Spence, C. (2015). Conducting perception research over the internet: a tutorial review. PeerJ, 3, e1058. https://doi.org/10.7717/peerj.1058
Acknowledgements
I thank Hiroyuki Mitsudo and Kentaro Yamamoto for their valuable comments on an earlier version of this manuscript.
Open practices statement
The data and materials for all experiments are available at Open Science Framework (https://osf.io/pj4sb/), and none of the experiments was preregistered.
Author information
Affiliations
Corresponding author
Additional information
Publisher’s note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Kuroki, D. A new jsPsych plugin for psychophysics, providing accurate display duration and stimulus onset asynchrony. Behav Res 53, 301–310 (2021). https://doi.org/10.3758/s13428-020-01445-w
Published:
Issue Date:
Keywords
- JavaScript
- Web
- Online experiments
- Psychophysics