Advertisement

Behavior Research Methods

, Volume 51, Issue 2, pp 727–746 | Cite as

Argus: An open-source and flexible software application for automated quantification of behavior during social interaction in adult zebrafish

  • Soaleha ShamsEmail author
  • Shahid Amlani
  • Matthew Scicluna
  • Robert GerlaiEmail author
Article

Abstract

Zebrafish show great potential for behavioral neuroscience. Promising lines of research, however, require the development and validation of software tools that will allow automated and cost-effective behavioral analysis. Building on our previous work with the RealFishTracker (in-house-developed tracking system), we present Argus, a data extraction and analysis tool built in the open-source R language for behavioral researchers without any expertise in R. Argus includes a new, user-friendly, and efficient graphical user interface, instead of a command-line interface, and offers simplicity and flexibility in measuring complex zebrafish behavior through customizable parameters. In this article, we compare Argus with Noldus EthoVision and Noldus The Observer, to validate this new system. All three software applications were originally designed to quantify the behavior of a single subject. We first also performed an analysis of the movement of individual fish and compared the performance of the three software applications. Next we computed and quantified the behavioral variables that characterize dyadic interactions between zebrafish. We found that Argus and EthoVision extract similar absolute values and patterns of changes in these values for several behavioral measures, including speed, freezing, erratic movement, and interindividual distance. In contrast, the manual coding of behavior in The Observer showed weaker correlations with the two tracking methods (EthoVision and Argus). Thus, Argus is a novel, cost-effective, and customizable method for the analysis of adult zebrafish behavior that may be utilized for the behavioral quantification of both single and dyadic interacting subjects, but further sophistication will be needed for the proper identification of complex motor patterns, measures that a human observers can easily detect.

Keywords

Zebrafish Social behavior Anxiety Dyads R programming language 

The zebrafish displays a repertoire of complex behaviors and has recently emerged as an excellent vertebrate model for behavioral neuroscience studies (Gerlai, 2010). Although zebrafish are gaining in popularity among researchers, the utilization of zebrafish models remains limited, due to a scarcity of well-established behavioral paradigms (Gerlai, 2015; Sison, Cawker, Buske, & Gerlai, 2006). The progress is also hindered by a lack of valid and reliable reagents (Carter, Cortes-Campos, Chen, McCammon, & Sive, 2017; Nilsen et al., 2004; Prykhozhij, Steele, Razaghi, & Berman, 2017), equipment (Creton, 2009; Lin et al., 2015; Makhankov, Rinner, & Neuhauss, 2004), software (Buske & Gerlai, 2014; Ladu, Butail, Macri, & Porfiri, 2014; Nema, Hasan, Bhargava, & Bhargava, 2016), and other tools and techniques (Breacker, Barber, Norton, McDearmid, & Tilley, 2017; Estepa & Coll, 2015; Field, Kelley, Martell, Goldstein, & Serluca, 2009; Mwaffo, Butail, di Bernardo, & Porfiri, 2015; Wang et al., 2016) that are specific for zebrafish. Unlike the study of rodent models, which benefits from commercially available antibodies, mazes, software, and so forth, that can be used in behavioral research, studying zebrafish often requires researchers to develop and validate novel custom-built instruments and techniques. Faced with this common problem, and building on previous work (Buske & Gerlai, 2014), in this article we outline, as a proof of concept, the development and validation of Argus, a software application designed to aid in the quantification of behavior, including measures of social interaction in zebrafish.

Social interaction is a vital part of zebrafish behavior and is utilized for pharmacological screening and CNS disease modeling (Khan et al., 2017; Meshalkina et al., 2018; Schaefer et al., 2015; Shams & Gerlai, 2016). Social behavior, including shoaling, schooling, courtship, aggression, and the dominance hierarchy, can be easily induced and identified in the laboratory in adult zebrafish (Miller & Gerlai, 2012; Qin, Wong, Seguin, & Gerlai, 2014; Saverino & Gerlai, 2008; Seguin & Gerlai, 2017; Wright, Ward, Croft, & Krause, 2006). Shoaling is the tendency to form and remain in a group (Miller & Gerlai, 2007) and is particularly sensitive to various psychotropic compounds and environmental factors, exhibiting both drastic and subtle changes that are context-dependent (Miller & Gerlai, 2007, 2011; Moretz, Martins, & Robison, 2007; Schroeder, Jones, Young, & Sneddon, 2014; Shams, Amlani, Buske, Chatterjee, & Gerlai, 2018; Shams & Gerlai, 2016). The control and manipulation of these features offers vast potential for modeling various human neuropsychiatric diseases (Mathur & Guo, 2010; Norton, 2013), insofar as our capacity to analyze the complexity of these social behaviors and the availability of valid analytical tools does not lag behind.

In addition to displaying validity and reliability, ideal tools for behavioral analysis should also be automated, simple, and cost-effective. The need for automation has led to the development of various proprietary and open-source software applications, including Noldus EthoVision, Noldus The Observer, Viewpoint Zebralab, TSE systems Visiotracker, Stoelting ANY-maze, Fish-Tracker, JWatcher, Image Tool, ImageJ, and ZebraTrack. Although proprietary software involves the obvious challenge of cost, the validation of novel in-house software and comparison among the various applications tends to be laborious (Blaser & Gerlai, 2006; Jhuang et al., 2010). To address these issues, our laboratory previously developed a custom software application, the “RealFishTracker” (RFT), that tracks moving objects (in our case, one or multiple zebrafish) and can work in the Windows, Mac, and Linux operating systems (Buske & Gerlai, 2014). The underlying principles and functionality of this application are similar to those of Noldus EthoVision, commercially available software (Noldus Info Tech, Wageningen, The Netherlands). Using a digital video file (.mpg, .avi, .wmv, etc.), RFT tracks a moving object by sampling the frames in the video and recording the precise location of an object in xy coordinates over time. From these location coordinates, various behavioral parameters can be generated—for example, speed, turn angle, and the time spent at specific locations (whole tank, center of the tank, perimeter, etc.).

This application has been successfully used, in our laboratory and by others (Buske & Gerlai, 2014; Felix, Antunes, Coimbra, & Valentim, 2017; Mahabir, Chatterjee, Buske, & Gerlai, 2013; Saif, Chatterjee, Buske, & Gerlai, 2013), to quantify individual and social behaviors. Subsequent data extraction for a single moving object is built into RFT, but for multiple moving objects, the RFT application allows only tracking, not data extraction—that is, complex behavior quantification. Previously we have used R (open-source programming language) to extract the tracking parameters of multiple fish tested at once (Buske & Gerlai, 2014). For example, on the basis of the location coordinates of individual fish, we computed interindividual distances (IID), or averages of distances between all possible pairs in a shoal at a given time (Buske & Gerlai, 2014; Mahabir et al., 2013; Shams et al., 2018), using scripts written in the R environment. Although it is flexible, this process requires data processing and programming skills in R (Buske & Gerlai, 2014). Building on this work, we now present Argus, an application that can analyze the xy coordinates output from video-tracking software applications, such as RFT, for users with little or no knowledge of R. Although the scripts are written in the R environment, the added graphical user interface (GUI) allows users to execute commands using icons and bar menus instead of a command-line interface (Fig. 1). The GUI in Argus is a more efficient and user-friendly system that allows a user to set parameters and manipulate data without any expertise in R. For users with prior knowledge of R, Argus also allows further flexibility to manipulate the data and customize parameters based on the tracks of individual fish (see the Argus User Manual in the Appendix). In the present study, Argus scripts analyzed the outputs from RFT and EthoVision, but output text files (.txt) from other video-tracking systems could be used with relatively simple data-frame changes. Our present article focuses on behavioral neuroscience applications—that is, measuring and quantifying how zebrafish move in space and time and how they interact with each other. However, Argus may be utilized in numerous other contexts—that is, anywhere where x-/y-coordinate-like data may need to be analyzed. Argus may thus find applications, for example, in many subfields of biology, physics, and perhaps even economics, in which the physical or conceptual movement of real or abstract objects/individuals is measured.
Fig. 1

Argus graphical user interface (GUI), highlighting its user-friendly features and the flexibility of setting custom-built parameters of behavioral variables.

The goals of this article are to evaluate and validate the functionality of Argus for quantifying zebrafish behavior, by comparing it to two other software applications commonly employed in behavioral research. EthoVision (Noldus Info Tech, Wageningen, The Netherlands) is a video tracker used for the automated quantification of simple movement-based behaviors, whereas The Observer (Noldus Info Tech, Wageningen, The Netherlands) is a multi-event recorder that allows for the quantification of motor and posture patterns—for instance, elements of the ethogram of a species, as defined and coded by a human observer (Blaser & Gerlai, 2006; Teles, Dahlbom, Winberg, & Oliveira, 2013). We first compared a simple parameter of motor activity (the speed of a single fish) computed by Argus scripts to RFT’s built-in analysis function and to values extracted from EthoVision’s tracking system. Next, we assessed the three software applications (EthoVision, The Observer, and Argus) in their ability to quantify behavior during a dyadic social interaction between zebrafish, by extracting multiple parameters of zebrafish movement and comparing them within and between the software applications.

Materials and methods

General fish husbandry

A total of 92 zebrafish (Danio rerio) of the AB strain were used as the experimental fish (20 fish for Exp. 1, and 72 fish for Exp. 2). All zebrafish were maintained and tested in the vivarium at the University of Toronto Mississauga under guidelines approved by the Canadian Council on Animal Care and the Local Animal Care Committee. The experimental fish were sixth-generation descendants of progenitors bought originally from the Zebrafish International Research Center (ZIRC, Eugene, OR). The fish were kept in a high-density aquatic housing system (Aquaneering Inc., San Diego, CA, USA) that recirculated water through fluidized bed biological filtration, activated carbon filtration, and UV sterilization. Throughout the experiment, the water temperature (28–30 °C), salinity (250–350 μS), and pH (6.8–7.5) were kept constant. Fish were housed in a room with a 12:12 h light:dark cycle (lights provided by ceiling-mounted fluorescent light fixtures were turned on at 0900) and fed alternating diets of the nauplii of brine shrimp (Artemia salina) and dried flake food (1:1 mixture of spirulina and Tetramin) twice daily. Experimental testing took place when the fish were 6 months old. Testing trials were conducted during the hours of 1100 and 1600.

Experiment 1

First, we quantified a simple movement parameter, the swimming speed of a single fish, using Argus and compared the results to the data extracted by EthoVision and by the built-in function of RFT. This comparison was conducted to confirm that the Argus calculations resulted in similar values and changes in the values in a simple and often utilized measure of general activity. In this experiment, we compared the mean of swim speeds calculated for 14 1-min intervals, as well as the intra-individual temporal variances in speed within each of the 14 1-min intervals, extracted by RFT, Argus, and EthoVision.

Behavioral acquisition setup and procedure

The testing tank was of a standard 37-L glass tank (50 × 25 × 30 cm, Length × Width × Depth), filled 25 cm deep with water from the home system-rack. A 50-cm-long Aquarium Spectrum Fluorescent lamp (15 W) illuminated the testing tank from above. A digital video camera (Sony HDR-XR550, Sony Corporation, Japan) was set up in front of the tank. The back side of the tank was covered with opaque white polycarbonate in order to allow good contrast for the video recording, and the other two sides consisted of blank computer screens that remained turned off.

Fish were fed about 1 h before behavioral testing. Each fish was netted from the home tank into the testing tank singly, and the behavior of the single subject was recorded for 14 min. Behavioral testing started when the researcher left the room and allowed the fish to move freely. After the trial ended, the fish was netted and returned to its home tank.

Behavioral tracking and quantification

The video recordings were transferred to an external hard drive using Picture Motion Browser (PMB, Sony Corp., Japan) and converted from Advanced Video Coding High Definition (AVCHD) format to AVI format using iSkysoft Video Converter (iSkysoft Studio, Guangdong, China). The behavioral parameters extracted were (1) swim speed average for each 1-min interval and (2) the intra-individual variance in swim speed, also calculated for each 1-min interval.

For RFT, the tracking parameters were as follow: A fish was differentiated as a “dark object,” averaging 300 frames, with the confidence threshold of 30, Gaussian variance at 0.010, and a mean filter size of three pixels. Following this, the Analyze Data function was used to extract the above behavioral measures. Next, using Argus scripts and the GUI, the text file outputs from the RFT were used to extract the behavioral measures, also in 1-min bins, for each of the 20 fish in a single run. Finally, for EthoVision the tracking parameters were as follow: One subject was tracked without markers, subjects were identified as fish, sampled at 30 frames/s, and “dynamic subtraction” was used for subject identification. The values for mean speed were extracted in 1-min bins for all 20 individuals at once.

Experiment 2

To evaluate and validate the functionality of Argus during social interaction, we compared this software application to two commercially available software packages often employed in behavioral research: Noldus EthoVision and Noldus The Observer. These two applications represent two conceptually and methodologically distinct approaches. EthoVision is a video-tracking system and can quantify intensity-related or continuously varying path parameters in an automated manner. For example, unlike The Observer, it can precisely measure how fast a fish swims and where exactly the fish is, relative to a user-defined point, line, or area. On the other hand, The Observer, is based on human interface—that is, on the ability of the human brain to detect complex motor and posture patterns. The Observer is not automated, since one has to watch a video-recording and press keys designated to particular predefined motor patterns. Although this is time-consuming, The Observer allows one to quantify complex motor and posture patterns that are often not possible or rather difficult to measure using video-tracking systems.

In this second experiment, we computed variables for the social interaction of adult fish in dyads using EthoVision (automated video-tracking-based), The Observer (human observation-based), and Argus (automated video-tracking-based). To make these comparisons more meaningful, we organized and compared the extracted variables in three behavioral categories: (a) activity-related measures, including swimming speed, freezing behavior, and erratic movement; (b) location-based measures; and (c) social behavior.

Behavioral test setup and procedure

The testing tank was an 8-L glass aquarium (30 × 15 × 20 cm, Length × Width × Depth). It was filled 15 cm deep with water from the home system-rack. As compared to the setup in Experiment 1, we used this smaller tank to facilitate interaction between the paired fish and to allow the higher magnification and better video quality required to observe and quantify the interaction between the two fish. The testing tank was illuminated from above with a 50-cm-long Aquarium Spectrum Fluorescent lamp (15 W), and a digital video-camera (same as Exp. 1) was set up in front of the tank. The other three sides of the tank were covered with opaque white polycarbonate sheets to increase the contrast for video recording.

The experimental fish were fed 1 h before behavioral testing. Two experimental fish were placed in the test tank at a time and were allowed to interact. Before each trial, a white opaque divider was placed in the middle of the tank, in order to split the tank into two sides and prevent the fish from seeing and interacting with each other until the trial started. The sex, size, and body characteristics of each experimental fish were recorded. We attempted to match the sizes of the interacting fish. We did not experimentally mark the fish, to avoid stress, but in order to allow identification during and after the trial, same-sex and size-matched fish were paired only if they had noticeably distinctive and unique body characteristics (e.g., pattern, pigmentation, or thickness of stripes, size/shape of fins and belly), which we could achieve for all pairs. Once selected, the two fish were put in the testing tank, separated by the divider. After a short, 3-min acclimation period, behavioral testing started when the divider was removed and the researcher left the room and allowed the fish dyad to interact freely for 30 min.

Behavioral tracking and quantification

Prior to the behavioral analysis, the recordings from the video camera were transferred and converted in the same manner as Experiment 1. These recordings were replayed and tracked using RFT and EthoVision XT 12 and event-recorded using The Observer XT 9. For RFT, the video-tracking parameters were the same as in Experiment 1, except that now two subjects were tracked. We then used Argus to quantify various behaviors in 1-min bins. Similarly, for EthoVision the parameters remained the same as in Experiment 1, except that here, too, two subjects were tracked without markers. For data analysis, the dependent variables were extracted in 1-min bins. Finally, for The Observer, a single trained investigator, blind to the experimental conditions, identified and event-recorded the frequency and duration of behaviors over the 30-min-long trials. Each video recording was replayed and event-recorded twice in order to get individual observations for each of the two fish in the dyad. The frequency and duration of the behavioral measures were extracted in 1-min bins using The Observer.

Statistical analyses

All statistical analyses were conducted using IBM SPSS Statistics 20 for Windows. Statistical significance was accepted when the probability of the null hypothesis was no more than 5% (p ≤ .05). For Experiment 1, using values for each of the 20 fish (averaged over the length of the trial), we first calculated Pearson correlation coefficients correlating the results obtained for the entire behavior session period by the three software applications (RFT vs. EthoVision, RFT vs. Argus, EthoVision vs. Argus). For this analysis, we used the mean swimming speed and intra-individual variance in swimming speed. Subsequently, we calculated Pearson correlation coefficients between pairs of data points generated by the three software applications for each 1-min interval of the 14-min-long behavioral recording session. We did this for all possible pairs of software applications (RFT vs. EthoVision, RFT vs. Argus, EthoVision vs. Argus). We also conducted separate repeated measures analyses of variance (ANOVAs) for both the means and variances of speed, with software application (two levels) and time (14 levels) as repeated measures factors.

For Experiment 2, along with swimming activity, we also analyzed more complicated behavioral responses, including freezing, erratic movement, distance from bottom, and social interaction (interindividual distances and duration of proximity; see Table 2 in the Results for the complete list). We compared the same or similar variables from the two tracking applications, Argus and EthoVision. We also compared these Argus and EthoVision variables to the relevant observer event-recorded variables quantified by The Observer. For these comparisons, we computed means for each of these variables and each of the fish dyads. We averaged our data across time (average of 30 min) to simplify our analysis, and also averaged the data from each pair of fish in order to prevent possible mismatching within a dyad. Thus, we compared 36 dyads for each of the measured variables. Using the average values, we first generated Pearson correlation coefficients between the relevant comparable behavioral variables quantified by the three software applications. We also examined the relationships (i.e., correlation structures) among the multiple variables in each of our five categories of behavior (four variables for social interaction and six other variables, listed in Table 2 below) using principal component analysis. Separate Pearson product moment correlation matrices were constructed for each behavioral category, and principal components were extracted, using varimax rotation with 25 iterations and Kaiser normalization. Principal components were kept when their eigenvalues reached 1. Finally, we plotted each of the variables across five 6-min intervals in order to examine any effect of time over the 30 min of the testing period. Thus, we conducted repeated measures ANOVAs for pairs of the same or similar variables, with software application (Argus, EthoVision, or Observer) and time (five 6-min bins) as repeated measures factors.

Results

Experiment 1

Swimming speed (mean and intra-individual variance) of each of the fish from the three software applications is depicted in Fig. 2. The results suggest that the three software applications gave comparable values and matching patterns of differences across individual fish. We compared the average speeds for each fish between pairs of the three applications, and strong positive correlations confirmed a high degree of similarity. The Pearson correlation coefficients for mean swimming speed were as follow: RTF versus EthoVision (r = .940, n = 20, p < .001), RTF versus Argus (r = .967, n = 20, p < .001), and Argus versus EthoVision (r = .929, n = 20, p < .001). The correlation coefficients for variance in swimming speed were RTF versus EthoVision (r = .878, n = 20, p < .001), RTF versus Argus (r = .943, n = 20, p < .001), and Argus versus EthoVision (r = .829, n = 20, p < .001).
Fig. 2

Visual representation of (A) mean speed and (B) intra-individual variance in speed for each of the 20 fish used in Experiment 1. The data points are individual fish’s averages (over the testing period), connected with dotted lines in order to emphasize the software used for extraction and to highlight the similarity in the ability of the three methods to detect interindividual differences.

Next we examined the temporal pattern of changes in swimming (Fig. 3). First we investigated the correlations among the results generated by the three software applications for each 1-min interval separately. Overall, the Pearson correlation coefficients for individual time bins confirmed strong similarity between the values obtained by the three software applications for mean speed (.990 > r > .754; Table 1A) and moderate to strong positive correlations in variance of speed (.987 > r > .587), except for variance of swimming speed during the first time bin (Table 1B).
Fig. 3

Mean speed (A) and intra-individual temporal variance in speed (B), shown as a function of time (1-min intervals). Means ± SEMs are shown, n = 20. The three software applications employed to extract the data are indicated by the legend. Note the high degree of similarity in the temporal trajectories obtained for the three software applications.

Table 1

Pearson correlation coefficients for the comparison of the three software applications: (A) mean swimming speeds across time bins, averaged across the 20 fish, and (B) variances in swimming speed across time bins, averaged across the 20 fish

Time Bins

RealFishTracker vs. EthoVision

RealFishTracker vs. Argus

EthoVision vs. Argus

A.

 1

r = .866, p < .001

r = .955, p < .001

r = .897, p < .001

 2

r = .754, p < .001

r = .769, p < .001

r = .896, p < .001

 3

r = .924, p < .001

r = .990, p < .001

r = .940, p < .001

 4

r = .944, p < .001

r = .984, p < .001

r = .954, p < .001

 5

r = .946, p < .001

r = .989, p < .001

r = .922, p < .001

 6

r = .884, p < .001

r = .980, p < .001

r = .881, p < .001

 7

r = .872, p < .001

r = .951, p < .001

r = .859, p < .001

 8

r = .799, p < .001

r = .962, p < .001

r = .826, p < .001

 9

r = .944, p < .001

r = .933, p < .001

r = .944, p < .001

 10

r = .903, p < .001

r = .837, p < .001

r = .812, p < .001

 11

r = .940, p < .001

r = .966, p < .001

r = .973, p < .001

 12

r = .844, p < .001

r = .964, p < .001

r = .863, p < .001

 13

r = .815, p < .001

r = .944, p < .001

r = .815, p < .001

 14

r = .865, p < .001

r = .943, p < .001

r = .828, p < .001

B.

 1

r = .352, p = .127

r = .987, p < .001

r = .3988, p = .083

 2

r = .784, p < .001

r = .891, p < .001

r = .867, p < .001

 3

r = .785, p < .001

r = .983, p < .001

r = .782, p < .001

 4

r = .896, p < .001

r = .949, p < .001

r = .784, p < .001

 5

r = .875, p < .001

r = .958, p < .001

r = .907, p < .001

 6

r = .664, p = .001

r = .950, p < .001

r = .587, p = .007

 7

r = .791, p < .001

r = .945, p < .001

r = .812, p < .001

 8

r = .715, p < .001

r = .950, p < .001

r = .716, p < .001

 9

r = .877, p < .001

r = .932, p < .001

r = .807, p < .001

 10

r = .694, p = .001

r = .874, p < .001

r = .601, p = .005

 11

r = .852, p < .001

r = .940, p < .001

r = .858, p < .001

 12

r = .721, p < .001

r = .952, p < .001

r = .706, p < .001

 13

r = .703, p = .001

r = .935, p < .001

r = .738, p < .001

 14

r = .689, p = .001

r = .871, p < .001

r = .698, p = .001

Finally, we conducted separate repeated measures ANOVAs for each pair of software applications (RTF vs. EthoVision, RTF vs. Argus, and Argus vs. EthoVision) in order to evaluate the effect of software, the effect of time, and the possible interaction between these factors for both mean speed and variance of speed. For RTF versus EthoVision, the analysis of mean speed showed significant effects of time [F(13, 247) = 2.556, p = .043] and software [F(1, 19) = 188.689, p < .001], but no interaction between these two factors [F(13, 247) = 1.789, p = .138]. On the other hand, for RTF versus Argus, the speed ANOVA confirmed a significant effect of software [F(1, 19) = 17.422, p = .001], but no effect of time [F(13, 247) = 1.852, p = .121] and no interaction between time and software [F(13, 247) = 1.789, p = .138]. Similarly, for Argus versus EthoVision, the analysis of swimming speed again showed a significant effect of software [F(1, 19) = 98.480, p < .001], but no significant effect of time [F(13, 247) = 2.419, p = .054] and no Time × Software interaction [F(13, 247) = 1.553, p = .203]. Thus, for mean speed, the lack of significant Time × Software interactions in all cases indicated that although the actual values extracted from the software were significantly different, the three software applications quantified the temporal changes in the same manner.

We repeated the same analysis for variance in speed, and in the RTF versus EthoVision comparison, the ANOVA revealed a significant effect of software [F(1, 19) = 266.295, p < .001], but no effect of time [F(13, 247) = 1.235, p = .304] and no interaction between the two factors [F(13, 247) = 1.407, p = .251]. For RTF versus Argus, the ANOVA for variance in speed showed a significant effect of software [F(1, 19) = 14.523, p = .001], but no effect of time [F(13, 247) = 1.068, p = .372] and no interaction between time and software [F(13, 247) = 1.011, p = .420]. For variance in speed for Argus versus EthoVision, again the ANOVA showed a significant effect of software [F(1, 19) = 236.118, p < .001], but no significant effect of time [F(13, 247) = 1.127, p = .350] and no Time × Software interaction [F(13,247) = 1.439, p = .243]. Once again, in the case of variance in speed, our results showed no interaction between software and time, suggesting that all of the software quantified the temporal changes similarly. However, this interpretation was hampered by the fact that no significant temporal changes were detected in this measure.

Experiment 2

Experiment 1 established a reasonable (statistically appreciable) level of similarity in how the three software applications detected interindividual differences and temporal pattern of changes in swimming speed and the intra-individual variability of swimming speed. Building on this, in Experiment 2 we analyzed a more complex set of behavioral variables zebrafish exhibited during social interaction during a dyadic encounter. In this experiment, using these variables we compared Argus, EthoVision, and The Observer. RFT allows the tracking but not the analysis of video recordings with more than one fish. From tracking data generated by RFT, Argus allowed us to quantify and extract more complex behaviors and to compare them with tracking data extracted from EthoVision. In addition, we compared the results from the two tracking methods with those from The Observer. This comparison was somewhat arbitrary because, unlike in the case of Argus and EthoVision, the definition of the behavioral variables did not precisely match. We quantified various location- and activity-based variables and organized them into five categories, shown in Table 2. For the comparison, we attempted to match human observer-based variables that were as similar as possible in definition to the video-tracking based variables (Blaser & Gerlai, 2006; Kalueff et al., 2013; Seguin et al., 2018; Teles et al., 2013; Teles & Oliveira, 2016). The list of 28 behaviors extracted by the three software applications and how these various variables match each other across the three applications are shown in Table 2.
Table 2

List of behavioral variables computed and extracted using the three software applications from the dyadic interactions in Experiment 2

Variables

Argus

EthoVision

Observer

Locomotor activity

-average swim speed (cm/s)

-variance of swim speed (cm/s)2

-average swim speed (cm/s)

-variance of swim speed (cm/s)2

-average swimming duration (s)

-average swimming bouts

Freezing behavior

-average freezing duration (s)

-average distance travelled (cm)

-average immobility duration (s)

-average mobility time (%)

-average freezing duration (s)

-average freezing bouts

Erratic behavior

-frequency of erratic movements

-duration of direction change (s)

-number of rotations made

-absolute turn angle (degree)

-frequency of erratic movement

-frequency of dives and jumps

Anxiety-related behavior

-average distance to bottom (cm)

-average time in the perimeter (s)

-average distance to bottom (cm)

-variance of distance to bottom (cm)2

-floating bouts

-sinking bouts

Social behavior

-average interindividual distance (cm)

-average inter-individual distance (cm)

-average time in proximity (s)

-average time in proximity (s)

The behavioral categories are indicated in the first column, and the unit of measurement is indicated beside each variable

Correlational analyses

We calculated Pearson correlation coefficients and built matrices that showed the strengths of correlations within and between the three software applications. To simplify the interpretation of the above-described correlation matrices, we also performed the data reduction method of principal component analysis (PCA) separately for each category of the behaviors in Table 2.

Locomotor activity

We extracted two variables that quantified swimming activity, mean swimming speed and the variance in swimming speed, from both Argus and EthoVision. As is displayed in Fig. 4, individual differences are captured similarly by the two types of tracking software for mean swimming speed, but less so for variance in swimming speed. We compared these variables with the event-recorded behavior from The Observer that quantified swimming duration and frequency, using Pearson correlations coefficients (Table 3A). Moderate positive correlations were found within each software application. For correlations between the software applications, the coefficients confirmed that the mean swimming speed extracted from Argus had strong positive correlations with the mean speed and variance in swimming speed extracted from EthoVision, but no significant correlation was detected between the Argus extracted variance in swimming speed and any other variable in this category. We also compared the duration of swimming events recorded in The Observer and found only moderate to weak correlations between event-recorded swimming duration and the variables extracted by EthoVision and Argus, and importantly, these correlations were negative. Similarly, the frequency of swimming events recorded in The Observer also showed only moderately strong negative correlations with the mean speed and variance of speed obtained using EthoVision, and weak correlations with the mean speed and variance in speed extracted by Argus. From the principal component analysis, two factors were extracted with eigenvalues greater than 1, which accounted for 76% of the total variance in the locomotor activity variables. Principal Component 1 accounted for 55% of the variance, with high loadings from both Argus and both EthoVision variables, but not The Observer variables. Principal Component 2 accounted for 21% of the variance, with high loadings from all six variables (Table 3B).
Fig. 4

Visual representation of the (A) mean speed and (B) intra-individual variance in speed for each of the 36 dyads used in Experiment 2. The data points are individual fish’s averages (over the testing period) and are connected with dotted lines in order to emphasize the software used for extraction and to highlight the similarity in the patterns from the three methods. Note the high degree of similarity in the patterns obtained for mean swimming speed, but not for the variance in swimming speed.

Table 3

Matrix of (A) Pearson correlation coefficients and (B) extraction results from the principal component analysis for the comparison of extracted variables measuring locomotor activity from the three software applications

A. Locomotor activity

n = 36 dyads

Argus

EthoVision

Observer

Average

Swim Speed (cm/s)

Variance of Swim Speed (cm/s)2

Average

Swim Speed (cm/s)

Variance of Swim Speed (cm/s)2

Swimming Duration (min)

Number of Swimming Bouts

Argus

Average Swim Speed (cm/s)

1

r = .515

p = .001

r = .901

p < .001

r = .824

p < .001

r = – .348

p = .038

r = – .261

p = .124

Variance of Swim Speed (cm/s)2

 

1

r = .170

p = .323

r = .217

p = .204

r = .018

p = .918

r = .031

p = .857

EthoVision

Average Swim Speed (cm/s)

  

1

r = .917

p < .001

r = – .440

p = .007

r = – .404

p = .015

Variance of Swim Speed (cm/s)2

   

1

r = – .385

p = .020

r = – .427

p = .009

Observer

Swimming Duration (min)

    

1

r = .445

p = .006

Number of

Swimming Bouts

     

1

 

B.

Component

  

1

2

Argus

Average Swim Speed

.928

– .301

Variance of Swim Speed

.742

.385

EthoVision

Average Swim Speed

.747

– .581

Variance of Swim Speed

.742

– .550

Observer

Swimming Duration

 

.753

Swimming Bouts

 

.778

Freezing behavior

We next quantified bouts of inactivity or freezing and compared similar variables (the duration or frequency of inactivity) from the three software applications, as is presented in Table 4A. For all three methods, we found moderate to high degrees of similarity within a software application. For example, we found moderate to strong correlations between the Argus variable of freezing duration and The Observer variables (duration and frequency of freezing) and between the Argus variable of distance traveled and the EthoVision variables mobility and immobility. The total distance traveled extracted from Argus had only weak negative correlations with the duration and frequency of freezing, as event-recorded in The Observer. For EthoVision, the mobility variable had moderate to weak negative correlations with variables related to freezing from The Observer, but a moderately strong negative correlation with Argus freezing duration. The duration of immobility extracted from EthoVision was not appreciably correlated with either of The Observer variables and had a moderate correlation with the Argus duration of freezing. Next, in the PCA, two factors were identified with eigenvalues greater than 1, accounting for 88% of the total variance. Principal Component 1 accounted for 64% of the variance, with high loadings from both Argus and both EthoVision variables, but not from The Observer variables. Principal Component 2 accounted for 24% of the variance, with high loadings from all variables except the EthoVision immobility variable, as is shown in Table 4B.
Table 4

Matrix of (A) Pearson correlation coefficients and (B) extraction results from the principal component analysis for the comparison of variables measuring freezing behavior (direct and indirect measures of inactivity) from the three software applications

A. Freezing Behavior

n = 36 dyads

Argus

EthoVision

Observer

Average Freezing Duration (s)

Average Distance Travelled

(cm)

Average Immobility Duration (s)

Average

Mobility Time

(%)

Average Freezing Duration

(s)

Average Freezing Bouts

Argus

Average Freezing Duration (s)

1

r = – .621

p < .001

r = .404

p = .014

r = – .635

p < .001

r = .642

p < .001

r = .643

p < .001

Average Distance Traveled (cm)

 

1

r = – .847

p < .001

r = .879

p < .001

r = – .352

p = .035

r = – .429

p = .009

EthoVision

Ave Immobility Duration (s)

  

1

r = – .947

p < .001

r = .163

p = .343

r = .285

p = .092

Average Mobility Time (%)

   

1

r = – .396

p = .017

r = – .457

p = .005

Observer

Average Freezing Duration (s)

    

1

r = .861

p < .001

Average Freezing Bouts

     

1

 

B.

Component

  

1

2

Argus

Average Freezing Duration

– .457

.724

Average Distance Travelled

.901

– .289

EthoVision

Average Immobility Duration

– .973

 

Total Mobility Time

.933

– .309

Observer

Average Freezing Duration

 

.951

Total Freezing Bouts

 

.918

Erratic behavior

We also quantified bouts of erratic movements (e.g., time spent changing direction, rotations, frequency of erratic movements, and dives/jumps) and compared these variables from the three software applications, as is listed in Table 5A. We found strong positive correlations between rotations extracted from EthoVision and the frequencies of erratic movements extracted from Argus and event-recorded in The Observer. Frequency of dive/jumps from The Observer was not correlated significantly with the Argus variables (duration of direction change and frequency of erratic movements) or with the EthoVision-extracted variable turn angle. All other correlations showed moderate to weak strength. From the factor analysis, only one factor was identified with an eigenvalue greater than 1, which contributed to 55% of the total variance (Table 5B). This principal component showed high loadings from all six variables.
Table 5

Matrix of (A) Pearson correlation coefficients and (B) extraction results from the principal component analysis for the comparison of variables computing erratic behavior (fast changes in directions) from the three software applications

A. Erratic Behavior

n = 36 dyads

Argus

EthoVision

Observer

Frequency of Erratic Movements

Duration of Direction Change (s)

Number of Rotations Made

Absolute Turn Angle (degrees)

Frequency of Erratic Movements

Dives + Jumps

Argus

Frequency of Erratic Movements

1

r = – .369

p = .027

r = .827

p < .001

r = – .527

p < .001

r = .567

p < .001

r = .086

p = .616

Duration of Direction Change (s)

 

1

r = – .419

p = .011

r = .413

p = .012

r = – .317

p = .060

r = – .286

p = .091

EthoVision

Number of Rotations Made

  

1

r = – .664

p < .001

r = .716

p < .001

r = .348

p = .037

Absolute Turn Angle (degrees)

   

1

r = – .482

p = .003

r = – .191

p = .265

Observer

Frequency of Erratic Movements

    

1

r = .365

p = .028

Dives + Jumps

     

1

 

B.

Component

  

1

Argus

Frequency of Erratic Movements

– .599

Duration of Direction Change

.817

EthoVision

Total Number of Rotations Made

.934

Absolute Turn Angle

– .764

Observer

Frequency of Erratic Movements

.800

Total Dives + Jumps

.435

Other anxiety-related behavior

We quantified other variables that measure location-based indicators of anxiety, such as distance from the bottom, time spent in the perimeter of the arena, and frequency of floating and sinking (Table 6A). We found a strong correlation within the two variables from Argus and a moderately strong correlation between these two Argus variables and the EthoVision variable distance to bottom. The EthoVision-extracted variable variance in distance to bottom was only weakly correlated with the Argus-quantified duration in perimeter, and not significantly correlated with any other variable. The Observer variables of frequency of floating and sinking were not correlated with each other. The Observer coded frequency of floating showed weak correlation with the two Argus variables and with EthoVision distance to bottom, whereas the Observer coded frequency of sinking was not related to any of the variables. Next, the PCA yielded two factors with eigenvalues greater than 1, accounting for 70% of the total variance. Principal Component 1 accounted for 50% of the variance, with high loadings from all variables except sinking coded in The Observer. Principal Component 2 accounted for 20% of the variance, with high loadings from all variables except the duration in perimeter from Argus, as is listed in Table 6B.
Table 6

Matrix of (A) Pearson correlation coefficients and (B) extraction results from the principal component analysis for the comparison of variables measuring location-based anxiety-related behaviors from the three software applications

A. Anxiety-related behavior

n = 36 dyads

Argus

EthoVision

Observer

Average Distance to Bottom (cm)

Average Duration in Perimeter (s)

Average Distance to Bottom (cm)

Variance in Distance to Bottom (cm)

Floating Bouts

Sinking Bouts

Argus

Average Distance to Bottom (cm)

1

r = .772

p < .001

r = .887

p < .001

r = .267

p = .115

r = .409

p = .013

r = .230

p = .177

Average Duration in Perimeter (s)

 

1

r = .603

p < .001

r = .357

p = .033

r = .432

p = .009

r = .091

p = .599

EthoVision

Average Distance to Bottom (cm)

  

1

r = .225

p = .188

r = .371

p = .026

r = – .322

p = .055

Variance in Distance to Bottom (cm)

   

1

r = .139

p = .418

r = – .268

p = .115

Observer

Floating Bouts

    

1

r = .098

p = .572

Sinking Bouts

     

1

 

B.

Component

  

1

2

Argus

Average Distance to Bottom

.881

.315

Average Duration in Perimeter

.843

 

EthoVision

Average Distance to Bottom

.793

.384

Variance in Distance to Bottom

.254

.573

Observer

Total Floating Bouts

.726

– .281

Total Sinking Bouts

 

– .882

Social behavior

Using Pearson correlation coefficients, we compared the “interindividual distance” (IID) extracted from Argus and EthoVision and the “duration in proximity” extracted from EthoVision and The Observer (Table 7A). The generated correlation coefficients verified the Argus IID to be strongly correlated with EthoVision IID and EthoVision duration in proximity. On the other hand, the correlation between IID and duration in proximity from EthoVision was only moderately strong. The correlations between the duration in proximity variable extracted from The Observer and the variables extracted from EthoVision and Argus were weaker. EthoVision IID was not correlated with The Observer recorded duration in proximity. The PCA generated only one factor with an eigenvalue greater than 1, which accounted for 63% of the total variance (Table 7B). This principal component had high loadings from all four variables.
Table 7

Matrix of (A) Pearson correlation coefficients and (B) extraction results from the principal component analysis for the comparison of extracted variables measuring social behaviors from the three software applications

A. Social behavior

n = 36 dyads

Argus

EthoVision

Observer

Average

Interindividual Distance (cm)

Average

Interindividual Distance (cm)

Average Duration in Proximity (s)

Average Duration in Proximity (s)

Argus

Average

Interindividual Distance (cm)

1

r = .746

p < .001

r = – .711

p < .001

r = – .374

p = .024

EthoVision

Average

Interindividual Distance (cm)

 

1

r = – .418

p = .011

r = .288

p = .089

Average Duration in Proximity (s)

  

1

r = – .448

p = .006

Observer

Average Duration in Proximity (s)

   

1

 

B.

Component

  

1

Argus

Average Inter-Individual Distance

.920

EthoVision

Average Inter-Individual Distance

.790

Average Duration in Proximity

– .824

Observer

Average Duration in Proximity

– .616

Analysis of variance

Locomotor activity

Next we conducted separate ANOVAs on pairs of the same or similar variables, to determine whether the three software applications measured swimming speed data in the same way. These comparisons are visualized in Fig. 5, and the results from the repeated measures ANOVAs are listed in the first panel in Table 8. As is listed in panel 1A of Table 8, for the comparison between the Argus and EthoVision mean speeds, both software and time had significant effects, but there was no Software × Time interaction. These results indicate that although the values generated by the two tracking systems were different, there was a significant effect of time, and this effect was detected by the two software applications in the same manner (cf. Fig. 5A and B). On the other hand, the Software × Time interaction was significant for the Argus mean speed and The Observer recorded swim duration comparison (panel 1B of Table 8) and for the EthoVision speed and The Observer coded swim duration (panel 1C of Table 8), implying that both tracking software applications yielded speed data that differed from the human-observation-based swimming durations. For the variance of speed and human-recorded swim frequency, ANOVAs showed that time did not have a significant effect (panels 1D–1F in Table 8), and the Software × Time interaction was also nonsignificant.
Fig. 5

Comparison of variables measuring locomotor activity extracted using the three software applications: (A) Average swim speeds extracted by Argus and EthoVision, and swimming duration recorded in The Observer. (B) Variances in swim speeds extracted from Argus and EthoVision, and swimming frequency recorded in The Observer. Means ± SEMs are shown; n = 36 dyads for all.

Table 8

Results from the various separate repeated measures ANOVAs are presented for the comparisons between the various behavioral variables extracted using the three software applications

Argus vs. EthoVision

Argus vs. Observer

EthoVision vs. Observer

1A

Argus Mean Speed vs. EthoVision Mean Speed

Software: F(1, 35) = 6.407, p = .016

Time: F(4, 140) = 6.401, p = .01

Software × Time: F(4, 140) = 0.903, p= .439

1D

Argus Speed Variance vs. EthoVision Speed Variance

Software: F(1, 35) = 30.760, p < .001

Time: F(4, 140) = 0.295, p = .792

Software × Time: F(4, 140) = 0.748, p= .522

1B

Argus Mean Speed vs. Observer Swim Duration

Software: F(1, 35) = 253.160, p < .001

Time: F(4, 140) = 18.058, p < .001

Software × Time: F(4, 140) = 13.950, p < .001

1E

Argus Speed Variance vs. Observer Swim Frequency

Software: F(1, 35) = 58.388, p < .001

Time: F(4, 140) = 0.521, p = .648

Software × Time: F(4, 140) = 0.294, p= .805

1C

EthoVision Mean Speed vs. Observer Swim Duration

Software: F(1, 35) = 245.855, p < .001

Time: F(4, 140) = 18.103, p < .001

Software × Time: F(4, 140) = 13.999, p < .001

1F

EthoVision Speed Variance vs. Observer Swim Frequency

Software: F(1, 35) = 14.903, p < .001

Time: F(4, 140) = 0.763, p = .467

Software × Time: F(4, 140) = 2.326, p= .109

2A

Argus Freeze Duration vs. EthoVision Immobility

Software: F(1, 35) = 902.864, p < .001

Time: F(4, 140) = 8.733, p < .001

Software × Time: F(4, 140) = 9.308, p < .001

2D

Argus Distance Travelled vs. EthoVision Mobility

Software: F(1, 35) = 408.909 , p < .001

Time: F(4, 140) = 6.770, p < .001

Software × Time: F(4, 140) = 6.497, p = .001

2B

Argus Freeze Duration vs. Observer Freeze Duration

Software: F(1, 35) = 147.541, p < .001

Time: F(4, 140) = 4.455, p < .001

Software × Time: F(4, 140) = 23.831, p < .001

2E

Argus Distance Travelled vs. Observer Freezing Bouts

Software: F(1, 35) = 400.339, p < .001

Time: F(4, 140) = 6.936, p = .001

Software × Time: F(4, 140) = 6.353, p = .001

2C

EthoVision Immobility vs. Observer Freeze Duration

Software: F(1, 35) = 32.293 p < .001

Time: F(4, 140) = 9.253, p < .001

Software × Time: F(4, 140) = 20.564, p < .001

2F

EthoVision Mobility vs. Observer Freezing Bouts

Software: F(1, 35) = 192.218, p < .001

Time: F(4, 140) = 14.500, p < .001

Software × Time: F(4, 140) = 2.439, p= .087

3A

Argus Erratic Movements vs. EthoVision Rotations

Software: F(1, 35) = 277.167, p < .001

Time: F(4, 140) = 1.098, p = .345

Software × Time: F(4, 140) = 1.989, p= .118

3D

Argus Direction Change vs. EthoVision Turn Angle

Software: F(1, 35) = 5.479, p = .025

Time: F(4, 140) = 14.443, p < .001

Software × Time: F(4, 140) = 18.344, p < .001

3B

Argus Erratic Movements vs. Observer Erratic Movements

Software: F(1, 35) = 3.653, p = .064

Time: F(4, 140) = 0.838, p = .449

Software × Time: F(4, 140) = 0.982, p= .377

3E

Argus Direction Change vs. Observer Dives/Jumps

Software: F(1, 35) = 1658.128, p < .001

Time: F(4, 140) = 1.167, p = .326

Software × Time: F(4, 140) = 1.697, p= .173

3C

EthoVision Rotations vs. Observer Erratic Movements

Software: F(1, 35) = 141.096, p < .001

Time: F(4, 140) = 0.840, p = .470

Software × Time: F(4, 140) = 2.589, p= .066

3F

EthoVision Turn Angle vs. Observer Dives/Jumps

Software: F(1, 35) = 434.986, p < .001

Time: F(4, 140) = 17.403, p < .001

Software × Time: F(4, 140) = 17.965, p < .001

4A

Argus Distance to Bottom vs. EthoVision Distance to Bottom

Software: F(1, 35) = 6.897, p = .013

Time: F(4, 140) = 6.581, p = .001

Software × Time: F(4, 140) = 1.304, p= .277

4D

Argus Perimeter Time vs. EthoVision Distance to Bottom Variance

Software: F(1, 35) = 94.707, p < .001

Time: F(4, 140) = 11.441, p < .001

Software × Time: F(4, 140) = 6.400, p = .001

4B

Argus Distance to Bottom vs. Observer Floating

Software: F(1, 35) = 400.043, p < .001

Time: F(4, 140) = 7.978, p < .001

Software × Time: F(4, 140) = 6.862, p < .001

4E

Argus Perimeter Time vs. Observer Sinking

Software: F(1, 35) = 222.589, p < .001

Time: F(4, 140) = 15.669, p < .001

Software × Time: F(4, 140) = 6.620, p < .001

4C

EthoVision Distance to Bottom vs. Observer Floating

Software: F(1, 35) = 340.033, p < .001

Time: F(4, 140) = 6.038, p = .001

Software × Time: F(4, 140) = 4.975, p = .003

4F

EthoVision Distance to Bottom Variance vs. Observer Sinking

Software: F(1, 35) = 291.343, p < .001

Time: F(4, 140) = 3.381, p = .021

Software × Time: F(4, 140) = 1.726, p= .174

5A

Argus IID vs. EthoVision IID

Software: F(1, 35) = 42.690, p < .001

Time: F(4, 140) = 0.985, p = .402

Software × Time: F(4, 140) = 0.6.209, p < .001

5D

Argus IID vs. EthoVision Time in Proximity

Software: F(1, 35) = 40.447, p < .001

Time: F(4, 140) = 5.577, p = .002

Software × Time: F(4, 140) = 1.735, p= .168

5B

Argus IID vs. Observer Time in Proximity

Software: F(1, 35) = 4.879, p = .034

Time: F(4, 140) = 2.053, p = .121

Software × Time: F(4, 140) = 1.369, p= .258

5C

EthoVision IID vs. Observer Time in Proximity

Software: F(1, 35) = 17.506, p < .001

Time: F(4, 140) = 0.875, p = .441

Software × Time: F(4, 140) = 2.587, p= .061

5F

EthoVision Time in Proximity vs. Observer Time in Proximity

Software: F(1, 35) = 4.237, p = .047

Time: F(4, 140) = 0.437, p = .711

Software × Time: F(4, 140) = 4.852, p = .005

Statistics are shown for the main effects of software and time and the Software × Time interaction. Italic fonts indicate statistically significant results, whereas Software × Time interactions that are not statistically significant are indicated in bold font

Freezing behavior

Repeated measure ANOVAs confirmed a significant effect of time on all variables and also showed significant Software × Time interactions for all comparisons (panels 2A–2E in Table 8), except for the EthoVision mobility with The Observer recorded freezing bouts (panel 2F, Fig. 6C and D). Thus, for freezing behavior, only EthoVision mobility and Observer freezing bouts captured temporal variation in a similar fashion.
Fig. 6

Comparison of variables directly and indirectly measuring freezing behavior extracted using the three software applications: (A) Durations of freezing/immobility extracted from Argus, EthoVision, and The Observer. (B) Average distance traveled extracted from Argus, percentage of time spent in mobility extracted from EthoVision, and freezing bouts recorded in The Observer. Means ± SEMs are shown; n = 36 dyads for all.

Erratic behavior

For the erratic behavior variables, repeated measures ANOVAs show no significant effect of time in panels 3A–3C and 3E. In panels 3D and 3F, time has a significant effect and the Software × Time interaction is statistically significant. Thus, either the erratic movement variables in this category showed no significant effect of time (panels 3A–3C and 3E; Fig. 7) or time affected the variables we compared differently (panels 3D and 3F).
Fig. 7

Comparison of variables computing erratic behavior (fast changes in direction of swimming) extracted using the three software applications: (A) Frequencies of erratic movements extracted from Argus and The Observer, and rotations tracked in EthoVision. (B) Duration of direction changing extracted from Argus, absolute turn angle extracted from EthoVision, and frequency of dives/jumps recorded in The Observer. Means ± SEMs are shown; n = 36 dyads for all.

Other anxiety-related behaviors

Repeated measures ANOVAs showed a significant effect of time but no significant Software × Time interaction for the comparisons between Argus and EthoVision distance to bottom (panel 4A of Table 8; Fig. 8A and C) and between the EthoVision variable of variance in distance to bottom compared with observer-recorded sinking behavior (panel 4F in Table 8; Fig. 8C and D). All other comparisons showed a significant effect of time and a significant Software × Time interaction (panels 4B–4E in Table 8), indicating that these variables did not measure variation across time in the same manner.
Fig. 8

Comparison of variables that measured location-based anxiety-related behaviors extracted using the three software applications: (A) Distance to bottom extracted from Argus and EthoVision, and floating frequency recorded in The Observer. (B) Time spent within perimeter extracted from Argus, variance in distance to bottom extracted from EthoVision, and frequency of sinking recorded in The Observer. Means ± SEMs are shown; n = 36 dyads for all.

Social behavior

Repeated measures ANOVAs detected a significant effect of time but no significant Software × Time interaction for the comparison of Argus IID and EthoVision proximity time (panel 5D in Table 8; Fig. 9A and B). The other comparisons did not show a significant effect of time, thus the Software × Time interactions could not establish how time affected these variables.
Fig. 9

Comparison of social-behavior-related variables extracted using the three software applications: (A) Average interindividual distances from Argus and EthoVision. (B) Average durations in proximity, extracted from EthoVision and recorded in The Observer. Proximity was defined as being within one body length of a social stimulus. Means ± SEMs are shown; n = 36 dyads for all.

Discussion

In this study, we utilized a new R-software-based data extraction/quantification tool, Argus, developed in our laboratory. We compared the ability of Argus to those of other behavioral recording and quantification methods using software applications often employed in behavioral research with zebrafish and other laboratory organisms. We deliberately did not manipulate behavioral performance in a robust manner, because we wanted to investigate whether subtle changes—for example, as a result of temporal effects or simply due to individual differences—may be detected similarly or differently using the three software applications. In general, we found that the two video-tracking-based quantification methods—Argus extracting results from RFT recorded xy coordinates, and EthoVision—yielded similar results, whereas the observation-based method using event recording with The Observer yielded different findings.

Zebrafish genetics and morphogenesis are well-established, but behavioral neuroscience studies are relatively new (Gerlai, 2014; Kalueff, Stewart, & Gerlai, 2014). Zebrafish behavioral research offers benefits such as the ability to build on the previously established rodent literature and the availability of genetic manipulation techniques that are unparalleled in other model organisms (Gerlai, 2012; Lawson, 2016). On the other hand, there is an inconvenient caveat that behavioral standards and optimized protocols that quantify and interpret behavior are still being developed (Nema et al., 2016; Thorn, Clift, Ojo, Colwill, & Creton, 2017). The problem is further compounded by the fact that zebrafish exhibit fewer clearly observable complex motor patterns and often move very fast, necessitating automated analysis over manual event recording (Creton, 2009; Mwaffo et al., 2015). These limitations are especially obvious when studying such complex and multifaceted behaviors as anxiety, learning, and social behavior in zebrafish (Blaser, Chadwick, & McGinnis, 2010; Miller & Gerlai, 2011; Pagnussat et al., 2013; Pelkowski et al., 2011).

In particular, the ability to analyze the behavior of individual fish during social interaction can be an important tool for both basic science research and studies modeling the social deficit symptoms seen in human neuropsychological diseases (Guo, Wagle, & Mathur, 2012; Khan et al., 2017; Meshalkina et al., 2018). Although various behavioral paradigms exist, optimal tracking and interpretation of behavior is often challenging, and it necessitates validation and further investigation (Mwaffo et al., 2015; Nema et al., 2016). Thus, the goal of the present study was to evaluate three different software applications and compare their functionality in quantification of individual and social behaviors relevant for disease modeling. We compared behavioral variables from Argus, EthoVision, and The Observer on adult zebrafish behavior in dyadic interactions.

To examine any differences between the calculations made in the two tracking software, the first experiment was conducted to compare the very basic movement of single fish as a proof of concept. The results from Experiment 1 provided evidence that overall, there was a high level of similarity in how the three software applications detected individual differences among the tested fish and how they detected the pattern of changes in behavior across time. Patterns revealing individual differences in fish behavior were detectable in visual representation of data extracted from Argus and were similar to variables extracted from EthoVision for both swimming speed and variance of swimming speed. These similarities supported our hypothesis that the calculations used in Argus yield outputs that are comparable to widely used commercially available software application EthoVision, and motivated us to test utility of Argus during social interaction.

Thus, in Experiment 2, we tracked fish dyads and found high level of similarity for some, but not all, of the compared variables across the three applications (Argus, EthoVision, and The Observer). Strong correlations were found between Argus and EthoVision for variables quantifying mean swimming speed, inactivity (direct and indirect measure of immobility), erratic or increased activity (erratic movement and rotations), and location within the water column (distance to bottom, floating higher in the water column, sinking in the lower half of the water column, etc.). Social variables (IID and proximity) extracted from EthoVision were also strongly correlated with Argus coded IID. These results and trends support our argument that Argus is an effective tool for behavioral analysis of dyadic interaction in zebrafish.

The development and validation of Argus is valuable for multiple reasons. First, it mimics the usability of EthoVision, which can be helpful for researchers looking for similar but more cost-effective and open-source tools to quantify behavior (Buske & Gerlai, 2014; Felix et al., 2017). Second, due to open-source, license free distribution of the R environment, Argus can be run simultaneously on multiple computers without any additional cost for additional software licenses. Furthermore, Argus can work with Windows, Mac, and Linux operating systems. Argus variables are flexible such that user can define and quantify specific parameters in the GUI and control the output for the parameters. For example, the user can define speed that marks a movement as erratic and the user can select the output to be extracted in time-bins averaged over 1 min or 5 or 30 min, and so forth. Argus also allows custom-built variables such that any behavior can be quantified if it can be defined by specific speed, time, directional turns, location, or a combination of these parameters. Finally, for users with some expertise in R, even more sophisticated analysis (beyond the scope of this article) is possible. For example, the user could quantify instances of thrashing followed by freezing, or identify occurrence or frequency of a pattern of approaching, circling, and shoaling with a partner.

It should be noted that Argus and EthoVision outputs were not strongly correlated with similar behaviors recorded by The Observer. With the exception of strong correlation between EthoVision-extracted rotation frequency and The Observer-recorded erratic movement frequency, all other comparisons of The Observer-recorded behavior with Argus and EthoVision yielded only moderate to weak correlations. The Observer-recorded freezing (duration and frequency) and erratic movement frequency had moderately strong correlations with comparable variables extracted from Argus. However, we are unable to decipher why a greater degree of variance exists in tracked versus event-recorded variables. The Observer is an event-recording software in that a user assigns specific behavior codes on a keyboard (for example, s = swim, j = jump, b = bite, and so forth). The frequency and duration of time for a particular behavior is calculated on the basis of when a specified key is pressed during a behavioral observation (Noldus, Trienes, Hendriksen, Jansen, & Jansen, 2000). It is possible that manual coding is less accurate for behaviors whose intensity needs to be quantified. For example, a fish may be swimming for a long duration of time (a large value recorded in The Observer), whereas its actual speed is relatively modest (a small value recorded by video-tracking software applications). On the other hand, and just as likely, perhaps the weaker correlations highlight the tracking applications’ limited capacity to precisely capture complex motor patterns that can be easily perceived and identified, and thus recorded, by a human observer (Blaser & Gerlai, 2006).

The use of event-recorders such as The Observer has many advantages, including the ability to quantify almost any behavior and the coding of live videos or playback of recorded videos at a slower speed, as well as the analysis of additional information captured by latency or sequential order of behaviors. On the other hand, some of its significant disadvantages may be prohibitive, including the inability to output precise location and directional changes, intensity of the behavior, as well as the dependence on a human user. Manual coding may be subjective, labor-intensive, and potentially prone to interobserver or even intra-observer unreliability (Blaser & Gerlai, 2006; Jhuang et al., 2010). Manual coding can also be limited in how many subjects may be recorded or analyzed at once. For example, in Experiment 2 we had to process the same video twice with The Observer software, in order to focus on only one of the fish at a time. Recording both fish at once would have resulted in a more challenging task for the human observer and would have potentially increased the errors (missing occurrence of behaviors) or decreased efficiency (slower speed of the video playback). In contrast, tracking software can allow processing of more than one subject at a time with precision. We used both EthoVision and Argus to track two fish simultaneously. RFT is capable of reliably tracking multiple fish (up to ten fish as used in Buske & Gerlai, 2011, 2012; Mahabir et al., 2013). Although we have only examined dyadic interactions in this article, Argus is capable of analyzing social and nonsocial interaction-based data with multiple fish (same script works for larger shoals of up to ten fish). Thus, refining automated video-tracking-based methods is likely going to make an important contribution to zebrafish behavioral research in particular. Furthermore, such analysis may be likely extendable to the investigation of movement and/or interaction of individuals, particles or objects in general.

Refining video-tracking-based methods will require further empirical studies, which, for example, explore zebrafish behavior in numerous different contexts. For example, if one is interested in zebrafish social behavior, one could investigate numerous aspects of courtship, shoaling, as well as aggression. Our selection and organization of behaviors was relevant for nonaggressive social interaction of an unfamiliar dyad in a small tank. The variables we chose to examine may not be fully extrapolated to other behavioral contexts. Comparative analysis of behavior quantification methods using different software applications are needed to fully understand the similarity or differences between these methods. In summary, improved automated tracking and analysis tools are needed to identify and quantify a variety of complex behavior patterns that may emerge in different contexts. Argus has the potential to be such a tool, particularly due to low-cost and open-source distribution in a community-based R network. Although these features of Argus are not limited to use with zebrafish research, we suspect they will make zebrafish behavioral research even more attractive and practical.

Author note

The idea and conceptualization of this article were by S.S. and R.G. The Argus algorithms and codes were written by M.S. Experimental implementation and validation was by S.S. and S.A., and the data manipulation and analysis was by S.S. and S.A. The original draft was written by S.S. and edited and reviewed by S.S., S.A., and R.G. The authors are thankful to Diane Seguin, Aneesa Khan, and Aysha Khan for their technical assistance, and to the vivarium staff and volunteers at the University of Toronto Mississauga for zebrafish husbandry. We are also grateful to James McCrae for developing TheRealFishTracker (available at www.dgp.toronto.edu/~mccrae/projects/FishTracker/). This work was supported by a grant from NSERC (311637) to R.G. and by an NSERC-USRA grant (345211) to M.S.

Supplementary material

13428_2018_1083_MOESM1_ESM.pdf (959 kb)
ESM 1 (PDF 959 kb)

References

  1. Blaser, R., & Gerlai, R. (2006). Behavioral phenotyping in zebrafish: Comparison of three behavioral quantification methods. Behavior Research Methods, 38, 456–469. doi: https://doi.org/10.3758/BF03192800 CrossRefGoogle Scholar
  2. Blaser, R. E., Chadwick, L., & McGinnis, G. C. (2010). Behavioral measures of anxiety in zebrafish (Danio rerio). Behavioural Brain Research, 208, 56–62. doi: https://doi.org/10.1016/j.bbr.2009.11.009 CrossRefGoogle Scholar
  3. Breacker, C., Barber, I., Norton, W. H., McDearmid, J. R., & Tilley, C. A. (2017). A low-cost method of skin swabbing for the collection of DNA samples from small laboratory fish. Zebrafish, 14, 35–41. doi: https://doi.org/10.1089/zeb.2016.1348 CrossRefGoogle Scholar
  4. Buske, C., & Gerlai, R. (2011). Shoaling develops with age in Zebrafish (Danio rerio). Prog Neuropsychopharmacol Biol Psychiatry, 35(6), 1409–1415.  https://doi.org/10.1016/j.pnpbp.2010.09.003
  5. Buske, C., & Gerlai, R. (2012). Maturation of shoaling behavior is accompanied by changes in the dopaminergic and serotoninergic systems in zebrafish. Dev Psychobiol, 54(1), 28–35.  https://doi.org/10.1002/dev.20571
  6. Buske, C., & Gerlai, R. (2014). Diving deeper into Zebrafish development of social behavior: Analyzing high resolution data. Journal of Neuroscience Methods, 234, 66–72. doi: https://doi.org/10.1016/j.jneumeth.2014.06.019 CrossRefGoogle Scholar
  7. Carter, B. S., Cortes-Campos, C., Chen, X., McCammon, J. M., & Sive, H. L. (2017). Validation of protein knockout in mutant zebrafish lines using in vitro translation assays. Zebrafish, 14, 73–76. doi: https://doi.org/10.1089/zeb.2016.1326 CrossRefGoogle Scholar
  8. Creton, R. (2009). Automated analysis of behavior in zebrafish larvae. Behavioural Brain Research, 203, 127–136. doi: https://doi.org/10.1016/j.bbr.2009.04.030 CrossRefGoogle Scholar
  9. Estepa, A., & Coll, J. (2015). Innate multigene family memories are implicated in the viral-survivor zebrafish phenotype. PLoS ONE, 10, e0135483. doi: https://doi.org/10.1371/journal.pone.0135483 CrossRefGoogle Scholar
  10. Felix, L. M., Antunes, L. M., Coimbra, A. M., & Valentim, A. M. (2017). Behavioral alterations of zebrafish larvae after early embryonic exposure to ketamine. Psychopharmacology, 234, 549–558. doi: https://doi.org/10.1007/s00213-016-4491-7
  11. Field, H. A., Kelley, K. A., Martell, L., Goldstein, A. M., & Serluca, F. C. (2009). Analysis of gastrointestinal physiology using a novel intestinal transit assay in zebrafish. Neurogastroenterology & Motility, 21, 304–312. doi: https://doi.org/10.1111/j.1365-2982.2008.01234.x CrossRefGoogle Scholar
  12. Gerlai, R. (2010). High-throughput behavioral screens: The first step toward finding genes involved in vertebrate brain function using zebrafish. Molecules, 15, 2609–2622. doi: https://doi.org/10.3390/molecules15042609 CrossRefGoogle Scholar
  13. Gerlai, R. (2012). Using zebrafish to unravel the genetics of complex brain disorders. Current Topics in Behavioral Neuroscience, 12, 3–24. doi: https://doi.org/10.1007/7854_2011_180 CrossRefGoogle Scholar
  14. Gerlai, R. (2014). Fish in behavior research: Unique tools with a great promise! Journal of Neuroscience Methods, 234, 54–58. doi: https://doi.org/10.1016/j.jneumeth.2014.04.015 CrossRefGoogle Scholar
  15. Gerlai, R. (2015). Zebrafish phenomics: behavioral screens and phenotyping of mutagenized fish. Current Opinions in Behavioral Science, 2, 21–27. doi: https://doi.org/10.1016/j.cobeha.2014.07.007 CrossRefGoogle Scholar
  16. Guo, S., Wagle, M., & Mathur, P. (2012). Toward molecular genetic dissection of neural circuits for emotional and motivational behaviors. Developmental Neurobiology, 72, 358–365. doi: https://doi.org/10.1002/dneu.20927 CrossRefGoogle Scholar
  17. Jhuang, H., Garrote, E., Mutch, J., Yu, X., Khilnani, V., Poggio, T., … Serre, T. (2010). Automated home-cage behavioural phenotyping of mice. Nature Communications, 1, 68. doi: https://doi.org/10.1038/ncomms1064 CrossRefGoogle Scholar
  18. Kalueff, A. V., Gebhardt, M., Stewart, A. M., Cachat, J. M., Brimmer, M., Chawla, J. S., … Schneider, H. (2013). Toward a comprehensive catalog of zebrafish behavior 1.0 and beyond. Zebrafish, 10, 70–86. doi: https://doi.org/10.1089/zeb.2012.0861 CrossRefGoogle Scholar
  19. Kalueff, A. V., Stewart, A. M., & Gerlai, R. (2014). Zebrafish as an emerging model for studying complex brain disorders. Trends in Pharmacological Sciences, 35, 63–75. doi: https://doi.org/10.1016/j.tips.2013.12.002 CrossRefGoogle Scholar
  20. Khan, K. M., Collier, A. D., Meshalkina, D. A., Kysil, E. V., Khatsko, S. L., Kolesnikova, T., … Echevarria, D. J. (2017). Zebrafish models in neuropsychopharmacology and CNS drug discovery. British Journal of Pharmacology, 174, 1925–1944. doi: https://doi.org/10.1111/bph.13754 CrossRefGoogle Scholar
  21. Ladu, F., Butail, S., Macri, S., & Porfiri, M. (2014). Sociality modulates the effects of ethanol in zebra fish. Alcoholism: Clinical and Experimental Research, 38, 2096–2104. doi: https://doi.org/10.1111/acer.12432 CrossRefGoogle Scholar
  22. Lawson, N. D. (2016). Reverse genetics in zebrafish: Mutants, morphants, and moving forward. Trends in Cellular Biology, 26, 77–79. doi: https://doi.org/10.1016/j.tcb.2015.11.005 CrossRefGoogle Scholar
  23. Lin, E., Craig, C., Lamothe, M., Sarunic, M. V., Beg, M. F., & Tibbits, G. F. (2015). Construction and use of a zebrafish heart voltage and calcium optical mapping system, with integrated electrocardiogram and programmable electrical stimulation. American Journal of Physiology: Regulatory, Integrative and Comparative Physiology, 308, R755–R768. doi: https://doi.org/10.1152/ajpregu.00001.2015 Google Scholar
  24. Mahabir, S., Chatterjee, D., Buske, C., & Gerlai, R. (2013). Maturation of shoaling in two zebrafish strains: A behavioral and neurochemical analysis. Behavioural Brain Research, 247, 1–8. doi: https://doi.org/10.1016/j.bbr.2013.03.013 CrossRefGoogle Scholar
  25. Makhankov, Y. V., Rinner, O., & Neuhauss, S. C. (2004). An inexpensive device for non-invasive electroretinography in small aquatic vertebrates. Journal of Neuroscience Methods, 135, 205–210. doi: https://doi.org/10.1016/j.jneumeth.2003.12.015 CrossRefGoogle Scholar
  26. Mathur, P., & Guo, S. (2010). Use of zebrafish as a model to understand mechanisms of addiction and complex neurobehavioral phenotypes. Neurobiology of Disease, 40, 66–72. doi: https://doi.org/10.1016/j.nbd.2010.05.016 CrossRefGoogle Scholar
  27. Meshalkina, D. A., Kizlyk, M. N., Kysil, E. V., Collier, A. D., Echevarria, D. J., Abreu, M. S., … Kalueff, A. V. (2018). Zebrafish models of autism spectrum disorder. Experimental Neurology, 299, 207–216. doi: https://doi.org/10.1016/j.expneurol.2017.02.004 CrossRefGoogle Scholar
  28. Miller, N., & Gerlai, R. (2007). Quantification of shoaling behaviour in zebrafish (Danio rerio). Behavioural Brain Research, 184, 157–166. doi: https://doi.org/10.1016/j.bbr.2007.07.007 CrossRefGoogle Scholar
  29. Miller, N., & Gerlai, R. (2012). From schooling to shoaling: Patterns of collective motion in zebrafish (Danio rerio). PloS ONE, 7, e48865. doi: https://doi.org/10.1371/journal.pone.0048865 CrossRefGoogle Scholar
  30. Miller, N. Y., & Gerlai, R. (2011). Shoaling in zebrafish: What we don’t know. Reviews in the Neurosciences, 22, 17–25. doi: https://doi.org/10.1515/rns.2011.004 CrossRefGoogle Scholar
  31. Moretz, J. A., Martins, E. P., & Robison, B. D. (2007). The effects of early and adult social environment on zebrafish (Danio rerio) behavior. Environmental Biology of Fishes, 80, 91–101. doi: https://doi.org/10.1007/s10641-006-9122-4 CrossRefGoogle Scholar
  32. Mwaffo, V., Butail, S., di Bernardo, M., & Porfiri, M. (2015). Measuring zebrafish turning rate. Zebrafish, 12, 250–254. doi: https://doi.org/10.1089/zeb.2015.1081 CrossRefGoogle Scholar
  33. Nema, S., Hasan, W., Bhargava, A., & Bhargava, Y. (2016). A novel method for automated tracking and quantification of adult zebrafish behaviour during anxiety. Journal of Neuroscience Methods, 271, 65–75. doi: https://doi.org/10.1016/j.jneumeth.2016.07.004 CrossRefGoogle Scholar
  34. Nilsen, B. M., Berg, K., Eidem, J. K., Kristiansen, S. I., Brion, F., Porcher, J. M., & Goksoyr, A. (2004). Development of quantitative vitellogenin-ELISAs for fish test species used in endocrine disruptor screening. Analytical and Bioanalytical Chemistry, 378, 621–633. doi: https://doi.org/10.1007/s00216-003-2241-2 CrossRefGoogle Scholar
  35. Noldus, L. P. J. J., Trienes, R. J. H., Hendriksen, A. H. M., Jansen, H., & Jansen, R. G. (2000). The Observer Video-Pro: New software for the collection, management, and presentation of time-structured data from videotapes and digital media files. Behavior Research Methods, Instruments, & Computers, 32, 197–206. doi: https://doi.org/10.3758/BF03200802 CrossRefGoogle Scholar
  36. Norton, W. (2013). Toward developmental models of psychiatric disorders in zebrafish. Frontiers in Neural Circuits, 7, 79. doi: https://doi.org/10.3389/fncir.2013.00079 CrossRefGoogle Scholar
  37. Pagnussat, N., Piato, A. L., Schaefer, I. C., Blank, M., Tamborski, A. R., Guerim, L. D., … Lara, D. R. (2013). One for all and all for one: The importance of shoaling on behavioral and stress responses in zebrafish. Zebrafish, 10, 338–342. doi: https://doi.org/10.1089/zeb.2013.0867 CrossRefGoogle Scholar
  38. Pelkowski, S. D., Kapoor, M., Richendrfer, H. A., Wang, X., Colwill, R. M., & Creton, R. (2011). A novel high-throughput imaging system for automated analyses of avoidance behavior in zebrafish larvae. Behavioural Brain Research, 223, 135–144. doi: https://doi.org/10.1016/j.bbr.2011.04.033 CrossRefGoogle Scholar
  39. Prykhozhij, S. V., Steele, S. L., Razaghi, B., & Berman, J. N. (2017). A rapid and effective method for screening, sequencing and reporter verification of engineered frameshift mutations in zebrafish. Disease Models & Mechanisms, 10, 811–822. doi: https://doi.org/10.1242/dmm.026765 CrossRefGoogle Scholar
  40. Qin, M., Wong, A., Seguin, D., & Gerlai, R. (2014). Induction of social behavior in zebrafish: live versus computer animated fish as stimuli. Zebrafish, 11, 185–197. doi: https://doi.org/10.1089/zeb.2013.0969 CrossRefGoogle Scholar
  41. Saif, M., Chatterjee, D., Buske, C., & Gerlai, R. (2013). Sight of conspecific images induces changes in neurochemistry in zebrafish. Behavioural Brain Research, 243, 294–299. doi: https://doi.org/10.1016/j.bbr.2013.01.020 CrossRefGoogle Scholar
  42. Saverino, C., & Gerlai, R. (2008). The social zebrafish: Behavioral responses to conspecific, heterospecific, and computer animated fish. Behavioural Brain Research, 191, 77–87. doi: https://doi.org/10.1016/j.bbr.2008.03.013 CrossRefGoogle Scholar
  43. Schaefer, I. C., Siebel, A. M., Piato, A. L., Bonan, C. D., Vianna, M. R., & Lara, D. R. (2015). The side-by-side exploratory test: a simple automated protocol for the evaluation of adult zebrafish behavior simultaneously with social interaction. Behavioural Pharmacology, 26, 691–696. doi: https://doi.org/10.1097/fbp.0000000000000145 CrossRefGoogle Scholar
  44. Schroeder, P., Jones, S., Young, I. S., & Sneddon, L. U. (2014). What do zebrafish want? Impact of social grouping, dominance and gender on preference for enrichment. Lab Animal, 48, 328–337. doi: https://doi.org/10.1177/0023677214538239 CrossRefGoogle Scholar
  45. Seguin, D., & Gerlai, R. (2017). Zebrafish prefer larger to smaller shoals: Analysis of quantity estimation in a genetically tractable model organism. Animal Cognition, 20, 813–821. doi: https://doi.org/10.1007/s10071-017-1102-x CrossRefGoogle Scholar
  46. Seguin, D. (2018). Effects of early embryonic ethanol exposure on adult zebrafish social behavior. Toronto: PhD, University of TorontoGoogle Scholar
  47. Shams, S., Amlani, S., Buske, C., Chatterjee, D., & Gerlai, R. (2018). Developmental social isolation affects adult behavior, social interaction, and dopamine metabolite levels in zebrafish. Developmental Psychobiology, 60, 43–56. doi:10.1002/dev.21581Google Scholar
  48. Shams, S., & Gerlai, R. (2016). Pharmacological manipulation of shoaling behavior in zebrafish. Current Psychopharmacology, 5, 180–193. doi: https://doi.org/10.2174/2211556005666160607094906 CrossRefGoogle Scholar
  49. Sison, M., Cawker, J., Buske, C., & Gerlai, R. (2006). Fishing for genes influencing vertebrate behavior: Zebrafish making headway. Lab Animal, 35, 33–39. doi: https://doi.org/10.1038/laban0506-33 CrossRefGoogle Scholar
  50. Teles, M. C., Dahlbom, S. J., Winberg, S., & Oliveira, R. F. (2013). Social modulation of brain monoamine levels in zebrafish. Behavioural Brain Research, 253, 17–24. doi: https://doi.org/10.1016/j.bbr.2013.07.012 CrossRefGoogle Scholar
  51. Teles, M. C., & Oliveira, R. F. (2016). Quantifying aggressive behavior in zebrafish. Methods in Molecular Biology, 1451, 293–305. doi: https://doi.org/10.1007/978-1-4939-3771-4_20 CrossRefGoogle Scholar
  52. Thorn, R. J., Clift, D. E., Ojo, O., Colwill, R. M., & Creton, R. (2017). The loss and recovery of vertebrate vision examined in microplates. PloS ONE, 12, e0183414. doi: https://doi.org/10.1371/journal.pone.0183414 CrossRefGoogle Scholar
  53. Wang, J., Zhang, X., Shan, R., Ma, S., Tian, H., Wang, W., & Ru, S. (2016). Lipovitellin as an antigen to improve the precision of sandwich ELISA for quantifying zebrafish (Danio rerio) vitellogenin. Comparative Biochemistry and Physiology Part C: Toxicology & Pharmacology, 185–186, 87–93. doi: https://doi.org/10.1016/j.cbpc.2016.03.007 Google Scholar
  54. Wright, D., Ward, A. J., Croft, D. P., & Krause, J. (2006). Social organization, grouping, and domestication in fish. Zebrafish, 3, 141–155. doi: https://doi.org/10.1089/zeb.2006.3.141 CrossRefGoogle Scholar

Copyright information

© Psychonomic Society, Inc. 2018

Authors and Affiliations

  1. 1.Department of Cell & Systems BiologyUniversity of Toronto MississaugaMississaugaCanada
  2. 2.Department of PsychologyUniversity of Toronto MississaugaMississaugaCanada
  3. 3.Department of Computer ScienceUniversity of MontrealMontrealCanada

Personalised recommendations