Keywords

1 Introduction

Communicating scientific results and recommendations about natural hazards and disasters into language easily understandable by non-experts is a challenging task in the best of circumstances. During an actual natural hazard event, stress levels are high and considerable pressure is put on scientists and emergency managers to communicate a wide variety of information to each other and many different stakeholders (Alexander 2007; Barclay et al. 2008; Haynes et al. 2008; IAVCEI Task Group on Crisis Protocols 2016; Rovins et al. 2015, p. 56).

Many practicing scientists receive no formal training in science communication (including communication with the public and with media) (MORI and The Welcome Trust 2001; The Royal Society 2006) or public engagement (Miller and Fahy 2009). Additionally, embedded training of science communication in undergraduate degree programmes is uncommon, though specific degrees, minors, or postgraduate degrees are offered in a relatively select few institutions and predominantly within Europe (Trench and Miller 2012). Therefore, dedicated science and risk communication training for undergraduates provides a valuable opportunity to instil the next generation of natural hazard scientists and emergency managers with communication strategies and skills which, if informed by established best practices, will aid them to better serve a society that faces increasing risks from natural and manmade hazards.

This chapter describes a case study about an interactive, challenging role-play designed to train students how to forecast volcanic eruptions, manage the impacts from these eruptions, and communicate with the public throughout the simulated crisis. The chapter also introduces the reader to the foundations of instructional communication, education, and risk and crisis communication research and demonstrates how to evaluate communication training pedagogy with an evidence-based approach.

We argue that role-play challenges students and provides them with practical experience that they can utilise in their careers. It also improves learner’s confidence in their ability to communicate and improves their overall perceptions of risk and crisis communication best practice. We believe the success of the role-play lies in the explicit practicing of authentic communication tasks in a feedback-rich environment and we hope to should encourage instructors to incorporate more authentic tasks into their curricula. We invite our readers to use and adapt this curriculum in classrooms of all levels of formal and informal education.

1.1 Why Is Volcanic Risk Communication Training Important?

There is a long history and multidisciplinary approach to research of risk and crisis communication. Corporate crisis communication and public relations (e.g., Grunig and Repper 1992; Crane and Livesey 2003), health risk and crisis communication (e.g., Reynolds and Shenhar 2016), and broader risk communication (e.g., Morgan et al. 2002; Glik 2007) communities have all explored the strategies, philosophies and evaluation of these communications and how differing approaches may influence its success.

In general, these communities have advised that we should move away from the old, linear, ‘transmission’ form of communication (i.e., ‘source’ to ‘receiver’ or the Shannon-Weaver model of communication) towards a participatory approach to work with communities to establish a dialogue (e.g., Fisher 1991; Fischhoff 1995) that supports diversity in the needs of the audience (McCroskey 2006) preferably in an unofficial and relaxed setting that helps to build trust between scientists and the public (Haynes et al. 2007). The Sendai Framework for Disaster Risk Reduction supports this approach, encouraging the sectors of society (i.e., public, private and academic sectors) to work together in a ‘people-centred’ approach to DRR (United Nations International Strategy for Disaster Risk Reduction 2015, See point 7, p. 10). This shift is important for the delivery of risk and crisis communications and highlights the importance of knowing, understanding and connecting with your audience.

Volcanologists play a major role in the dialogue that occurs in the long-term and short-term communication of volcanic risk. Pielke (2007) provides an excellent overview of the particular roles that experts may choose to take when science has the potential to impact policy, politics and the public. He proposes that experts (e.g., medical practitioners, engineers or scientists) can act as an ‘honest broker’ by providing clear options to the person(s) at risk, articulate the specific outcomes, while simultaneously accounting for uncertainties and incorporating the most up to date scientific understanding of the topic at hand.

Essentially, it is our job as scientists to provide clear information to the public on the potential risks that they face from volcanoes. However, as stated above, scientists are rarely trained in communication so the pathways and strategies for achieving this aim is less known. Additionally, there have been very few initiatives that have blended volcanology, risk communication, and education but all of these research areas have much to offer to the teaching of communication in the sciences. This research hopes to bridge this gap and describes a research-informed curriculum that can be used to train future volcanologists in the best practices of volcanic risk and crisis communication.

1.2 Instructional Communication Research

Communication is one of the most commonly mentioned graduate attributes for most undergraduate degrees and is also core to the geology profession (Heath 2000; Jones et al. 2010). A quick sample of several university’s graduate attribute profiles will show you that communication, in some defined form, is almost always present. Communication was a main focus (i.e., was among the primary goals and outcomes) in all of the courses (see Sect. 2) that featured the role-play so as part of our efforts to include authentic communication training we undertook a review of instructional communication (i.e., the teaching of communication skills). Here, we share some of what the research community tells us about teaching communication.

Firstly, there are a wealth of studies that advocate for the benefits of learners undergoing some form of communication education. Morreale and Pearson (2008) state that effective communication skills are needed across many disciplines (e.g., sciences, business, engineering or architecture) and helps them to succeed in a range of careers. Morreale and Pearson (2008) also state that communication training encourages global, socially and culturally-aware citizens, including specific areas of global significance allowing our society to make better decisions in areas like health and medicine, crisis management, and policing.

Secondly, effective communication does not come about by simply practicing a speech in front of a mirror. A recent study by Engleberg et al. (2016) compiled the core competencies of communication to assist in building a standardised introductory course in instructional communication. The seven core competencies (listed here, taken directly from Engleberg et al. 2016) shows the reader the diversity of skills that are needed to be an effective communicator:

  1. 1.

    Monitoring and Presenting Your Self (i.e., the ability to monitor and present yourself to others within and across a variety of communication contexts);

  2. 2.

    Practicing Communication Ethics (i.e., the ability to identify, evaluate, and demonstrate appropriate ethical behaviour within and across a variety of communication contexts);

  3. 3.

    Adapting to Others (i.e., the ability to understand, respect, and adapt messages to a diversity of human characteristics and attitudes in order to accomplish a communication goal within and across a variety of communication contexts);

  4. 4.

    Practicing Effective Listening (i.e., the ability to listen effectively and respond appropriately to the meaning of messages within and across a variety of communication contexts);

  5. 5.

    Expressing Messages (i.e., the ability to select, demonstrate, and adapt appropriate forms of verbal, nonverbal, and mediated expression that support and enhance the meaning of messages within and across a variety of communication contexts);

  6. 6.

    Identifying and Explaining Fundamental Communication Processes (i.e., the ability to identify and explain how specific communication processes influence the outcome of communication interactions within and across a variety of communication contexts);

  7. 7.

    Creating and Analysing Message Strategies (i.e., the ability to create and analyse message strategies that generate meaning within and across a variety of communication contexts).

Thirdly, measuring and assessment of communication competence is different from most learning in the sciences and other disciplines. It is a skill that is highly contextualised (See above) and success is in the mind of the receiver(s)/audience(s), that makes it inherently difficult to judge with objective consistency. Determining whether a learner has shown excellence in communication requires observation of the student’s performance across a range of situations and contexts. Though these competencies may seem difficult to assess, communication researchers have developed a series of measures that aim to capture some of the many dimensions of communication competency.Footnote 1

Our research aimed to characterise and measure students’ confidence and perceptions of volcanic crisis communication and to determine if the role-play was effective at improving these qualities. This study occurred at the beginning of a longitudinal programme that is exploring a working model of communication denoted by several dimensions that impact an individual’s communication performance: communication confidence (discussed here), perceptions of science/crisis communication (discussed here), previous experiences with communication, and content knowledge (i.e., expertise in the topic that is being communicated).

Confidence in one’s ability to communicate competently relies on having the knowledge, skills and motivation to communicate (Rubin and Morreale 1996). The knowledge to communicate competently requires learners to select the appropriate information and strategy for the right situation, while the skills come about from having the skills to execute these strategies (Kreps and Query 1990). The motivation to communicate arises from learners choosing to engage after weighing several internal and external factors (e.g., grade incentives; Fortney et al. 2001). Courses in public speaking have been shown to increase student’s confidence communicating (Miller 1987; Richmond et al. 1989; Rubin et al. 1997; Ellis 1995). It is worth noting that confidence does not directly translate to effective performance and that overconfidence (e.g., Kruger and Dunning 2009) and compulsive communication (Fortney et al. 2001) can be detrimental to learning and communication. Communication confidence was measured by asking students to self-report their perceived competency to communicate to different receivers and in different contexts (described in Sect. 2).

Another important construct to our study was the perceptions of risk and crisis communication best practice. Perceptions are a selection of attitudes or beliefs that an individual holds and that guides their behaviour. McCroskey (2006) proposes that there are three elements to building communication skills: desire, understanding, and experience. Understanding communication involves knowledge and awareness of the multitude of considerations and strategies that you can employ when crafting and delivering a message. A perceptions survey allows you to check for alignment between the views of the students compared with the views of professionals.

Our curriculum was focussed on teaching students’ volcanic crisis communication, and so their perceptions were measured by asking students whether they agreed with a series of statements concerning best practice under these circumstances (described in Sect. 2). Though it should be stated that simply because you hold a ‘correct’ perception does not mean that you will (a) execute the strategy effectively, or (b) decide to use the strategy when the opportunity arises. Holding expert-like perceptions is only one part of the tool kit for becoming an effective communicator.

1.3 Educational Research

Educational research is critical for the development and evaluation of curricula. As our understanding of ‘how we learn’ becomes more sophisticated, the strategies we use in the classroom allow for more effective learning experiences than traditional, stand-and-deliver teaching. At present, we feel that rigorous education research is an underutilised resource at all levels of volcanology education including formal and informal educational settings.

In practice, curriculum development is often content-driven rather than learning outcome-driven (i.e., focuses on specific aspects of volcanism to cover, rather than on the skills and knowledge that an instructor hopes the students will gain from learning about volcanoes). Additionally, curriculum development is undertaken by academics or secondary school educators who may not be aware of applied volcanology and emergency management practices. Consequently, lessons that are developed may be theory-focussed (not skills-focussed) and lack the authentic challenges that accompany volcanic crises.

Authentic learning focuses on real-world, complex problems and their solutions taught within authentic environments through activity and social interaction (Herrington and Herrington 2006; Lombardi 2007; Herrington et al. 2014). Authentic learning seeks to replicate real-world practices in the classroom including the environment, roles, and responsibilities of professionals. Role-play is one of the many examples of authentic learning. Other examples include: simulation, role-play, mentoring, debate, case studies, coaching, and reflection (e.g., Brown et al. 1989). Authentic learning offers an opportunity for students to explore communication in its fullest complexity leading to a more befitting assessment of their communication skills.

The effectiveness of role-play and simulation for learning has been reported in a number of studies (e.g., DeNeve and Heppner 1997; van Ments 1999). Simulation is defined as a learning experience that occurs within an imaginary or virtual system or world (van Ments 1999) and ‘role-play’ as the importance and interactivity of roles in pre-defined scenarios (Errington 1997, 2011). Simulation and role-play require more active participation from students than lecture-based teaching techniques and intend to teach practical and theoretical skills that are transferable to different future situations (Roth and Roychoudhury 1993; Lunce 2006). Research shows that role-play and simulation improve student attitudes towards learning (DeNeve and Heppner 1997; van Ments 1999; Shearer and Davidhizar 2003) and interpersonal interactions (Blake 1987; van Ments 1999; Shearer and Davidhizar 2003), generic transferable skills (problem-solving and decision-making skills (Errington 1997; Barclay et al. 2011), communication skills (Bales 1976; van Ments 1999; Hales and Cashman 2008); and teamwork skills (Maddrell 1994; Harpp and Sweeney 2002), as well as discipline-specific knowledge (DeNeve and Heppner 1997; Livingstone 1999) and volcanic eruption forecasting skills (Harpp and Sweeney 2002; Hales and Cashman 2008).

1.4 Risk and Crisis Communication Best Practices

In order to teach students how to communicate about volcanic risk, we must first understand how experts communicate before, during and after volcanic events. The communication of science (more generally) can take on a multitude of formats, styles, objectives, and outcomes. Burns et al. (2003) defined science communication as the “… use of appropriate skills, media, activities, and dialogue to produce one or more of the following personal responses to science: awareness, enjoyment, interest, opinions, and understanding of science (i.e., its content, processes and social factors)”. Volcanic risk and crisis communication may include science communication that can be used to educate and promote risk-reducing behaviours to the public (Barclay et al. 2008).

We differentiate between risk and crisis communication using criteria laid out by Reynolds and Seeger (2005; Table 1): Risk communication uses messages that focus on reducing the consequences of a known threat (i.e., risk is based on projections and long-term forecasts), occurring prior to an event in frequent or routine communication campaigns, relying on technical experts and scientists to deliver the message; while Crisis communication uses messages that focus on information regarding a disruptive event, occurring immediately following and in a response to an event,Footnote 2 relying on authority figures and technical experts to deliver the message. Reynolds and Seeger (2005) promote an integrated model where the scientific community can view communication as part of an ever-evolving cycle around risk factors that must adapt and match to the situation and context. This allows communicators to approach both risk and crisis communication with a set of tools (i.e., best practices) that must be carefully selected and suit the context and needs of the audience. We welcome this way of thinking, and seek to undertake communication training of students and practitioners within this framework.

Table 1 Study participants demographics

For the purposes of teaching, we wanted to have a concise set of best practices that incorporated scholarly work but was comprehensible to our students allowing them to pick them up in the short time frame allocated by our curriculum. A colleague at the University of Otago developed a distinct set of rules for risk and science communication, which was derived from research on media from the Canterbury Earthquake sequence, that she called the 7Cs (Taken from Bryner 2012; Ideas influenced from the 10Cs Weingart et al. 2000; Miller 2008). These best practices were explicitly given to students prior to participating in the role-play, and were a part of the theoretical foundation for the perceptions survey used in this study and is described further in Sect. 2. The 7C’s say that risk and science communication should be:

comprehensible (i.e., simple, jargon-free, clear and concise),

contextualised (i.e., acknowledges and reflects diversity of your audience),

captivating (i.e., entertaining, engaging, salient, and relevant to everyday life),

credible (i.e., open, does not overpromise, acknowledges uncertainty),

consistent (i.e., backed by evidence, confirmable, coordinated and collaborated sources of information),

courteous (i.e., compassionate, empathetic and respectful), and

addresses concerns (i.e., empowers action and response, forms a dialogue).

We hope that the literature provided in the above sections has helped to prove to the reader that communication and education research communities have much to offer to the teaching of communication skills in volcanology and hazard and disaster management students. These fields provide the underlying framework and foundation (i.e., the stage and theatre) in which the volcanologists and emergency managers (i.e., the characters) will work through a crisis (i.e., the narrative) and avoid a potential disaster (i.e., the climax) in a role-play. To exemplify these theories in practice, we share with you a pilot study of an authentic role-play, training exercise that specifically aimed to improve university-level students’ communication skills during a mock volcanic crisis (described in detail, below).Footnote 3

1.5 The Volcanic Hazard Simulation

1.5.1 Design and Development of the Volcanic Hazard Simulation Role-Play

For some time, training exercises have been used in the emergency management community to simulate real world crises in order to upskill practitioners (Borodzicz and van Haperen 2002). We partnered with experts in the field (e.g., volcanologists, emergency managers and decision-makers) through action research and interviews to develop an authentic role-play and to deduce best practices in volcanic crisis communication. Additionally, we worked closely with instructors to assess the classroom setting, cultures and logistics to be sure that the role-play suited their needs and fitted into their curricula. Such a process allows for effective curriculum development geared towards learners’, instructors’ and industry needs and builds relationships within different sectors that supports long-term, sustainable teaching practices, and ensures that the curriculum will continue to be used after the educational specialist is out of the picture.

The Volcanic Hazard Simulation role-play was designed and developed by a team of researchers from the geosciences, hazards and disaster management and education disciplines at the University of Canterbury in Christchurch, New Zealand. Emphasis in the early phases of the project was placed on developing authenticity of the roles and teams and ensuring that the simulation was successful at achieving the desired learning goals. Evaluation of the simulation indicated that students found the simulation to be a highly challenging and engaging learning experience and self-reported improved skills (Dohaney et al. 2015). Classroom observations and interviews indicated that the students valued the authenticity and challenging nature of the role-play although personal experiences and team dynamics (within, and between the teams) varied depending on the students’ background, preparedness, and personality (Dohaney et al. 2015). For a more detailed discussion on the design and development of the Volcanic Hazards Simulation role-play we refer the reader to Dohaney (2013) and Dohaney et al. (2015) and for instructors who are interested in running the role-play in their course, an instructor manual is freely available for educational use online.Footnote 4

Two eruption scenarios have been built and tested. The first is a large explosive scenario based on a VEI6 eruption from Tongariro Volcanic complex eruption (Cole 1978; Hobden et al. 1999) that is modelled on the 1991 Mt. Pinatubo eruptions (e.g., Wolfe and Hoblitt 1996). The second scenario is an explosive and effusive eruption of the Auckland Volcanic Field that focuses on the science and impacts from monogenetic volcanism in an urban environment. In both cases, the scenarios were chosen as there were existing volcanic monitoring data available to build our models on, and because they had all the pedagogically-relevant stages; from forecasting (that can be denoted by precursors that students could identify), to minor eruption events and results in an exciting, ‘blockbuster’ climax (major eruption). In the scenario presented here (i.e., the Tongariro scenario), students are presented with real-time, streamed datasets that take the volcano from a quiescent stage, small eruptions (i.e., ‘unrest’), and concluding with a very large eruption. The initial design and timeline for the role-play was taken from Harpp and Sweeney (2002) and was subsequently improved through multiple design phases to optimise the exercise and meet the learning goals.

1.5.2 What Happens During the Volcanic Hazards Simulation?

The Volcanic Hazard Simulation is designed for 300–400 level (i.e., upper-year) undergraduate science students from geology, natural hazards, disaster risk reduction, and emergency management. The simulation takes 4–6-h and can accommodate between 15 and 40 students. Students are divided into two teams: the Geoscience team and the Emergency Management team. All students have an authentic role that they are required to research prior to participation in the simulation, such as the field geologist, geodesist, public information manager, or the welfare manager, etc.

The students within the Geoscience team interpret the streamed datasets (e.g., ground deformation, gas, seismicity; see Dohaney et al. (2015) for more details) and communicate science advice to the emergency management team and to the ‘public’. The Emergency Management team is responsible for managing the impacts that the volcanic eruption poses to communities and infrastructure. This set-up is adapted from the organisational structure of operational emergency management in New Zealand dictated by the most recent version of the national guidelines (Ministry of Civil Defence and Emergency Management 2009) and this structure is comparable to other emergency management structures used, globally [e.g., the National Incident Management System (Department of Homeland Security 2008)]. It is important to note that the learning goal for the exercise is not to replicate protocols, but to introduce students to the roles and responsibilities in these important events and to improve their skills sets. We emphasise this distinction to the students, and this allows students to free up their cognitive resources to focus on teamwork, decision-making and the communication tasks, rather than perfecting organisational procedures. The simulation is a reasonably fast-paced environment, with events happening in quick succession to mimic the stresses of a real natural hazard crisis.

Students respond to emergency management and the public’s information needs via a ‘Newsfeed’ data stream (i.e., a stream of prompts that replicate common views and needs during a crisis) and communicate to policy-makers and to members of the public (played by facilitators). Students need to be able to adapt both the content and style of the communication appropriately to serve the intended target audience. During the role-play, we included structured communication tasks that incorporate different communication goals, formats, contexts, and receivers (i.e., different audiences):

Students do the following structured communication events or tasks:

  1. 1.

    Media releases (written)

  2. 2.

    Volcanic impact reports (written)

  3. 3.

    Team discussions: Both within the team (intra-team) and between the groups (inter-team) (oral, group)

  4. 4.

    On-the-spot ‘dynamic’ information requests (written and oral, individual and group)

  5. 5.

    Media TV interviews (oral, public)

  6. 6.

    Press conferences (oral, public)

It should be noted that not all students will directly participate in each task, as these are team tasks in which some students will choose to communicate to the class, and others will not. We aimed to model authentic and effective team behaviour that requires the group to manage the incoming and outgoing communications, as well as adhering to the appropriate responsibilities of individual roles (i.e., team leaders typically volunteered to take on more frequent public speaking tasks).

Students prepare for the role-play through several preparatory activities including: a volcanic hazards mapping activity, pre-readings (with content specific to their role), an exercise instruction document (with learning goals, the rules, and flow of communication maps), and a science communication lecture and homework assignment including reviewing the 7C’s (described above) that we used as crisis communication ‘best practice’. We expect students to be comfortable with the basics of volcanic monitoring and emergency management, but additional introductory lectures are available for revision.

2 Methods

The current study explores the evaluation of students’ communication confidence and perceptions of crisis communication best practices. Below we discuss the study participants, data collection and data analysis procedures.

2.1 Study Participants

Participants (n = 43; Table 1) were recruited from 300- and 400-level physical volcanology and hazards management courses that hosted the Volcanic Hazards Simulation as part of their curricula. The role-play was assessed using a self- and peer-evaluation rubric that accounted for a small percentage of their grade (~1% of their total grade). Students were mixed cohorts of American study-abroad students and New Zealand students who attended the University of Canterbury. They ranged in gender [female (13) and male (30)], nationality [New Zealand (27), United States of America (14), Netherlands (1) and India (1)], and age [aged 19–22 (25) and >23 years old (18)].

2.2 Data Collection

Two iterations of the role-play were tested for communication perceptions and confidence; One role-play was embedded at the end of a 7-day field course (January 2012; n = 23) the other was embedded within a lecture-based course (August 2012; n = 20). The nature of the intervention was slightly different in terms of what was covered prior to the exercise. The Field-based cohort carried out a hazards mapping exercise (studying the volcanology and hazards of Tongariro) and reviewed the best practices of science communication in a short lecture, followed by a media release critique [both of which were assessed for a small amount (~1% of their total grade)] to encourage students to prepare for the role-play. While the Lecture-based cohort received the same science communication lecture but no other activities. These differences in treatment were controlled by course design and allowed the researchers to explore if different treatments of the student groups elicited different communication results.

We used a mixed methods approach in our investigation of the effectiveness of the role-play on science communication using pre- and post-questionnaires that included multiple choice and open-ended questions. The Field cohort was surveyed using hardcopy questionnaires two days before the role-play (Jan 28) while the Lecture cohort was surveyed up to a week prior (Aug 7–13) using email and hardcopies. Both cohorts were surveyed with hardcopy post-questionnaires immediately after the exercise to ensure a high response rate as the study relies on paired data (pre- and post- results).

The questionnaires included several components: the self-reported communication competence instrument (SPCC), a perceptions of crisis communication instrument (PCC), demographics, and open-ended questions.

SPCC is a validated instrument (with a high internal consistency, Cronbach’s alpha of 0.92) that measures communication confidence and is guided by the earlier works of McCroskey (e.g., McCroskey et al. 1977; McCroskey 1982). McCroskey and McCroskey (1988) investigated communication competence through self-reported evaluation of one’s ability to communicate (i.e., communication confidence). The SPCC instrument considers several dimensions of communication: communication contexts [public, meeting, group, and dyad (or pair; one-on-one)] and receivers of the communication (strangers, acquaintances, and friends). While this measure (and others like it) is not a true characterisation of actual communication competency, it has been used in the discipline to measure gains (i.e., testing of communication competency before and after an intervention) (Fortney et al. 2001) and researchers indicate it is a good predictor of actual communication competence (McCroskey and McCroskey 1988).

The PCC survey (Table 2) was built and piloted for this study. We composed the statements with support from risk communication literature (see Sect. 1), expert views on volcanic crisis communication, and our practices with teaching science communication. The attitudes and beliefs covered by the survey are not exhaustive, but we feel that it covers the common best practices and appropriate behaviours when communicating science during crisis. Further research on the instrument will allow us to refine the statements and to incorporate all the important aspects of science communication. This survey was checked for content validity, but not examined with interview techniques (e.g., Adams and Wieman 2010). The questionnaire also included demographic information and open-ended questions that were designed to gather feedback about the student experience and science communication.

Table 2 PCC survey results for all students

2.3 Data Analysis

The SPCC consists of 12 statements (McCroskey and McCroskey 1988) asking the participant to rate their perceived ability to communicate in different situations and contexts (on a 0–100 scale). The higher the total score, the higher the participant’s confidence. We changed the phrasing from “competent” to “ability” and used a 5-point scale in our version (very strong ability, strong ability, average ability, poor ability, very poor ability). We felt this phrase change would be more comprehensible to our students. For further information on the design and scoring of the instrument please see the publication noted above.

The PCC instrument is composed of 17 5-point Likert statements (Table 2). Experts were surveyed in a small, convenience sample (n = 7) made of volcanology, emergency management and geology faculty at the authors’ institution to assess expert opinion or ‘the right answer’. The responses to the statements can be collapsed to agree, neutral and disagree, to reduce effects of participants preferring less or more conservative use of agreement/disagreement. The student responses can then be assessed as being in agreement or disagreement with the experts (Adams et al. 2006). Neutral responses are not weighted in the calculation.

SPCC and PCC survey results were analysed using the open source PAST statistics programme (Hammer 2015) to determine potential differences or associations with variables within the dataset. SPCC data are treated as interval and groups (i.e., subpopulations) within the dataset were compared using t-tests and one-way ANOVAs. The individual students’ % agreement scores are interval data and so typical parametric tests were carried out, however the individual statement data (i.e., all students’ responses for one statement) are ordinal data [agree (1), neutral (0) and disagree (−1)] and so were treated with non-parametric tests.

Reponses to an open-ended question (Table 3) in the questionnaire were transcribed and coded using qualitative software (ATLAS.ti, Friese and Ringmayr 2011) by the first author. We used content analysis that is defined as the process of using systematic and verifiable means of summarising qualitative data (Cohen et al. 2007). In the first pass of the responses, the researcher identified different units for analysis (individual and separate items). Codes were initially taken as verbatim quotes, to denote, as much as possible, the student’s meaning. In a second pass, the results were viewed in a network (i.e., a map that shows all the responses and allows the user to group similar phrases). The items were grouped and categorised together (i.e., units of data into meaningful clusters; Lincoln and Guba 1985), where like statements could be assigned to code families. The code families were constructed around the act of communication: the knowledge, skills, and attitudes, needed for actions (i.e., strategies) to create an appearance to lead to successful outcomes when communicating. The data were reviewed in a third pass to refine and check for redundancy within and between the code families. 42 student surveys were evaluated, but the question allowed students to respond to as many items as they wanted. Therefore, frequencies of mentions do not represent individual student responses.

Table 3 Results from a post-survey (n = 42): Students’ perceptions of science communication best practices

3 Results

3.1 Improvement of Students’ Communication Confidence

Figures 1 and 2 show changes in students’ self-reported competence (i.e., confidence; SPCC) with communication. In both pre- and post-surveys, most students fell within the ‘average’ confidence zone, with several students reporting low or high confidence. Altogether, the students showed a positive mean change in confidence (Fig. 1b; Paired t-Test of pre and post-scores, t = −2.07, p = 0.046). An equal number of individuals showed positive and negative shifts in confidence after participating in the exercise, but the largest observable changes were positive (i.e., changes of >10 points: 8 positive compared to 2 negative). Three ‘Low’ confidence students showed large positive changes (21, 27, and 42 points). There were no statistically significant differences between the changes achieved by the different cohorts (Unpaired t-Test for same means; t = 0.37, p = 0.71), but the Field cohort did have lower pre-test scores (average of 69 ± 16). Figure 2a shows the changes for all of the students within each SPCC category (Speaking in public, meetings, groups, or pairs; with strangers, acquaintances, or friends). Overall, the mean changes for the public (5 ± 15) and stranger (7 ± 15) categories were the highest.

Fig. 1
figure 1

Students’ self-reported communication competence before and after the Volcanic Hazards Simulation. a A plot showing pre-test versus post-test SPCC scores for individual students and the cohorts of which the means are not statistically different. b A table showing SPCC basic statistics. Overall, students showed positive and negative changes, but the positive changes were greater, on average

Fig. 2
figure 2

a Box and whisker plots of the average change within different dimensions of the SPCC instrument (i.e., communication contexts and receivers) for all students. Note that the highest average change is shown in the public speaking and stranger dimensions that are both emphasised through public speaking tasks within the Volcanic Hazards Simulation. b A plot showing the overall change (pre- and post SPCC) sorted by students who did and did not explicitly participate in public speaking tasks. A comparison of the two groups did not result in a statistically significant difference

We examined the SPCC results for demographic associations with the pre-test scores and changes (Table 1; gender, age, nationality, degree programme, and year of degree programme) as well as curriculum factors [cohort, assigned roles (i.e., data-focussed vs. communication task-focussed) and teams (emergency management or geoscience)]. An interesting relationship surfaced between changes and the pre-test scores and direct participation in the public speaking tasks. Plotting the change scores (post-score minus pre-score) versus pre-test scores showed an inverse relationship (Pearson’s product-moment correlation coefficient r = −0.46; p = 0.004); students with lower pre-test scores achieved the highest changes, and those with the higher pre-test scores achieved the most negative changes. Additionally, we found that students with the greatest individual change in confidence (Fig. 1) participated in the public speaking tasks (i.e., press conferences and media interview) (Fig. 2b; “yes” to participating in public speaking tasks 7.01; “no”: 0.28) although the difference was not statistically significant (t = −1.63, p = 0.11). We would like to explore this affect in the future, with more students and better control over who participates and who does not in the public speaking tasks.

3.2 Improvement of Student Perceptions of Volcanic Crisis Communication

Figure 3 and Table 2 shows the results from the pre- and post-survey (PCC) that measured students’ perceptions of communicating during a volcanic crisis. On average, the students’ reported statistically significant positive changes (i.e., agreeing with experts) in perceptions (Fig. 3a, b; Paired t-Test, t = −2.07; p = 0.046) but individual students displayed both increases and decreases in agreement with experts. More students showed positive (17) or no changes (16) than negative shifts in perceptions (7) after participating in the role-play with the largest observable changes being positive (changes of >10 points; 7 positive, 4 negative).

Fig. 3
figure 3

Students’ perceptions of volcanic crisis communication before and after the Volcanic Hazards Simulation. a A plot showing pre-test versus post-test PCC scores for individual students and the cohorts. There was no statistical difference between changes within the different cohorts (Paired t-Test to test for same means; t = 0.07, p = 0.95). b A table showing basic statistics of the perceptions survey. Overall, students showed positive and negative changes, but there were more students who exhibited positive changes rather than negative

The analysis of the pre-test scores revealed no significant statistical relationships for curriculum factors and most demographic factors. However, we did find that there was a significant difference in pre-test perceptions between students who were in the 300-level, versus the 400-level of their university degree programmes (mean score of 78 and 69%, respectfully; Unpaired t-test for equal means t = 2.18 and p = 0.04).

The changes achieved by students (post-test minus pre-test %) were also examined for curriculum and demographic factors. The cohort, participation in public speaking tasks, year and type of degree programme did not differ. Factors that did differ were: gender (male mean change = 6.7, female = −3.04), age (older students (>23 years of age) mean change = 7.15, younger students = −0.02), nationality (NZ students mean change = 4.30, US students = −1.89), assigned team (Geoscience group mean change = 8.56, EM = −2.3), and assigned role-type (data monitoring-focussed roles mean change = 9.83, communications-focussed = −0.71). However, these results should be considered with caution as none of these change factors showed statistical significance and there is a high likelihood of interacting and mediating factors (e.g., we cannot isolate some of the variables from one another.).

Lastly, similar to the SPCC scores, we found that the pre-test scores show an inverse relationship to the changes achieved (Pearson’s r = −0.63; p < 0.001). Additionally, as the mean changes for the cohorts and all students were similar for the perceptions survey and the SPCC instrument, we checked for correlations between changes in confidence and changes in perceptions, but only a weak correlation was found and it was not statistically significant (Pearson’s r = 0.30, p = 0.07).

Table 2 illustrates the PCC results broken down by individual statements and grouped together by ‘audience’. Changes between the statements within the field and lecture-based cohorts were not shown to be statistically different, and so the combined results are shown. Overall, most statements showed positive changes (i.e., improving the agreement with the experts) from pre to post-survey. In the pre-survey, some statements showed very high agreement with the experts (>90%, statements 1, 16, 5, 6, 8, 10, and 17, bolded). Statement 7 and 14 showed statistically significant changes from pre to post-survey (Wilcoxon signed rank test, for ordinal data; agree = 1, neutral = 0, disagree = 1 with experts; paired data; p < 0.05). Overall, students had positive changes within the ‘skills’ and ‘communication with other scientists’ dimensions, but some negative changes on statements within the ‘communication with the public’ category. This was surprising, as we were specifically aiming to improve their perceptions of communication with the public. However, a closer look shows that several of the individual statement’s negative shifts were from very high values of agreement with experts where the majority of students who agreed with experts shifted into the neutral category (i.e., were questioning their perception). It should be noted here as well, that when 100% of the students agree with experts it can cause a ‘ceiling effect’, where scores cannot go any higher and can limit the statistical analysis of these results.

3.3 Best Practices of Science Communication

A central aim of the role-play is to enhance students’ communication best practices. In a post-survey, students were asked to “list the most important ‘best practices’ of communication that scientists should use when talking to the public” (Table 3). No significant differences were discovered of the item frequencies between the field and lecture cohorts, and so the results from both groups are presented as a whole. Students views are comprehensive (covering many aspects), but the frequency of items shows a focus on the strategies of communication (134 mentions; e.g., use of jargon, use of analogies, use of visual aids) rather than on how the speaker appears (20), their behaviour (21) and the outcomes of the communication (5). There were a couple of examples of potentially divergent responses within a couple of the categories. For example, in the appearance category, students report that it is important to appear approachable and relatable (5) but another reports that it is important to appear authoritative. Another important example is that students felt it was appropriate to show emotions (6) but another student stated not to. The jargon category was quite popular, and students mentioned a range of recommended approaches including “not using jargon whatsoever”, to using it “appropriately”.

4 Discussion

4.1 Improvement in Students’ Communication Confidence

The overall statistically significant positive changes on the SPCC results (Fig. 1) indicate that the role-play was effective in improving students’ communication confidence. Figure 2 showed that the public speaking and stranger (i.e., speaking with strangers rather than someone you know) dimensions were the most positively affected and this result aligns with the learning goals of the role-play (i.e., to improve students’ crisis communication skills). Positive changes achieved by students were substantial, however, there were equal numbers of students with small negative changes, and some with no change. This indicates that the role-play may be effective in improving student confidence for some more than others. Changes likely occur when students re-evaluate their abilities based on the performances during the role-play (of themselves and others) and either increase or decrease their confidence in communicating. Research has indicated that self-reported competency (i.e., confidence) is diminished when there are some peers who are compulsive communicators (i.e., dominant and frequent talkers), meaning less frequent speakers may not assess their merit as highly in comparison to their classmates (Fortney et al. 2001). Though we did not survey for compulsive speakers, all cohorts included some frequent and dominant speakers and that aspect could have potentially negatively influenced some student’s appraisals of their own abilities. To reduce peer comparison effects, some scholars suggest encouraging students to focus on one’s own progress (i.e., self-comparison), rather than comparing their performance with others, thereby reducing social comparison effects (e.g., Luk et al. 2000).

The change in scores can also potentially be attributed to (positive or negative) feedback provided by instructors and peers during the role-play. Feedback (i.e., self-, peer- and instructor feedback) is vital for communication improvement (e.g. Maguire et al. 1996; Maguire and Pitceathly 2002) and it is likely that some of the participants received more meaningful feedback (i.e., explicit guidance on how to improve and what to consider) during the simulation than others. Additionally, some students may shy away from perceived criticism which could result in negative self-appraisals.

It is worth noting that the SPCC scale and other communication instruments (e.g., PRCA-24; McCroskey et al. 1985) were designed and typically used to record longer interventions (over semesters rather than after one, multi-hour event). Some students in this study reported changes of ~2–5% shift, while changes in competency from an entire semester of communication class (e.g., Rubin et al. 1997) resulted in similar magnitude of change. We propose that even small changes may be influential in a student’s communication confidence over time and that the role-play has been shown here to have similar affects when compared to longer treatments.

Based on the divergent change results, we checked to see what factors may be influencing the individual student’s experiences in different ways. A plot of the change scores versus pre-test scores revealed an inverse relationship (Pearson’s product-moment correlation coefficient r = −0.46; p = 0.004) where students with lower pre-test scores achieved the highest changes, and the higher pre-test scores achieved the most negative changes. This indicates that this exercise is particularly effective at improving student confidence for those with mild communication apprehension. This relationship also indicates that our higher confidence students are becoming less confident. This may be due to a lack of accurate ‘benchmarks’ for effective competence, in that students with less academic maturity/experience may be overestimating their ability to communicate, and when confronted with a challenging exercise, may have a more realistic assessment of their abilities when compared to other students.

There were no notable differences in demographics (age, year of study, gender, nationality, etc.) in contrast to prior communication research that reports that males tend to have higher confidence in transferable skills and communication than females (Lundeberg et al. 1994; Whittle and Eaton 2001; Donovan and Maclntyre 2004) and that people from different cultures and nationalities are more confident with public speaking than others (Lundeberg et al. 2000). We did not observe these attributes in our study population, however the total sample size was small (n = 37) and these factors may only become apparent with larger groups.

There was no statistical difference in changes between the two cohorts, and for the different roles and teams. This indicates that regardless of the learning environment, the extent of the intervention, or the assigned roles and team (i.e., the specific tasks) the affect was equal on students’ confidence. However, as noted in Fig. 2, students who directly participated in the public speaking tasks (i.e., press conferences and media TV interviews) showed more positive changes but this may be due to self-selection (i.e., students who volunteered to speak for the team may be less public-speaking averse than those that passed on the opportunity).

In the future, we may use a more equitable and structured approach to participation in the public speaking tasks (i.e., where all roles are noted and ‘called on’ by the facilitators or team leaders to speak), but presently we did not want to force students to participate. This approach may encourage students to overcome their perceived aversion to public speaking and improve their confidence. It should be noted that the treatment was not set up to specifically control for students participating in the public speaking tasks and future research will explore this variable further.

4.2 Student Perceptions of Best Practice in Volcanic Crisis Communication

Two datasets were considered to explore students’ perceptions of crisis communication best practice: the PCC instrument (Table 2 and Fig. 3) and an open-ended question (Table 3). Overall, the PCC results students showed positive perception changes (i.e., increases in percent agreement with experts; Fig. 3) and more individual positive changes than negative changes, with some students achieving large shifts of >10 points. This indicates that the role-play was effective in enhancing students’ perceptions (becoming more expert-like).

The data shows that the 300-level students had higher pre-test scores than 400-level students. This is separate from nationality, age, and cohort (which showed no differences) indicating that there is an element of academic maturity/experience that is having an effect on their initial perceptions. It is not possible at this stage to differentiate specific reasons why these levels of students had different pre-test scores and will explore it further in our future work.

Overall, several factors (curriculum and demographic) may be impacting the amount of changes in student perceptions: gender, age, nationality, assigned team and role-type though these differences are not statistically significant and we did not observe (i.e., noted during observations of the role-play) distinguishing affects during the role-play. However, given the likelihood that these factors may be interacting, and that mediating variables (such as group socio-dynamics) might be present, causal inferences are difficult to make. A larger sample and more controlled design could plan for these factors.

However, the changes in perceptions associated with assigned role and team could potentially be due to group dynamics. The exercise is challenging with complex social dynamics within the teams and between. The Geoscience team (predominantly data-focussed students) had higher changes than the EM team. This is surprising, as these students are more concerned with data analysis and interpretation than the other team, because this group focuses on receiving science advice and prioritising and communicating impacts of the volcanic crisis. However, the perceptions survey is focussed on the communication of science, and not specifically on advice and actions for the public. It is likely that the Geoscience teams discussed the nuances of science communication at a deeper level than the EM team. This is important consideration when considering your evaluation of these exercises (Does your measure/instrument suit one context over another?).

Results from Table 3 show that students illustrated a comprehensive view of the strategies that you should employ when communicating science, but focussed more on the mechanics of communicating (i.e., the How To’s). This indicates that our participants understand that there are many things to consider when communicating, appreciating the complexity of the task. The responses are all consistent with up to date approaches in rhetorical communication in instructional communication texts (e.g., McCroskey 2006). The frequency of mentions that focuses on the mechanics of science communication is not surprising, given their level of academic maturity and previous experiences (i.e., learning the initial skills, before moving on towards more sophisticated elements of the trade). The lesser but somewhat divergent responses (i.e., ‘appear authoritative’ vs. ‘appear relatable’) is additional evidence for students valuing different approaches to best practice. The undergraduate teaching community should be assured that students need to walk before they can run, and acknowledging where they are in their communication training can help them to understand where they should aspire to be (i.e., considering more situational aspects of communication). The volcanology community can benefit from this finding in that it may be important to acknowledge that practitioners may also hold divergent views on what is best practice, and that organisations would benefit from discussing the merits of specific approaches in specific circumstances. The risk and crisis communication community has much research for almost each individual statements in Table 2 specific areas [e.g., topics like uncertainty (e.g., Hudson-Doyle et al. 2011) and the importance of building and establishing trust through communication (e.g., Haynes et al. 2007)] and applying a one-dimensional approach to crisis communication is not advised.

It should be noted, that this perceptions survey is a pilot version and it has not yet been rigorously validated. Current research on a new version indicates that some of the statements may be asking about more than one concept (e.g., “Using numbers, drawings and probabilities is a good method of communicating scientific principles to other scientists”). New results from experts indicate that they may confuse some statements in terms of what is intended by the approach, versus its’ effectiveness. Meaning that some strategies or perceptions may be valid in theory, but may not be helpful in practice (e.g., disclosing all of your results to show transparency, versus disclosing only the most important results to create a coherent message to the public). These ideas are somewhat opposed and in conflict with one another, causing a tension for the communicator to overcome. Additionally, our list of perception statements is not exhaustive. There is such a diversity and complexity to communicating during crisis and that is evident in the student responses in Table 3. However, it becomes difficult to capture this complexity in a series of closed statements. Further research into student and expert perceptions through interviewing techniques will allow us to characterise risk and crisis communication best practice.

Further work will validate our measure of communication perceptions (i.e., further refine the instrument and comprehensively define crisis communication best practice with the help of experts and practitioners), and focus on assessment of all of the above dimensions to ascertain the relationship between factors that lead to successful communication performance. If we know pedagogical factors influences a student’s ability to learn about crisis communication, then we can provide practical suggestions to improve the teaching of communication in the classroom. We would also like to investigate risk and crisis communication in alternate natural hazards scenarios (e.g., earthquakes, Dohaney et al. 2016 and hydroelectric dam failure) to help students diversify their approaches to risk and crisis communication. Additionally, we would like to develop volcanic scenarios over longer mock time frames (e.g., following a community engagement initiative as it progresses through stages of learning about volcanic risk) to help students understand that risk communication occurs through all stages of the 4 R’s and cultivating relationships with communities provides the foundation for making crisis communication possible.

4.3 Implications for the Teaching of Volcanic Crisis Communication and Future Work

In this final section, we would like to share with the community some lessons learned from our use of training exercises and teaching about communication, as well as outline our future research into the measurement of communication performance.

The use of training exercises is not uncommon in the emergency management sector, however, it is less used in formal education settings because of the significant time investment that goes into building an authentic scenario, organising a robust curriculum plan, and evaluating and testing whether it is effective. We believe an evidence-based approach to the building and testing of such curricula should include specialists in education and communication research. A partnership among these professionals allows content experts (i.e., volcanologists and emergency managers) to learn about pedagogy of training exercises and the art of evaluating such complex learning activities. Input from communication researchers can further enhance the inclusion of specific communication contexts and tasks, as well as help to guide instructors and students in delving deeper into how messages are constructed and received by diverse audiences. In our case, previous research into the design of this exercise (Dohaney et al. 2015) meant that we could move away from the intricate task of ‘tweaking’ our exercise and look at the impact that it has had on our students’ abilities to communicate. Such alliances create powerful and engaging learning experiences that create memorable and lasting influence on student’s ongoing career development.

The results discussed above illustrate that the Volcanic Hazards Simulation has influenced our student’s perceptions and confidence with communicating during a mock volcanic crisis. But, does this translate to transferable communication skills moving forward? What we do know is that often knowledge and awareness of best practice (i.e., ‘expert-like’ perceptions) is the first step towards utilising these communication behaviours and strategies (e.g., McCroskey 2006). And what about communication confidence? Do our high confidence students actually communicate more effectively? Recent research by Kruger and Dunning (2009) suggests that overconfidence and ignorance are not a good thing, however, students with high confidence paired with expert-like perceptions of crisis communication best practice have the tools at their disposal, we hope that as they move forward in their careers they can continue to practice and improve, ultimately leading to better crisis communication practitioners (should they choose to follow that career path).

5 Conclusion

Our study set out to examine whether an authentic volcanic crisis role-play could improve students’ communication confidence and their perceptions of science communication. In the role-play, students challenged themselves and moved outside of their ‘academic comfort zone’ when required to rapidly synthesize new information and communicate the information to differing stakeholders and in different formats. On average, our results indicate that the role-play does improve both confidence and perceptions for our students. In particular, this exercise is most effective for students who have low confidence and low perceptions of communicating science. Students with improved and high confidence in their abilities are more likely to engage in communication experiences (McCroskey et al. 1977), which leads to further improvement, so even a small number of positive shifts in confidence are a success.

However, some students showed both positive and negative changes in confidence and perceptions. Negative appraisals of confidence may be due to peer comparison effects and negative perceptions shifts may be due to shifting from agreeing with experts to neutral responses (i.e., questioning their current perceptions). In future work, we will try and minimise negative experiences and increase the positive experiences for all students. There were no significant differences with regard to students’ confidence and perceptions between the cohorts indicating that despite slightly different intervention (one more extended than the other) students achieved positive changes. This indicates that role-play as a standalone part of an instructor’s curriculum is flexible enough to accommodate different schedules while still reaching its outcomes.

Results from the open-ended question show that our students illustrated a comprehensive range of views on the best practices of science communication, but focussed primarily on the mechanics of delivery, which is unsurprising as most students are still relatively inexperienced and are continually developing these skills. New scenarios for earthquakes will be tested to improve on our findings. This approach to learning skills through authentic challenges builds confidence and resilience in undergraduate students who are likely to become a part of the geologic and emergency management community.