Using Role-Play to Improve Students’ Confidence and Perceptions of Communication in a Simulated Volcanic Crisis

  • Jacqueline DohaneyEmail author
  • Erik Brogt
  • Thomas M. Wilson
  • Ben Kennedy
Open Access
Part of the Advances in Volcanology book series (VOLCAN)


Traditional teaching of volcanic science typically emphasises scientific principles and tends to omit the key roles, responsibilities, protocols, and communication needs that accompany volcanic crises. This chapter provides a foundation in instructional communication, education, and risk and crisis communication research that identifies the need for authentic challenges in higher education to challenge learners and provide opportunities to practice crisis communication in real-time. We present an authentic, immersive role-play called the Volcanic Hazards Simulation that is an example of a teaching resource designed to match professional competencies. The role-play engages students in volcanic crisis concepts while simultaneously improving their confidence and perceptions of communicating science. During the role-play, students assume authentic roles and responsibilities of professionals and communicate through interdisciplinary team discussions, media releases, and press conferences. We characterised and measured the students’ confidence and perceptions of volcanic crisis communication using a mixed methods research design to determine if the role-play was effective at improving these qualities. Results showed that there was a statistically significant improvement in both communication confidence and perceptions of science communication. The exercise was most effective in transforming low-confidence and low-perception students, with some negative changes measured for our higher-learners. Additionally, students reported a comprehensive and diverse set of best practices but focussed primarily on the mechanics of science communication delivery. This curriculum is a successful example of how to improve students’ communication confidence and perceptions.


Education Learning and teaching Volcanic crises Science communication Role-play Risk management Disaster risk reduction 

1 Introduction

Communicating scientific results and recommendations about natural hazards and disasters into language easily understandable by non-experts is a challenging task in the best of circumstances. During an actual natural hazard event, stress levels are high and considerable pressure is put on scientists and emergency managers to communicate a wide variety of information to each other and many different stakeholders (Alexander 2007; Barclay et al. 2008; Haynes et al. 2008; IAVCEI Task Group on Crisis Protocols 2016; Rovins et al. 2015, p. 56).

Many practicing scientists receive no formal training in science communication (including communication with the public and with media) (MORI and The Welcome Trust 2001; The Royal Society 2006) or public engagement (Miller and Fahy 2009). Additionally, embedded training of science communication in undergraduate degree programmes is uncommon, though specific degrees, minors, or postgraduate degrees are offered in a relatively select few institutions and predominantly within Europe (Trench and Miller 2012). Therefore, dedicated science and risk communication training for undergraduates provides a valuable opportunity to instil the next generation of natural hazard scientists and emergency managers with communication strategies and skills which, if informed by established best practices, will aid them to better serve a society that faces increasing risks from natural and manmade hazards.

This chapter describes a case study about an interactive, challenging role-play designed to train students how to forecast volcanic eruptions, manage the impacts from these eruptions, and communicate with the public throughout the simulated crisis. The chapter also introduces the reader to the foundations of instructional communication, education, and risk and crisis communication research and demonstrates how to evaluate communication training pedagogy with an evidence-based approach.

We argue that role-play challenges students and provides them with practical experience that they can utilise in their careers. It also improves learner’s confidence in their ability to communicate and improves their overall perceptions of risk and crisis communication best practice. We believe the success of the role-play lies in the explicit practicing of authentic communication tasks in a feedback-rich environment and we hope to should encourage instructors to incorporate more authentic tasks into their curricula. We invite our readers to use and adapt this curriculum in classrooms of all levels of formal and informal education.

1.1 Why Is Volcanic Risk Communication Training Important?

There is a long history and multidisciplinary approach to research of risk and crisis communication. Corporate crisis communication and public relations (e.g., Grunig and Repper 1992; Crane and Livesey 2003), health risk and crisis communication (e.g., Reynolds and Shenhar 2016), and broader risk communication (e.g., Morgan et al. 2002; Glik 2007) communities have all explored the strategies, philosophies and evaluation of these communications and how differing approaches may influence its success.

In general, these communities have advised that we should move away from the old, linear, ‘transmission’ form of communication (i.e., ‘source’ to ‘receiver’ or the Shannon-Weaver model of communication) towards a participatory approach to work with communities to establish a dialogue (e.g., Fisher 1991; Fischhoff 1995) that supports diversity in the needs of the audience (McCroskey 2006) preferably in an unofficial and relaxed setting that helps to build trust between scientists and the public (Haynes et al. 2007). The Sendai Framework for Disaster Risk Reduction supports this approach, encouraging the sectors of society (i.e., public, private and academic sectors) to work together in a ‘people-centred’ approach to DRR (United Nations International Strategy for Disaster Risk Reduction 2015, See point 7, p. 10). This shift is important for the delivery of risk and crisis communications and highlights the importance of knowing, understanding and connecting with your audience.

Volcanologists play a major role in the dialogue that occurs in the long-term and short-term communication of volcanic risk. Pielke (2007) provides an excellent overview of the particular roles that experts may choose to take when science has the potential to impact policy, politics and the public. He proposes that experts (e.g., medical practitioners, engineers or scientists) can act as an ‘honest broker’ by providing clear options to the person(s) at risk, articulate the specific outcomes, while simultaneously accounting for uncertainties and incorporating the most up to date scientific understanding of the topic at hand.

Essentially, it is our job as scientists to provide clear information to the public on the potential risks that they face from volcanoes. However, as stated above, scientists are rarely trained in communication so the pathways and strategies for achieving this aim is less known. Additionally, there have been very few initiatives that have blended volcanology, risk communication, and education but all of these research areas have much to offer to the teaching of communication in the sciences. This research hopes to bridge this gap and describes a research-informed curriculum that can be used to train future volcanologists in the best practices of volcanic risk and crisis communication.

1.2 Instructional Communication Research

Communication is one of the most commonly mentioned graduate attributes for most undergraduate degrees and is also core to the geology profession (Heath 2000; Jones et al. 2010). A quick sample of several university’s graduate attribute profiles will show you that communication, in some defined form, is almost always present. Communication was a main focus (i.e., was among the primary goals and outcomes) in all of the courses (see Sect. 2) that featured the role-play so as part of our efforts to include authentic communication training we undertook a review of instructional communication (i.e., the teaching of communication skills). Here, we share some of what the research community tells us about teaching communication.

Firstly, there are a wealth of studies that advocate for the benefits of learners undergoing some form of communication education. Morreale and Pearson (2008) state that effective communication skills are needed across many disciplines (e.g., sciences, business, engineering or architecture) and helps them to succeed in a range of careers. Morreale and Pearson (2008) also state that communication training encourages global, socially and culturally-aware citizens, including specific areas of global significance allowing our society to make better decisions in areas like health and medicine, crisis management, and policing.

Secondly, effective communication does not come about by simply practicing a speech in front of a mirror. A recent study by Engleberg et al. (2016) compiled the core competencies of communication to assist in building a standardised introductory course in instructional communication. The seven core competencies (listed here, taken directly from Engleberg et al. 2016) shows the reader the diversity of skills that are needed to be an effective communicator:
  1. 1.

    Monitoring and Presenting Your Self (i.e., the ability to monitor and present yourself to others within and across a variety of communication contexts);

  2. 2.

    Practicing Communication Ethics (i.e., the ability to identify, evaluate, and demonstrate appropriate ethical behaviour within and across a variety of communication contexts);

  3. 3.

    Adapting to Others (i.e., the ability to understand, respect, and adapt messages to a diversity of human characteristics and attitudes in order to accomplish a communication goal within and across a variety of communication contexts);

  4. 4.

    Practicing Effective Listening (i.e., the ability to listen effectively and respond appropriately to the meaning of messages within and across a variety of communication contexts);

  5. 5.

    Expressing Messages (i.e., the ability to select, demonstrate, and adapt appropriate forms of verbal, nonverbal, and mediated expression that support and enhance the meaning of messages within and across a variety of communication contexts);

  6. 6.

    Identifying and Explaining Fundamental Communication Processes (i.e., the ability to identify and explain how specific communication processes influence the outcome of communication interactions within and across a variety of communication contexts);

  7. 7.

    Creating and Analysing Message Strategies (i.e., the ability to create and analyse message strategies that generate meaning within and across a variety of communication contexts).


Thirdly, measuring and assessment of communication competence is different from most learning in the sciences and other disciplines. It is a skill that is highly contextualised (See above) and success is in the mind of the receiver(s)/audience(s), that makes it inherently difficult to judge with objective consistency. Determining whether a learner has shown excellence in communication requires observation of the student’s performance across a range of situations and contexts. Though these competencies may seem difficult to assess, communication researchers have developed a series of measures that aim to capture some of the many dimensions of communication competency.1

Our research aimed to characterise and measure students’ confidence and perceptions of volcanic crisis communication and to determine if the role-play was effective at improving these qualities. This study occurred at the beginning of a longitudinal programme that is exploring a working model of communication denoted by several dimensions that impact an individual’s communication performance: communication confidence (discussed here), perceptions of science/crisis communication (discussed here), previous experiences with communication, and content knowledge (i.e., expertise in the topic that is being communicated).

Confidence in one’s ability to communicate competently relies on having the knowledge, skills and motivation to communicate (Rubin and Morreale 1996). The knowledge to communicate competently requires learners to select the appropriate information and strategy for the right situation, while the skills come about from having the skills to execute these strategies (Kreps and Query 1990). The motivation to communicate arises from learners choosing to engage after weighing several internal and external factors (e.g., grade incentives; Fortney et al. 2001). Courses in public speaking have been shown to increase student’s confidence communicating (Miller 1987; Richmond et al. 1989; Rubin et al. 1997; Ellis 1995). It is worth noting that confidence does not directly translate to effective performance and that overconfidence (e.g., Kruger and Dunning 2009) and compulsive communication (Fortney et al. 2001) can be detrimental to learning and communication. Communication confidence was measured by asking students to self-report their perceived competency to communicate to different receivers and in different contexts (described in Sect. 2).

Another important construct to our study was the perceptions of risk and crisis communication best practice. Perceptions are a selection of attitudes or beliefs that an individual holds and that guides their behaviour. McCroskey (2006) proposes that there are three elements to building communication skills: desire, understanding, and experience. Understanding communication involves knowledge and awareness of the multitude of considerations and strategies that you can employ when crafting and delivering a message. A perceptions survey allows you to check for alignment between the views of the students compared with the views of professionals.

Our curriculum was focussed on teaching students’ volcanic crisis communication, and so their perceptions were measured by asking students whether they agreed with a series of statements concerning best practice under these circumstances (described in Sect. 2). Though it should be stated that simply because you hold a ‘correct’ perception does not mean that you will (a) execute the strategy effectively, or (b) decide to use the strategy when the opportunity arises. Holding expert-like perceptions is only one part of the tool kit for becoming an effective communicator.

1.3 Educational Research

Educational research is critical for the development and evaluation of curricula. As our understanding of ‘how we learn’ becomes more sophisticated, the strategies we use in the classroom allow for more effective learning experiences than traditional, stand-and-deliver teaching. At present, we feel that rigorous education research is an underutilised resource at all levels of volcanology education including formal and informal educational settings.

In practice, curriculum development is often content-driven rather than learning outcome-driven (i.e., focuses on specific aspects of volcanism to cover, rather than on the skills and knowledge that an instructor hopes the students will gain from learning about volcanoes). Additionally, curriculum development is undertaken by academics or secondary school educators who may not be aware of applied volcanology and emergency management practices. Consequently, lessons that are developed may be theory-focussed (not skills-focussed) and lack the authentic challenges that accompany volcanic crises.

Authentic learning focuses on real-world, complex problems and their solutions taught within authentic environments through activity and social interaction (Herrington and Herrington 2006; Lombardi 2007; Herrington et al. 2014). Authentic learning seeks to replicate real-world practices in the classroom including the environment, roles, and responsibilities of professionals. Role-play is one of the many examples of authentic learning. Other examples include: simulation, role-play, mentoring, debate, case studies, coaching, and reflection (e.g., Brown et al. 1989). Authentic learning offers an opportunity for students to explore communication in its fullest complexity leading to a more befitting assessment of their communication skills.

The effectiveness of role-play and simulation for learning has been reported in a number of studies (e.g., DeNeve and Heppner 1997; van Ments 1999). Simulation is defined as a learning experience that occurs within an imaginary or virtual system or world (van Ments 1999) and ‘role-play’ as the importance and interactivity of roles in pre-defined scenarios (Errington 1997, 2011). Simulation and role-play require more active participation from students than lecture-based teaching techniques and intend to teach practical and theoretical skills that are transferable to different future situations (Roth and Roychoudhury 1993; Lunce 2006). Research shows that role-play and simulation improve student attitudes towards learning (DeNeve and Heppner 1997; van Ments 1999; Shearer and Davidhizar 2003) and interpersonal interactions (Blake 1987; van Ments 1999; Shearer and Davidhizar 2003), generic transferable skills (problem-solving and decision-making skills (Errington 1997; Barclay et al. 2011), communication skills (Bales 1976; van Ments 1999; Hales and Cashman 2008); and teamwork skills (Maddrell 1994; Harpp and Sweeney 2002), as well as discipline-specific knowledge (DeNeve and Heppner 1997; Livingstone 1999) and volcanic eruption forecasting skills (Harpp and Sweeney 2002; Hales and Cashman 2008).

1.4 Risk and Crisis Communication Best Practices

In order to teach students how to communicate about volcanic risk, we must first understand how experts communicate before, during and after volcanic events. The communication of science (more generally) can take on a multitude of formats, styles, objectives, and outcomes. Burns et al. (2003) defined science communication as the “… use of appropriate skills, media, activities, and dialogue to produce one or more of the following personal responses to science: awareness, enjoyment, interest, opinions, and understanding of science (i.e., its content, processes and social factors)”. Volcanic risk and crisis communication may include science communication that can be used to educate and promote risk-reducing behaviours to the public (Barclay et al. 2008).

We differentiate between risk and crisis communication using criteria laid out by Reynolds and Seeger (2005; Table 1): Risk communication uses messages that focus on reducing the consequences of a known threat (i.e., risk is based on projections and long-term forecasts), occurring prior to an event in frequent or routine communication campaigns, relying on technical experts and scientists to deliver the message; while Crisis communication uses messages that focus on information regarding a disruptive event, occurring immediately following and in a response to an event,2 relying on authority figures and technical experts to deliver the message. Reynolds and Seeger (2005) promote an integrated model where the scientific community can view communication as part of an ever-evolving cycle around risk factors that must adapt and match to the situation and context. This allows communicators to approach both risk and crisis communication with a set of tools (i.e., best practices) that must be carefully selected and suit the context and needs of the audience. We welcome this way of thinking, and seek to undertake communication training of students and practitioners within this framework.
Table 1

Study participants demographics


Age (n)

Gender (n)

Nationality (n)

Degree programme (n)


(23 students)

Jan 2012

19–22 (18)

≥23 (5)

Female (8)

male (15)

United States (13)

New Zealand (9)

Netherlands (1)

BSc (13)

PGDipScia (9)

PhD (1)


(20 students)

Aug 2012

19–22 (7)

≥23 (13)

Female (5)

male (15)

United States (1)

New Zealand (18)

India (1)

BSc (11)

PGDipSci (9)

All students

(43 students)

19–22 (25)

≥23 (18)

Female (13)

male (30)

United States (14)

New Zealand (27)

Netherlands (1)

India (1)

BSc (24)

PGDipSci (18)

PhD (1)

Numbers here represent students who participated in the role-play. Some students did not complete all of the surveys in the study

aStudents in the PGDipSci programme were in the first year of their postgraduate studies focussed on Geology and/or Hazards and Disaster Management. Some of these students later upgraded to a MSc thesis

For the purposes of teaching, we wanted to have a concise set of best practices that incorporated scholarly work but was comprehensible to our students allowing them to pick them up in the short time frame allocated by our curriculum. A colleague at the University of Otago developed a distinct set of rules for risk and science communication, which was derived from research on media from the Canterbury Earthquake sequence, that she called the 7Cs (Taken from Bryner 2012; Ideas influenced from the 10Cs Weingart et al. 2000; Miller 2008). These best practices were explicitly given to students prior to participating in the role-play, and were a part of the theoretical foundation for the perceptions survey used in this study and is described further in Sect. 2. The 7C’s say that risk and science communication should be:

comprehensible (i.e., simple, jargon-free, clear and concise),

contextualised (i.e., acknowledges and reflects diversity of your audience),

captivating (i.e., entertaining, engaging, salient, and relevant to everyday life),

credible (i.e., open, does not overpromise, acknowledges uncertainty),

consistent (i.e., backed by evidence, confirmable, coordinated and collaborated sources of information),

courteous (i.e., compassionate, empathetic and respectful), and

addresses concerns (i.e., empowers action and response, forms a dialogue).

We hope that the literature provided in the above sections has helped to prove to the reader that communication and education research communities have much to offer to the teaching of communication skills in volcanology and hazard and disaster management students. These fields provide the underlying framework and foundation (i.e., the stage and theatre) in which the volcanologists and emergency managers (i.e., the characters) will work through a crisis (i.e., the narrative) and avoid a potential disaster (i.e., the climax) in a role-play. To exemplify these theories in practice, we share with you a pilot study of an authentic role-play, training exercise that specifically aimed to improve university-level students’ communication skills during a mock volcanic crisis (described in detail, below).3

1.5 The Volcanic Hazard Simulation

1.5.1 Design and Development of the Volcanic Hazard Simulation Role-Play

For some time, training exercises have been used in the emergency management community to simulate real world crises in order to upskill practitioners (Borodzicz and van Haperen 2002). We partnered with experts in the field (e.g., volcanologists, emergency managers and decision-makers) through action research and interviews to develop an authentic role-play and to deduce best practices in volcanic crisis communication. Additionally, we worked closely with instructors to assess the classroom setting, cultures and logistics to be sure that the role-play suited their needs and fitted into their curricula. Such a process allows for effective curriculum development geared towards learners’, instructors’ and industry needs and builds relationships within different sectors that supports long-term, sustainable teaching practices, and ensures that the curriculum will continue to be used after the educational specialist is out of the picture.

The Volcanic Hazard Simulation role-play was designed and developed by a team of researchers from the geosciences, hazards and disaster management and education disciplines at the University of Canterbury in Christchurch, New Zealand. Emphasis in the early phases of the project was placed on developing authenticity of the roles and teams and ensuring that the simulation was successful at achieving the desired learning goals. Evaluation of the simulation indicated that students found the simulation to be a highly challenging and engaging learning experience and self-reported improved skills (Dohaney et al. 2015). Classroom observations and interviews indicated that the students valued the authenticity and challenging nature of the role-play although personal experiences and team dynamics (within, and between the teams) varied depending on the students’ background, preparedness, and personality (Dohaney et al. 2015). For a more detailed discussion on the design and development of the Volcanic Hazards Simulation role-play we refer the reader to Dohaney (2013) and Dohaney et al. (2015) and for instructors who are interested in running the role-play in their course, an instructor manual is freely available for educational use online.4

Two eruption scenarios have been built and tested. The first is a large explosive scenario based on a VEI6 eruption from Tongariro Volcanic complex eruption (Cole 1978; Hobden et al. 1999) that is modelled on the 1991 Mt. Pinatubo eruptions (e.g., Wolfe and Hoblitt 1996). The second scenario is an explosive and effusive eruption of the Auckland Volcanic Field that focuses on the science and impacts from monogenetic volcanism in an urban environment. In both cases, the scenarios were chosen as there were existing volcanic monitoring data available to build our models on, and because they had all the pedagogically-relevant stages; from forecasting (that can be denoted by precursors that students could identify), to minor eruption events and results in an exciting, ‘blockbuster’ climax (major eruption). In the scenario presented here (i.e., the Tongariro scenario), students are presented with real-time, streamed datasets that take the volcano from a quiescent stage, small eruptions (i.e., ‘unrest’), and concluding with a very large eruption. The initial design and timeline for the role-play was taken from Harpp and Sweeney (2002) and was subsequently improved through multiple design phases to optimise the exercise and meet the learning goals.

1.5.2 What Happens During the Volcanic Hazards Simulation?

The Volcanic Hazard Simulation is designed for 300–400 level (i.e., upper-year) undergraduate science students from geology, natural hazards, disaster risk reduction, and emergency management. The simulation takes 4–6-h and can accommodate between 15 and 40 students. Students are divided into two teams: the Geoscience team and the Emergency Management team. All students have an authentic role that they are required to research prior to participation in the simulation, such as the field geologist, geodesist, public information manager, or the welfare manager, etc.

The students within the Geoscience team interpret the streamed datasets (e.g., ground deformation, gas, seismicity; see Dohaney et al. (2015) for more details) and communicate science advice to the emergency management team and to the ‘public’. The Emergency Management team is responsible for managing the impacts that the volcanic eruption poses to communities and infrastructure. This set-up is adapted from the organisational structure of operational emergency management in New Zealand dictated by the most recent version of the national guidelines (Ministry of Civil Defence and Emergency Management 2009) and this structure is comparable to other emergency management structures used, globally [e.g., the National Incident Management System (Department of Homeland Security 2008)]. It is important to note that the learning goal for the exercise is not to replicate protocols, but to introduce students to the roles and responsibilities in these important events and to improve their skills sets. We emphasise this distinction to the students, and this allows students to free up their cognitive resources to focus on teamwork, decision-making and the communication tasks, rather than perfecting organisational procedures. The simulation is a reasonably fast-paced environment, with events happening in quick succession to mimic the stresses of a real natural hazard crisis.

Students respond to emergency management and the public’s information needs via a ‘Newsfeed’ data stream (i.e., a stream of prompts that replicate common views and needs during a crisis) and communicate to policy-makers and to members of the public (played by facilitators). Students need to be able to adapt both the content and style of the communication appropriately to serve the intended target audience. During the role-play, we included structured communication tasks that incorporate different communication goals, formats, contexts, and receivers (i.e., different audiences):

Students do the following structured communication events or tasks:
  1. 1.

    Media releases (written)

  2. 2.

    Volcanic impact reports (written)

  3. 3.

    Team discussions: Both within the team (intra-team) and between the groups (inter-team) (oral, group)

  4. 4.

    On-the-spot ‘dynamic’ information requests (written and oral, individual and group)

  5. 5.

    Media TV interviews (oral, public)

  6. 6.

    Press conferences (oral, public)


It should be noted that not all students will directly participate in each task, as these are team tasks in which some students will choose to communicate to the class, and others will not. We aimed to model authentic and effective team behaviour that requires the group to manage the incoming and outgoing communications, as well as adhering to the appropriate responsibilities of individual roles (i.e., team leaders typically volunteered to take on more frequent public speaking tasks).

Students prepare for the role-play through several preparatory activities including: a volcanic hazards mapping activity, pre-readings (with content specific to their role), an exercise instruction document (with learning goals, the rules, and flow of communication maps), and a science communication lecture and homework assignment including reviewing the 7C’s (described above) that we used as crisis communication ‘best practice’. We expect students to be comfortable with the basics of volcanic monitoring and emergency management, but additional introductory lectures are available for revision.

2 Methods

The current study explores the evaluation of students’ communication confidence and perceptions of crisis communication best practices. Below we discuss the study participants, data collection and data analysis procedures.

2.1 Study Participants

Participants (n = 43; Table 1) were recruited from 300- and 400-level physical volcanology and hazards management courses that hosted the Volcanic Hazards Simulation as part of their curricula. The role-play was assessed using a self- and peer-evaluation rubric that accounted for a small percentage of their grade (~1% of their total grade). Students were mixed cohorts of American study-abroad students and New Zealand students who attended the University of Canterbury. They ranged in gender [female (13) and male (30)], nationality [New Zealand (27), United States of America (14), Netherlands (1) and India (1)], and age [aged 19–22 (25) and >23 years old (18)].

2.2 Data Collection

Two iterations of the role-play were tested for communication perceptions and confidence; One role-play was embedded at the end of a 7-day field course (January 2012; n = 23) the other was embedded within a lecture-based course (August 2012; n = 20). The nature of the intervention was slightly different in terms of what was covered prior to the exercise. The Field-based cohort carried out a hazards mapping exercise (studying the volcanology and hazards of Tongariro) and reviewed the best practices of science communication in a short lecture, followed by a media release critique [both of which were assessed for a small amount (~1% of their total grade)] to encourage students to prepare for the role-play. While the Lecture-based cohort received the same science communication lecture but no other activities. These differences in treatment were controlled by course design and allowed the researchers to explore if different treatments of the student groups elicited different communication results.

We used a mixed methods approach in our investigation of the effectiveness of the role-play on science communication using pre- and post-questionnaires that included multiple choice and open-ended questions. The Field cohort was surveyed using hardcopy questionnaires two days before the role-play (Jan 28) while the Lecture cohort was surveyed up to a week prior (Aug 7–13) using email and hardcopies. Both cohorts were surveyed with hardcopy post-questionnaires immediately after the exercise to ensure a high response rate as the study relies on paired data (pre- and post- results).

The questionnaires included several components: the self-reported communication competence instrument (SPCC), a perceptions of crisis communication instrument (PCC), demographics, and open-ended questions.

SPCC is a validated instrument (with a high internal consistency, Cronbach’s alpha of 0.92) that measures communication confidence and is guided by the earlier works of McCroskey (e.g., McCroskey et al. 1977; McCroskey 1982). McCroskey and McCroskey (1988) investigated communication competence through self-reported evaluation of one’s ability to communicate (i.e., communication confidence). The SPCC instrument considers several dimensions of communication: communication contexts [public, meeting, group, and dyad (or pair; one-on-one)] and receivers of the communication (strangers, acquaintances, and friends). While this measure (and others like it) is not a true characterisation of actual communication competency, it has been used in the discipline to measure gains (i.e., testing of communication competency before and after an intervention) (Fortney et al. 2001) and researchers indicate it is a good predictor of actual communication competence (McCroskey and McCroskey 1988).

The PCC survey (Table 2) was built and piloted for this study. We composed the statements with support from risk communication literature (see Sect. 1), expert views on volcanic crisis communication, and our practices with teaching science communication. The attitudes and beliefs covered by the survey are not exhaustive, but we feel that it covers the common best practices and appropriate behaviours when communicating science during crisis. Further research on the instrument will allow us to refine the statements and to incorporate all the important aspects of science communication. This survey was checked for content validity, but not examined with interview techniques (e.g., Adams and Wieman 2010). The questionnaire also included demographic information and open-ended questions that were designed to gather feedback about the student experience and science communication.
Table 2

PCC survey results for all students

(N=39). No significant differences were found between the Field and Lecture cohorts. The number of agree (A), neutral (N) and disagree (D) responses are shown as well as the overall % agreement with experts (%A) responses for each statement. Overall, most statements show positive changes, few show negative changes (shaded green rows). Two statements were shown to be statistically different from pre- to post-survey (* symbol). These differences were calculated using the Wilcoxon signed rank test for different medians in paired pre- and post-survey results where p (equals less than symbol) 0.05

2.3 Data Analysis

The SPCC consists of 12 statements (McCroskey and McCroskey 1988) asking the participant to rate their perceived ability to communicate in different situations and contexts (on a 0–100 scale). The higher the total score, the higher the participant’s confidence. We changed the phrasing from “competent” to “ability” and used a 5-point scale in our version (very strong ability, strong ability, average ability, poor ability, very poor ability). We felt this phrase change would be more comprehensible to our students. For further information on the design and scoring of the instrument please see the publication noted above.

The PCC instrument is composed of 17 5-point Likert statements (Table 2). Experts were surveyed in a small, convenience sample (n = 7) made of volcanology, emergency management and geology faculty at the authors’ institution to assess expert opinion or ‘the right answer’. The responses to the statements can be collapsed to agree, neutral and disagree, to reduce effects of participants preferring less or more conservative use of agreement/disagreement. The student responses can then be assessed as being in agreement or disagreement with the experts (Adams et al. 2006). Neutral responses are not weighted in the calculation.

SPCC and PCC survey results were analysed using the open source PAST statistics programme (Hammer 2015) to determine potential differences or associations with variables within the dataset. SPCC data are treated as interval and groups (i.e., subpopulations) within the dataset were compared using t-tests and one-way ANOVAs. The individual students’ % agreement scores are interval data and so typical parametric tests were carried out, however the individual statement data (i.e., all students’ responses for one statement) are ordinal data [agree (1), neutral (0) and disagree (−1)] and so were treated with non-parametric tests.

Reponses to an open-ended question (Table 3) in the questionnaire were transcribed and coded using qualitative software (ATLAS.ti, Friese and Ringmayr 2011) by the first author. We used content analysis that is defined as the process of using systematic and verifiable means of summarising qualitative data (Cohen et al. 2007). In the first pass of the responses, the researcher identified different units for analysis (individual and separate items). Codes were initially taken as verbatim quotes, to denote, as much as possible, the student’s meaning. In a second pass, the results were viewed in a network (i.e., a map that shows all the responses and allows the user to group similar phrases). The items were grouped and categorised together (i.e., units of data into meaningful clusters; Lincoln and Guba 1985), where like statements could be assigned to code families. The code families were constructed around the act of communication: the knowledge, skills, and attitudes, needed for actions (i.e., strategies) to create an appearance to lead to successful outcomes when communicating. The data were reviewed in a third pass to refine and check for redundancy within and between the code families. 42 student surveys were evaluated, but the question allowed students to respond to as many items as they wanted. Therefore, frequencies of mentions do not represent individual student responses.
Table 3

Results from a post-survey (n = 42): Students’ perceptions of science communication best practices

Question (open-ended): List the most important ‘best practices’ (or good methods) of communication that scientists should use when talking with the public.

(Categories are capitalised and bolded; “representative student quote” (n of items); added or altered words are in {})

Knowledge and skills (7)

Communicate frequently” (2)

Have a formata (1)

Know the {correct} information and facts” (1)

“{Have} general public speaking abilities” (1)

“{Understand} the topic” (1)

Refer to the experts” (1)

Strategies (134)

Speech quality (13)

Speak slowly” (3)

Speak clearly” or “Be clear”a (9)

Repeat the information: “repetition” (1)

Jargon (35)

Explain or define jargon, which is used: “Use some jargon, but explain it” (13)

Use jargon appropriately: “Using appropriate jargon” (9)

Don’t use jargon: “Not use jargon” (7)

Avoid {using} jargon” (6)

Minimise use of jargon: “Minimise technical jargon” (5)

Language and figures (35)

Use analogies” (13)

Use simple terminology: “Keep things simple” (11)

“{Use} simple explanations” (4)

“{Use} numbers” and “statistics” (4)

Use examples of everyday things” (3)

Information quantity and specificity (14)

Be concise: “Speaking concisely” (8)

Be specific and precise: “{Keep things} precise” (3)

Not going into too much detail…” (2)

Give as much information as possible” (1)

Transparency and uncertainty (6)

Explain “what is known and what is not” (3)

Explain what science can tell us and it’s limitations” (2)

Don’t make statements that are not certain” (2)

Back up your observations with data” (1)

Content (9)

Careful wording to avoid panic and fear: “Be careful when using words that might ‘incite’ fear (4)

Explain what is happening: “Explain what we know” (2)

Explain why things are happening: “To convey the “why” of the situation” (2)

Consider facts, not opinions” (1)

Use of visual aids (22)

Diagrams (7), maps (5), figures (3), graphs (2), pie charts (1), media (1), charts (1), graphics (1), and drawings (1)

Attitudes and framing (11)

Be sensitive to the public’s concerns: “Be sensitive when correcting false statements” (4)

Be respectful” (3)

Be polite” (1)

Be honest about the situation: “Be straight up and honest” (2)

Put a positive spin on things” (1)

Behaviour (21)

Show emotions, as appropriate: “Show some emotion” (6)

Don’t show emotions: “Not getting emotional” (1)

Engage with the audience: “Put the audience in the scene” (6)

Use appropriate body language: “Use good posture” (6)

Dress and behave professionally: “Be professional”(2)

Appearance (20)

Don’t appear condescending or patronising: “Not being patronising” (7)

Appear confident: “Sound like you know what you’re talking about” (5)

Appear approachable and relatable: “{Be} down to earth” (5)

Appear calm: “Be calm” (2)

Appear authoritative: “{speak} with authority” (1)

Outcomes (5)

Don’t increase panic or the public’s concerns: “Share concerns without increasing panic or public concern” (4)

Make the public feel safe” (1)

a‘Speak clearly’ and ‘be clear’ could be two different aspects, but are presented here together

3 Results

3.1 Improvement of Students’ Communication Confidence

Figures 1 and 2 show changes in students’ self-reported competence (i.e., confidence; SPCC) with communication. In both pre- and post-surveys, most students fell within the ‘average’ confidence zone, with several students reporting low or high confidence. Altogether, the students showed a positive mean change in confidence (Fig. 1b; Paired t-Test of pre and post-scores, t = −2.07, p = 0.046). An equal number of individuals showed positive and negative shifts in confidence after participating in the exercise, but the largest observable changes were positive (i.e., changes of >10 points: 8 positive compared to 2 negative). Three ‘Low’ confidence students showed large positive changes (21, 27, and 42 points). There were no statistically significant differences between the changes achieved by the different cohorts (Unpaired t-Test for same means; t = 0.37, p = 0.71), but the Field cohort did have lower pre-test scores (average of 69 ± 16). Figure 2a shows the changes for all of the students within each SPCC category (Speaking in public, meetings, groups, or pairs; with strangers, acquaintances, or friends). Overall, the mean changes for the public (5 ± 15) and stranger (7 ± 15) categories were the highest.
Fig. 1

Students’ self-reported communication competence before and after the Volcanic Hazards Simulation. a A plot showing pre-test versus post-test SPCC scores for individual students and the cohorts of which the means are not statistically different. b A table showing SPCC basic statistics. Overall, students showed positive and negative changes, but the positive changes were greater, on average

Fig. 2

a Box and whisker plots of the average change within different dimensions of the SPCC instrument (i.e., communication contexts and receivers) for all students. Note that the highest average change is shown in the public speaking and stranger dimensions that are both emphasised through public speaking tasks within the Volcanic Hazards Simulation. b A plot showing the overall change (pre- and post SPCC) sorted by students who did and did not explicitly participate in public speaking tasks. A comparison of the two groups did not result in a statistically significant difference

We examined the SPCC results for demographic associations with the pre-test scores and changes (Table 1; gender, age, nationality, degree programme, and year of degree programme) as well as curriculum factors [cohort, assigned roles (i.e., data-focussed vs. communication task-focussed) and teams (emergency management or geoscience)]. An interesting relationship surfaced between changes and the pre-test scores and direct participation in the public speaking tasks. Plotting the change scores (post-score minus pre-score) versus pre-test scores showed an inverse relationship (Pearson’s product-moment correlation coefficient r = −0.46; p = 0.004); students with lower pre-test scores achieved the highest changes, and those with the higher pre-test scores achieved the most negative changes. Additionally, we found that students with the greatest individual change in confidence (Fig. 1) participated in the public speaking tasks (i.e., press conferences and media interview) (Fig. 2b; “yes” to participating in public speaking tasks 7.01; “no”: 0.28) although the difference was not statistically significant (t = −1.63, p = 0.11). We would like to explore this affect in the future, with more students and better control over who participates and who does not in the public speaking tasks.

3.2 Improvement of Student Perceptions of Volcanic Crisis Communication

Figure 3 and Table 2 shows the results from the pre- and post-survey (PCC) that measured students’ perceptions of communicating during a volcanic crisis. On average, the students’ reported statistically significant positive changes (i.e., agreeing with experts) in perceptions (Fig. 3a, b; Paired t-Test, t = −2.07; p = 0.046) but individual students displayed both increases and decreases in agreement with experts. More students showed positive (17) or no changes (16) than negative shifts in perceptions (7) after participating in the role-play with the largest observable changes being positive (changes of >10 points; 7 positive, 4 negative).
Fig. 3

Students’ perceptions of volcanic crisis communication before and after the Volcanic Hazards Simulation. a A plot showing pre-test versus post-test PCC scores for individual students and the cohorts. There was no statistical difference between changes within the different cohorts (Paired t-Test to test for same means; t = 0.07, p = 0.95). b A table showing basic statistics of the perceptions survey. Overall, students showed positive and negative changes, but there were more students who exhibited positive changes rather than negative

The analysis of the pre-test scores revealed no significant statistical relationships for curriculum factors and most demographic factors. However, we did find that there was a significant difference in pre-test perceptions between students who were in the 300-level, versus the 400-level of their university degree programmes (mean score of 78 and 69%, respectfully; Unpaired t-test for equal means t = 2.18 and p = 0.04).

The changes achieved by students (post-test minus pre-test %) were also examined for curriculum and demographic factors. The cohort, participation in public speaking tasks, year and type of degree programme did not differ. Factors that did differ were: gender (male mean change = 6.7, female = −3.04), age (older students (>23 years of age) mean change = 7.15, younger students = −0.02), nationality (NZ students mean change = 4.30, US students = −1.89), assigned team (Geoscience group mean change = 8.56, EM = −2.3), and assigned role-type (data monitoring-focussed roles mean change = 9.83, communications-focussed = −0.71). However, these results should be considered with caution as none of these change factors showed statistical significance and there is a high likelihood of interacting and mediating factors (e.g., we cannot isolate some of the variables from one another.).

Lastly, similar to the SPCC scores, we found that the pre-test scores show an inverse relationship to the changes achieved (Pearson’s r = −0.63; p < 0.001). Additionally, as the mean changes for the cohorts and all students were similar for the perceptions survey and the SPCC instrument, we checked for correlations between changes in confidence and changes in perceptions, but only a weak correlation was found and it was not statistically significant (Pearson’s r = 0.30, p = 0.07).

Table 2 illustrates the PCC results broken down by individual statements and grouped together by ‘audience’. Changes between the statements within the field and lecture-based cohorts were not shown to be statistically different, and so the combined results are shown. Overall, most statements showed positive changes (i.e., improving the agreement with the experts) from pre to post-survey. In the pre-survey, some statements showed very high agreement with the experts (>90%, statements 1, 16, 5, 6, 8, 10, and 17, bolded). Statement 7 and 14 showed statistically significant changes from pre to post-survey (Wilcoxon signed rank test, for ordinal data; agree = 1, neutral = 0, disagree = 1 with experts; paired data; p < 0.05). Overall, students had positive changes within the ‘skills’ and ‘communication with other scientists’ dimensions, but some negative changes on statements within the ‘communication with the public’ category. This was surprising, as we were specifically aiming to improve their perceptions of communication with the public. However, a closer look shows that several of the individual statement’s negative shifts were from very high values of agreement with experts where the majority of students who agreed with experts shifted into the neutral category (i.e., were questioning their perception). It should be noted here as well, that when 100% of the students agree with experts it can cause a ‘ceiling effect’, where scores cannot go any higher and can limit the statistical analysis of these results.

3.3 Best Practices of Science Communication

A central aim of the role-play is to enhance students’ communication best practices. In a post-survey, students were asked to “list the most important ‘best practices’ of communication that scientists should use when talking to the public” (Table 3). No significant differences were discovered of the item frequencies between the field and lecture cohorts, and so the results from both groups are presented as a whole. Students views are comprehensive (covering many aspects), but the frequency of items shows a focus on the strategies of communication (134 mentions; e.g., use of jargon, use of analogies, use of visual aids) rather than on how the speaker appears (20), their behaviour (21) and the outcomes of the communication (5). There were a couple of examples of potentially divergent responses within a couple of the categories. For example, in the appearance category, students report that it is important to appear approachable and relatable (5) but another reports that it is important to appear authoritative. Another important example is that students felt it was appropriate to show emotions (6) but another student stated not to. The jargon category was quite popular, and students mentioned a range of recommended approaches including “not using jargon whatsoever”, to using it “appropriately”.

4 Discussion

4.1 Improvement in Students’ Communication Confidence

The overall statistically significant positive changes on the SPCC results (Fig. 1) indicate that the role-play was effective in improving students’ communication confidence. Figure 2 showed that the public speaking and stranger (i.e., speaking with strangers rather than someone you know) dimensions were the most positively affected and this result aligns with the learning goals of the role-play (i.e., to improve students’ crisis communication skills). Positive changes achieved by students were substantial, however, there were equal numbers of students with small negative changes, and some with no change. This indicates that the role-play may be effective in improving student confidence for some more than others. Changes likely occur when students re-evaluate their abilities based on the performances during the role-play (of themselves and others) and either increase or decrease their confidence in communicating. Research has indicated that self-reported competency (i.e., confidence) is diminished when there are some peers who are compulsive communicators (i.e., dominant and frequent talkers), meaning less frequent speakers may not assess their merit as highly in comparison to their classmates (Fortney et al. 2001). Though we did not survey for compulsive speakers, all cohorts included some frequent and dominant speakers and that aspect could have potentially negatively influenced some student’s appraisals of their own abilities. To reduce peer comparison effects, some scholars suggest encouraging students to focus on one’s own progress (i.e., self-comparison), rather than comparing their performance with others, thereby reducing social comparison effects (e.g., Luk et al. 2000).

The change in scores can also potentially be attributed to (positive or negative) feedback provided by instructors and peers during the role-play. Feedback (i.e., self-, peer- and instructor feedback) is vital for communication improvement (e.g. Maguire et al. 1996; Maguire and Pitceathly 2002) and it is likely that some of the participants received more meaningful feedback (i.e., explicit guidance on how to improve and what to consider) during the simulation than others. Additionally, some students may shy away from perceived criticism which could result in negative self-appraisals.

It is worth noting that the SPCC scale and other communication instruments (e.g., PRCA-24; McCroskey et al. 1985) were designed and typically used to record longer interventions (over semesters rather than after one, multi-hour event). Some students in this study reported changes of ~2–5% shift, while changes in competency from an entire semester of communication class (e.g., Rubin et al. 1997) resulted in similar magnitude of change. We propose that even small changes may be influential in a student’s communication confidence over time and that the role-play has been shown here to have similar affects when compared to longer treatments.

Based on the divergent change results, we checked to see what factors may be influencing the individual student’s experiences in different ways. A plot of the change scores versus pre-test scores revealed an inverse relationship (Pearson’s product-moment correlation coefficient r = −0.46; p = 0.004) where students with lower pre-test scores achieved the highest changes, and the higher pre-test scores achieved the most negative changes. This indicates that this exercise is particularly effective at improving student confidence for those with mild communication apprehension. This relationship also indicates that our higher confidence students are becoming less confident. This may be due to a lack of accurate ‘benchmarks’ for effective competence, in that students with less academic maturity/experience may be overestimating their ability to communicate, and when confronted with a challenging exercise, may have a more realistic assessment of their abilities when compared to other students.

There were no notable differences in demographics (age, year of study, gender, nationality, etc.) in contrast to prior communication research that reports that males tend to have higher confidence in transferable skills and communication than females (Lundeberg et al. 1994; Whittle and Eaton 2001; Donovan and Maclntyre 2004) and that people from different cultures and nationalities are more confident with public speaking than others (Lundeberg et al. 2000). We did not observe these attributes in our study population, however the total sample size was small (n = 37) and these factors may only become apparent with larger groups.

There was no statistical difference in changes between the two cohorts, and for the different roles and teams. This indicates that regardless of the learning environment, the extent of the intervention, or the assigned roles and team (i.e., the specific tasks) the affect was equal on students’ confidence. However, as noted in Fig. 2, students who directly participated in the public speaking tasks (i.e., press conferences and media TV interviews) showed more positive changes but this may be due to self-selection (i.e., students who volunteered to speak for the team may be less public-speaking averse than those that passed on the opportunity).

In the future, we may use a more equitable and structured approach to participation in the public speaking tasks (i.e., where all roles are noted and ‘called on’ by the facilitators or team leaders to speak), but presently we did not want to force students to participate. This approach may encourage students to overcome their perceived aversion to public speaking and improve their confidence. It should be noted that the treatment was not set up to specifically control for students participating in the public speaking tasks and future research will explore this variable further.

4.2 Student Perceptions of Best Practice in Volcanic Crisis Communication

Two datasets were considered to explore students’ perceptions of crisis communication best practice: the PCC instrument (Table 2 and Fig. 3) and an open-ended question (Table 3). Overall, the PCC results students showed positive perception changes (i.e., increases in percent agreement with experts; Fig. 3) and more individual positive changes than negative changes, with some students achieving large shifts of >10 points. This indicates that the role-play was effective in enhancing students’ perceptions (becoming more expert-like).

The data shows that the 300-level students had higher pre-test scores than 400-level students. This is separate from nationality, age, and cohort (which showed no differences) indicating that there is an element of academic maturity/experience that is having an effect on their initial perceptions. It is not possible at this stage to differentiate specific reasons why these levels of students had different pre-test scores and will explore it further in our future work.

Overall, several factors (curriculum and demographic) may be impacting the amount of changes in student perceptions: gender, age, nationality, assigned team and role-type though these differences are not statistically significant and we did not observe (i.e., noted during observations of the role-play) distinguishing affects during the role-play. However, given the likelihood that these factors may be interacting, and that mediating variables (such as group socio-dynamics) might be present, causal inferences are difficult to make. A larger sample and more controlled design could plan for these factors.

However, the changes in perceptions associated with assigned role and team could potentially be due to group dynamics. The exercise is challenging with complex social dynamics within the teams and between. The Geoscience team (predominantly data-focussed students) had higher changes than the EM team. This is surprising, as these students are more concerned with data analysis and interpretation than the other team, because this group focuses on receiving science advice and prioritising and communicating impacts of the volcanic crisis. However, the perceptions survey is focussed on the communication of science, and not specifically on advice and actions for the public. It is likely that the Geoscience teams discussed the nuances of science communication at a deeper level than the EM team. This is important consideration when considering your evaluation of these exercises (Does your measure/instrument suit one context over another?).

Results from Table 3 show that students illustrated a comprehensive view of the strategies that you should employ when communicating science, but focussed more on the mechanics of communicating (i.e., the How To’s). This indicates that our participants understand that there are many things to consider when communicating, appreciating the complexity of the task. The responses are all consistent with up to date approaches in rhetorical communication in instructional communication texts (e.g., McCroskey 2006). The frequency of mentions that focuses on the mechanics of science communication is not surprising, given their level of academic maturity and previous experiences (i.e., learning the initial skills, before moving on towards more sophisticated elements of the trade). The lesser but somewhat divergent responses (i.e., ‘appear authoritative’ vs. ‘appear relatable’) is additional evidence for students valuing different approaches to best practice. The undergraduate teaching community should be assured that students need to walk before they can run, and acknowledging where they are in their communication training can help them to understand where they should aspire to be (i.e., considering more situational aspects of communication). The volcanology community can benefit from this finding in that it may be important to acknowledge that practitioners may also hold divergent views on what is best practice, and that organisations would benefit from discussing the merits of specific approaches in specific circumstances. The risk and crisis communication community has much research for almost each individual statements in Table 2 specific areas [e.g., topics like uncertainty (e.g., Hudson-Doyle et al. 2011) and the importance of building and establishing trust through communication (e.g., Haynes et al. 2007)] and applying a one-dimensional approach to crisis communication is not advised.

It should be noted, that this perceptions survey is a pilot version and it has not yet been rigorously validated. Current research on a new version indicates that some of the statements may be asking about more than one concept (e.g., “Using numbers, drawings and probabilities is a good method of communicating scientific principles to other scientists”). New results from experts indicate that they may confuse some statements in terms of what is intended by the approach, versus its’ effectiveness. Meaning that some strategies or perceptions may be valid in theory, but may not be helpful in practice (e.g., disclosing all of your results to show transparency, versus disclosing only the most important results to create a coherent message to the public). These ideas are somewhat opposed and in conflict with one another, causing a tension for the communicator to overcome. Additionally, our list of perception statements is not exhaustive. There is such a diversity and complexity to communicating during crisis and that is evident in the student responses in Table 3. However, it becomes difficult to capture this complexity in a series of closed statements. Further research into student and expert perceptions through interviewing techniques will allow us to characterise risk and crisis communication best practice.

Further work will validate our measure of communication perceptions (i.e., further refine the instrument and comprehensively define crisis communication best practice with the help of experts and practitioners), and focus on assessment of all of the above dimensions to ascertain the relationship between factors that lead to successful communication performance. If we know pedagogical factors influences a student’s ability to learn about crisis communication, then we can provide practical suggestions to improve the teaching of communication in the classroom. We would also like to investigate risk and crisis communication in alternate natural hazards scenarios (e.g., earthquakes, Dohaney et al. 2016 and hydroelectric dam failure) to help students diversify their approaches to risk and crisis communication. Additionally, we would like to develop volcanic scenarios over longer mock time frames (e.g., following a community engagement initiative as it progresses through stages of learning about volcanic risk) to help students understand that risk communication occurs through all stages of the 4 R’s and cultivating relationships with communities provides the foundation for making crisis communication possible.

4.3 Implications for the Teaching of Volcanic Crisis Communication and Future Work

In this final section, we would like to share with the community some lessons learned from our use of training exercises and teaching about communication, as well as outline our future research into the measurement of communication performance.

The use of training exercises is not uncommon in the emergency management sector, however, it is less used in formal education settings because of the significant time investment that goes into building an authentic scenario, organising a robust curriculum plan, and evaluating and testing whether it is effective. We believe an evidence-based approach to the building and testing of such curricula should include specialists in education and communication research. A partnership among these professionals allows content experts (i.e., volcanologists and emergency managers) to learn about pedagogy of training exercises and the art of evaluating such complex learning activities. Input from communication researchers can further enhance the inclusion of specific communication contexts and tasks, as well as help to guide instructors and students in delving deeper into how messages are constructed and received by diverse audiences. In our case, previous research into the design of this exercise (Dohaney et al. 2015) meant that we could move away from the intricate task of ‘tweaking’ our exercise and look at the impact that it has had on our students’ abilities to communicate. Such alliances create powerful and engaging learning experiences that create memorable and lasting influence on student’s ongoing career development.

The results discussed above illustrate that the Volcanic Hazards Simulation has influenced our student’s perceptions and confidence with communicating during a mock volcanic crisis. But, does this translate to transferable communication skills moving forward? What we do know is that often knowledge and awareness of best practice (i.e., ‘expert-like’ perceptions) is the first step towards utilising these communication behaviours and strategies (e.g., McCroskey 2006). And what about communication confidence? Do our high confidence students actually communicate more effectively? Recent research by Kruger and Dunning (2009) suggests that overconfidence and ignorance are not a good thing, however, students with high confidence paired with expert-like perceptions of crisis communication best practice have the tools at their disposal, we hope that as they move forward in their careers they can continue to practice and improve, ultimately leading to better crisis communication practitioners (should they choose to follow that career path).

5 Conclusion

Our study set out to examine whether an authentic volcanic crisis role-play could improve students’ communication confidence and their perceptions of science communication. In the role-play, students challenged themselves and moved outside of their ‘academic comfort zone’ when required to rapidly synthesize new information and communicate the information to differing stakeholders and in different formats. On average, our results indicate that the role-play does improve both confidence and perceptions for our students. In particular, this exercise is most effective for students who have low confidence and low perceptions of communicating science. Students with improved and high confidence in their abilities are more likely to engage in communication experiences (McCroskey et al. 1977), which leads to further improvement, so even a small number of positive shifts in confidence are a success.

However, some students showed both positive and negative changes in confidence and perceptions. Negative appraisals of confidence may be due to peer comparison effects and negative perceptions shifts may be due to shifting from agreeing with experts to neutral responses (i.e., questioning their current perceptions). In future work, we will try and minimise negative experiences and increase the positive experiences for all students. There were no significant differences with regard to students’ confidence and perceptions between the cohorts indicating that despite slightly different intervention (one more extended than the other) students achieved positive changes. This indicates that role-play as a standalone part of an instructor’s curriculum is flexible enough to accommodate different schedules while still reaching its outcomes.

Results from the open-ended question show that our students illustrated a comprehensive range of views on the best practices of science communication, but focussed primarily on the mechanics of delivery, which is unsurprising as most students are still relatively inexperienced and are continually developing these skills. New scenarios for earthquakes will be tested to improve on our findings. This approach to learning skills through authentic challenges builds confidence and resilience in undergraduate students who are likely to become a part of the geologic and emergency management community.


  1. 1.

    Please see for a list of prominent examples.

  2. 2.

    We should acknowledge that volcanic events can become a ‘crisis’ even before any eruptive activity occurs.

  3. 3.

    The role-play discussed here does not include the risk communication practices that occur over longer time frames or in ongoing volcanic events. The learning goals for our activity were limited to volcanic forecasting, decision-making, and managing community concerns throughout a crisis. For a further explanation of our learning goals and motivations for building this scenario, please see Dohaney et al. (2015).

  4. 4.

    You can find the user manual in two places on VHUB (; Dohaney et al. 2014) or on SERC (


  1. Adams WK, Wieman CE (2010) Development and validation of instruments to measure learning of expert-like thinking. Int J Sci Educ 33(9):1–19. doi: 10.1080/09500693.2010.512369CrossRefGoogle Scholar
  2. Adams WK, Perkins K, Podolefsky N, Dubson M, Finkelstein N, Wieman CE (2006) New instrument for measuring student beliefs about physics and learning physics: The Colorado learning attitudes about science survey. Phys Rev Spec Top Phys Educ Res 2(1):1–14. doi: 10.1103/PhysRevSTPER.2.010101CrossRefGoogle Scholar
  3. Alexander D (2007) Making research on geological hazards relevant to stakeholders’ needs. Quat Int 171:186–192CrossRefGoogle Scholar
  4. Bales RF (1976) Interaction process analysis: a method for the study of small groups. University of Chicago Press, Cambridge, ChicagoGoogle Scholar
  5. Barclay EJ, Haynes K, Mitchell T, Solana C, Teeuw R, Darnell A, Crosweller HS, Cole P, Pyle D, Lowe C, Fearnley C, Kelman I (2008) Framing volcanic risk communication within disaster risk reduction: finding ways for the social and physical sciences to work together. Geol Soc Lond Spec Pub 305(1):163–177. doi: 10.1144/SP305.14CrossRefGoogle Scholar
  6. Barclay EJ, Renshaw CE, Taylor HA, Bilge AR (2011) Improving decision making skill using an online volcanic crisis simulation: impact of data presentation format. J Geosci Educ 59(2):85. doi: 10.5408/1.3543933CrossRefGoogle Scholar
  7. Blake M (1987) Role play and inset. J Furth High Educ 11(3):109–119CrossRefGoogle Scholar
  8. Borodzicz E, van Haperen K (2002) Individual and group learning in crisis simulations. J Conting Crisis Manag 10(3):139–147. doi: 10.1111/1468-5973.00190CrossRefGoogle Scholar
  9. Brown JS, Collins A, Duguid P (1989) Situated cognition and the culture of learning. Educ Res 18(1):32–42. doi: 10.3102/0013189X018001032CrossRefGoogle Scholar
  10. Bryner V (2012) Science communication vodcast assignment: 7 cs of science communication Youtube user: science with Tom. Accessed 15 July 2016
  11. Burns T, O’Connor D, Stocklmayer S (2003) Science communication: a contemporary definition. Publ Underst Sci 12(2):183–202CrossRefGoogle Scholar
  12. Cohen L, Manion L, Morrison K (2007) Research methods in education, 6th edn. Routledge, Taylor and Francis Group, New York. doi: 10.1111/j.1467-8527.2007.00388_4.xCrossRefGoogle Scholar
  13. Cole JW (1978) Andesites of the Tongariro Volcanic Centre, North Island, New Zealand. J Volcanol Geotherm Res 3:121–153CrossRefGoogle Scholar
  14. Crane A, Livesey S (2003) Are you talking to me? Stakeholder communication and the risks and rewards of dialogue. In: Andriof J, Waddock S, Rahman S, Husted B (eds) Unfolding stakeholder thinking 2: relationships, communication, reporting and performance. Greenleaf, Sheffield, pp 39–52CrossRefGoogle Scholar
  15. DeNeve KM, Heppner MJ (1997) Role play simulations: the assessment of an active learning technique and comparisons with traditional lectures. Innov High Educ 21(3):231–246. doi: 10.1007/BF01243718CrossRefGoogle Scholar
  16. Department of Homeland Security (2008) National incident management system, p 156. Accessed 15 July 2015
  17. Dohaney J (2013) Educational theory and practice for skill development in the geosciences. Dissertation, University of CanterburyGoogle Scholar
  18. Dohaney J, Brogt E, Kennedy B, Wilson TM, Fitzgerald R (2014) The volcanic hazards simulation. VHUB: collaborative volcano research and risk mitigation. Accessed 15 July 2016
  19. Dohaney J, Brogt E, Kennedy B, Wilson TM, Lindsay JM (2015) Training in crisis communication and volcanic eruption forecasting: design and evaluation of an authentic role-play simulation. J Appl Volcanol 4(1):1–26. doi: 10.1186/s13617-015-0030-1CrossRefGoogle Scholar
  20. Dohaney J, Brogt E, Wilson TM, Hudson-Doyle E, Kennedy B, Lindsay J, Bradley B, Johnston DM, Gravley D (2016) Improving science communication through scenario-based role-plays. National project fund research report, Ako Aotearoa, National Centre for Tertiary Teaching Excellence, Wellington, New ZealandGoogle Scholar
  21. Donovan LA, Maclntyre PD (2004) Age and sex differences in willingness to communicate: communication apprehension and self-perceived competence. Commun Res Rep 21(4):420–427CrossRefGoogle Scholar
  22. Ellis K (1995) Apprehension, self-perceived competency, and teacher immediacy in the laboratory-supported public speaking course: trends and relationships. Commun Educ 44:64–77CrossRefGoogle Scholar
  23. Engleberg IN, Ward SM, Disbrow LM, Katt JA, Myers SA, O’Keefe P (2016) The development of a set of core communication competencies for introductory communication courses. Commun Educ 4523(July):1–18. doi: 10.1080/03634523.2016.1159316CrossRefGoogle Scholar
  24. Errington EP (1997) Role-play. Higher Education Research and Development Society of Australasia Incorporated, AustraliaGoogle Scholar
  25. Errington EP (2011) Mission possible: using near-world scenarios to prepare graduates for the professions. Int J Teach Learn High Educ 23(1):84–91Google Scholar
  26. Fischhoff B (1995) Risk perception and communication unplugged: twenty years of process. Risk Anal 15(2):137–145. doi: 10.1111/j.1539-6924.1995.tb00308.xCrossRefGoogle Scholar
  27. Fisher A (1991) Risk communication challenges. Risk Anal 11(2):173–179CrossRefGoogle Scholar
  28. Fortney SD, Johnson DI, Long KM (2001) The impact of compulsive communicators on the self-perceived competence of classroom peers: an investigation and test of instructional strategies. Commun Educ 50(4):357–373. doi: 10.1080/03634520109379261CrossRefGoogle Scholar
  29. Friese S, Ringmayr TG (2011) ATLAS.ti6 User Guide. Accessed 15 July 2016
  30. Glik DC (2007) Risk communication for public health emergencies. Annu Rev Publ Health 28(1):33–54. doi: 10.1146/annurev.publhealth.28.021406.144123CrossRefGoogle Scholar
  31. Grunig JE, Repper FC (1992) Strategic management, publics and issues. In: Grunig JE (ed) Excellence in public relations and communication management. Lawrence Erlbaum Associates, Hillsdale, NJ, pp 117–157Google Scholar
  32. Hales TC, Cashman KV (2008) Simulating social and political influences on hazard analysis through a classroom role playing exercise. J Geosci Educ 56(1):54–60CrossRefGoogle Scholar
  33. Hammer O (2015) PAST: Paleontological Statistics instruction manual, Version 3. Natural History Museum, University of Oslo, Norway, p 243.
  34. Harpp KS, Sweeney WJ (2002) Simulating a volcanic crisis in the classroom. J Geosci Educ 50(4):410–418CrossRefGoogle Scholar
  35. Haynes K, Barclay J, Pidgeon N (2007) The issue of trust and its influence on risk communication during a volcanic crisis. Bull Volcanol 70(5):605–621CrossRefGoogle Scholar
  36. Haynes K, Barclay J, Pidgeon N (2008) Whose reality counts? Factors affecting the perception of volcanic risk. J Volcanol Geotherm Res 172(3):259–272CrossRefGoogle Scholar
  37. Heath C (2000) The Technical and non-technical skills needed by Canadian-based mining companies. J Geosci Educ 48(1):5–18CrossRefGoogle Scholar
  38. Herrington A, Herrington J (2006) Authentic learning environments in higher education. IGI Global, Hershey, PA, USA. doi: 10.4018/978-1-59140-594-8CrossRefGoogle Scholar
  39. Herrington J, Reeves TC, Oliver R (2014) Authentic learning environments. In: Spector JM, Merrill MD, Elen J, Bishop MJ (ed) Handbook of research on educational communications and technology. Springer, New York, NY, pp 401–412. doi: 10.1007/978-1-4614-3185-5CrossRefGoogle Scholar
  40. Hobden BJ, Houghton BF, Davidson JP, Weaver SD (1999) Small and short-lived magma batches at composite volcanoes: time windows at Tongariro volcano. N Z J Geol Soc 156(5):865–868. doi: 10.1144/gsjgs.156.5.0865CrossRefGoogle Scholar
  41. Hudson-Doyle EE, Johnston DM, McClure J, Paton D (2011) The communication of uncertain scientific advice during natural hazard events. N Z J Psychol 40(4):39–50Google Scholar
  42. IAVCEI Task Group on Crisis Protocols (2016) Toward IAVCEI guidelines on the roles and responsibilities of scientists involved in volcanic hazard evaluation, risk mitigation, and crisis response. Bull Volcanol 78:31CrossRefGoogle Scholar
  43. Jones F, Ko K, Caulkins J, Tompkins D, Harris S (2010) Survey of hiring practices in geoscience industries. CWSEI Report, University of British Columbia, p 18Google Scholar
  44. Kreps GL, Query JL (1990) Health communication and interpersonal competence. Speech communication essays to commemorate the 75th anniversary of the Speech Communication Association, pp 293–323Google Scholar
  45. Kruger J, Dunning D (2009) Unskilled and unaware of it: how difficulties in recognizing one’s own incompetence lead to inflated self-assessments. Psychology 1:30–46. Retrieved from
  46. Lincoln YS, Guba EG (1985) Naturalistic inquiry. Sage PublicationsGoogle Scholar
  47. Livingstone I (1999) Role-playing planning public inquiries. J Geogr High Educ 23(1):63–76. doi: 10.1080/03098269985605CrossRefGoogle Scholar
  48. Lombardi MM (2007) Authentic learning for the 21st century: an overview. Educause Learning InitiativeGoogle Scholar
  49. Luk CL, Wan WWN, Lai JCL (2000) Consistency in the choice of social referent. Psychol Rep 86:925–934CrossRefGoogle Scholar
  50. Lunce LM (2006) Simulations: bringing the benefits of situated learning to the traditional classroom. J Appl Educ Technol 3(1):37–45Google Scholar
  51. Lundeberg MA, Fox PW, Puncochar J (1994) Highly confident but wrong: gender differences and similarities in confidence judgments. J Educ Psychol 86(1):114–121CrossRefGoogle Scholar
  52. Lundeberg MA, Fox PW, Brown AC, Elbedour S (2000) Cultural influences on confidence: country and gender. J Educ Psychol 92(1):152–159. doi: 10.1037//0022-0663.92.1.152CrossRefGoogle Scholar
  53. Maddrell AMC (1994) A scheme for the effective use of role plays for an emancipatory geography. J Geogr High Educ 18(2):155–163CrossRefGoogle Scholar
  54. Maguire P, Pitceathly C (2002) Key communication skills and how to acquire them. Br Med J 325(September):697–700CrossRefGoogle Scholar
  55. Maguire P, Booth K, Elliott C, Jones B (1996) Helping health professionals involved in cancer care acquire key interviewing skills: the impact of workshops. Eur J Cancer 32A(9):1486–1489CrossRefGoogle Scholar
  56. McCroskey JC (1982) Communication competence and performance: a research and pedagogical perspective. Commun Educ 31(January):102–109Google Scholar
  57. McCroskey J (2006) An Introduction to rhetorical communication: a western rhetorical perspective, 9th edn. In: Bowers K, Wheel B (eds). Englewood Cliffs, New Jersey, p 333 Google Scholar
  58. McCroskey JC, McCroskey LL (1988) Self-report as an approach to measuring communication competence. Commun Res Rep 5(2):108–113CrossRefGoogle Scholar
  59. McCroskey JC, Daly JA, Richmond VP, Falcione RL (1977) Studies of the relationship between communication apprehension and self-esteem. Hum Commun Res 3(3):269–277CrossRefGoogle Scholar
  60. McCroskey JC, Beatty MJ, Kearney P, Plax TG (1985) The content validity of the PRCA-24 as a measure of communication apprehension across communication contexts. Commun Q 33(3):165–173. doi: 10.1080/01463378509369595CrossRefGoogle Scholar
  61. Miller MD (1987) The relationship of communication reticence and negative expectations. Commun Educ 36:228–235CrossRefGoogle Scholar
  62. Miller S (2008) So where’s the theory? On the relationship between science communication practice and research. In: Cheng D, Claessens M, Gascoigne T, Metcalfe J, Schiele B, Shi S (eds) Communicating science in social contexts. Springer Science and Business Media, Netherlands, pp 275–287CrossRefGoogle Scholar
  63. Miller S, Fahy D (2009) Can science communication workshops train scientists for reflexive public engagement? The ESConet experience. Sci Commun 31(1):116–126. doi: 10.1177/1075547009339048CrossRefGoogle Scholar
  64. Ministry of Civil Defence and Emergency Management (2009) The guide to the national civil defence emergency management plan, 3rd edn, 266 ppGoogle Scholar
  65. Morgan MG, Fischhoff B, Bostrom A, Atman CJ (2002) Risk communication: a mental models approach, 1st edn. Cambridge University Press, New York, NYGoogle Scholar
  66. Morreale S, Pearson J (2008) Why communication education is important: the centrality of the discipline in the 21st Century. Commun Educ 57(2):224–240CrossRefGoogle Scholar
  67. Pielke RA (2007) The honest broker: making sense of science in policy and politics, 1st edn. Cambridge University Press, Cambridge, UK.
  68. Reynolds B, Seeger M (2005) Crisis and emergency risk communication as an integrative model. J Health Commun 10(1):43–55CrossRefGoogle Scholar
  69. Reynolds BJ, Shenhar G (2016) Crisis and emergency risk. In: Koenig and Schultz’s Disaster medicine: comprehensive principles and practices, p 390Google Scholar
  70. Richmond VP, McCroskey JC, McCroskey LL (1989) An investigation of self-perceived communication competence and personality orientations. Commun Res Rep 6(1):28–36CrossRefGoogle Scholar
  71. Roth WM, Roychoudhury A (1993) The development of science process skills in authentic contexts. J Res Sci Teach 30(2):127–152. doi: 10.1002/tea.3660300203CrossRefGoogle Scholar
  72. Rovins JE, Wilson TM, Hayes J, Jensen SJ, Dohaney J, Mitchell J, Johnston DM, Davies A (2015) Risk assessment handbook. GNS science miscellaneous series. 84:67 ppGoogle Scholar
  73. Rubin RB, Morreale SP (1996) Setting expectations for speech communication and listening. New Dir High Educ 96:19–29CrossRefGoogle Scholar
  74. Rubin RB, Rubin AM, Jordan FF (1997) Effects of instruction on communication apprehension and communication competence. Commun Educ 46(2):104–114. doi: 10.1080/03634529709379080CrossRefGoogle Scholar
  75. Shearer R, Davidhizar R (2003) Using role play to develop cultural competence. J Nurs Educ 42(6):273–276.
  76. Trench B, Miller S (2012) Policies and practices in supporting scientists’ public communication through training. Sci Publ Policy 39(6):722–731CrossRefGoogle Scholar
  77. United Nations International Strategy for Disaster Risk Reduction (2015) Sendai framework for disaster risk reduction 2015–2030. Geneva, SwitzerlandGoogle Scholar
  78. Van Ments M (1999) The effective use of role-play: practical techniques for improving learning, Kogan Page PublishersGoogle Scholar
  79. Weingart P, Engels A, Pansegrau P (2000) Risks of communication: discourses on climate change in science, politics, and the mass media. Publ Underst Sci 9(3):304. doi: 10.1088/0963-6625/9/3/304CrossRefGoogle Scholar
  80. Whittle SR, Eaton DGM (2001) Attitudes towards transferable skills in medical undergraduates. Med Educ 35(2):148–153. doi: 10.1046/j.1365-2923.2001.00773.xCrossRefGoogle Scholar
  81. Wolfe EW, Hoblitt RP (1996) Overview of eruptions. In: Fire and mud: eruptions and lahars of Mount Pinatubo, Philippines, vol 18, pp 3–20. doi: 10.2307/3673980CrossRefGoogle Scholar

Copyright information

© The Author(s) 2017

Open Access   This chapter is licensed under the terms of the Creative Commons Attribution 4.0 International License (, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license and indicate if changes were made.

The images or other third party material in this chapter are included in the chapter’s Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the chapter’s Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder.

Authors and Affiliations

  1. 1.Geoscience Education Research GroupUniversity of CanterburyChristchurchNew Zealand
  2. 2.Academic Services GroupUniversity of CanterburyChristchurchNew Zealand
  3. 3.Department of Geological SciencesUniversity of CanterburyChristchurchNew Zealand

Personalised recommendations