Abstract
It is widely held that leadership consists of behaviors that should be applied strategically and systematically to motivate individuals and teams to perform. It is also self-evident that we live in a world of automation, artificial intelligence, and expert systems. Given these two assertions, we propose that some aspects of leadership are candidates for automation. This paper briefly reviews relevant leadership literature and describes three leadership behaviors that may be possibly automated: goal setting, performance monitoring, and performance consequences. The paper also explores the relationship of different embodiments of the artificial leaders and the impact of these embodiments as varying degrees of social presence and the effect of this presence on performance and satisfaction outcomes.
Keywords
You have full access to this open access chapter, Download conference paper PDF
1 Introduction
In the movie “The Terminator”, a computer system in the future has become self-aware and is leading an onslaught against the human race. This theme of intelligent robots has been consistently played out in various books, movies, etc. Similarly, leadership has been a topic of study and reflection for thousands of years, but it is almost always thought of in a human context – humans leading other humans. Recently, some studies have examined the relationship between, and performance of human-agent teams and others examined the role of humans leading these artificial agent teams [1, 2]. This current examination is unique in that explores if an artificially intelligent machine is capable of providing limited leadership to a person for a specific task.
There are many different definitions of leadership. As Stogdill [3] points out in a review of leadership research, there are almost as many definitions of leadership as there are people who have tried to define it. Peter Northouse offers the following observations and definition that will serve as the working definition of leadership in this paper: “leadership is a process whereby an individual influences a group of individuals to achieve a common goal” [4]. The main difference in this study is that an artificially intelligent agent will influence a group of individuals to achieve the goal. Recent leadership research has concentrated on transformational leadership [5, 6] and a charismatic human leader creating a vision that people will want to follow. However, as important as transformational leadership is, it has some weaknesses [4, 7]. Transactional leadership is still required in a vast set of circumstances and is closely aligned with effective management [8, 9]. A transformational leader needs to create the vision and a transactional leader will ensure that the vision is implemented and both are necessary and complementary [10]. A transactional leader does not individualize the needs of subordinates or focus on their personal development. Transactional leaders exchange things of value with subordinates to advance their own and their subordinates’ agendas [11,12,13].
As intelligent systems advance and become more ubiquitous, we need to explore new dimensions of human-computer interactions based on natural communication patterns and consideration of human individual differences. Next generation information systems will involve both the automated delivery of human-like communication and the interpretation human verbal and non-verbal messages [14,15,16,17]. Given these assertions, the ability for a computer system to have a knowledge base on which to draw in order to deliver appropriate messages to a human user is an ambitious undertaking and is a novel conceptualization for information systems [18].
In this paper, we focus on transactional leadership and propose that this form of leadership need not be confined to human-to-human interactions. We take the position that transactional leadership can be automated. In other words, a system can be developed and/or trained to provide leadership to human counterparts in a computer-to-human interaction for a limited project. We propose to automate three leadership behaviors and explore the social presence of the automated leader as a moderating variable.
2 Research Background
At the macro-level, this research paper proposes to examine the relationship between automated leadership, social presence and task performance, and follower satisfaction. Figure 1 shows the proposed relationships between each of these constructs. The very essence of leadership is to improve performance and develop followers. Leadership theories posit that leadership consists of behaviors that should be applied strategically and systematically to motivate individuals and teams to perform [19,20,21,22,23,24,25].
2.1 Automated Leadership
There are multiple supervisory behaviors that have shown a positive impact on performance [22, 26,27,28]. We propose to apply an automated leadership style, which highlights the importance of certain behaviors, such as providing information and developing goals [29]. Research in virtual teams has shown that effective leaders in distributed teams are extremely efficient at providing regular, detailed, and prompt communication with their peers and in articulating role relationships (responsibilities) among the virtual team members [30]. Three leadership behaviors have been selected for automation: goal setting, performance monitoring, and performance consequences.
Goal Setting.
A goal is a desired state or outcome [31]. According to Locke and Latham, goals affect performance through four mechanisms. First, they serve a directive function. Second, goals have an energizing function. Third, goals affect persistence. Fourth, goals affect action indirectly by leading to the arousal, discovery, and/or use of task-relevant knowledge and strategies. Locke and Latham showed that the highest and most difficult goals produced the highest levels of effort and performance [31]. They also found that specific, difficult goals led to consistently higher performance than urging people to do their best. Atkinson [32] showed that there was an inverse, curvilinear relationship between task difficulty (measured as probability of success) and performance. The highest level of effort occurred when the task was moderately difficult. Therefore, effective leaders will set goals of appropriate difficulty to stimulate the optimal performance according to a given team’s capability. For the ad hoc nature of this leader and follower experiment, effective goal setting involves formulating specific, challenging and time-constrained objectives [33].
Performance Monitoring.
Antonakis et al. [19] noted that transactional leadership is the ability to control and monitor outcomes. Research by Larson and Callahan [34] looked at the role of monitoring on performance. They hypothesized that performance monitoring would have an independent effect upon work behavior and found that monitoring improved subjects work output independent of other factors. Similarly, Brewer [35] found that the quantity of work improved when monitored. Aiello and Kolb [36] examined the role of electronic performance monitoring and social context on productivity and stress. They found that individually monitored participants were vastly more productive than those monitored at the group level for a simple task. More recent research has evaluated how electronic performance monitoring systems impact emotion, performance, and satisfaction [37, 38]. Therefore, effective leaders will actively monitor the performance of individual team members and the team as a whole.
Performance Consequences.
Bass [39, 40] argued that theories of leadership primarily focused on follower goal and role clarification and the ways leaders rewarded or sanctioned follower behavior. Similarly, Larson and Callahan [34] found that monitoring along with consequences (feedback to the subjects about their performance during the task) significantly increased subjects’ work output and that this provided the largest increase in productivity. Thus, how well a leader is able to monitor performance and influence the team’s behavior is a measure of transactional leadership ability. Follower behavior can be shaped by effectively providing feedback and appropriate consequences. Consequences can be defined as either motivating/reinforcing events or as disciplining/punishing ones [41, 42]. Komaki et al. [27, 28] expanded this definition to include consequences that are neutral and informational in character. For this study, we use their definition of performance consequences, which is defined as communicating an evaluation of or indicating knowledge of another’s performance, where the indication can range from highly evaluative to neutral. This type of communication is vital for performance and compliance [27, 43]. An artificial system can operate by creating clear structures that make it certain what is required of the subordinate team members, and the rewards that they will receive for following instructions. Punishments can also be clearly stated and then a computer system can be coded to use operant conditioning on followers. Komaki provides several examples of positive, negative, and neutral consequences. Several examples are listed below:
Positive
-
“You have done good work; no signs of errors!”
-
“Great, you have done it so quickly.”
Negative
-
“You have made a great deal of errors.”
-
“Oh no. You have done this all wrong.”
Neutral
-
“You have over 300 open cases.”
-
“He made a call yesterday for those materials.”
2.2 Social Presence
Social presence is the sense that one is together with another. It encompasses the idea that embodied agents have a persona that causes natural reactions from human beings. Heeter [44] said that this phenomenon relates to the apparent existence of and feedback from the other entity in the communication and that social presence is the extent to which other beings in the world appear to exist and react to the user. Biocca et al. [45] posit that social presence may be the byproduct of reading or simulating the mental states of virtual others and that social presence is related to the theory of the mind. They state that when users interact with agents or robots, they “read minds” and respond socially, even when they know that no mind or social other really exists. They continue that although humans know that the “other” is just ink on paper or patterns of light on a screen, the social responses are automatic [45]. Similarly, Computers as Social Actors (CASA) theory proposes that human beings interact with computers as though computers were people [46]. In multiple studies, researchers have found that participants react to interactive computer systems no differently than participants react to other people [47]. It is suggested that people fail to critically assess the computer and its limitations as an interaction partner [48] and as a result, the norms of interaction observed between people occur no differently between a person and a computer [49]. CASA has been used in multiple studies to provide structure for experimentation. Some similar studies include instances were computers have been specifically designed to praise or criticize performance [50], to display dominant or submissive cues [51, 52], to flatter participants [53], explored the role of gender and flattery [54] or to display similar or dissimilar interaction cues with participants [51]. More recent studies have shown how individuals may form group relations with computer agents [55], how social presence affects interaction in a virtual environment [56], and that social presence factors contribute significantly to the building of the trustworthy online exchanging relationships [57].
2.3 Text-Based Automated Agents
One level of interaction for the automated leaders is simply to send a text-based message. Each of the leadership behaviors described above can be put into an agent that is “unembodied” and communicates with the follower through text messages that appear on the screen. We propose to use several levels of social presence/embodiment with text-based being the least “present”. There are several reasons to use an embodied face over only sound and text when communicating and interacting with individuals. People interacting with embodied agents tend to interpret both nonverbal cues and the absence of nonverbal cues. Embodied agents can effectively communicate an intended emotion through animated facial expressions [58]. The nonverbal interactions between individuals include significant conversational cues and facilitate communication. Incorporating nonverbal conversational elements into an automated leader may increase the engagement and satisfaction of individuals interacting with the agent [59,60,61,62,63]. We anticipate that this lowest level of presence will moderate both performance and satisfaction and should provide greater performance than no leadership at all.
2.4 Embodied Automated Agents
The next level of presence for the adaptive intelligent agent in our experiment will be a “flat”, embodied agent. The primary means the automated leader has for affecting its follower are the signals and messages it sends to the human via its rendered, embodied interface. For this paper, embodied agents refer to virtual, three-dimensional human likenesses that are displayed on computer screens. While they are often used interchangeably, it is important to note that the terms avatar and embodied agent are not synonymous. If an embodied agent is intended to interact with people through natural speech, it is often referred to as an Embodied Conversational Agent, or ECA [17]. The signals available to the agent take on three primary dimensions, which are appearance, voice, and size. The appearance can be manipulated to show different demeanors, genders, ethnicities, hair colors, clothing, hairstyles, and face structures. One study of embodied agents in a retail setting found a difference in gender preferences. Participants preferred the male embodied agent and responded negatively to the accented voice of the female agent. However, when cartoonlike agents were used, the effect was reversed and participants liked the female cartoon agent significantly more than the male cartoon [64].
Embodied Conversational Agents are becoming more effective at engaging human subjects as though they were intelligent individuals. Humans engage with virtual agents and respond to their gestures and statements. When the embodied agents react to human subjects appropriately and make appropriate responses, participants report finding the interaction satisfying. On the other hand, when the agents fail to recognize what humans are saying, and respond with requests for clarification or inappropriate responses, humans can find the interaction very frustrating [65]. It has been proposed that Embodied Conversational Agents could be used as an interface between users and computers [66]. While humans are relatively good at identifying expressed emotions from other humans whether static or dynamic, identifying emotions from synthetic faces is more problematic. Identifying static expressions was particularly difficult, with expressions such as fear being confused with surprise, and disgust and anger being confused with each other. When synthetic expressions are expressed dynamically, emotion identification improves significantly [58].
In one study on conversational engagement, conversational agents were either responsive to conversational pauses by giving head nods and shakes or not. Thirty percent of human storytellers in the responsive condition indicated they felt a connection with the conversational agent, while none of the storytellers to the non-responsive agents reported a connection. In this study, embodied conversational agent responsiveness was limited to head movement, and facial reactions were fixed. Subjects generally regarded responsive avatars as helpful or disruptive, while 75% of the subjects were indifferent towards the non-responsive avatars. Users talking to responsive agents spoke longer and said more, while individuals talking to unresponsive agents talked less and had proportionally greater disfluency rates and frequencies [67] (Fig. 2).
Agents that are photorealistic need to be completely lifelike, with natural expressions, or else individuals perceive them negatively, and a disembodied voice is actually preferred and found to be clearer. When cartoon figures are utilized, three-dimensional characters are preferred over two-dimensional characters and whole body animations are preferred over talking heads [61]. Emotional demeanor is an additional signal that can be manipulated as an effector by the automated leader based on its desired goals, probable outcomes, and current states. The emotional state display may be determined from the probability that desired goals will be achieved. Emotions can be expressed through the animation movements and facial expressions, which may be probabilistically determined, based on the agent’s expert system [61]. There are limitless possible renderings that may influence human perception and affect the agents operating environment. Derrick and Ligon [68] showed that these types of agents could use influence tactics such as impression management techniques to change user perceptions of the automation. Moreover, it has been shown that these perceptions change user/follower behavior include how people speak and interact with the agent [69]. Finally, Nunamaker and colleagues review how these types of agents have been tested and deployed in various contexts [16].
2.5 Hologram-Based Automated Agents
An alternate technology that is widely deployed in interactive entertainment environments is a projection display known as Pepper’s Ghost. While often referred to in the mainstream media as a hologram, this is a form of 2D display technology that creates an illusion of depth under limited viewing conditions and angles. Technological advancements have produced impressive visualizations and immersive experiences, as evidenced by recent highly publicized “live” stage performances by celebrities who are not present (e.g. Narendra Modi), animated characters (e.g., Hatsune Miku, Madonna with the Gorillaz), and digital recreations of deceased celebrities (e.g., Tupac, Michael Jackson). These visualizations are also used at high-end amusement parks where the immersive experience is critical to the visitor’s experience, such as Disney World’s “Haunted Mansion” and “Phantom Manor” and Universal Studios’ Harry Potter “Hogwarts Express” ride. We have developed a prototype limited viewing angle pseudo-hologram (LVAH) based system from readily available 2D COTS systems to use as the automated leader with the most social presence.
Technological trusting beliefs result from social presence, or the warmth, sociability, and feeling of human contact, and can be achieved through simulating interaction with another real person [70]. Trust in technology also depends upon machine accuracy, responsivity, predictability, and dependability [71]. We propose that the more socially present leader that is created using a LVAH will moderate performance and satisfaction of the followers. We will measure the followers’ perceptions of social presence in each of the types of leadership agents and test how this moderates the outcome variables. Figure 3 shows the apparatus to create the hologram agent and Fig. 4 shows the embodied hologram agent that will be used in the study.
3 Method
We have created a task where students have to input fake alumni information into an online system. We generated 500 fake names, addresses and phone numbers, and printed them out on sheets of paper. We ran a control study with no leadership where students were told, “We are capturing addresses and contact information for recent UNO Alumni that will be used to send information, fundraise, and help build the UNO community. In front of you, there is a sheet of alumni’s information that must be input into the system. Please use the data entry screen on the computer to input this data. Work as quickly and as accurately as possible, as your performance is based on both the number and quality of the data that you have captured. After you have input each person’s contact information, please press the submit button to store it to the database. You will input data for thirty minutes and then we will ask you about your experience”. The students then input the data for thirty minutes and were thanked for their participation. Based on this control group, the average user could input approximately 26 names in 30 min (25.7) with a standard deviation of 4.9. We measured the accuracy of the data input against the gold standard of the generated names stored in the database by comparing each field entered to the actual data. Figure 5 shows the user input screen.
For the control group, the average accuracy was 86.4% with a standard deviation of 6.3%. Using this baseline data, we programmed the automated leaders to set an objective for the new followers that was two standard deviations higher than the average (i.e., 35 names input in 30 min) and one standard deviation higher for quality (i.e., 93%). For any of the leadership conditions regardless of embodiment, the follow script was used and was delivered either in text (for the text-based leader) or by voice in case of the embodied leaders:
“Hello. I am a new automated manager and will be your leader for this task. We are capturing addresses and contact information for recent UNO Alumni that will be used to send information, fundraise, and help build the UNO community. In front of you, there is a sheet of alumni’s information that must be input into the system. Please use the data entry screen on the computer to input this data. Work as quickly and as accurately as possible, as your performance is based on both the number and quality of the data that you have captured. After you have input each person’s contact information, please press the submit button to store it to the database. You will input data for thirty minutes and then we will ask you about your experience. I will monitor your performance. The average person can input about 30 people’s contact information in 30 min. Based on your education and personality profile, I think that 35 is a reasonable goal for you. Please don’t let me down. When you press the OK button on the screen, I will start the timer. Please ask my human assistant if you have any questions”.
The bolded sections in the text highlight where the agent is establishing itself as the leader and is setting a performance objective for the follower. The non-bolded text is the same as the control group instructions. Once the user starts the experiment, the systems monitors performance and provides appropriate feedback at defined intervals. The system architecture is shown in Fig. 6 below.
As described earlier the system will then communicate these objectives to followers electronically via a chat client or as an embodied conversational agent (e.g. SPECIES agent) [14]. Moreover, the artificially intelligent leader will be able to monitor individual team members’ performance using electronic performance monitoring techniques as the users participate in the virtual context. Finally, the system is programmed to use operant conditioning on its followers and the automated leader has specific and proper pre-programmed statements that it will send to followers at appropriate intervals depending on their performance. There are multiple studies that evaluate leadership from an operant perspective [26,27,28,29, 41]. Transactional leadership from an operant perspective was chosen for automation because it can be limited to inducing only basic exchanges with followers. In essence, the programmed psychology of the artificial leader will be operant conditioning [72]. Figure 7 shows the experiment flow. The initial and second feedback are measured against progress towards the stated goal. All subsequent feedback is based on the performance of the prior five minutes. This allows the user to get more positive feedback if his or her performance improves over their initial baselines but are still short toward the goal.
The leader has a battery of possible feedback based on performance. Below are two samples of the feedback. The first is an example of feedback at 9 min where the participant has good speed, but poor accuracy and the second is an example of positive feedback for both speed and accuracy at 15 min.
Example 1.
Your speed is excellent, and you are on track to make our goal! However, I have also checked the quality of your data entries and they are not acceptable. You need to be more precise. Being fast is not good, if your data quality is so poor. Thank you for your effort, but you need to improve. Your quality is worse than most of the people that have worked on this task. Please be more accurate.
Example 2.
Thank you so much for your effort! You are really doing a fantastic job. Your speed and accuracy are in the top tier of all of the people that have worked on this task. You are doing a remarkable job and are on pace to be one of the best participants.
The experiment concludes with the leader delivering a thank you message and telling them that they have done an excellent job. After the completion of the task, the participants are given a post-survey that measures outcome and process satisfaction [73], Leader-Member Exchange (LMX) [74], Leader Behavior Description Questionnaire [4], and degree of social presence [45, 75] of the artificial leader.
Transactional leadership theory indicates that leadership is an exchange process based on the fulfillment of contractual obligations and is typically represented as setting objectives and monitoring and controlling outcomes [19]. The object of our study is to measure the effectiveness of an information system in providing this type of leadership. We control for natural team capability by random assignment to the various embodied artificial leaders. We will perform comparisons between the control group (no leader) and the presence of leadership with the manipulations being the embodiment. Transactional leadership has been operationalized as setting goals, performance monitoring and performance consequences. These behaviors should directly affect the team performance. Figure 8 shows the final operationalized model.
Summary of hypotheses:
Automated Leadership Will Improve Follower Performance
-
1.
Automated Leadership will increase the number of data entries
-
2.
Automated Leadership will increase the accuracy of data entries
-
3.
Social presence will have a positive moderating effect on performance outcomes.
Follower Satisfaction
-
4.
Automated Leadership will decrease process satisfaction
-
5.
Automated Leadership will increase outcome satisfaction.
Perceptions of the Leader
-
6.
The greater the social presence of the artificial leader the better the follower perception of the leader.
The automated leader obviously has several limitations many of which are grounded in its assumptions. First, it assumes a rational follower, who is largely motivated by simple reward, and who exhibits predictable behavior. Its programmed psychology is Behaviorism, including classical conditioning [76] and operant conditioning [72]. Similarly, it is a very narrow task with limited interaction and consequences.
4 Conclusion
As technology advances, and virtual leadership becomes the norm, our view of leadership must evolve. Similarly, the ability of machines to exhibit leadership traits needs to be evaluated. Our overarching proposition is that an information system can perform equal to or better than a human at providing transactional leadership to a human follower. We propose to explore these propositions and questions by conducting experiments where leadership is clearly defined and consistently measured [77]. Charles Darwin said, “In the long history of humankind (and animal kind, too) those who learned to collaborate and improvise most effectively have prevailed”. This new phenomenon of machine leadership is part of this next evolution we must understand how it impacts individual and team dynamics. Humans and machines are collaborating in new ways and organizations are increasingly leveraging human-automation teams. Siri (the Apple iPhone conversational assistant), Alexa (Amazon’s conversational agent), physical robots, virtual customer-service agents, and many other pseudo-intelligent agents, use text clues, vocal cues, or other environmental sensors to retrieve information from the user, process it, and respond appropriately. These agents help individuals complete everyday tasks such as find directions, ask for help when ordering goods or services on a website, or understand additional information about a topic or idea. Humans still use automated agents for simple, utilitarian tasks, but these types of assistants are able to undertake larger and more important tasks. While intelligent agents present a potential solution, it is not fully understood about how humans will actually interact with digitized experts or if humans utilize intelligent agents in ways different from traditional human-to-human collaboration.
References
Few, D.A., Bruemmer, D.J., Walton, M.C.: Dynamic leadership for human-robot teams. In: Proceedings of the 1st ACM SIGCHI/SIGART Conference on Human-Robot Interaction, pp. 333–334. ACM, New York (2006)
Jones, H., Hinds, P.: Extreme work teams: using SWAT teams as a model for coordinating distributed robots. In: Proceedings of the 2002 ACM Conference on Computer Supported Cooperative Work, pp. 372–381. ACM, New York (2002)
Stogdill, R.M.: Handbook of Leadership: A Survey of Theory and Research. Free Press, New York (1974)
Northouse, P.G.: Leadership: Theory and Practice. Sage Publications, Thousand Oaks (2018)
Avolio, B.J.: Promoting more integrative strategies for leadership theory-building. Am. Psychol. 62, 25–33 (2007)
Ng, T.W.H.: Transformational leadership and performance outcomes: analyses of multiple mediation pathways. Leadersh. Q. 28, 385–417 (2017)
Yukl, G.: An evaluation of conceptual weaknesses in transformational and charismatic leadership theories. Leadersh. Q. 10, 285–305 (1999)
Deichmann, D., Stam, D.: Leveraging transformational and transactional leadership to cultivate the generation of organization-focused ideas. Leadersh. Q. 26, 204–219 (2015)
McCleskey, J.A.: Situational, transformational, and transactional leadership and leadership development. J. Bus. Stud. Q. 5, 117–130 (2017)
Kanji, G.K., Sã, P.M.E.: Measuring leadership excellence. Total Qual. Manag. 12, 701–718 (2001)
Kuhnert, K.W., Lewis, P.: Transactional and transformational leadership: a constructive/developmental analysis. Acad. Manag. Rev. 12, 648–657 (1987)
Kunhert, K.W.: Transforming leadership: developing people through delegation. In: Bass, B., Avolio, B. (eds.) Improving Organizational Effectiveness Through Transformational Leadership, pp. 10–25. Sage, Thousand Oaks (1994)
Tyssen, A.K., Wald, A., Spieth, P.: The challenge of transactional and transformational leadership in projects. Int. J. Proj. Manag. 32, 365–375 (2014)
Derrick, D.C., Jenkins, J., Nunamaker Jr., J.F.: Design principles for special purpose, embodied, conversational intelligence with environmental sensors (SPECIES) agents. AIS Trans. Hum.-Comput. Interact. 3, 62–81 (2011)
Elkins, A., Derrick, D.: The sound of trust: voice as a measurement of trust during interactions with embodied conversational agents. Group Decis. Negot. 22, 897–913 (2013)
Nunamaker, J.F., Briggs, R.O., Derrick, D.C., Schwabe, G.: The last research mile: achieving both rigor and relevance in information systems research. J. Manag. Inf. Syst. 32, 10–47 (2015)
Nunamaker Jr., J.F., Derrick, D.C., Elkins, A.C., Burgoon, J.K., Patton, M.W.: Embodied conversational agent (ECA) based kiosk for automated interviewing. J. Manag. Inf. Syst. 28, 17–49 (2011)
Burgoon, J.K., Derrick, D.C., Elkins, A.C.: Sociocultural intelligence and intelligent agents. IEEE Intell. Syst. 26, 84–87 (2011)
Antonakis, J., Avolio, B.J., Sivasubramaniam, N.: Context and leadership: an examination of the nine-factor full-range leadership theory using the Multifactor Leadership Questionnaire. Leadersh. Q. 14, 261–295 (2003)
Falbe, C.M., Yukl, G.: Consequences for managers of using single influence tactics and combinations of tactics. Acad. Manag. J. 35, 638–652 (1992)
Hiller, N.J., Day, D.V., Vance, R.J.: Collective enactment of leadership roles and team effectiveness: a field study. Leadersh. Q. 17, 387–397 (2006)
Komaki, J.L., Citera, M.: Beyond effective supervision: identifying key interactions between superior and subordinate. Leadersh. Q. 1, 91–105 (1990)
Mawhinney, T.C.: Effective leadership in superior-subordinate dyads: theory and data. J. Organ. Behav. Manag. 25, 37–78 (2005)
Waldman, D.A., Javidan, M., Varella, P.: Charismatic leadership at the strategic level: a new application of upper echelons theory. Leadersh. Q. 15, 355–380 (2004)
Yukl, G., Tracey, J.B.: Consequences of influence tactics used with subordinates, peers, and the boss. J. Appl. Psychol. 77, 525–535 (1992)
Goltz, S.: A review of J. L. Komaki’s leadership from an operant perspective. J. Organ. Behav. Manag. 25, 73–81 (2005)
Komaki, J.L., Minnich, M.L.R., Grotto, A.R., Weinshank, B., Kern, M.J.: Promoting critical operant-based leadership while decreasing ubiquitous directives and exhortations. J. Organ. Behav. Manag. 31, 236–261 (2011)
Komaki, J.L., Desselles, M.L., Bowman, E.D.: Definitely not a breeze: extending an operant model of effective supervision to teams. J. Appl. Psychol. 74, 522–529 (1989)
Horner, M.: Leadership theory: past, present and future. Team Perform. Manag. 3, 270–287 (1997)
Kayworth, T.R., Leidner, D.E.: Leadership effectiveness in global virtual teams. J. Manag. Inf. Syst. 18, 7–40 (2002)
Locke, E.A., Latham, G.P.: Building a practically useful theory of goal setting and task motivation: a 35-year odyssey. Am. Psychol. 57, 705–717 (2002)
Atkinson, J.: Towards experimental analysis of human motivation in terms of motives, expectancies, and incentives. In: Atkinson, J. (ed.) Motives in Fantasy, Action, and Society, pp. 288–305. Van Nostrand, Princeton (1958)
Berson, Y., Halevy, N., Shamir, B., Erez, M.: Leading from different psychological distances: a construal-level perspective on vision communication, goal setting, and follower motivation. Leadersh. Q. 26, 143–155 (2015)
Larson, J.R., Callahan, C.: Performance monitoring: how it affects work productivity. J. Appl. Psychol. 75, 530–538 (1990)
Brewer, N.: The effects of monitoring individual and group performance on the distribution of effort across tasks. J. Appl. Soc. Psychol. 25, 760–777 (1995)
Aiello, J.R., Kolb, K.J.: Electronic performance monitoring and social context: impact on productivity and stress. J. Appl. Psychol. 80, 339–353 (1995)
Kahn, D.: Impact of electronic performance monitoring on call centre employees performance (2016)
Nicolaou, N.: Electronic performance monitoring: the crossover between self-discipline and emotion management (2015)
Bass, B.M.: Leadership and Performance Beyond Expectations. Free Press, New York (1985)
Bass, B.M.: The Bass Handbook of Leadership – Theory, Research & Managerial Applications. Free Press, New York (2008)
Ashour, A.S., Johns, G.: Leader influence through operant principles: a theoretical and methodological framework. Hum. Relat. 36, 603–626 (1983)
Luthans, F., Rosenkrantz, S.A., Hennessey, H.W.: What do successful managers really do? An observation study of managerial activities. J. Appl. Behav. Sci. 21, 255–270 (1985)
Rost, K.A., Wilmer, D.R., Haas, E.J.: An operant analysis of leadership practices in mining. J. Saf. Health Environ. Res. 11, 234–241 (2015)
Heeter, C.: Being there: the subjective experience of presence. Presence: Teleoper. Virtual Environ. 1, 262–271 (1992)
Biocca, F., Harms, C., Burgoon, J.K.: Toward a more robust theory and measure of social presence: review and suggested criteria. Presence: Teleoper. Virtual Environ. 12, 456–480 (2003)
Nass, C., Steuer, J., Tauber, E.R.: Computers are social actors. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems: Celebrating Interdependence, pp. 72–78. ACM, Boston (1994)
Nass, C., Moon, Y., Morkes, J., Kim, E.-Y., Fogg, B.J.: Computers are social actors: a review of current research. In: Human Values and the Design of Computer Technology, pp. 137–161. Cambridge University Press, Stanford (1997)
Nass, C., Moon, Y.: Machines and mindlessness: social responses to computers. J. Soc. Issues 56, 81–103 (2000)
Hall, B., Henningsen, D.D.: Social facilitation and human-computer interaction. Comput. Hum. Behav. 24, 2965–2971 (2008)
Nass, C., Steuer, J.: Voices, boxes, and sources of messages. Hum. Commun. Res. 19, 504–527 (1993)
Moon, Y., Nass, C.: Are computers scapegoats? Attributions of responsibility in human-computer interaction. Int. J. Hum.-Comput. Stud. 49, 79–94 (1998)
Nass, C., Moon, Y., Fogg, B.J., Reeves, B., Dryer, C.: Can computer personalities be human personalities? In: Conference Companion on Human Factors in Computing Systems, pp. 228–229. ACM, Denver (1995)
Fogg, B.J., Nass, C.: Silicon sycophants: the effects of computers that flatter. Int. J. Hum.-Comput. Stud. 46, 551–561 (1997)
Lee, E.-J.: Flattery may get computers somewhere, sometimes: the moderating role of output modality, computer gender, and user gender. Int. J. Hum.-Comput. Stud. 66, 789–800 (2008)
Xu, K., Lombard, M.: Persuasive computing: feeling peer pressure from multiple computer agents. Comput. Hum. Behav. 74, 152–162 (2017)
Greenwald, S.W., Wang, Z., Funk, M., Maes, P.: Investigating social presence and communication with embodied avatars in room-scale virtual reality. In: Beck, D., Allison, C., Morgado, L., Pirker, J., Khosmood, F., Richter, J., Gütl, C. (eds.) Immersive Learning Research Network. CCIS, pp. 75–90. Springer, Cham (2017). https://doi.org/10.1007/978-3-319-60633-0_7
Lu, B., Fan, W., Zhou, M.: Social presence, trust, and social commerce purchase intention: an empirical research. Comput. Hum. Behav. 56, 225–237 (2016)
Katsyri, J., Sams, M.: The effect of dynamics on identifying basic emotions from synthetic and natural faces. Int. J. Hum.-Comput. Stud. 66, 233–242 (2008)
Bailenson, J.N., Swinth, K., Hoyt, C., Persky, S., Dimov, A., Blascovich, J.: The independent and interactive effects of embodied-agent appearance and behavior on self-report, cognitive, and behavioral markers of copresence in immersive virtual environments. Presence: Teleoper. Virtual Environ. 14, 379–393 (2005)
Fox, J., Ahn, S.J., Janssen, J.H., Yeykelis, L., Segovia, K.Y., Bailenson, J.N.: Avatars versus agents: a meta-analysis quantifying the effect of agency on social influence. Hum.-Comput. Interact. 30, 401–432 (2015)
Gratch, J., Rickel, J., Andre, E., Cassell, J., Petajan, E., Badler, N.: Creating interactive virtual humans: some assembly required. IEEE Intell. Syst. 17, 54–63 (2002). https://doi.org/10.1109/mis.2002.1024753
Gratch, J., Okhmatovskaia, A., Lamothe, F., Marsella, S., Morales, M., van der Werf, R.J., Morency, L.-P.: Virtual rapport. In: Gratch, J., Young, M., Aylett, R., Ballin, D., Olivier, P. (eds.) IVA 2006. LNCS, vol. 4133, pp. 14–27. Springer, Heidelberg (2006). https://doi.org/10.1007/11821830_2
Bailenson, J.N., Yee, N.: Digital chameleons: automatic assimilation of nonverbal gestures in immersive virtual environments. Psychol. Sci. 16, 814–819 (2005)
McBreen, H.M., Jack, M.A.: Evaluating humanoid synthetic agents in e-retail applications. IEEE Trans. Syst. Man Cybern. Part A Syst. Hum. 31, 394–405 (2001)
Kenny, P., Parsons, T.D., Gratch, J., Rizzo, A.A.: Evaluation of Justina: a virtual patient with PTSD. In: Prendinger, H., Lester, J., Ishizuka, M. (eds.) IVA 2008. LNCS, vol. 5208, pp. 394–408. Springer, Heidelberg (2008). https://doi.org/10.1007/978-3-540-85483-8_40
Deng, Z., Bailenson, J., Lewis, J.P., Neumann, U.: Perceiving visual emotions with speech. In: Gratch, J., Young, M., Aylett, R., Ballin, D., Olivier, P. (eds.) IVA 2006. LNCS, vol. 4133, pp. 107–120. Springer, Heidelberg (2006). https://doi.org/10.1007/11821830_9
Gratch, J., Okhmatovskaia, A., Lamothe, F., Marsella, S., Morales, M., van der Werf, R.J., Morency, L.-P.: Virtual rapport. In: Gratch, J., Young, M., Aylett, R., Ballin, D., Olivier, P. (eds.) IVA 2006. LNCS, vol. 4133, pp. 14–27. Springer, Heidelberg (2006). https://doi.org/10.1007/11821830_2
Derrick, D.C., Ligon, G.S.: The affective outcomes of using influence tactics in embodied conversational agents. Comput. Hum. Behav. 33, 39–48 (2014)
Pickard, M.D., Burgoon, J.K., Derrick, D.C.: Toward an objective linguistic-based measure of perceived embodied conversational agent power and likeability. Int. J. Hum.-Comput. Interact. 30, 495–516 (2014)
Hess, T., Fuller, M., Cambell, D.: Designing interfaces with social presence: using vividness and extraversion to create social recommendation agents. J. Assoc. Inf. Syst. 10, 889–919 (2009)
Merritt, S.M., Ilgen, D.R.: Not all trust is created equal: dispositional and history-based trust in human-automation interactions. Hum. Factors 50, 194–210 (2008)
Skinner, B.F.: The phylogeny and ontogeny of behavior. Science 153, 1205–1213 (1966)
Briggs, R.O., Reinig, B.A., de Vreede, G.-J.: Meeting satisfaction for technology-supported groups: an empirical validation of a goal attainment model. Small Group Res. 37, 585–611 (2006)
Graen, G.B., Uhl-Bien, M.: Relationship-based approach to leadership: development of leader-member exchange (LMX) theory of leadership over 25 years: applying a multi-level, multi-domain perspective. Leadersh. Q. 6, 219–247 (1995)
Bente, G., Rüggenberg, S., Krämer, N.C., Eschenburg, F.: Avatar-mediated networking: increasing social presence and interpersonal trust in net-based collaborations. Hum. Commun. Res. 34, 287–318 (2008)
Pavlov, P.I.: Conditioned reflexes: an investigation of the physiological activity of the cerebral cortex. Ann. Neurosci. 17, 136–141 (2010)
Pfeffer, J.: The ambiguity of leadership. Acad. Manag. Rev. 2, 104–112 (1977)
Author information
Authors and Affiliations
Corresponding authors
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2018 Springer International Publishing AG, part of Springer Nature
About this paper
Cite this paper
Derrick, D.C., Elson, J.S. (2018). Automated Leadership: Influence from Embodied Agents. In: Nah, FH., Xiao, B. (eds) HCI in Business, Government, and Organizations. HCIBGO 2018. Lecture Notes in Computer Science(), vol 10923. Springer, Cham. https://doi.org/10.1007/978-3-319-91716-0_5
Download citation
DOI: https://doi.org/10.1007/978-3-319-91716-0_5
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-91715-3
Online ISBN: 978-3-319-91716-0
eBook Packages: Computer ScienceComputer Science (R0)