Skip to main content

Behavioral Telemetry in Games User Research

  • Chapter
  • First Online:
Game User Experience Evaluation

Part of the book series: Human–Computer Interaction Series ((HCIS))

Abstract

Within the past few years the adoption of business analytics has provided powerful new tools to the interactive entertainment industry, giving rise to the field of game analytics. Where traditionally user testing was limited to samples, it is today possible to obtain behavioral telemetry data from entire populations of players, and to map second-by-second interactions in the user testing lab. In this chapter, the focus is on the behavioral side of user experience in games, rather than user experience itself. The chapter outlines what behavioral telemetry is and its role in game user research from an introductory, top-down perspective. The chapter also introduces data mining as a toolbox that is available for analyzing large or small telemetry datasets. Finally, several case studies are used to showcase examples of how behavioral telemetry data can be used to evaluate game design in order to optimize the user experience.

The Pagonis Network

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

eBook
USD 16.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 109.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  • Amsel A (1990) Arousal, suppression, and persistence: frustration theory, attention, and its disorders. Cognit Emot 4:239–268 (Psychology Press)

    Article  Google Scholar 

  • Berry M, Linoff G (1999) Mastering data mining: the art and science of customer relationship management. Wiley, New York

    Google Scholar 

  • Bohannon J (2010) Game-miners grapple with massive data. Science 330(6000):30–31

    Google Scholar 

  • Bordens K, Abbott BB (2013) Research design and methods: a process approach. McGraw-Hill, New York (Humanities/Social Sciences/Languages)

    Google Scholar 

  • Canossa A, Drachen A (2009) Patterns of play: play-personas in user-centered game development. In: Proceedings of DIGRA 2009, (London, United Kingdom), DiGRA Publishers, DIGRA Digital Library: http://www.digra.org/dl/display_html?chid=http://www.digra.org/dl/db/09287.49165.pdf. Accessed 1 April 2015

  • Canossa A, Drachen A, Rau Møller Sørensen J (2011) Arrgghh!!!—blending quantitative and qualitative methods to detect player frustration. In: Proceedings of the 2011 Foundations of Digital Games Conference (Bordeaux, France), ACM Publishers

    Google Scholar 

  • Davenport TH (2012) Enterprise analytics: optimize performance, process and decisions through big data. Pearson FT Press, Boston

    Google Scholar 

  • Davenport TH, Harris JG (2007) Competing on analytics: the new science of winning. Harvards Business Review, Boston

    Google Scholar 

  • Demers MN (2008) Fundamentals of geographical information systems. Wiley, New York

    Google Scholar 

  • DeRosa P (2007) Tracking player feedback to improve game design. Gamasutra, August 7 (2007). http://www.gamasutra.com/view/feature/1546/tracking_player_feedback_to_.php. Accessed 1 April 2015

  • Dix A, Finlay JE, Abowd GD, Beale R (2007). Human-computer interaction, 3rd edn. Prentice Hall Publishers, Upper Saddle River

    Google Scholar 

  • Drachen A, Canossa A (2009) Towards gameplay analysis via gameplay metrics. In: Proceedings of the 13th MindTrek 2009 (Tampere, Finland), ACM-SIGCHI Publishers, pp 202–209

    Google Scholar 

  • Drachen A, Canossa A (2011) Evaluating motion. Spatial user behavior in virtual environments. Int J Arts Technol 4(3):294–314

    Article  Google Scholar 

  • Drachen A, Schubert M (2013) Spatial game analytics and visualization. In: Proceedings of IEEE Computational Intelligence in Games, pp 1–8. doi:10.1109/CIG.2013.6633629

    Google Scholar 

  • Drachen A, Yannakakis GN, Canossa A, Togelius J (2009) Player Modeling using Self-Organization in Tomb Raider: Underworld. In: Proceedings of IEEE Computational Intelligence in Games

    Google Scholar 

  • Drachen A, Nacke L, Yannakakis G, Pedersen AL (2010) Correlation between heart rate, electrodermal activity and player experience in First-Person Shooter games. In: Proceedings of the 5th ACM SIGGRAPH, ACM-SIGGRAPH Publishers, pp 49–54. doi:10.1145/1836135.1836143

    Google Scholar 

  • Drachen A, Sifa R, Bauckhage C, Thurau C (2012) Guns, swords and data: clustering of player behavior in computer games in the wild. In: Proceedings of IEEE Computational Intelligence in Games, pp 163–170. doi:10.1109/CIG.2012.6374152

    Google Scholar 

  • Drachen A, Canossa A, Sørensen JR (2013a) Gameplay metrics in games user research: examples from the trenches. In: Seif-El Nasr M, Drachen A, Canossa A (eds) Game analytics. Springer, London

    Google Scholar 

  • Drachen A, Seif El Nasr M, Canossa A (2013b) Introduction to user analytics. May 2013, Game Developer Magazine (feature, later published on Gamasutra). http://www.gamasutra.com/view/feature/193241/intro_to_user_analytics.php

  • Drachen A, Seif El-Nasr M, Canossa A (2013c) Game analytics—the basics. In: Seif El-Nasr M, Drachen A, Canossa A (eds) Game analytics—maximizing the value of player data. Springer, London, pp 13–40 (http://www.springer.com/computer/hci/book/978-1-4471-4768-8)

    Google Scholar 

  • Drachen A, Thurau C, Yannakakis G, Togelius J, Bauckhage C (2013d) Game data mining. In: Seif El-Nasr M, Drachen A, Canossa A (eds) Game analytics—maximizing the value of player data. Springer, London, pp 205–253 (http://www.springer.com/computer/hci/book/978-1-4471-4768-8)

  • Drachen A, Baskin S, Riley J, Klabjan D (2014) Going out of business: auction house behavior in the massively multi-player online game glitch. J Entertain Comput 5:20–31. doi:10.1016/j.entcom.2014.02.001 ([SJR H-index=5], Elsevier Publishers)

    Google Scholar 

  • Field T, Cotton B (2011) Social game design: monetization methods and mechanics. Morgan Kauffman Publishers, Burlington

    Google Scholar 

  • Hadiji F, Sifa S, Drachen A, Thurau C (2014) Predicting player churn in the wild. In: Proceedings of the IEEE Computational Intelligence in Games, pp 131–139

    Google Scholar 

  • Han J, Kamber M, Pei J (2005) Data mining: concepts and techniques, 2nd edn. Morgan Kaufmann, Amsterdam

    Google Scholar 

  • Hassenzahl M, Tractinsky N (2006) User experience—a research agenda. Behav Inf Technol 25(2):91–97

    Article  Google Scholar 

  • Houghton S (2011) Balance and Flow Maps. AltDevBlogADay Newssite. http://altdevblogaday.com/2011/06/01/balance-and-flow-maps-2/. Accessed 1 April 2015

  • Isbister K, Schaffer N (2008) Game usability: advancing the player experience. Morgan Kaufman Publishers, Massachusetts

    Google Scholar 

  • Kaiser S (2013) Wooga: building a successful social game by combining metrics with emotion Social Times, July 20, 2011. http://www.adweek.com/socialtimes/wooga-building-a-successful-social-game-by-combining-metrics-with-emotion/585918

  • Kennerly D (2003) Better game design through data mining. Gamasutra, 15th August 2003. http://www.gamasutra.com/view/feature/2816/better_game_design_through_data_.php. Accessed 1 April 2015

  • Kiel M, Gilleade KM, Dix A (2004) Using frustration in the design of adaptive videogames. In: Proceeding of the 2004 ACM SIGCHI International Conference on Advances in computer entertainment technology, ACM New York, NY, USA

    Google Scholar 

  • Kim JH, Gunn DV, Phillips BC, Pagulayan RJ, Wixon D (2008) Tracking real-time user experience (TRUE): a comprehensive instrumentation solution for complex systems. In: Proceedings of the twenty-sixth annual SIGCHI conference on Human factors in computing systems, CHI’08

    Google Scholar 

  • Larose DT (2004) Discovering knowledge in data: an introduction to data mining. Wiley, Hoboken

    Book  Google Scholar 

  • Law E, Vermeeren APOS, Hassenzahl M, Blythe M (2007) Towards a UX manifesto. In Proceedings of the 21st British HCI Group Annual Conference on HCI 2008: People and Computers XXI: HCI...but not as we know it - Volume 2 (Swinton, UK, September 3–7). British Computer Society, pp 205–206

    Google Scholar 

  • Lazzaro N, Mellon L (2005) Fun meters for games. Presentation at the 2005 Austin Game Developers Conference

    Google Scholar 

  • Leone M (2012) Data entry, risk management and tacos: inside Halo 4’s playtest labs. Polygon, October 24th 2012. http://www.polygon.com/2012/10/24/3538296/data-entry-risk-management-and-tacos-inside-halo-4s-playtest-labs. Accessed 1 April 2015

  • Luton W (2013) Free-to-play: making money from games you give away. New Riders, California

    Google Scholar 

  • Mahlman T, Drachen A, Canossa A, Togelius J, Yannakakis GN (2010) Predicting player behavior in tomb raider: Underworld. In: Proceedings of the International Conference on Computational Intelligence and Games, CIG’10

    Google Scholar 

  • Mandryk RL (2008) Physiological measures for game evaluation. In: Isbister K, Schaffer N (eds) Game usability: advice from the experts for advancing the player experience. Morgan Kaufman Publishers, Burlington, pp 207–235

    Google Scholar 

  • Mandryk RL, Atkins MS, Inkpen KM (2006) A continuous and objective evaluation of emotional experience with interactive play environments. In: Proceedings of CHI 2006: Novel methods: Emotions, Gestures, Events, ACM Press

    Google Scholar 

  • Marsh T, Smith SP, Yang K, Shahabi C (2006) Continuous and unobtrusive capture of user-player behavior and experience to assess and inform game design and development. In: Proceedings of Fun and Games, pp 76–86

    Google Scholar 

  • Medler B (2012) Play with data—an exploration of play analytics and its effect on player experiences, Ph.D. dissertation. Georgia Institute of Technology

    Google Scholar 

  • Medlock MC, Wixon D, Terrano M, Romero RL, Fulton B (2002) Using the RITE method to improve products: a definition and a case study. Usability Professionals Association, Orlando

    Google Scholar 

  • Mellon L (2009) Applying metrics driven development to mmo costs and risks. Versant Corporation, Redwood City

    Google Scholar 

  • Minelli M, Chambers M, Dhiraj A (2013) Big data, big analytics: emerging business intelligence and analytic trends for today’s businesses. Wiley, Hoboken

    Google Scholar 

  • Nacke L (2009) Affective ludology: scientific measurement of user experience in interactive entertainment. Ph. D. Thesis, BTH, Karlskrona

    Google Scholar 

  • Nacke L, Drachen A (2011) Towards a framework of player experience research. In: Proceedings of the EPEX 11’ Workshop (Bordeaux, France)

    Google Scholar 

  • Pagulayan RJ, Keeker K (2007) Measuring pleasure and fun: playtesting. In: Wilson C (ed) Handbook of formal and informal interaction design methods. Morgan Kaufmann Publishers, Massachusetts

    Google Scholar 

  • Pagulayan RJ, Keeker K, Wixon D, Romero RL, Fuller T (2003) User-centered design in games. In: Jacko J, Sears A (eds) The HCI Handbook. Lawrence Erlbaum Associates, Mahwah

    Google Scholar 

  • Pruett C (2011) Hot failure: tuning gameplay with simple player metrics. Game developer magazine, September 2010. http://www.gamasutra.com/view/feature/6155/hot_failure_tuning_gameplay_with_.php?print=1. Accessed 1 April 2015

    Google Scholar 

  • Ravaja N, Saari T, Turpeinen M, Laarni J, Salminen M, Kivikangas M (2006) Spatial presence and emotions during video game playing: does it matter with whom you play? Presence 15(4):381–392. (Teleoperators and Virtual Environments)

    Article  Google Scholar 

  • Romero R (2008) Successful instrumentation. Tracking attitudes and behaviors to improve games. Presentation at the Game Developers Conference

    Google Scholar 

  • Runge J, Gao P, Garcin F, Faltings B (2014) Churn prediction for high-value players in casual social games. In: Proceedings of the Computational Intelligence in Games Conference 2014, IEEE Publishers

    Google Scholar 

  • Seif El-Nasr M, Drachen A, Canossa A (2013) Game analytics. Springer, 800 pp, ISBN: 978-1-4471-4769-5. http://www.springer.com/computer/hci/book/978-1-4471-4768-8

  • Seufert EB (2014) Freemium economics: leveraging analytics and user segmentation to drive revenue. Morgan Kaufmann Publishers, Waltham

    Google Scholar 

  • Shaul B (2013) Monster world developer Wooga looks back at three years of success on facebook. SocialTimes, August 14, 2013. http://www.adweek.com/socialtimes/monster-world-developer-wooga-looks-back-at-three-years-of-success-on-facebook/608648. Accessed 1 April 2015

  • Sifa R, Bauckhage C, Drachen A (2014) The playtime principle: large-scale cross-games interest modeling. In: Proceedings of the IEEE Computational Intelligence in Games, pp 365–373

    Google Scholar 

  • Southey F, Xiao G, Holte RC, Trommelen M, Buchanan J (2005) Semi-Automated gameplay analysis by machine learning. In: Proceedings of AIIDE

    Google Scholar 

  • Thawonmas R, Iizuka K (2008) Visualization of online-game players based on their action behaviors. Int J Comput Games Technol. doi:10.1155/2008/906931

    Google Scholar 

  • Thawonmas R, Kashifuji Y, Chen KT (2008) Design of MMORPG Bots Based on Behavior Analysis. In: Proceedings of the 2008 International Conference on Advances in Computer Entertainment Technology, ACE’08

    Google Scholar 

  • Thompson C (2007) Halo 3: how Microsoft labs invented a new science of play. Wired Magazine

    Google Scholar 

  • Thurau C, Bauckhage C (2010) Analyzing the evolution of social groups in world of Warcraft. In: Proceedings of the International Conference on Computational Intelligence and Games, IEEE, CIG’10

    Google Scholar 

  • Tychsen A, Canossa A (2008) Defining personas in games using metrics. In: Proceedings of FUTURE PLAY 2008 (Toronto, Canada), ACM Publishers, pp 73–80. doi:10.1145/1496984.1496995

    Google Scholar 

  • Weber BGM, Mateas JM, Jhala A (2011) Modeling player retention in madden NFL 11. In: Proceedings of IAAI

    Google Scholar 

  • Williams D (2014) Zombie epidemics and you, Gamasutra, 3rd February 2014, Gamasutra. http://www.gamasutra.com/blogs/DmitriWilliams/20140203/209921/Zombie_Epidemics_and_You.php. Accessed 1 April 2015

  • Williams D (2015) Prediction in the gaming industry, Part 2: prediction and gaming (Or, How to Know Your Players), Gamasutra, 2nd April 2015. http://www.gamasutra.com/blogs/DmitriWilliams/20150204/235660/Prediction_in_the_Gaming_Industry_Part_2_Prediction_and_Gaming_Or_How_to_Know_Your_Players.php. Accessed 1 April 2015

  • Wooga (2013) Monster world: keeping players engaged for more than 3 years, August 2013. http://www.wooga.com/2013/08/monster-world-keeping-players-engaged-for-more-than-3-years/. Accessed 1 April 2015

  • Yannakakis GN (2012) Game AI revisited. In: Proceedings of ACM Computing Frontiers ­Conference

    Google Scholar 

  • Zoeller G (2011) MMO rapid content iteration. Presentation at Game Developers Conference 2011. http://gdc.gulbsoft.org/. Accessed 1 April 2015

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Anders Drachen .

Editor information

Editors and Affiliations

Conclusions

Conclusions

Behavior analysis via gameplay metrics analysis addresses one of the major challenges to game user research, namely that of tracking and analyzing user behavior when interacting with the very complex systems that contemporary computer games represent. As a user-oriented approach, it complements existing methods utilized in the industry, providing detailed and quantitative data to supplement qualitative and semi-quantitative data from other user research methods on the quality of the play experience (Lazzaro and Mellon 2005; Kim et al. 2008; Isbister and Schaffer 2008; Drachen and Canossa 2011; Seif El-Nasr et al. 2013). Alternatively, behavior analysis can be used as a method for inferring player experience, although care need to be taken when doing so and secondary validation is recommended (Lazzaro and Mellon 2005; Marsh et al. 2006; Nacke and Drachen 2011; Drachen and Canossa 2011).

There are many ways to work with and utilize behavioral telemetry data, during production and post-launch, and it can be challenging to generate systematic overviews of methods and principles: The choice of approach is influenced by the available resources for telemetry logging, user research and not the least the game design. For example, traditional boxed, fire-and-forget games are different from persistent world massively multi-player online games and social online games, where an important focus is analysis and synthesis directed at monitoring the player population, tuning game design on a running basis, continually optimizing the play experience to tune game design on a running basis, and calculating business-oriented game metrics such as the average revenue per user and churn rate (Fields and Cotton 2011). Even within the confines of a single type of game, e.g. first-person shooters, there is a substantial space for developing approaches to working with gameplay metrics. On top of this come the problems in correlating behavioral data with robust and broadly applicable measures of play experience (Kim et al. 2008). Despite these challenges, there is a general consensus forming in the game industry and game academia that telemetry-driven behavior analysis mesh well with other user-oriented approaches for evaluating and testing games (Kim et al. 2008; Zoeller 2011; Seif El-Nasr et al. 2013; Luton 2013; Seufert 2014). Game analytics is here to stay.

Box 1: Drill-Down Analysis

The first analyses done on gameplay telemetry are typically high-level analyses and descriptive statistics, meant to provide an overview of how the game is doing. For example, looking at the number of player entering and leaving the game (if a live game), or the amount of time spent across different levels. These types of high-level views on datasets from pre-launch playtesting or post-launch live games provides one of the early means for detecting potential problems. However, they rarely show why specific problems occur. For example, when players are spending twice as much time on level 5 as intended, something we did not plan for is happening. To find out why, it is necessary to dive down into the data and find the root causes of player ­behavior. This process is known as drill-down analysis” and is a core method in data mining (Larose 2004; Han et al. 2005; Drachen et al. 2012).

Drill-down analysis is one of the most common forms of operations carried out in game analytics, and the term is commonly heard in game data mining contexts.

This is because the root causes of behavioral patterns are often nested at deeper levels of the behavioral data than what is apparent at high-level aggregate analyses. For example, a specific checkpoint malfunctioning for 25 % of the players, or only on some platforms. Drill down analysis, as a data mining operation, basically means moving from summary/aggregate information to detailed data via a particular focus. For example, noticing that a group of players have suspiciously high gold income rates in an MMORPG, and working down through the summary data to investigate the raw data of these particular player, to figure out if cheating is happening or not. When performing drill-down, we are essentially performing analysis on the parent attribute (e.g. gold income rate), via investigating child attributes (e.g. time spent playing, sources of gold income). At the lowest level of a drill-down analysis are the raw data

In practice, when using business intelligence applications, drilling down is performed by selecting and querying data. This can be as simple as clicking a bar in a bar chart and getting the underlying data, to running queries in a relational database or writing scripts for a particular drill-down path. How far drill-down analysis can be taken depends on data granularity.

To take an example (Fig. 7.9) consider a simple breakdown of data consisting of a few variables is considered, e.g. average completion times for the levels of a game. At this top level, it may be noticed that a level appears to take longer to complete than others (see graphic). In order to explore why, the underlying data need to be exposed, in this case a breakdown of the completion times for individual components of the level. In this more detailed view of the data, it may be noticed that a specific sector of the level is where the players spend a lot of time (Main Hall). Drilling down further into the metrics data captured from the level, it may be found that the root cause is that players have trouble beating a specific type of enemy and keep dying (Evil Conductors), whose difficulty can be adjusted to accommodate.

If the cause of the observed pattern is not obvious from looking at the data, it can be useful to consider the actual game environment, as in the Halo cases mentioned in this chapter. This is about the closest we can get to the actual experience of playing the game, for example via mapping player telemetry directly into the virtual environment (Drachen and Canossa 2011).

Box 2: Managing the Allure of Numbers

For people working with game user research, behavioral telemetry is usually valuable. They form potentially highly detailed data about player behavior, and can be obtained from very large groups of people. Quantitative data take the form of numbers, and many popular metrics calculated in the game industry and –academia are intuitively understandable, e.g. Monthly Active Users (MAU) (Luton 2013; Fields and Cotton 2011; Seif El-Nasr et al. 2013). It can therefore be tempting to rely on behavioral telemetry alone following the launch of a game, and ignore lab- or field-based user research (see e.g.: Pagulayan et al. 2003; Isbister and Schaffer 2008). However, this is dangerous because behavioral telemetry does not permit analysis of e.g. user experience beyond what can be inferred from the telemetry trails themselves.

In game analytics, is it important to keep in mind that data can only say something about the data themselves, not about what other data that should be collected, or factors existing outside the dataset, which impact the results being generated. Furthermore, any measures of human behavior cannot include all the factors that influence the behavior. Assumptions therefore need to be made when working with behavioral telemetry in GUR just like for any other data source on human behavior and PX in games.

Strong knowledge of the principles for empirical research and knowledge acquisition are essential to the successful deployment of behavioral game telemetry in game user research.

Many common KPIs used in the game industry require contextual information to be used to drive decision making. For example, knowing that the number of Daily Active Users (DAU) increased 20 % last week, does not explain why this increase occurred, or whether it is sustainable, and it can even hide a problem—for example that the increase is due to a new feature, which however removes a key motivator for IAPs, thus reducing the Life Time Value (LTV) of the player base in the same period of time, endangering the financial survival of the game.

As noted by Drachen et al. (2012) critical thinking should always be applied when evaluating player behavior—sometimes the analysis will show one thing through a red color on a heat map or another suspicious pattern in the data, but the problem may actually rest in a minor design detail somewhere else. Small design changes in one area of a game can cause changes in behaviors in an entirely different section. Just because something looks good does not mean it is true, and this is why careful analysis is needed when suspicious or problematic patterns are detected in behavioral data. Heat maps and graphs are often intuitively understandable and travel better in organizations than two pages of text with detailed explanation of a specific finding from a comprehensive user test. However, data visualizations also make it easy to ignore other factors that could potentially hold an impact on whatever is being investigated, but which is not included in the metrics-based analysis in question. Metrics analysis, data mining, etc. all requires a human element—the numbers do not make things right, human interpretation does (Han et al. 2005). The key lesson here is that game analytics is not design (Fields and Cotton 2011; Seif El-Nasr et al. 2013), but can be an incredibly help to design, providing hard evidence on how design decisions are affecting player behavior. Just like any other method for game user research, good research practices should inform game analytics. (© Game Analytics, used with permission, www.gameanalytics.com)

Rights and permissions

Reprints and permissions

Copyright information

© 2015 Springer International Publishing Switzerland

About this chapter

Cite this chapter

Drachen, A. (2015). Behavioral Telemetry in Games User Research. In: Bernhaupt, R. (eds) Game User Experience Evaluation. Human–Computer Interaction Series. Springer, Cham. https://doi.org/10.1007/978-3-319-15985-0_7

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-15985-0_7

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-15984-3

  • Online ISBN: 978-3-319-15985-0

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics