Advertisement

Design and development methodologies of Kkongalmon, a location-based augmented reality game using mobile geographic information

  • DongHyun Youm
  • SangHyun Seo
  • Jung-Yoon KimEmail author
Open Access
Research
Part of the following topical collections:
  1. Real-time Image and Video Processing in Embedded Systems for Smart Surveillance Applications

Abstract

Augmented reality is a rapidly growing area of interactive design where it allows virtual contents to be seamlessly integrated with displays of real-world scenes. Along with the meteoric rise of smart mobile devices capable of producing interesting augmented reality environments, the field of interactive game contents has been explored. Pokémon GO, which is released by Niantic, is one of the representative game contents that are implemented using a location-based augmented reality (AR) technology. Even though it is implemented on the basis of the famous and powerful intellectual property (IP) which is Pokémon, the game can be said to be a good example that shows how location-based AR technology using mobile geographic information is being utilized. In the Korean game industry, however, AR-based games are not being paid much attention to. This paper aims at describing the current situation of and future prospects for AR technology and identifying location-based AR game development methods, problems, and improvements by investigating Kkongalmon. In addition, this paper presents efficient development methodologies including design guidelines, tools, and interfaces in AR applications using mobile geographic information. This will enable the improvement of understanding of user experience in mobile AR applications.

Keywords

Location-based game Augmented reality Mobile geographic information Development methodology 

Abbreviations

AR

Augmented reality

GPS

Global Positioning System

HMD

Head-mounted display

IP

Intellectual Property

MIT

Massachusetts Institute of Technology

PDA

Personal digital assistant

RFID

Radio frequency identification

VE

Virtual environment

VR

Virtual reality

1 Introduction

As a field of virtual reality (VR), augmented reality (AR) is a technology that superimposes a computer graphically generated virtual world in the real world viewed by users. Compared to VR which is comprised of artificial and virtual objects only, AR technology allows users to feel more sense of immersion and reality because it is based on the real world. Since the real and virtual worlds are mixed together in AR technology, it is sometimes introduced as a context of mixed reality. To use the AR technology, there is no need to wear a head-mounted display (HMD) nor need to use high-performance devices. Hence, it has less restriction in terms of usability. Since AR contents can be easily enjoyed using smartphones, it is highly expected to grow along with its advantages like mobility and personalization. According to VR/AR installed base by Digi-Capital as shown in Fig. 1, mobile AR is growing higher than other platforms from 2018, and this growth is likely to increase. Interestingly, according to the VR/AR sector revenue by Digi-Capital as shown in Fig. 2, most of the revenues in AR are expected to occur at hardware and e-commerce. Therefore, a critical challenge for stimulating AR-based games is to develop and deploy killer contents for which AR is being used.
Fig. 1

VR/AR installed base (M). The VR/AR installed base on mobile from 2016 to 2021

Fig. 2

AR/VR sector revenue. The VR/AR sector revenue on mobile from 2017 to 2022

AR technology augments the sense of reality by superimposing virtual objects and cues upon the real world in real time. Based on Azuma’s study [1], AR is not considered to be restricted to a particular type of display technologies such as HMD nor is it limited to the sense of sight. AR has the potential to be applied to all senses, augmenting smell, touch, and hearing as well. AR can also be used to augment or substitute users’ missing senses by sensory substitution, such as augmenting the sight of blind users or users with poor vision by the use of audio cues or augmenting the hearing of deaf users by the use of visual cues.

There is an exponential rise in the capabilities and features of mobile devices like smartphones, tablets, and wearables combined with the ubiquitous and affordable Internet access and the advances in the areas of cooperative networking, computer vision, and mobile cloud computing. Although mobile devices are more constrained computational-wise than traditional computers, they have a multitude of sensors that can be used to the development of more sophisticated AR applications.

Following this, Section 2.2 introduces the related works and Section 2.3 describes the global situation of AR-based games’ development. Section 3 summarizes AR game development methodologies based on an investigation of practical cases and identifies the significant consequences on an efficient development of AR games. Also, the design guidelines, development tools, and interfaces are presented.

2 Methods

2.1 Aim

This paper aims at describing the current situation of and future prospects for AR technology and identifying location-based AR game development methods, problems, and improvements by investigating Kkongalmon. In addition, this paper presents efficient development methodologies including design guidelines, tools, and interfaces in AR applications using mobile geographic information. This will enable the improvement of understanding of user experience in mobile AR applications.

Kkongalmon (Chungkang college of cultural industries and Yong-in City) is a tourism guide game where it provides location-based augmented reality user experience. It allows the users to use camera, GPS, and Mobile Geographic Information in order to get location information on where to look for a monster.

2.2 Related research

As for the previous works related to augmented reality game contents development, this section reviews and summarizes three things, namely, mobile AR for context recognition, stable object-tracking algorithm, and ways to generate virtual space for information sharing.

Augmented reality and mobile game contents are expected to become of widespread use over the next two years. More AR applications are developed especially with mobile applications, such as Wikitude AR Travel Guide launched in 2008. Also, the Massachusetts Institute of Technology (MIT) has developed its 6th sense prototype as well.

Nowadays, with the new advances in technology, an increasing amount of AR systems and applications are produced, which promises to revolutionize mobile AR [2, 3]. In particular, a study on context recognition AR was published by Hong and Woo [4]. In their study, they introduced research trends on novel computing concept which support seamless interaction with users by collecting, managing, and utilizing context information in people’s surroundings through their personal mobile device in ubiquitous smart space, augmenting service, and/or contents to their real space in accordance to their context, selectively sharing them and enabling interaction and collaboration with others. Lee presented development of a stable object-tracking algorithm in a mobile environment and the ways to generate virtual space for sharing information among multiple other users and discussed how these would be applied to an emotional communication between users using objects which are trackable in the practical application [5].

Kim published the study trends on mobile AR focusing on marker-based AR and natural image processing-based AR schemes for mobile AR technology implementation for mobile game [6]. The application of AR games extends wireless games towards being more bound up with real locations and activities and takes advantage of the real-world context [7]. AR-based real-world games composed of real and virtual game elements create new and exciting gaming experiences for highly motivated learning [8]. The concepts of these types of gaming are being used with a wide range of technology. It spans mobile handheld-based systems running on a personal digital assistant (PDA) or a smart phone as well as AR systems comprising a backpacked laptop and a head-mounted display to augment reality [9, 10]. In addition, core technologies, such as Global Positioning System (GPS), portable displays, radio frequency identification (RFID) reader, or augmented devices such as a smart phone’s Bluetooth, Infrared, or camera, are an integral part of almost any mobile devices. As a result of the rapidly evolving and increasing pervasiveness of smart phones, AR is viewed to become ubiquitous for leisure and mobile gaming [11].

To overcome the difficulty of calibration between the camera and the tracking device as well as to resolve the problem that markers closely positioned in the view of users affect the user’s sense of immersion in ARPushPush case, Kim suggested an enhanced indoor environment AR system using simplified calibration and tracking hand gestures and markers at the ceiling [12]. Moon presented a system in which a user can manipulate virtual objects augmented in the game space with the hands with use of a marker and HMD in an AR-based game study maximizing interaction between the virtual object and the hands [13].

2.3 Global status of AR game development

Niantic is a game development company which mainly produces AR-based games. The game Niantic has developed, called “Pokémon GO” gets massive popularity. The game is implemented by adopting a location-based system with monsters which appear on a real map, letting users search for and collect the monsters by themselves. British TV documentary series with a theme of science, “The Future is Wild” has opened an exhibition at the Futuroscope theme park in France in 2008 in which visitors can experience a tour in the future world using an AR-applied telescope.

A successful mobile AR system should enable users to focus on its application rather than its implementation [14]. The popular Pokemon GO, for example, is a well-known mobile AR application that offers a location-based AR mobile game experience. Pokemon GO shares many features with a previous similar mobile AR application named Ingress, and it gained huge popularity on the first days after its release, by generating almost two million US dollars’ revenue per day.

Niantic has launched the Pokémon GO service in Korea. In seven days after its launch, the number of downloads has reached 7,570,000, so its popularity continues in Korea as well. Owing to the widespread popularity of Pokémon GO, location-based monster collection games similar to Pokémon GO have emerged in Korea. One of the games is “Catchmon” developed by Mgame in which users catch monsters and summon them in a form of a card. Another example is “SoulCatcher AR” of Hanbit Soft, which is also a location-based monster collection game. Allm, one of the Korean game companies, has developed an AR-dedicated app “Joy Lotty” that has AR contents enjoyable in conjunction with a theme park and is designed to be played at Lotte World Adventure.

Generally, theme parks are providing various AR games and mini-games at each attraction in a way of relieving boredom while visitors are waiting for their turns in the long queue. Various ways are used to increase usage of game applications by offering “free Magic pass” for reducing waiting time corresponding to their game score and by giving free a food/drink coupon or merchandise discount coupon for finding a hidden marker around the theme park.

2.3.1 AR game composition

Mobile AR applications are mostly run on mobile or wearable devices, such as smartphones, smart glasses, tablets, and laptops, due to their mobile nature. A mobile application can be categorized as a mobile AR application if it has the following composition. First, various sensors of the devices such as a camera, gyroscope, microphone, and GPS as well as any companion device can serve as input components. Another component constitutes processing. It determines the type of information that is going to be rendered in the screen of the mobile device. In order to do that, it may require access to be stored locally in the device or in a remote database. Lastly, the output is projected to the screen of the mobile device together with the current view of the user. Figure 3 shows the mobile AR game components.
Fig. 3

Mobile AR game components. A conceptual diagram of mobile AR game components

The use of smartphones for ubiquitous mobile AR are popular due to their higher computing power and portability, but require the user to point and hold to be able to benefit from AR applications. Recently, the pace of AR game development has never slowed down in the world of mobile phones particularly smartphones. Tablets, PCs, and laptops start to get cumbersome and limited use to specific operations. AR glasses are considered to be the best option for ubiquitous mobile AR as the projected information is directly superimposed to the physical world, although their computing power is limited and most applications remain quite basic.

Due to specific mobile platform requirements, mobile AR suffers from additional problems such as computational power and energy limitations. It is usually required to be self-contained so as to work in unknown environments. A typical mobile AR system comprises mobile computing platforms, software frameworks, detection and tracking support, display, wireless communication, and data management. The most widely adapted devices for mobile AR are also the least powerful due to their high portability. Depending on the application, storage and rendering capabilities of the device, and its connectivity to the Internet, parts of it may be executed in a cloud-based architecture [15]. In addition, an important parameter is the memory and the storage requirements of the mobile applications. Mobile AR browsers, for instance, are projecting virtual objects in the view of the mobile users [16].

2.3.2 AR computing platforms

The high mobility of mobile AR requires it to provide services without constraining the users’ location to a limited space, which needs mobile platforms to be portable, small, and light. In recent years, significant progress in miniaturization and performance improvement of mobile computing platforms has been seen.
  1. A.

    Mobile phones

    Mobile phones have achieved great progress in all aspects from embedded cameras, built-in sensors to powerful processors, and dedicated graphics hardware. An embedded camera is suitable for video see-through mobile AR display. Many mobile AR applications [17] were built on mobile phones making them predominant platforms for mobile AR systems because of their minimal intrusion, social acceptance, and high portability.

     
  2. B.

    AR glasses

    AR glasses leverage the latest advances in mobile computing and projection display to bring MAR to a new level. They supply a hands-free experience with less device intrusion. AR glasses work in a way that users do not have to look down at mobile devices. For example, the Google Glass [18] is a wearable AR device developed by Google.

     
  3. C.

    Laptop computers

    Laptop computers were typically used in early mobile AR prototypes [19] as backpack computing platforms. Compared to desktop computers, laptop computers are more flexible to carry; however, size and weight is still the hurdle for wide acceptance by most users. Since laptop computers are configured as a backpack setup, additional display devices such as head-mounted displays are required for display.

     
  4. D.

    Personal digital assistants

    PDAs became an alternative to notebook computers before the emergence of other advanced handheld PCs. Several mobile AR applications [20] configured PDAs as mobile computing platforms; however, PDAs have issues of poor computational capability and absence of floating-point support. The smaller screen also limits the view angle and display resolution.

     
  5. E.

    Tablet personal computers

    Tablet personal computers are personal mobile computer products running Windows operating systems. The large screen size and multi-touch technology facilitate the content display and interactive operations. Many mobile AR systems [21] were built on tablet PCs; however, they cost expensively and are too heavyweight for long-time single-handed carry.

     

3 Results and discussion

3.1 AR game development requirements

The overall requirements of an AR game development can be described by providing comparative description against the requirements of virtual environments. These requirements are composed of three basic subsystems which represent the core components of modern AR game development.

One of the required subsystems in AR game development is scene generator: Rendering is not currently one of the major problems in AR. Virtual environment (VE) systems have much higher requirements for realistic images because they completely replace the real world with the virtual environment. In AR, the virtual images only supplement the real world. As a result, fewer virtual objects are needed to be drawn, and they do not necessarily have to be realistically rendered in order to serve the purposes of the application. As an example, text and 3D wireframe drawings might suffice in the annotation applications. In ideal scenarios, photorealistic graphic objects would be seamlessly merged with the real environment.

Another requirement for the development of an AR game is the display device. The display devices used in AR may have less stringent requirements than VE systems demand, again because AR does not replace the real world. For example, monochrome displays may be adequate for some AR applications, while virtually all VE systems today use full color. Optical see-through HMDs with a small field-of-view may be satisfactory because the user can still see the real world with his peripheral vision; the see-through HMD does not shut off the user’s normal field-of-view. Furthermore, the resolution of the monitor in an optical see-through HMD might be lower than what a user would tolerate in a VE application, since the optical see-through HMD does not reduce the resolution of the real environment.

Tracking and sensing completes the other subsystems that the AR game development requires. While in the previous two subsystems, AR had lower requirements than VE, which is not the case for tracking and sensing. In this area, the requirements for AR are much stricter than those for VE systems. A major reason for this is the registration problem that makes the tracking and sensing requirements higher than the VE requirements.

3.2 Development methodologies

Based on the AR game “Kkongalmon” jointly developed by the Chungkang College of Cultural Industries and Yong-in City, this study explored location-based AR game development methodologies such as location-based technology, gyroscope sensor technology, and sensorless technology, see Fig. 4 for the Kkongalmon game. This game is a tourism guide game where it provides location-based augmented reality user experience. It allows you to use a camera, GPS, and Mobile Geographic Information to get the location information on where to look for a monster.
Fig. 4

Kkongalmon. An example image of a Kkongalmon game

Comparatively, the outdoor single-player mobile AR game ARQuake was developed [22] in which it enables players to kill virtual 3D monsters with physical props. Also, Cheok et al. developed a mobile AR game for multiplayers in an outdoor environment. They embed Bluetooth into objects to associate them with virtual objects. The system could recognize and interact with the virtual counterparts when players communicate with real objects which allow players to kill virtual enemies by physically touching other players in real world.

3.2.1 Location-based technology using Mobile geographic information

For the case investigated in this study, monster collection functionality using location-based technology using mobile geographic information is additionally adopted. Using GPS attached in a user’s smartphone, the user’s position is specified as shown in Fig. 5. Mobile geographic information is used in this. Based on the specified user’s location, real world is mapped to the virtual world where monsters live. At the moment when the user’s real location gets reflected to the virtual world, errors are frequently occurring due to the nature of GPS. Because of this, a symptom that a user’s location jumps to another location is observed. Various errors could happen due to location and time of satellite, due to refraction occurring while wave propagates through the atmosphere, and due to wave interference. When it comes to utilizing location-based technology, how much accurately are errors corrected becomes a critical issue. For doing this, they take adjacent GPS values by recording the recent location values within several seconds timeframe, calculating an average and discarding the value whose deviation is too large.
Fig. 5

Example of utilizing location-based technology. An example of utilizing the location-based technology of the Kkongalmon game

3.2.2 Gyroscope sensor-based technology

The gyroscope sensor measures the rotational rate of the device along its three dimensional axes. To let a virtual monster exist at a specific position in the real world, it is necessary to map the real-world coordinates and virtual coordinates. A camera equipped with a gyroscope sensor in the real world turns to a camera existing at the 3D virtual world, so the rotation of the camera occurring at the real world would be converted to the rotation of the camera in the virtual world. When it comes to moving to another location in the same height, there is not a problem of using a gyroscope because of using a GPS at the same time.

However, a movement between places with big difference of heights caused the problem in matching the real world and virtual world accurately because it is not capable of controlling height even when using both a gyroscope sensor and a GPS together. A magnetic sensor can be an alternative to using a gyroscope sensor, but it has also a limitation in that it allows only for identifying four directions of North, South, East, and West and cannot be applied to identify a vertical direction, see Fig. 6 for an example of the use of gyroscope sensor.
Fig. 6

Example of using gyroscope sensor. An example of using gyroscope sensor of the Kkongalmon game

3.2.3 Sensorless technology

As a large number of low-priced smartphones are being supplied to the market recently, the trend is to remove sensors from the smartphones for cutting price. For this reason, gyroscope sensor and magnetic sensor are not being used in low-priced smartphones. Since these sensors are essentially necessary for implementing AR technology, AR cannot be implemented on these low-priced smartphones without these sensors. To overcome this, an alternative way to adopt AR without sensors has been sought in a way of allowing users to experience the feeling of residing in a virtual world by using an illusion effect rather than taking a physical approach. Since users who collect monsters are not aware of the accurate location and direction where a monster is positioned, forcibly showing a monster in a certain location on a user’s camera makes the users feel that they are using AR.

3.3 AR game development design guidelines

The development of mobile AR gaming has provided guidelines for design and evaluation of future mobile AR game applications such as presentation, content, functionality, users time experience during actions, specific context, and experience being invoked. The details are shown in Table 1.
Table 1

AR game design guidelines

Guideline

Description

Use the context for providing content

Geolocation, object detection to provide corresponding information

Deliver relevant-to-the task content

Clear and consistent UI; personalized and dynamic changing content. Besides, the addition of the social aspect (i.e., Facebook, Twitter, collaborative environments) improves the user experience and enhancement with the mobile AR application

Provide useful interactions with the AR content

Provide information related to the product or object that empower the user interaction and experience

Inform about content privacy

Inform the user on how the information collected about the users’ device will be used. Besides that, provide privacy awareness for the AR content

Provide feedback about the infrastructures behavior

Provide different configurations of an application regarding quality, resource requirements. Update the interaction according to users’ interactions

Support procedural and semantic memory

Focus the UI design in usability in order to make easier for non-experienced users to interact with the AR applications.

Furthermore, some of the AR content interactions do not need to be an extrapolation of real-world gestures (i.e., some users prefer other approaches to interact with AR objects and content)

Novel key ideas allow the development of frameworks for context and user-based context which leads to improvement of user experience immersion. Context immersion can be defined as user awareness of real context through interactions with the AR content. To get a better understanding of user experience in mobile AR applications, context dimension such as time and location-based tracking, object-based context immersion, and user-based context immersion are considered.

3.4 AR game development tools

AR game development tools allow applications to integrate AR capabilities in their functionalities. These capabilities include having the museum exhibit to tell its own story or it helps the user decide which furniture looks better in his living room. It is also possible to bring an elephant that the user just drew on a piece of paper into life, or a warning is given to the user about all the ignored signs while the user is driving.

Currently, there are numerous AR game development tools that exist and can be used to develop applications for smartphones, tablets, or smart glasses. Table 2 contains information from a specific company, the product they distribute, their license for distribution, and the platforms that it supports.
Table 2

AR game development tools [26]

Product

Company

License

Supported platforms

ARPA SDKs

Arpa Solutions

Commercial

Android, iOS (ARPA SDKs), Google Glass (ARPA GLASS SDK), Android, iOS, Windows PC (ARPA Unity Plugin)

ARLab SDKs

ARLab

Commercial

Android, iOS

DroidAR

Free and commercial

Android

Metaio SDK

Metaio

Free and commercial

Android, iOS, Windows PC, Google Glass, Epson Moverio BT-200, Vuzix M-100, Unity

Vuforia SDK

Qualcomm

Free and commercial

Android, iOS, Unity

Wikitude SDK

Wikitude GmbH

Commercial

Android, iOS, Google Glass, Epson Moverio, Vuzix M-100, Optinvent ORA1, PhoneGap, Titanium, Xamarin

Location-based AR is provided by DroidAR to Android applications which is an open-source framework that offer functionality for gesture detection, marker detection, and support for static and animated 3D objects using the model loaders from the libGDX game development framework that the user can interact with. Location-based tracking is also supported by Wikitude AR SDK. It also offers image recognition and tracking, 3D model rendering and animations, video overlays, and image, text, button, video, and HTML augmentations. In addition, location tracking is offered by Metaio SDK that supports among others 2D image, 3D object, face, barcode and QR code scanning, continuous visual search, and gesture detection. Moreover, ARPA SDK is complemented by ARPA GPS SDK for geolocation-based AR functionality.

Image multi-detection and multi-tracking are supported by ARPA SDKs. It also provides features such as real-time three-dimensional object rendering, as well as user interaction with 3D objects for building AR applications in IOS and Android. ARLab SDKs enable the launch of object tracking, image tracking, and virtual button SDKs which are available for both Android and iOS. AR Browser SDKs allow user to add and remove POIs independently from the scene in real time, interact with them, and perform actions on them. Vuforia SDK features multi-target detection, target tracking, virtual buttons, smart TerrainTM, and extended tracking. Vuforia supports the detection of several kinds of targets such as objects, images, and English texts.

3.5 AR development interfaces

Creating appropriate techniques for intuitive interaction between the user and the virtual content of mobile AR game applications is one of the most important aspects of augmented reality. The main ways of interaction in AR applications are as follows: tangible AR game interfaces, hybrid AR game interfaces, collaborative AR game interfaces, and emerging multimodal interfaces.
  1. A.

    Tangible AR game interface

    Tangible interfaces support direct interaction with the real world by exploiting the use of real physical objects and tools. VOMAR application developed by [23] is a classic example of the power of tangible user interfaces which enables a person to select and rearrange the furniture in an AR living room design application by using a real, physical paddle. Paddle motions are mapped to intuitive gesture-based commands, such as “scooping up” an object to select it for movement or hitting an item to make it disappear in order to provide the user with an intuitive experience.

     
  2. B.

    Hybrid AR game interface

    Hybrid interfaces combine different but complementary interfaces as well as the possibility to interact through a wide range of interaction devices [24]. They provide a flexible platform for unplanned, everyday interaction where it is not known in advance which type of interaction display or devices will be used.

     
  3. C.

    Multimodal AR game interface

    Multimodal interfaces combine real objects input with naturally occurring forms of language and behaviors such as speech, touch, natural hand gestures, or gaze. These types of interfaces are more recently emerging. An example of multimodal interaction is the work of Lee et al. [25], which makes use of a gaze and blink to interact with objects. This type of interaction is now being largely developed and is sure to be one of the preferred type of interaction for future augmented reality application as they offer a relatively robust, efficient, expressive, and highly mobile form of human-computer interaction that represent the users’ preferred interaction style.

     
  4. D.

    Collaborative AR game interface

    Collaborative AR interfaces include the use of multiple displays to support remote and co-located activities. Co-located sharing uses 3D interfaces to improve physical collaborative workspace. In remote sharing, AR is able to effortlessly integrate multiple devices with multiple locations to enhance teleconferences.

     

4 Conclusion

Augmented reality is a rapidly growing area of interactive design where it allows virtual contents to be seamlessly integrated with displays of real-world scenes. Along with the meteoric rise of smart mobile devices capable of producing interesting augmented reality environments, the field of interactive game contents has been explored. This study addressed the methodologies used for developing location-based AR game using mobile geographic information through investigating one practical case in which a game called “Kkongalmon” was implemented. The technologies, necessary for implementing location-based AR games, such as GPS sensor, gyroscope sensor, and magnetic sensor were explored. In addition, the method to develop a game even in a low-priced smartphone which has no necessary sensor was also explored. The case study indicates that it can be a good practice to show how to actively cope with the recent changes like the emergence of low-priced sensorless smartphones under the current situation where lots of diversified location-based AR games including Pokémon GO were being continuously released. The design guidelines enable an improvement in understanding of user experience in mobile AR applications. In the future, further study will be necessary for system improvement in a direction of enhancing an algorithm to reduce GPS error, and flexibly dealing with a symptom of jumping user’s location to another location due to a location error. In addition, since the location-based AR game can transmit geographical information to the user, the user can positively use this information. This will be a new opportunity for marketing in that TV and magazine ads offer an opportunity to interact with each other, unlike unidirectional information delivery.

Notes

Acknowledgements

This research was supported by Basic Science Research Program through the National Research Foundation of Korea(NRF) funded by the Ministry of Education (No.2016R1D1A1B03935378).

Funding

There is a no funding for this research.

Availability of data and materials

Data sharing not applicable to this article as no datasets were generated or analyzed during the current study.

Authors’ contributions

D-HY, S-HS, and J-YK have jointly designed the research program. D-HY and J-YK have initiated this particular study and drafted the manuscript. S-HS has done the discourse analysis. All the authors read and approved the final manuscript.

Ethics approval and consent to participate

As this is a non-experimental, non-interventional study not involving patients, patients’ tissue and/or data, an ethics approval or waiver thereof does not apply.

Consent for publication

This manuscript does not include any individual person’s data in any form (including individual details, images or videos).

Competing interests

The authors declare that they have no competing interests.

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

References

  1. 1.
    R. Azuma, Y. Baillot, R. Behringer, S. Feiner, S. Julier, B. MacIntyre, Recent Advances in Augmented Reality (Naval Research Lab, Washington DC, 2001).  https://doi.org/10.1109/38.963459 CrossRefGoogle Scholar
  2. 2.
    M. Akcayir, G. Akcayir, Advantages and challenges associated with augmented reality for education: a systematic review of the literature. Educ. Res. Rev. 20, 1–11 (2017).  https://doi.org/10.1016/j.edurev.2016.11.002 CrossRefGoogle Scholar
  3. 3.
    H.Y. Wang, G.Z. Liu, G.J. Hwang, Integrating socio-cultural contexts and location-based systems for ubiquitous language learning in museums: a state of the art review of 2009–2014. Br. J. Educ. Technol 48(2), 653–671 (2017).  https://doi.org/10.1111/bjet.12424 CrossRefGoogle Scholar
  4. 4.
    D. Hong, W. Woo, Survey on mobile augmented reality systems. J. Kor. Inst. Inf. Sci. Eng. 26(1), 88–97 (2008)Google Scholar
  5. 5.
    J.Y. Lee, S.H. Cho, H.S. Kim, Development of Smart object Application Using Mobile Augmented Reality (Proceedings of the Korean Information Science Society Conference. Korean Institute of Information Scientists and Engineers). 39(1), 338-340 (2012)Google Scholar
  6. 6.
    J.Y. Kim, S. Nam, A study of immersive game contents system design and modeling for virtual reality technology. Int. J. Control. Autom. Syst. 7(10), 411–418 (2014)CrossRefGoogle Scholar
  7. 7.
    F. Fotouhi-Ghazvini, R.A. Earnshaw, D. Robison, P.S. Excell, in CyberWorlds, 2009, CW’09. Designing augmented reality games for mobile learning using an instructional-motivational paradigm (International Conference on IEEE, Bradford, 2009), pp. 312–319.  https://doi.org/10.1109/CW.2009.27 CrossRefGoogle Scholar
  8. 8.
    T. Winkler, M. Ide-Schoening, M. Herczeg. Mobile co-operative game-based learning with moles: Time travelers in medieval. (AACE, 2008), pp. 3441–3449 (2008)Google Scholar
  9. 9.
    Wetzel R., McCall R., Braun A. K., & Broll W. (2008). Guidelines for Designing Augmented Reality Games. In Proceedings of the 2008 Conference on Future Play: Research, Play, Share (173–180). ACM.  https://doi.org/10.1145/1496984.1497013.
  10. 10.
    S.C.Y. Yuen, G. Yaoyuneyong, E. Johnson, Augmented reality: an overview and five directions for AR in education. J. Educ. Technol. Dev. Exch 4(1), 11 (2011).  https://doi.org/10.18785/jetde.0401.10 CrossRefGoogle Scholar
  11. 11.
    Specht M., Ternier S., & Greller W. (2011). Dimensions of Mobile Augmented Reality for Learning: A First Inventory. http://hdl.handle.net/1820/4008 Google Scholar
  12. 12.
    K. Kim, M. Lee, Y. Park, J. Lee, W. Woo, in Pervasive05 Workshop (PerGames05). ARPushPush: augmented reality game in indoor environment (2005)Google Scholar
  13. 13.
    Moon K., Sang J., & Woo W. (2014). Designing AR Game Enhancing Interactivity between Virtual Objects and Hand for Overcoming Space Limit. In International Conference on Virtual, Augmented and Mixed Reality (200–209). Springer, Cham.  https://doi.org/10.1007/978-3-319-07464-1_19
  14. 14.
    G. Papagiannakis, G. Singh, N. Magnenat-Thalmann, A survey of mobile and wireless technologies for augmented reality systems. Comput. Anim. Virtual Worlds 19(1), 3–22 (2008).  https://doi.org/10.1002/cav.221 CrossRefGoogle Scholar
  15. 15.
    Huang Z., Li W., Hui P., & Peylo C. (2014). CloudRidAR: A Cloud-Based Architecture for Mobile Augmented Reality. In Proceedings of the 2014 workshop on Mobile augmented reality and robotic technology-based systems (29–34). ACM, New York.  https://doi.org/10.1145/2609829.2609832
  16. 16.
    Chatzopoulos D., & Hui P. (2016). Readme: A Real-Time Recommendation System for Mobile Augmented Reality Ecosystems. In Proceedings of the 2016 ACM on Multimedia Conference (312–316). ACM, New York.  https://doi.org/10.1145/2964284.2967233
  17. 17.
    Henrysson A., & Ollila M. (2004). UMAR: Ubiquitous Mobile Augmented Reality. In Proceedings of the 3rd International Conference on Mobile and Ubiquitous Multimedia (41–45). ACM, New York.  https://doi.org/10.1145/1052380.1052387
  18. 18.
    Glass G, http://www.google.com/glass/start/. Accessed 31 July 2018. (2013)
  19. 19.
    Cheok A. D., Fong S. W., Goh K. H., Yang X., Liu W., Farzbiz F., & Li Y. (2003). Human Pacman: A Mobile Entertainment System with Ubiquitous Computing and Tangible Interaction over a Wide Outdoor Area. In International Conference on Mobile Human-Computer Interaction (209‑223). Springer, Berlin.  https://doi.org/10.1007/978-3-540-45233-1_16
  20. 20.
    Pasman W., & Woodward C. (2003). Implementation of an Augmented Reality System on a PDA. In Proceedings of the 2nd IEEE/ACM International Symposium on Mixed and Augmented Reality (276). IEEE Computer Society, Washington, DC. https://dl.acm.org/citation.cfm?id=946844
  21. 21.
    Guven S., Feiner S., & Oda O. (2006). Mobile Augmented Reality Interaction Techniques for Authoring Situated Media On-Site. In Proceedings of the 5th IEEE and ACM International Symposium on Mixed and Augmented Reality (235–236). IEEE Computer Society, Washington, DC.  https://doi.org/10.1109/ISMAR.2006.297821
  22. 22.
    W. Piekarski, B. Thomas, ARQuake: the outdoor augmented reality gaming system. Commun ACM 45(1), 36–38 (2002).  https://doi.org/10.1145/502269.502291 CrossRefGoogle Scholar
  23. 23.
    Kato H., Billinghurst M., Poupyrev I., Imamoto K., & Tachibana K. (2000). Virtual object manipulation on a table-top AR environment. In Augmented Reality, 2000. (ISAR 2000). Proceedings. IEEE and ACM International Symposium on (111–119). IEEE.  https://doi.org/10.1109/ISAR.2000.880934
  24. 24.
    Zhou F., Duh H. B. L., & Billinghurst M. (2008). Trends in augmented reality tracking, interaction and display: a review of ten years of ISMAR. In Proceedings of the 7th IEEE/ACM International Symposium on Mixed and Augmented Reality (193–202). IEEE Computer Society.  https://doi.org/10.1109/ISMAR.2008.4637362
  25. 25.
    Lee J. Y., Lee S. H., Park H. M., Lee S. K., Choi J. S., & Kwon J. S. (2010). Design and Implementation of a Wearable AR Annotation System Using Gaze Interaction. In Consumer Electronics (ICCE), 2010 Digest of Technical Papers InternatEional Conference (185–186). IEEE.  https://doi.org/10.1109/ICCE.2010.5418832
  26. 26.

Copyright information

© The Author(s). 2019

Open AccessThis article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.

Authors and Affiliations

  1. 1.Department of Computer ScienceGachon UniversitySeongnam-SiRepublic of Korea
  2. 2.Division of MediaSoftware, Sungkyul UniversityAnyang-siRepublic of Korea
  3. 3.Graduate School of GameGachon UniversitySeongnam-SiRepublic of Korea

Personalised recommendations