1 Introduction

Addressing interactive systems is no longer only a matter of web interfaces or interacting with products [1]. Consequently it requires a holistic view of the complex ecology of users, spaces, and technologies used on a variety of platforms [2]. From interactions with products and small ubiquitous services in the world of the internet of things, to large interactive system, the user and his way of experiencing the interaction, has been the focus for the Human Computer Interaction (HCI) research milieu. Nevertheless understanding user experience in a theoretical valid way has proved to be a difficult task [3]. Some of the attributes given to this interaction range from a dynamic, contextual and personal experience [4] to more emotional perspectives [5]. A holistic approach may give a foundation to understand and address the experience users have within a complex ecology. This complexity also requires the use of multiple methods to discover tensions and synergies between the anticipated use of an interactive system and the actual user experience. Tensions may address the usual problem of the mismatch between designer’s plans and ideas, and how users actually experience the results of the design process [5]. Synergies, on the other hand, address how designers and staff can use the positive effects of the new insight on user needs to achieve a well-functioning interactive system. Values like empathy [6] and sense-making of this gained competence can generate more radical changes and not incremental ones [7]. The context of this research effort is an academic library, and the addressed interactive system is a discovery tool used to find and read resources the library pay for. The academic library also offers an interesting context with specific user groups and specific tasks to be performed, resulting in a unique platform to observe and understand effects in the library when changes are made in an interactive system. The ecology consists of different and well defined types of stakeholders (e.g., students, academic and library staff), and with services existing in a constraint arena (digital and physical space).

In fact libraries are often used as an arena for testing and case analysis [2, 8], and especially academic libraries are relevant, since they have to stay up to date both in regards to technology and service offered [9, 10]. When it comes to large interactive systems in libraries a change in the type of software used has occurred during the last decades, from an in-house development to purchasing off-the-shelf software or adopting Open Source software like Koha [11]. The process itself of purchasing is similar between among academic libraries [12, 13]. The process of acquiring large systems is usually performed without performing various tests with real users in advance, and is based on Request for Information (RFI) produced by committees, visiting other institutions and finally ending in a Request for Proposal (RFP), while the user experience is still not addressed. When an interactive system is introduced in the ecology of the library, the majority of user experience and usability testing are focused on the web-interfaces using different usability methods [1421]. The holistic perspective of the user experience is often neglected, as the use of the system are performed in different contexts, both inside and outside the library, using a variety of platforms and influences other library services. In this scenario, actions derived from usability testing and usability metrics (e.g. efficiency, effectiveness, satisfaction) [22], are small adjustments of cosmetic art or functional issues of the interactive system. This paper presents a more holistic approach to the user experience where the introduction of an interactive system, is followed by a multi-folded testing, including a large survey, focus groups, observations and usability testing, and show how the library staff gained new insight, and further supported the re-design of this system taking into account both the physical and the digital space.

2 User Experience in Interactive Systems

User experience is a fluid field of research, gaining momentum from a variety of scopes, and therefore pointing out a theoretical foundation is difficult [23]. To describe the UX between users and products or services, it is possible to use attributes ranging from dynamic, contextual and subjective [4], to more emotional perspectives [5]. The latter can be described as a dimension including affect, aesthetics and enjoyments, and additional new values are among others relevance and engagement [22]. As part of this complex scene, some voices argue for and against the possibility to use metrics and values from usability to enrich UX analysis [24]. Usability metrics address values like efficiency, effectiveness, and satisfaction (ISO 9241-11), where the latter is also suggested, as a possible common value for a “person perceptions and responses”(ISO FDIS 9241-210 definition of user experience) [24].

When using a system, the interaction can be described as fluent, cognitive or expressive [1]. Fluent interaction is when the user does not need to reflect upon the task to be executed, cognitive interaction requires a learning process, while the expressive interaction emphasize the construction of a common ground where bonds are made [1]. The aforementioned values contribute to address some of the less researched factors affecting experience, like context, situation and temporality of the act [5]. The temporality of the act can then be discuss in regard to how it is consumed in time, where “an experience” is one occasion of, for instance using an emergency system, while the “experience” of login on a computer at work gives a completely different experience of interaction [1]. The “perceptions and responses” [24] of the experience changes over time as the user learn and embodies how to use an interactive system. Context and situation influences the learnability of the system. In regard to interactive systems, the effects of interaction and experience over a longer period of time probably have an impact as the user learn to use the system better and better, with or without help. This situation is assumed to influence task achievement and satisfaction of the experience [25]. Few longitudinal studies on UX have been done [22], reducing the possibility to analyze how the user experience may evolve over a longer period of time. An academic library may give a fruitful arena to conduct this type of study.

3 User Experience in a Library Interactive System

A shift has occurred in the type of services provided by libraries, from an “institutions” of knowledge with loan of paper books and helpdesk to find information as the primary task, to a place where users learn about access to quality assured information and a place to stay and work together with other students. New technologies and new forms of knowledge-based items like e-books and online databases, have made the libraries more important than before, on the other hand requiring them to be more pro-active to season based technologies [9, 10]. The use of services everywhere, has posed the library in the dilemma between either cannibalize their existing services in the physical space by moving them in to the digital one, or try to combine services from the two spaces using a UX perspective. A holistic approach that includes the two spaces, may inform how this process can be addressed, and evaluating the user experience of an interactive system may generate new insight for the library staff.

An holistic perspective on UX is usually not addressed by libraries [26], and as mentioned in the introduction the majority of the attempts has been in regards to web-interfaces (digital space), or library interiors, like signs and way finding [27] (physical space), but lack of good combination between the two spaces [28]. This article argues for an approach including both the physical and the digital space at the same time, and in doing so, combining what users do, and what users feel and expect [29].

4 The Study

4.1 Context of the Study

The context of this study is an academic library situated in a Scandinavian country. Since the beginning of the’90 this library was part of a consortia consisting of 108 research libraries sharing a large Integrated Library System (ILS). This system was a self-made solution, but failed over the years to migrate to platforms that could sustain new requirement from library users. The process of acquiring and implement to a new front-end system in 2013 was the first step toward the migration of a new ILS. The front-end system used in this study was adopted by all the libraries in the consortia, but each one had a local plan in regards in how and when implement the system. The system had a new discovery tool and similar features as the previous such as login for users where it was possible to renew book loans. Another difference between the old system and the new one was the process of ordering books and articles. In the old system the ordering form was a web page while now a login is required. Since the system was renowned cloud-based off-the-shelf software, it was only partly possible to change. Unfortunately, in this library the migration was done with little regard to the user experience perspective. In the physical space this academic library has different type of services, ranging from courses about information literacy, or use of library resources, to the possibility to order books from other libraries. The discovery tool, as part of the library ecology, addresses directly some of those services, affecting how users experience library services.

4.2 Methodology

After four months of use of the new discovery tool, the planning and effort of collecting UX data started. In approximately four weeks, survey, usability tests, and focus group interviews were done, while observation about the use of the physical library was gathered in two different occasions and not only during the intensive four weeks period. While the author had the overall responsibility for the theoretical anchoring, seven persons from different departments within the library, like management, web publishing, help-desk and so on, were the core project group. The project generated discussions in the library in regards to the methods used, giving a better understanding of the necessity of various efforts to gain better understanding of the user needs. Some observations of how users approached the physical library were done before and during the four weeks tests were underway. The mobile platform, often forgotten as part of the library ecology, was in this case addressed as one of the last points in the migration plan. The use of survey (quantitative data) and focus groups (qualitative data) are common in sociology studies [30], but seldom found in combination with usability test and observations.

In this study the result from the survey and the focus groups helped to inform a first re-design of the search tool and the library main web page, where the discovery tool search box was placed. Together the four methods provide a broad and deep insight and complement each other. The participant of the focus group and the usability test had to sign a consent form about the use of the collected data, while information about the use of the results was provided on the questionnaire.

4.3 User Survey

An overall of 727 answers was gathered during 3 h of work in the library departments of medicine, law, science, and humanities. The survey was short, with 3 questions, presented on paper. Library users were invited to participate in the survey when entering the library. As a motivation to participate, we announced the possibility to win an iPad mini. The questions were as follow:

  1. 1.

    Are you a : (a) Researcher, (b) Student, (c) University staff, (d) Other

  2. 2.

    Have you:

    1. (a)

      Not tested the new discovery tool

    2. (b)

      Tested the new discovery tool and still using it

    3. (c)

      Tested the new discovery tool, but using only the old one

    4. (d)

      Tested the new discovery tool and using it, but use also other search services as well

  3. 3.

    If you have tested the new discovery tool, how pleased are you?

The first and second questions were possible to answer with a check mark while the third was a Likert scale with a 5-point scale, from very satisfied to very unsatisfied with the discovery tool. Although the test was done during the exam period, the amount of users that had participated was adequate for our test. On the other side, the library was visited by students from other institutions and college in the city since the service declaration ensure anyone above eighteen years to be a user of this library.

4.4 Usability Tests

Using data from the questionnaire, we managed to get in contact with 10 respondents. The respondents had a good distribution of age, gender, and subjects. They were in a different stage of their study, from bachelor to Ph.D. students. The method used to gather data from the respondents were to let them “speak aloud” while two observers made notes of what was said and done, and a moderator was asking the questions. Been three persons could be a problem, but we saw gains by doing the test this way. Additional questions were also added, when needed, by the other observers to clarify the user comments. Usability tests have usual effectiveness, efficiency, and satisfaction as quantifiable values. The approach in this part of the study assesses satisfaction, based on how the users reacted when performing the tasks, and efficiency, if they managed to complete them. The data analysis was based on observations of what users did, and their comments. Avoiding a statically approach to the data, the focus was no longer on the instrumental value of the tested system [5].

The goal of this usability test was not only to test the efficiency and satisfaction of the discovery tool, but also if the library main web page, where the discovery tool search box was, did help the user in performing the tasks in a satisfactory manner. Using two different interfaces, with five respondents each, we expected to gain fruitful results. The first interface for the discovery tool on the library web page had various information (see Fig. 1), while the second was inspired by Google with one search field and 5 small link to other services and customized with the findings from the survey and the focus groups.

Fig. 1.
figure 1

First interface to the discovery tool on the library website

As a motivation for participants to enroll to the test, a reward of approximately 25$ was previously announced in the questionnaire from the aforementioned user survey, where the respondent could write an email address. The time spent was approximately 45 min, where twelve questions about how they approached our services to execute different type of tasks were used. The tasks had a clear intersection with the physical library. For instance, one of the tasks was about finding an item that we had not access at all, the results should show what they usually do, but also expectation they had in regards to the library services. Other question could be a crossover between the digital and the physical library, like finding a book and choose to get the paper version. Also access to databases or using other library resources was part of the tasks given to them. This type of questions had both effectiveness and utility in mind, while additional questions were especially focused on how the use of the discovery tool affected the use of other services, both in the physical and the digital space.

4.5 Focus Groups

The project managed to organize three focus groups with research groups from Department of Informatics (Design of Information System) with four participants, the Faculty of Law (Center for European Law) with three participants and Humanities (Greek and Latin) with five participants. As a sign of gratitude for their time, we offered a lunch. After a short introduction about the goal of the meeting, they tested the system for about 15 to 20 min, and could ask us about different functionality of the interactive system or other questions about services the library had. The whole session lasted about 2 h, which was concluded with the lunch. In addition to one questioner, two other observers were taking notes. The content of the notes was then discussed afterward, resulting in one report. We had eight open-ended questions to ensure all the topics should be covered. Even though the questions were not mandatory to follow, the discussions often touched almost all the topics using fewer. The questions dealt with issues regarding how the discovery tool was presented on the library web pages, and how the information there helped the experience and expectation of using this system. Other questions were more specific about the functionality and relevance of the results, and also addressed the intersection between the physical and digital space.

4.6 Observations

Observations were made in two phases. First in the science library, two different efforts of one hour each were made during a prior occasion [31], while the second phase was made in the Humanities and Social Sciences Library. During the sessions, observations were made of what user did when they were approaching and using the physical library. Notes and photos were taken of actions the users did. The last observation was done during the same period as the other test of this study was performed while the first two were done in conjunction with data gathering for a master thesis [31]. Results of the observations were more sensible to changes in the context than the other methods used. The last observation was done in November, a period where students are preparing for exams and the library is often overcrowded. This type of observations may have a constrained value, emphasizing the need to be triangulated with other evaluations effort.

5 Findings

5.1 User Survey

Out of 727 participants, the majority (88 %) were students. On the second question, about their use of library resources 39 % had not yet tested the new discovery tool, and only 11 % did not use or used other search services. On the last question the majority of the users of the new system were satisfied (17 % very pleased, 50 % pleased, 30 % middle), where only 3 % was partially unsatisfied and 0 % completely unsatisfied. Even the users aware of the new system were pleased with it, when 39 % had not yet tested it out after four months it went live, is a relevant observation. On the other hand, we observed a willingness explore the new discovery tool among nonusers. One of them said to us: “I will go right away and test it”. Those findings address how the library brands and does market its services. Another relevant feedback was the naming of the interactive system, where, although the library had chosen a new name, the majority of the users opted to use in their conversations the original name of the software. A consequence of this misconception affected the possibility the library had to help users in a fruitful way since information and support web-pages used the new name. A positive side effect was the involvement of the library staff as they were given the possibility to have direct contact with the users in another context. Asking users for information was a new experience and as some of the library staff commented: “We should do this more often”, or “What I heard was that your colleague was happy and that this was good advertising for the tool”. As far as we know, the staff had only positive feedback from users when they were asked to participate in the survey, and using only three hours to carry out the test, the overall cost was very low.

5.2 Usability Testing

As mentioned this part of the study was focused on satisfaction and efficiency. The majority of the users fulfilled the tasks, in a similar manner. For instance, in the first interface (see Fig. 1) the tabs in the search box were not used, or very little, since they opted to use the first one, and then deal with the results. They also commented about too much information in the same search interface (see Fig. 1), and used google simple search box as an example of how a starting point for their tasks should look like. The usability testing also revealed some assumptions the library had about user behaviors, later on confirmed by the focus group. From a holistic point of view, a relevant task in the usability test was when they should find an item the library had no information about or access to. Their prompt reaction was to move theirs search into Google Scholar. From our observations during usability testing, the interaction with the system was not cognitive but had a fluent value [1]. In a more descriptive way, we can say the impression we had of their behavior was of users using the interactive system so often, so they did not reflect so much upon their action. Some of them said, in fact, that they did not need to get help or go to courses to learn how to use it. Since some of the questions were specific task to be completed the results could also be constrained to address usability problems of the service from a heuristic perspective [32], and help the further development of the service.

5.3 Focus Groups

The results from the focus groups were the richest about data and revealed the need for a clear holistic approach to the use of the interactive system and the physical library. Some participants did not know about other services the library had, like the possibility to order books or articles from other libraries in the country or abroad. Also services like library courses to learn how to use in an effective way the interactive system was unknown by some of them. Several participants stressed that their next step was to search Google if they did not find the book or articles using the library system. The discussions showed also a practice among researchers about sharing literature resources and search strategies, giving possibility of the nurture of myths about how to find relevant literature.

Some of the comments had also a clear impact on the user experience even those could be characterized as usability issues. For instance the discovery tool lack of image for book covers or good icons to describe different type of materials (i.e.g. books, articles and multiple versions of an item, see Fig. 2), from a usability point of view the effectiveness is affected since a visual recognition is lacking, and in regard to experience the aesthetics of the icons gives a poor impression.

Fig. 2.
figure 2

Icons for book, articles and multiple versions of an item

5.4 Observations

All three observations showed tasks focused users. The foremost of the users did go directly to find a computer to work on it, others to meet other students to work together. Very few did go directly to the bookshelves. Other observations confirm [9] that student start their search for literature from the digital space. During the observations we also realized that this type of information shows only users outside the entrance of the library and their behavior inside. This type of observations does not give insight in why other users do not visit the library at all. Since one of the periods was during exams, the queue outside the library before opening hours was quite long, making the library very crowd for the rest of the day, and causing students to sit in every corner, and also outside the main door (see Fig. 3).

Fig. 3.
figure 3

Students outside the library during exams period

6 Discussion

The findings support an understanding were the library users had a clear idea of the task they had to do when using the interactive system. In the same way users are using the physical library and know they will get help to find any type of resources, they have the same expectation when using the library interactive system. This anticipation has an impact on how the experience will unfold when the discovery system is used. To grasp this dislocated experience, an holistic approach is needed, and factors like context, situation and temporality of the act [5] can be addressed and gives then an added value when the interactive system is designed, or needs to be re-designed, as in this case. The temporality of the user experience in the library is represented by an antecedent expectation of using the system, then an experience of using it, and finally the experience afterward. Our findings also show that users do not reflect upon the use of the library interactive system, this is in line with a description of a fluent interaction [1]. In doing so, the cognitive interaction is absent reducing the possibility to learn how to use the system in a better way. We have also denoted in our case the difficulty the interactive system has to help the user further to find their way to the physical space to solve their unsolved tasks. The consequences may be “an experience” perceived negatively, an antecedent expectation of the system not accomplished, resulting in users migrating to other systems, where an opposite experience could give loyal users.

The result from the survey describe a picture where users say they are pleased with the system, while the experience is more nuanced when we take into account the feedbacks from the participants of the usability test and focus group, where they had problems to find other services the library had and could then solve theirs tasks. This dichotomy implies some difficult questions, like how to communicate to the user in a holistic way the services the library provide, or how to visualize services starting in the digital space and ends in the physical, like realizing the need for help when using the discovery tool and then participating in courses provided by the library. This view of experience may have the potential to impose a non-instrumental understanding of the interactive system, and prevent us from characterizing the experience as based on product features [33].

For the Academic libraries an issue affecting the user experience is the tension between the willingness the library has to develop and adopt a user friendly system and the complexity of giving an arena where users can learn how to use and search quality checked literature. This tension is best illustrated by a comment from one participant of the focus group: “If I can’t find it using the library systems, the next step is Google Scholar”. While the library has several different services that may help the user solve a task in an easy way, the question then is how the library shall inform the users in a proper way. When 39 % of the respondents had never tested the system, the library should address how the services are branded both in the physical and digital space.

7 Conclusion

In this paper, we have argued for a holistic view when designing for experience in an interactive system, where the digital and physical space has to be taken into account. This approach seems to avoid an instrumental understanding of the service, focusing more on the experience the user has when using the system. Antecedent expectations are also relevant for how the users experience unfold and will affect future behaviors. To achieve this holistic insight scholar need to investigate both the digital and the physical space using multi-folded testing methods. Better visibility of library services and a tactful branding are necessary but have to be addressed as part of the complex ecology the library is. A knowledge-based development of services is mandatory, where the library has to react to findings on the behavior of their primary users. This effort may help introduce a culture in the library in regards to performing multi-folded tests to discover user needs and at the same time give library staff insight and new knowledge.