Keywords

1 Introduction

E-government emerged in late 1990 with the democratisation of the internet [1]. A shared aim was to create a self-service online portal/web base service for citizens that is easy to use and meets government benefits [2].

Lacking citizen experience seems to be a common practice in e-government platform development. We could identify that there is a little coverage of citizen and user satisfaction in the literature between 2000 and 2012.

We can see a recurrence that the development of a digital solution was very often led by policy and technical rather than by user needs. There is some clear evidence that what is missing is the monitoring of citizen experience in the use of government services [3].

It becomes important that today’s organisations include user research [4]. At the end of the day, if a government wants to move digital, they must take into consideration their users, their needs and context.

Even if user research is not a new concept, some countries such as France are not yet embedding it within their e-government service development. This becomes obvious, as one can see the difference in terms of user journey, user experience and user satisfaction. Even if they try to make some efforts, the user experience seems to be “la cinquième roue du carrosse”, the fifth wheel, which means that user research is generally seen as an optional extra. Bearing in mind that in some countries decision makers have no idea what user research is, let alone its benefits.

For more than a decade, the United Kingdom started to put in place a digital transformation agenda, with the aim to develop an online solution that will benefit the citizens and the government. Their main objective was to reduce costs and provide a self-service online journey. Nevertheless, it was reported that there was a “lack of focus on citizen during the development” [2]. Furthermore, the development of their digital solution was also led by policy and technical rather than by user needs.

One of the major issues was that that they did not involve user researchers at first. They seem to commission external agencies that were conducting Qualitative research (focus groups, interview and field work). Once the report was ready, designers, business analyst and IT team put in place a solution, based on their understanding of the report. Departments were working in isolation, even if we could anticipate some overlap in terms of processes and citizen needs. This led to isolated platform/websites that are not always easy to use.

Since then the Government Digital Service (GDS) department unit part of the Cabinet Office started introducing user research and user needs in their digital strategies. Standardisation of the processes, introduction of multi-disciplinary teams working following agile methodology became a must. GDS emphasised the importance of conducting user research [5].

Scrum teams were set up and 25 exemplars [6] were put in place, introducing user research through the whole product development.

The outcome was that GDS created guidelines for best practices. Even if they are not perfect, as they lack academic background. The guidelines are a very good starting point in the necessity of embedding users through the e-government product/service development.

Every end-user-facing service must follow the guidelines and meet the 26 service standards – that have now been reduced to 18 [7]. Every service that involves citizen-facing must go through the GDS assessment at the different stages of the product development, generally at the end of Alpha and Beta phase. Lack of user research leads to failure of the assessment; in extreme cases the service is switched off.

1.1 What Is User Research, Where Does It Come from, and What Is the Role of User Researchers?

User research uses scientific (qualitative and qualitative) research methods. To understand and evaluate user behaviour, needs, motivation while using a digital product.

User research provides strong evidence on how users react and might interact with a product or service.

The aim of conducting user research is to meet user needs, reduce risk and generate revenue or lower cost. It can also help stakeholders make better decisions, which will result in a more successful product or service.

User research (UR) incorporates usability, human and computer interaction, social interaction, psychology, ergonomics, anthropology etc. UR is becoming a discipline on its own and is now integrated into the agile product development circle [8].

User research is more than usability. User research helps to understand who are the users and what are their needs. It puts the user at the centre of the investigation. UR looks at the full user journey across multiple digital tools (apps, software or digital products). It looks at how the user interacts with the whole ecosystem (devices, apps, software, site, social media, environment etc.).

User research provides valuable information collected using research methods coming from ergonomics, HCI, human behaviour, cognitive sciences and human interaction. These findings are evidence-based. They help web designers, software developers, engineers, business owners, and other stakeholders create better (digital) products; that should in principle be usable by all.

1.2 Why Is User Research Essential in the Development of a Government Platform?

Citizens want to have access to everything at their fingers tips, in no time, through their mobile, laptop, TV etc.

This digital transition transformation affects all organisations, dematerialisation, data management, optimisation of processes, online sales, marketing, cost reduction, or staff reduction where new profiles are required.

Differentiating Client Needs to User Needs:

Client/stakeholder needs will be related to the business needs (we want to reduce the number of emails, we want to communicate directly on the intranet, we want to be able to make the financial transaction on our site, we want as a business to get X or Y etc.)

User needs relates to how users will interact with the tools.

Doing User Research Will

  • Clarify business needs

  • Identify the requirements

  • Draw an account of the users by creating the persona

  • Test the concept with real users

  • Make the recommendation to the UX designers

  • Check with the IT what is possible to be done technically.

  • Test prototype before development

  • Evaluate the functionalities

  • Evaluate the architecture and layout of the product

  • Evaluate content and terminology

  • Evaluate how the users behave and what their needs are

Advantages

User research enables you to work agilely and improve the product through the product development.

Making changes before development is far less expensive than doing it once the product is developed.

Doing user research will limit the risk of failure. It will increase the chance of success and meet user expectation.

The user researchers are objective; they have several research methodologies to capture evidence that will help the business, the IT, the design to make the right decision.

User research can intervene when you want to change the design of your site or if you want to add a functionality.

e-Government Users and Their Needs

Government services are more complex than other services as there are constrained by policies, and there is no limitation in term of users, as it could be any citizen.

Therefore, it is essential to understand who the users are, how they will interact with the service, as well as translate the policy jargon to user-friendly language.

1.3 When Doing User Research?

Involving users throughout the product development is essential, it should happen as early as possible and through every phase (discovery, Alpha, Beta and Live).

Discovery: A short phase, in which you start researching the needs of your service’s users, find out what you should be measuring and explore technological or policy-related constraints. (4–8 weeks)

Alpha: A short phase in which you prototype solutions for your users’ needs.

You’ll be testing with a small group of users or stakeholders, and getting early feedback about the design of the service. (6–12 weeks)

Beta: You’re developing against the demands of a live environment, understanding how to build and scale while meeting user needs. You’ll also be releasing a version to test in public. (12–24 weeks)

Live: The work doesn’t stop once your service is live. You’ll be iteratively improving your service, reacting to new user needs and demands and meeting targets set during its development (until retirement).

2 Case Study: Universal Credit Digital Service

The Universal Credit Digital Service is the biggest European e-government platform and aims to be used by 10 million citizens.

Based on a new policy, the UK Government wanted to create a single platform with the fusion of 6 benefits.

  • Income-based Jobseeker’s Allowance

  • Income-related Employment and Support Allowance

  • Income Support

  • Working Tax Credit

  • Child Tax Credit

  • Housing Benefit

The objectives of the Universal Credit were to make a single monthly payment to the citizen, reduce fraud, put a cap on benefits, have monthly follow-ups with the citizen in search of employment, and make it usable for 10 million users [10].

2.1 Background UCDS

The project started 12 months before we started implementing user research. At that point, UX designers were just creating screens based on Policy requirements, or based on research reports which were done by external agencies.

The project was development and policy led without or with very little UX/User Research.

The UX team was composed of:

  • 1 content designer

  • 2 designers to prepare the screens

  • 2 civil servants (1 part-time not on site) both acting as user researchers without previous research qualifications or training.

The complexity of the project was that it involved so many end users. It was shaped by two large policy documents, involved the interaction with other services such as HMRC, as well as many job centre agents and other back-office civil servants.

The basic architecture of the platform was developed offering a hybrid offline and online journey. The registration was online but ID verification was offline, as claimants had to come to the job centre to show a proof of identity.

User research was not recognised as an important part of the product development, for policy people, product owner and/or delivery managers.

It was clear that Stakeholders did not see the importance of spending time with the users despite the GDS recommendation.

Stakeholders did not realise that citizens on benefits may not be good with computers, they may not have internet, or may not have a computer etc. and some may fall in the Assisted Digital categories [11].

Furthermore, stakeholders did not realise that users were not limited to claimants. Users were also every civil servant in the back office that needed to interact with the platform.

The policy was not very popular with the general opinion. The press never missed any occasion to write negative comments. Therefore, poor usability and user experience could have a negative effect on the policy, government, and of course the next election.

Making the product usable for all the users was a challenge but overall was a necessity. Every section of the UCDS platform were broken down in features and prioritized.

Every section of the portal went through the following process:

  • Mini discovery: to understand the policy requirement, user profile/persona, user needs etc., the business analyst and user researcher have to gather the information.

  • Screen or prototype creation: based on the BA, UR, designers and content designers who worked on the screens.

  • User testing: user researchers tested the prototype.

  • Analysis: feedback and recommendation by the user researcher.

  • Prototype updating: content and design.

  • Retest prototype: user testing sessions until good enough to go for development.

  • Validation: of the prototype, screen user researchers and BA.

  • Preparation of screens: for development designers.

  • Development.

Once the every features were developed, we tested the user journey on mobile devices. Traditionally government focused on non mobile platform service, but today it is important to make mobile e-government service in order to increase the access public services to all the citizen [12]. GDS took the direction to develop every platform that will be on the Gov.uk as mobile first [13].

2.2 Mobile and Tablet Testing

20% of consumers check their phone more than 50 times a day [14] Over 30% of UK adults look at their phone within 5 min of walking. The average instant message (IM) user sends over 55 IMS a day. 4G subscriber’s numbers expected to exceed 10 million by the end of 2015 [14]. This increased the importance in creating a mobile responsive portal (Fig. 3).

Web analytics UCDS March – July 2015

The analytics in Figs. 1 and 2 show clear evidence that mobile devices became the primary choice used by citizens to access UCDS. We kept moderating the analytics and projected that the pivot point was going to happen sooner than later.

Fig. 1.
figure 1

Source – Savarit 2015 [9]

Integration of user research to the product development circle

Fig. 2.
figure 2

- Source Savarit (2015) [15]

Percentage of devices used to access UCDS March 2015-July 2015

Fig. 3.
figure 3

- Source Savarit, (2015) [15]

Number of visits to UCDS per devices March 2015-July 2015

Quote 1:
figure 4

General understanding of the universal credit

User journey 1:
figure 5

Entitlement, register and create account

User journey 2:
figure 6

Create an account

Quote 2:
figure 7

Security questions

We decided to carry user testing on the full end-to-end UCDS journey. This is to be done on mobile devices (iPhone, Android and Tablet), aiming to get the maximum amount of feedback to improve the mobile journey, to be easy and satisfactory.

2.3 Approach

Face-to-face user testing sessions were undertaken across 3 days on the: 7th July, 21st July and 4th August 2015 at the Experience Lab facilities, in Holborn.

The sessions lasted between 45–60 min and were based on:

  1. 1.

    Semi-structured/informal interview conducted to keep the session on track

  2. 2.

    Tasks performed based on a script prepared by the researcher

  3. 3.

    We used the UX external demo on iPad2, iPhone 5S and Samsung galaxy S5

2.4 Participants

All participants were 1st time users of the service and should be eligible to the universal credit.

They were recruited by a recruitment agency based on a screener that the researcher prepared. The details of participants are presented in Table 1.

Table 1. Participants information

Participants were briefed verbally at the beginning of the session, asking if they were happy to take part in the research, explained their right to withdraw and asked to sign the consent form. Ethical guidance was followed.

At the end of the session, participants were debriefed and given some incentives for their participation.

2.5 Method

All sessions were video recorded and projected in the viewing room. Screen capture as well as the face of the participant were recorded. Notes were taken during the sessions by the researcher as well as by the observers (designers, business analysts etc.), which were in the viewing room following the live session. The notes were summarized in an Excel spreadsheet and the video files were reviewed to confirm what had been identified during the session. Despite the fact that reviewing the video is time-consuming, it enables us to verify if the phenomenon occurred and that it is not influenced by individual perception. Thematic analysis has been used to analyse the outcome of the sessions, with data coded and organised into categories. Similarities in themes could be identified across participants, across sections of the task and across user testing sessions.

2.6 Materials

  • Discussion guide

  • Consent form

  • Tablet

  • iPhone

  • Samsung

  • Camera

  • Video recording

  • SUS

  • Excel document for the analysis

2.7 Findings

The sessions confirmed, that users did not have a clear understanding of what universal credit was.

Entitlement.

To make their claims the users initially had to check if they were entitled to apply to UCDS. At this stage, only a few postcodes were taking part in the private beta. Every claimant had to go through the postcode check which we called the entitlement. Once the user was eligible to use the service, they had to register.

Participants did not experience any issues in checking if they were eligible to use the service. They used their postcode and all participants managed to check their eligibility autonomously.

After, they had to select if they were doing a single claim or a couple claim. All participants taking part in the research were doing a single claim.

Create an Account

Participants were asked to create a user name, password and to answer two security questions.

Password.

At this stage, participants find some difficulties to move beyond the username/password.

The user testing on mobile devices brought even more evidences towards participants who were struggling in the creation of their account. It was also more prominent to create an account on the smartphone (iPhone and Android) than on the iPad.

Most of the users needed between 3 to 5 attempts to go through the “create an account” process. Participants that were not computer savvy did not manage to go through it and required the help of the moderator.

Furthermore, despite the instructions available to the users, and the colour coding which was in place, they struggled to go through the account creation process.

The findings from the user testing show that participants were not reading the instructions (symbols, letters, upper/lower cases and number) that were required.

We had evidence from previous user testing on desktop, that users had difficulties to create an account. The reason was due to the complexity of the password requirement. Even if we made the recommendation to simplify the password requirement, the security service declined our recommendation.

On mobile devices, participants did not see the errors symbols in red, even if it was more prominent on the smartphone than on desktop.

Therefore, we had to find other solutions to facilitate the journey of the password creation.

Security Questions.

To create an account, once the user managed to create a password successfully, two security questions were presented to the users to be able to retrieve their account in case they lose their login details.

We identified different types of issues:

Content Issue.

Based on the reaction from the users during the testings, it seems that two security questions was too much.

Furthermore, the content of the security question seems to be a bit outdated as some questions did not apply to the type of users (e.g. what was your first car? - some users may never have owned a car).

Usability Issue.

iPhone users could not see the totality of security questions, as they did not fit the carousel preventing the user to read the questions. Users did not switch their phone instinctively on the horizontal mode to see the questions. Even if few managed to turn their phone, the content still did not fit the screen. Furthermore, the position of the carrousel at the bottom of the screen was not visible to the user. Many people do not see the carousel.

On the other devices (iPad & Samsung S5) we did not identify any issues.

Once the participants completed created a username, password, selected the two security questions and their answer, they successfully created their account. This was taking certain users up to 5–10 min.

About You

This section was for the user to confirm their email address. The main issue was that the users did not read the content and the instructions.

Users just realised that they have made a mistake once they have submitted the page ‘apply for UCDS’ in the user journey 3.

User journey 3:
figure 8

About you

Quote 3:
figure 9

Home address and postcode

Quote 4:
figure 10

Information about the process

User journey 4:
figure 11

To do list

Quote 5:
figure 12

Time to apply

Quote 6:
figure 13

Income other than earning

Quote 7:
figure 14

Work and earnings

Quote 8:
figure 15

Education

Quote 9:
figure 16

Housing

Quote 10:
figure 17

Who live with you?

Confirm My Email.

The confirmation of the email was straight forward. Users were expecting to get an email confirmation in their mailbox. No issue (content or usability) was identified, the layout on the mobile device was responsive.

Home Address.

Some users mentioned that they had already entered their postcode when they went through the entitlement screen, and questioned why they were asked again. They expected their postcode to be kept in the system. No other issues were identified.

Apply For Universal Credit.

The process was straightforward and the users managed autonomously to go through the process. The only issue identified was that the user did not know at which stage of the journey they were. They had no visibility of how far they were in the process of making their claim. Furthermore, they make the remark that they have no visibility of how long it was going to take them.

To Do List.

Once the user clicked on to apply for universal credit, they landed on a screen with a list that “we” call the “to do list”. This covered the following sub-Section

  1. 1.

    Income other than earnings

  2. 2.

    Caring for someone

  3. 3.

    Nationality declaration

  4. 4.

    Work and earnings

  5. 5.

    Your education

  6. 6.

    My health

  7. 7.

    Health-related benefits

  8. 8.

    Saving and investment

  9. 9.

    Housing

  10. 10.

    Bank account details

  11. 11.

    Who lives with you?

We could identify a mixed reaction from the users. All of them were quite overwhelmed by the length of the “to do list”; most of them made faces or commented on the list. We previously had this comment on desktop user testing, it was also more visible on smaller devices as the user had to scroll down. They all understood that they had to complete each section. Despite this fact, they believed that some of the sections were not relevant to them.

The order of the “to-do list” was not perceived as logical. Some users started with the one that seemed easiest to complete (e.g. housing, earning was easy whereas saving and investment tended to be kept for the end).

Income Other Than Earning

Users did not understand the meaning of the terminology “Income other than earning”. They asked the moderator what it meant, we asked them what they will be doing if they were on their own to complete their claim? Most of them respond that they will have contacted the helpline to get more explanation. This comprehension issue was clearly stopping the users from completing their initial application.

Caring for Someone

This referred to the people who were caring for someone who had a disability. This question was generally for people that were getting or eligible for the carer allowance. Some users thought that this section also was for parents that had children. They had to click on the link to see what this section was about.

Nationality Declaration

No issue identified

Work and Earnings

Content issues: There is no clear understanding from the user of what work and earnings meant (e.g. if they had to include benefits, maintenance etc.)

It was also not clear whether they had to put in their monthly or annual income.

There was also an issue for those on ‘0’ hour contracts (e.g. they don’t know until the end of the month how much they’re going to make)

Once again they did not read the instructions before entering the numbers.

Layout issues: The box on the screen to enter the number was divided into two boxes, one for the pound and the other for the pence. Users failed to recognise in which box they had to enter the numbers of their earnings.

The errors message was only activated once they have submitted the screen.

Your Education

We did not identify any usability issues for this section. However, at first, the users did not understand what this referred to (e.g. do they need to provide their education history).

My Health

We did not identify any issues with this section

Health Benefits

We did not identify any issues with this section

Saving and Investment

Most of the applicants did not think this section applied to them. The ones who had a bit of savings were worried that if they declared their savings, they will not qualify for their UCDS benefit. Information of the cap would be needed to reassure users.

Housing

This section was quite long for the users and several questions were asked. We did not identify any functionality or usability issues; nevertheless, the content was an issue.

Content issues: Users did not understand which figure they had to put in the box. Many users were on housing benefits (Housing benefits were at that time a separate benefit), which very often was paid directly to the landlord or the housing association. The users did not know if the amount required was the full amount of their rent or if it was the amount not covered by the housing benefit (the difference between the rent and the housing benefit that they were currently paying.)

We came across the fact that many users that get their housing benefits paid directly to their landlord or housing association had no idea of how much their rent was.

The section related to the ‘frequency’ was not clear to the users. In the UK, rents could be paid: monthly, weekly, biweekly or every 4 weeks. However, users did not understand that some people may have different rent payment frequency. Some instruction, such as how often do you pay your rent, could have been added above the options to make it clearer to the users.

The section related to the ‘service charge’, was also unclear to the users, many users did not know what this section was referring to.

The section “free weeks” was also not very clear to users that were not entitled to this type of benefit.

The section that asked to provide the address of the landlord was not clear to the users that did not have a private landlord. More information and guidance were needed.

There was some clear evidence that the content on this section was not clear and confusing to the users. Most of them completed the form but we could question whether the answers were accurate.

Most of the users will have contacted the helpdesk or even some of them will go directly to the job centre to get some support.

This overall section was perceived as being too long and congested on smartphones. However, on the iPad, the general overview was positive probably due to the screen dimension.

Bank Account Details

There were no usability/content issues in this section

Who Lives With You?

We identified two key issues in this section:

Content issue: The users had some difficulties in entering the childcare costs, there was no clarity about which amount and the frequency of the childcare cost to be entered (e.g. weekly, monthly, etc.).

The question ‘Had your child been in prison?’ caused a lot of animosity amongst users who clearly showed their displeasure during the session and recommended that this were to be removed! This is a policy requirement; nevertheless, how many claimants have children who have been in prison? A more discrete way may need to be considered.

In addition, this section was also perceived to be too long (e.g. lot of content on a small screen) to get through on a smartphone.

Confirm Details

There were no usability issues in this section. The feedback was more positive than on the on Desktop

Submit Claim

There were no usability issues in this section.

SUS Results.

The System Usability Scale (SUS) provides a high-level satisfaction score of usability of a site, application or any technological item.

SUS is a simple, ten-item scale giving a global view of subjective assessments of usability.

The SUS scale is generally used after the respondent has had an opportunity to use the system being evaluated, but before any debriefing or discussion takes place.

A SUS score has a range of 0 to 100. It is not a percentage.

A score above 68 is above average and a score below 68 is below average Table 2.

Table 2. System Utilisability Score score per devices

The overall the score across the three devices was of 80.9, which is acceptable between good to excellent.

We could see that the iPad had a higher score and that the iPhone the lower one. Nevertheless they are all scoring in the acceptable range, between good to excellent in reference to the Fig. 4.

Fig. 4.
figure 18

System Utilisability Scale (SUS) interpretation

3 Conclusion

The overall user journey on mobile devices was positive. We could identify some usability issues at the beginning (create account, registration). Password and security questions needed to be improved, otherwise this could have a major impact on the online user journey, which will increase the number of calls to the helpline or visits to the job centres.

Most issues identified were related to content understanding, policy jargon that needed to be translated to a more accessible terminology (e.g. housing, child care cost etc.). Furthermore, making clearer and sign posted guidance would help and facilitate the overall user experience.

We could also identify that the general user experience was more satisfactory on iPad, than on Samsung, the iPhone was not as good, due to the security questions.The screen size could also have caused some problems (e.g. amount of content displayed on each screen.)

Despite those issues, the general experience on mobile was positive, this was important as the number of citizen using mobile devices to access the UCDS platform keep increasing based on the analytics. The user testing on mobile devices showed how users interacted with the platform and their experience in-situ provided feedback that permitted to improve the overall user experience.

The fact that user research was involved at every steps of the product development enabled to fix issues on the spot every time we identified them. In this case study, we tested the whole user journey on mobile devices. The findings show that some issues in terms of content were still present, and work still needed to be done to make the portal easy to use. We could also conclude that the fact that we have introduced user research and user testing systematically for every feature has reduced the number of issues. By involving users and taking a user centred approach, we optimised product development, reduced high risk issues, and constantly evaluated functionalities and content.

Further research should take place with people which could be qualified as Assisted Digital (people with very little or no knowledge of computer, or people with cognitive limitation), as an e-government platform must be accessible and usable to all the citizen.