Keywords

1 Introduction

With the growing demand for streamlining software project management activities by increasing the level of collaboration between individuals who work in software development teams, many agile software project management tools have been introduced. These software solutions offer a variety of features that mainly help software engineers distribute and prioritize development tasks and manage the overall progress of their projects. When deciding which software project management tool would better fulfill the needs of a given organization and help in achieving its business goals, looking for a complete solution that gives each member the ability to efficiently plan, track, and manage each iteration of the project is of extreme importance. Usability focuses on how end users will work with the software, and agile development processes focus on how software developers can flexibly deliver their assigned tasks.

Once a specific software project management tool is chosen by a team, the software analysts, designers, developers and testers are supposed to use the selected tool to track the progress of other members and collaborate to achieve the objectives of their project and thus contribute to satisfying their business goals. Therefore, although team members are assumed to have the technical expertise to use these software solutions, agile software project management tools that suffer from usability shortcomings might discourage team members from using the chosen tools and thus negatively affect the progress of their teams. For globally distributed teams, the lack of transparent and visible progress of all team members might lead to these teams’ resources being wasted. For instance, some members might start working on work items that were already completed by other members. This in turn might complicate the integration of the development work done by distributed teams and increase inconsistencies between the work items completed by different teams, leading to cost and schedule overruns.

In this study, practical research is conducted on widely adopted tools and their features that are widely demanded in the software industry are explored. These tools are JIRA, AgileZen, VersionOne and ZebraPlan. The tool selection process was based on the support of the essential features that agile software development teams demand in practice, such as visibility of progress, communication, task prioritization and time management features. Before evaluating the usability of the chosen tools, we define comprehensive evaluation criteria covering elements chosen specifically to evaluate our participants’ confidence with the tools’ user interfaces, such as the readability of the written texts, the applicability of the presented icons and the overall user-friendliness of the evaluated tool. Further, because our participants are software engineers who have experience dealing with software development, analysis and testing tools and techniques, we give special attention to evaluating the features that are related to agile management methodologies and the ability to track progress at project and iteration levels. The results of our study can help the designers of these tools to identify the usability drawbacks of their systems and adjust them to fulfill the demands of the target users by highlighting the strengths and weaknesses of each tool from a usability engineering perspective. To the best of our knowledge, we have conducted the first research study that focuses on addressing the usability limitations of agile software project management tools based on a systematic approach that takes the subjective views of HCI experts and software engineers into consideration while utilizing the capabilities of a widely known usability assessment tool.

2 Literature Review

While taking into account the specific characteristics of software systems and the expectations of their users, many usability evaluation frameworks have been proposed to address the usability shortcomings of software applications in domains including banking, educational and gaming contexts [16]. While there are many usability parameters that could be considered for developing comprehensive usability evaluation criteria and frameworks, which might be related to either the properties of user interfaces or the cognitive abilities of target users, we believe that these parameters should not be considered equally important for all application types; some of them could be valued over others depending on the domain requirements of target systems. In [1], for instance, the factors that increase learners’ motivation to learn, the properties of instructional material and the subjective opinions of academics were utilized to develop a usability evaluation method for e-learning software systems. Similarly, researchers in [2] based the categorization of usability evaluation metrics specifically developed for massively multi-player online role-playing games on game-specific properties.

Prior studies have also highlighted the importance of reducing the development time of software systems, increasing the level of collaboration between members of development teams and achieving optimal allocation of product resources [710].

For globally distributed and co-located teams, the results presented in [8, 11] have shown that most of these teams use traditional communication, monitoring and task tracking tools (e.g., wikis, e-mails and instant messaging applications) instead of utilizing the software project management tools that were specifically designed to facilitate the coordination of distributed development work. Azizyan et al. found that the ease of use of agile project management tools is the most important factor impacting the adoption of these tools [12]. Silva et al. recommended studying the factors that would strengthen these tools and thus encourage software developers to take advantage of them [11]. In agile software project management contexts, after evaluating the ease of use of a number of software project management tools in [13], researchers have correlated the number of features provided by the evaluated tools with their ease of use and found that the tools with fewer features received higher ratings. However, the literature lacks comprehensive and systematic empirical evaluations of the usability of agile software project management solutions. There is also a lack of comprehensive usability evaluation criteria that could be taken as a basis for assessing the usability of these solutions from a project management perspective.

3 Research Methodology

We utilized a usability testing software called Morea to conduct a task-based usability analysis. We defined an evaluation checklist that was specifically developed for measuring the usability of software project management tools. We also used a pre-session questionnaire designed to help us understand our participants’ experience in the software industry and their overall familiarity with our chosen tools. We also took advantage of the System Usability Scale (SUS) questionnaire to understand the overall usability of each tool included in our study. Each participant was also asked to answer a 5-point Likert scale questionnaire that was specifically developed to measure the usability of each tool from a software project management perspective. The statements included in this questionnaire are: to what extent do you agree that the use of the tool would help your development team to achieve its goals? (Q1); to what extent do you agree that the use of the tool would increase the level of collaboration between your team members? (Q2); to what extent do you agree that the usability problems that you observed would not have negative effects on the overall workflow within your team? (Q3); to what extent do you agree that the tool is easy to use? (Q4); to what extent do you agree that the tool would help in improving the productivity levels of your team members? (Q5); to what extent do you think that the tool would help distributed teams to coordinate their work? (Q6); and to what extent do you agree that the tool would help in managing complex and large software systems? (Q7). Figure 1 summarizes our research methodology.

Fig. 1.
figure 1

Our research methodology

4 Usability Evaluation Criteria

The interface of each of the four selected agile project management tools was examined using Neilsen’s heuristics. From a project management perspective, some other heuristics were defined and have been mapped to the different project management-specific tasks. Each agile project management tool was examined by selecting its key tasks and examining its interface using the specified heuristics. This framework has grouped 41 defined metrics into different categories. These metrics were used by the five HCI experts to evaluate the different tools and their interfaces as shown in Table 5 which illustrates the overall heuristic evaluation of the four selected tools.

  1. 1.

    Immediate Feedback: The ability to provide appropriate feedback in the different cases while performing different tasks within a reasonable time. Two metrics were defined under this category: displaying error messages at the right time (A1) and presenting appropriate feedback based on explicit user actions (A2).

  2. 2.

    Real-world mapping: The ability of the tool to map and reflect real-world project management and development workflow. For this category, five metrics were defined: the elimination of information irrelevant to software project management (B1), the usage of software project management metaphors (B2), the usage of user-oriented terms in the interface (B3), the usage of agile terminologies (B4) and the appropriate reflection of real-world software development workflow (B5).

  3. 3.

    Consistency and standardization: The ability to use standard terminologies and organization to avoid confusing the users with different terms and actions. Four metrics were defined under this category: the use of consistent in-tool and across-tools naming (C1), the use of consistent layout of interface elements (C2), the use of consistent colors across the user interface (C3) and the consistent alignment between user interface elements and documentation/help (C4).

  4. 4.

    Ease of use and learnability: The degree to which a tool can be used by the intended users to effectively achieve the required task with fewer errors/failures. Eight metrics are grouped under this category: the ease of mastering the software project management tool (D1), the ease of learning by different classes of users (e.g., project managers vs. team members) (D2), the ease of tracking the progress of other members (D3), the ease of task distribution (D4), the ease of changing the status of created tasks (e.g., from “in progress” to “completed”) (D5), the ease of locating burndown charts (D6), the ease of involving customers or clients in the development process (D7), the ease of specifying the complexity of each work item (D8) and the ease of navigating between different software project artefacts (D9).

  5. 5.

    Layout and organization: The ability to use a reasonable layout and logical grouping of the different project artefacts, which can help different users to navigate easily between different options and pages. Six metrics are defined under this category: the existence of reasonable grouping of project related artefacts (E1), reasonable grouping of sprint related artefacts (E2), reasonable grouping of team related artefacts (E3), logical grouping of menu options (E4), logical ordering of menu options (E5) and logical depth of menu options (E6).

  6. 6.

    Flexibility: Providing different options that allow users to handle and preform a given task easily. We consider whether the tool allows for a flexible arrangement of teams (F1), flexible reassignments of roles within and across projects (F2), flexible adjustment of the status of tasks (e.g., in progress or completed) (F3), flexible prioritization of software requirements according to their importance (F4) and flexible management of time (F5).

  7. 7.

    Streamlining the experience and visibility: The ability of a tool to match real-world scenarios and communicate the context of the situation. We consider the support for user-customized profiles (G1), the appropriate support for recognition rather than recall (G2), the efficiency of task completion (G3), the availability of visually appealing user interface designs (G4), the visibility of visual interface elements (G5) and the visibility of the roles of different team members (G6).

  8. 8.

    Clarity: Enabling users to interact with the tool and distinguish its tasks easily without causing confusion. Under this category, we examine whether there is a clear distinction between the tasks and user stories throughout the user interface (H1), a clear distinction between bugs, epics and defects (H2), a clear distinction between the roles of software-development team members (H3) and clear navigation options throughout the software project management tool (H4).

5 Experimental Evaluation: Task-Oriented Usability Inspection

Eight participants were recruited to perform 17 tasks on the four chosen agile software project management tools (JIRA, VersionOne, AgileZen and ZebraPlan). Although agile software project management tools differ in the services they offer and have many features, we chose tasks that we believe most software engineers who work in development teams would utilize. Verifying admin home pages (T1), creating a project (T2), creating a member (T3), creating a user story (T4) and creating a task (T5) were the first five tasks that we asked our participants to perform. Our participants also tried creating sprints (T6), creating releases or versions (T7), assigning team members to tasks (T8), customizing the settings of their profile pages (T9) and viewing burndown charts (T10). They were also asked to specify the length of sprints in days or weeks (T11), access project conversation rooms (T12), mark tasks as completed (T13), change the priority of a task (T14), track the progress of team members (T15), place a comment on someone’s work (T16) and log in as clients and give feedback (T17). During each session, two observers were taking notes and tracking the progress of the participants. To identify usability problems and report users’ concerns while they were experimenting with the tools, the think-aloud protocol was utilized. Further, we intentionally decided not to ask participants to follow a particular order while performing the required tasks to avoid the effect resulting from using a difficult tool on an easier one. Participants were also asked to rate the difficulty of each task using a 5-point Likert scale.

6 Results

Pre-session Questionnaire.

At the start of each session, the participants were asked to complete a short pre-session questionnaire so we could assess their familiarity with the software engineering domain and software project management tools. The questionnaire was composed of six closed-format questions. Most of the participants have worked as software engineering practitioners in the software industry. Six of our participants had worked in the industry for a couple of months and two of them had worked for one to three years. All the participants who had worked in software engineering domains were responsible for programming and writing software codes, while only one participant was responsible for testing. In addition, four participants worked as requirement analysts, designers and project managers. Most of the participants have used software project management tools to coordinate their development work; however, most of them indicated that they used Microsoft Project and web-based tools (e.g., Google spreadsheets and wikis) to coordinate their work. Of the eight participants, two had used VersionOne whereas only one participant indicated that she had used JIRA, AgileZen and ZebraPlan.

Task Completion.

For all 17 of the above-mentioned tasks, Table 1 illustrates the task success rates among the participants for the four agile software project management tools. For JIRA, for instance, all of the participants successfully completed four tasks out of seventeen without facing any difficulty. All participants were able to figure out how to complete the third, eighth, ninth, eleventh and twelfth tasks using VersionOne. AgileZen showed a better success distribution in comparison with JIRA and VersionOne in that six out of the 17 tasks were completed successfully by all eight participants. ZebraPlan obtained the greatest success distribution among the four project management tools as all participants were able to successfully complete 10 out of the 17 tasks.

Table 1. Task completion rates among our eight participants

Time on Task (TT).

For the four software project management tools evaluated in this paper, Table 2 shows the average time spent by our participants to complete each task (in seconds).

Table 2. The average numbers of mouse clicks and time our participants required to complete the tasks

Number of Mouse Clicks (NMC).

We use this metric to help us determine how easily a user can accomplish a basic task the first time they use the tool. We measured navigability by calculating the number of mouse clicks required to complete each task using each of the four evaluated tools (see Table 2). Depending on the difficulty of the task and how easily a user could navigate through the user interface of each tool, we observed variations in the average numbers of mouse clicks our users needed to complete each task using the four tools.

System Usability Scale (SUS).

This questionnaire is composed of ten statements that are scored on a 5-point Likert scale ranging from 1(strongly disagree) to 5 (strongly agree). In SUS, the statements cover a variety of aspects of system usability such as learnability, reliability and validity, thus providing us with high-level measurement of the usability of each tool. According to [14], the average SUS score is 68 %.Table 3 demonstrates the average SUS scores for the four tools. AgileZen and ZebraPlan achieved very high average SUS scores of 70.63 % and 80.31 %, respectively. These results suggest positive perceptions of the usability of the two systems. On the other hands, the average SUS scores for JIRA and VersionOne were similar (44.69 % for JIRA and 42.19 % for VersionOne). The SUS scores for both JIRA and VersionOne were below the average acceptable SUS score.

Table 3. Results of the SUS questionnaire

Second Post-session Questionnaire.

We included questions to indicate whether the use of each of the four evaluated tools would help streamline development and management activities in agile teams (see Sect. 3). Using 5-point Likert scale questions, we asked our participants to rate the extent to which the utilization of each of the four software project management tools would help fulfill the objectives of development teams, facilitating discussions and interactions between team members and managing large and complex software systems. Our questions also addressed the applicability of using each tool to coordinate distributed development projects and reflect the overall workflow of development teams in real-world scenarios. After mapping each answer to a score (e.g., strongly agree answers worth 5 points, strongly disagree answers worth 1 points, etc.) and calculating the means of these scores, we were able to measure the degrees of satisfaction of our participants for each agile management tool (see Table 4).

Table 4. Results of our second post-session questionnaire

For ZebraPlan, all of our participants agreed that it is easy to use and expected it to help development teams achieve their goals. On the contrary, our results show that JIRA and VersionOne were more complex to use compared with ZebraPlan and AgileZen. When asked about the positive effects the tools would have on the productivity levels of team members, our results show that there were slight differences between the calculated means for all the tools although ZebraPlan and AgileZen took the first and second positions, respectively. By linking this finding with the level of complexity reported for each tool, we note that the tools that were observed as easiest to use were expected to have the greatest positive effects on the productivity levels of software engineers.

From the results presented in Table 4, we can also observe that JIRA and VersionOne obtained the lowest average scores for the third question, indicating that most participants expected the usability problems they faced to have negative effects on the overall workflow of development work in agile teams. For managing large and complex systems, most of the participants preferred using VersionOne or ZebraPlan. Our findings also show that participants found VersionOne to be helpful in managing communications and discussions between team members in development teams. Considering that ZebraPlan obtained the highest mean scores for five of the seven questions and was reported the easiest to use, our results suggest that the usability of this software project management tool would have significant impact on the overall progress and coordination of development work in teams that follow agile methodologies.

Observed Usability Problems.

Generally, due to the variety of artefacts used among developers who work in agile teams, we note that the four tools differ in the ways they present and organize the functions that each user can take advantage of. For instance, when adding a task, not all of the tools ask users to specify the user story to which the task belongs. Further, when specifying a type of task, some of the tools present the user with a list of options to choose from whereas others ask the user to type in the details of each backlog item that he/she would like to add. To prioritize user stories or development tasks, some of the tools ask the user to choose whether the priority of a task is low, medium or high, while others ask users to either type in the priorities that they want to specify or employ drag-and-drop features. Thus, by considering the fact that numbers can also be used for specifying task priorities, we note that typing the values of these priorities could lead to team members inconsistently specifying these values, which might slow the overall progress of development works. To change the status of task, we observed that some tools allow users to indicate whether a task has started, is in progress or has been completed, whereas others present checkboxes to allow users to indicate the percentage of completion of a given task as either 0 % or 100 %. We also observed that due to inconsistencies in the presentation and structure of functions inside some of the evaluated tools, some participants struggled to find the appropriate button and/or link to perform some of the tasks. For example, although each project has a set of sprints with some user stories that might be further divided into tasks, some of the evaluated tools do not clearly organize these pieces of information based on this hierarchy or do not present users with clear indications to allow them to find the required information easily. After spending a while trying to add a backlog item, one of the participants said, “The navigation menus are not helpful, I remember I saw the button I need but I have no idea where it is” (P4). Other participants also indicated that they had to remember so much information or practice and learn how to use some of the functions many times in order to efficiently perform the required tasks. Therefore, we recommend attaching all the artefacts related to a specific project consistently throughout agile software project management tools. We note that participants faced problems related to the visibility and placement of interface elements (e.g., some tools use grey to color clickable URLs). Furthermore, some participants failed to complete certain tasks because some hyperlinks did not appear as clickable. In some of the evaluated tools, our participants also faced difficulty in tracking the progress of team members or previously created tasks, either because they could not find the task boards or product backlogs in the corresponding tools or forgot where to find them. It is also worth mentioning that some participants queried search engines to figure out where to find burndown charts, add members to their teams and specify the start and end dates of sprints.

Heuristics Evaluation.

Table 5 presents the results of the usability ratings reported by five HCI experts. The evaluation is based on the criteria define in Sect. 4. In this table S1, S2, S3 and S4 represent JIRA, VersionOne, AgileZen and ZebraPlan, respectively.

Table 5. Ratings reported by our HCI experts based on our defined criteria (see Sect. 4)

7 Conclusion

In agile project management contexts, the involvement of human, time, financial and organization-specific factors increases the burden on usability practitioners to study the factors that could improve the workflow of agile teams in software project management tools without increasing the complexity of these tools. In this research effort, we utilize a number of qualitative and quantitative usability engineering methods to identify the major and minor drawbacks of four widely used agile software project management tools. We believe that the experimental findings and the usability evaluation framework presented in this paper can help software development companies to select their tools. We also expect our results to help designers of agile project management tools to identify the shortcomings of their solutions and improve them to suit the requirements of co-located and distributed teams that follow agile software development methodologies.