Keywords

1 Introduction

Pure research into information architecture (IA) is rare; the field borrows from external research as needed, rather than tackling research questions directly. However, as IA has become more structured and recognized, dedicated IA-related research has resulted. This research is based on design problems, and the drive to find answers.

Navigation is the method by which users investigate a web page or other mode of information. Navigation is a major research theme in IA. It encompasses labeling and menu structures, as well as navigation behavior models. Many research publications now deal with IA-related topics [1]. As research progresses, the reliance on external models or methodologies will diminish.

IA for the internet is concerned with applying the principles of architecture and library science to website design. Each website is like a public building: it is available for tourists and regulars alike to browse through at their leisure. The job of the architect is to set up the site’s framework, making it comfortable and inviting to visit, relax in, and perhaps even return to [2].

In 1976, Richard Saul Wurman first coined the term “information architecture” [3]. He was a trained architect who became interested in urban environments. Wurman examined the ways in which urban environmental data interacted meaningfully with architects, urban planners, utility and transport engineers, and urban-dwelling people [3]. His initial definition of information architecture was “organizing the patterns in data, making the complex clear” [4]. He sees that the problems of gathering, organizing, and presenting information have many parallels with those faced by architects in designing buildings to serve occupants’ needs [4].

IA encompasses structuring web page content to make it easily accessible for users. When designing IA for a website, the designer is concerned with navigation, labeling, and content organization: the elements help people understand and find what they are looking for. This practice draws on library science, cognitive psychology, semiotics, cybernetics, discrete mathematics, and architecture itself.

IA is essential to the process of building websites aligned with business needs. Some IA stakeholders with limited knowledge of IA matters—such business owners, web designers, information management specialists, and web programmers—do not have simple methods to evaluate IA regarding contextually derived desired qualities.

Recognizing the need for a coherent way to represent IA, Vasconcelos et al. [5] proposed a set of enterprise-modeling primitives (later extended to IA modeling) regarding information, application, and technological information. Subsequently, the IA modeling framework tested on real world case studies. This research step confirmed the need for tools capable of supporting the architect, while building IA and quickly accessing users’ choices.

As the evaluation topic is a mature issue in software engineering, there is several software evaluation approaches to consider their applicability for IA evaluation. In addition, they adapted some software metrics to the IS context. This paper aims to highlight the development of IA evaluation by proposing and explaining the main features of IA evaluation and providing stakeholders with the tools for assessing IA qualities to ensure their business suitability.

2 Overview

2.1 Defining IA

IA describes both the information design process and its outcomes. Hence, the relationship between architecture, information, and IA can be viewed from an architectural history perspective. Initially, architecture may not reference a built structure: it may be conceptual; this is, information. Early digital design was visual, incorporating images of buildings and cities. Gradually, similarities to physical realities like buildings have disappeared in IA; but users still interact with abstract and multiple IA spaces using navigation techniques. Users relate with information in semantic space, screen space, and interaction space. An information architect must address each of these spaces. The pervasiveness of contemporary computing might disrupt these navigation concepts, leaving only abstracted links between users and information. This new interaction may shape IA spaces.

IA is the term used to describe the structure of a system: how information clusters, the navigation methods, and the terminology used within the system. Effective IA enables people to step logically through a system and be confident they are reaching the required information. Most people only notice IA when it is inadequate and stops them from finding information. IA is most commonly associated with websites and intranets; however, it can be used in the context of any information structure or computer system [4].

IA is critical to information delivery and communication between clients and organizations. IA is a relatively recent phenomenon, with its own characteristics and contexts. The information structures of organizations’ websites are contributed to by multiple and diverse people, using IA [6].

IA is neither an information technology (IT) implementation end in itself, nor the solution to all information problems. Rather, it is an iterative process, a team activity, part of a solution, and an approach to solving issues around storing and finding information [7].

Dillon has offered a broad definition that attempts to accommodate the diversity of approaches. Dillon defines IA as “the process of designing, implementing, and evaluating information spaces that are humanly and socially acceptable to their intended stakeholders” [8]. This is an inclusive definition, despite not referencing IA as a discrete discipline. Instead, here IA is aligned to human activities such as design or creative writing. Further, Dillon advocates a view of IA as craft rather than engineering—a distinction based on the lack of separation within IA between the design and manufacture of the resulting application [8]. As craft, IA creates as it produces, often reacting to emerging elements of its own design to drive subsequent modification. Craft-based disciplines are less amenable to formal methodological abstraction for management and instructional purposes. This can result in them shifting or being altered by outside forces. IA organizes and simplifies information, as well as the design, integration, and aggregation of information spaces and systems. IA facilitates the finding, understanding, exchange, and management of information, allowing users to navigate complex information structures [8].

From the above we can identify the most important elements of IA: users, content, content management, structure, design and build, navigation, and security (Fig. 1).

Fig. 1.
figure 1

The main elements of IA.

2.2 A Brief History of IA

IA started with Argus Associates, a consulting company set up in 1994 by Joseph Janes and Louis Rosenfeld, from the School of Information and Library Studies at the University of Michigan. The company was involved in a range of internet and web developments, and began to use the architecture metaphor with clients to highlight the importance of structure and organization in web design. Web Review magazine started a column entitled “Web Architect”, authored by Rosenfeld. Peter Morville, also a graduate of the School, later joined Rosenfeld and became the first employee of Argus Associates [9].

In 1996, Wurman published a book entitled Information Architects, in which he claimed to have invented the expression “information architect” in 1975. This book took an information design approach to IA [10].

IA dates from 1998. By this time, Argus Associates had built a considerable reputation for IA expertise. O’Reilly Publishing commissioned Rosenfeld and Morville to write a book: Information Architecture for the World Wide Web (also known as the “polar bear book”, owing to the distinctive line-drawn polar bear on its cover) [9]. Rosenfeld and Morville approached the issues from a library and information science perspective.

In 2000, the American Society for Information Science and Technology organized the first in a series of IA summits. This event further catalyzed the development and visibility of IA. Argus Associates folded during the dot-com bust of 2001; however, by then IA had moved into mainstream web design. In 2002, a number of books were published that shed new light on the emerging discipline.

In Europe, IA is starting to become the subject of conferences and workshops. IA sessions at the 2003 Online Information Conference in London were well attended. In 2004, an IA conference in Denmark attracted 150 delegates. In March 2004, the United Kingdom (UK) Online User Group ran a seminar in London. In June 2004, Information Today Inc. launched an IA conference in Paris (www.infotoday.com/iaparis/).

3 Literature Review

Samsur and Zabed have reviewed definitions of website usability from the 1990 s until now and examined several approaches toward evaluating university websites [11]. This led them to develop a survey instrument that they used to explore students’ views of the website of the University of Dhaka (their own institution). Student population responses were analyzed according to demographics, use, and website usefulness, revealing that only a small proportion of those surveyed reported always being able to find what they needed. Samsur and Zabed identified five important factors for achieving usability: interactivity and functionality; navigation, searching, and interface attractiveness; accuracy, currency, and authority of information; accessibility, understandability, learnability, and operability; and efficiency and reliability [11].

Islam and Tsuji designed and developed a questionnaire based on 23 usability criteria divided into five categories by aspects of usability. They used this instrument to evaluate selected university websites in Bangladesh from a usability perspective [12] and found that a large majority of users were dissatisfied with the usability of these websites [12]. Weaknesses were found in terms of design, interface, and performance, and the websites’ internal features were identified, with suggestions to enhance website usability [12].

Mustafa and Al-Zoua’bi studied Jordanian university websites, using tools to measure internal website attributes not perceptible by users, such as html code errors, download times, and the size of html pages [13].

In his study of website usability and search issues involving 13 Australian and two overseas universities, Alexander concluded with five action-oriented recommendations:

  • to design an IA that meets prospective students’ needs

  • to create content that meets prospective students’ needs

  • to improve search performance

  • to not assume that prospective students have relevant domain knowledge

  • to not use PDFs (the primary format for web content) [14].

Ruwoldt and Spencer examined homepage screenshots to develop a questionnaire involving 68 Australian and overseas universities [15]. They sorted comments into specific content aspects, labeling and navigation, design, and branding, and concluded that IA best practice provides multiple navigation paths [15]. They made the following suggestions:

  • static links should be grouped according to audience or topic and labeled “for” and “about”

  • two or more links should be provided from the homepage to a key content page (as appropriate), with the links given different titles

  • links to key content should be emphasized visually

  • users should be allowed to choose between a search engine or browsing a site map or index/directory [15].

In her study on web standards and navigation structures, Nichani surveyed 25 universities, mostly from Australia, the UK, and the United States (US) [16]. She concluded that website re-design projects foregrounded considerable experimentation [16].

DeWeaver and Ellis surveyed a representative sample of nine universities in New South Wales and Queensland on 28 marketing parameters [17]. They concluded that, despite lengthy experience in web marketing, some universities still rated very low in this category [17]. DeWeaver and Ellis suggested that effective web marketing for universities requires greater integration of design and content. This relates to recognizing how visitors navigate websites [17].

Bao and Ellis conducted a pilot study with 31 institutions (21 universities [general curricula] and 10 business schools) across the US, Australia, the UK, Asia and France [18]. British and Australian institutions were found to appear more compliant with web standards and usability issues [18], while significant variation was found between other institutions in their organization of their homepage information. Scope for significant improvement was found for most institutions [18].

Powell recounted usability guidelines relating to website use:

  • learnability

  • memorability

  • efficiency

  • reliability

  • satisfaction [19].

McLaughin and Skinner identified six related but distinct components of usability:

  • checkability

  • confidence

  • control

  • ease of use

  • speed

  • understanding [20].

Aziz and Kamaludin used web evaluation to validate websites to determine how they perform. When analyzing a website, typical factors to be considered include: how the information is organized and presented, and how to access and navigate informative structure [21].

Morville described the interrelationship between the world and the Web [2]. He asked, “How do we rise to the new challenges of creating paths and places that bridge physical, digital, and cognitive spaces?” [2]. Viewed from this angle, information architects are at least partly responsible for creating these bridging paths. We might ask how user experience of similar paths, spaces, and usability models differs. Morville proposed guidelines to determine design and usability [2].

Rosenfeld and Morville stipulated that users, content, and context inform good IA [9]. Although conceding that the basic model was oversimplified, they did note that concepts intertwined “within a complex, adaptive information ecology” [9]. Rosenfeld and Morville also stressed the “dynamic, organic nature to both the information systems and the broader environments in which they exist” [9]; continuing with “we’re talking complex, adaptive systems with emergent qualities” [9]. These statements make a clear connection between IA and context-aware adaptive systems, as described above. However, these fields do not interact very much [9].

4 Research Design

4.1 Research Problem

One problem facing the IA community, in its drive for professional status, is the need to overcome abstraction and education problems. This will provide the legitimacy accorded to related fields within information science. There are no clear criteria to assess and evaluate universities’ web portals.

4.2 Methodology

The purpose of this study is to build IA criteria for assessing and evaluating universities’ web portals. Thus, this study uses the Delphi technique to identify the most important questions to build these criteria, because the Delphi method is described as a group process used to solicit, collate, and direct expert responses to reach consensus [22]. The methodology behind the current study is based on exploratory research by Farrokhi, Chizari and Mirdamadi [23], who used the Delphi method when examining the development of web-based distance education in Iran’s higher education system.

Therefore, to encourage a broad range of potential priorities, this study sought input from three disparate professional areas, each with a specialized area of expertise.

  • web designers (n = 10)

  • web masters (n = 10)

  • researchers and faculty members in web design (n = 10).

Data collected over a three-month period (January, February, and March) 2014. A letter of invitation to participate in this study was sent by e-mail to 30 potential participants around the world. They were identified through a search of technology administration websites, university websites, websites associated with web design, and a thorough literature review.

The researcher applied the Kendal coefficient to determine the consensus scale, using the following formula:

$$ s = \sum {\left[ {R_{J- \frac{{\sum {R_{J} } }}{N}}} \right]} $$

This coefficient determines the degree of agreement between priorities related to N people or things.

In total, six of the 30 selected people chose not to participate, or did not reply. The number of participants was thus reduced to 24. Dalkey [24] stated that for a study to be reliable, greater than 80 percent participation is needed.

This study used a series of four mailed questionnaires; a methodology that Moore and Kearsley [25] note is typical of the Delphi technique. A wide range of responses was collected using an open-ended question. These responses were then categorized to produce the items for the subsequent three rounds of the questionnaire, which required respondents to rate items on a five-point Likert-type scale (1–1.79 = strongly disagree, 1.80–2.59 = disagree, 2.60–3.39 = uncertain, 3.40–4.19 = agree, 4.20–5 = strongly agree). A panel of experts from outside the study validated the questionnaire. The four questionnaire rounds ran as follows:

  • The first round used the open-ended question: what are the most important things to consider before starting to design a website? This elicited a wide range of responses, which were categorized to produce the items for the second-round questionnaire.

  • In the second round, respondents were asked to rate the items identified in round one on a five-level Likert-type scale regarding the agreement level (from 1 = strongly disagree to 2 = strongly agree). From this second round of responses, the category list was reduced to 62.

  • The third round aimed at achieving consensus. Participants were asked to indicate their agreement using the Likert scale, and to provide comments if they did not agree with the summary findings. Consensus was reached on 54 of 62 items in this round. These items were divided into seven categories: users, content, content management, structure, design and build, navigation, and security.

  • A fourth round sought to reach consensus on the remaining items. This questionnaire asked respondents to indicate whether questions were the same as the modified ones from round three. Consensus was reached on 48 of the questions in this round.

5 Analysis of Data

The collected data were treated as interval data and reported using descriptive statistics, including means and standard deviations.

6 Results

This paper proposes criteria for assessing and evaluating web pages. The first draft of these criteria were based on an extensive literature review and experts’ opinions, expressed by web designers, web masters, researchers, and faculty using the Delphi method. The final proposed criteria displayed in Table 1.

Table 1. IA Criteria for Assessing and Evaluating Universities’ Web Portals

The list has seen several significant changes from one round to another. In the round, two of the panelists added many phrases and changed the formulation of some. Context in the first round increased from five to sixteen, users increased from five to nine phrases. All phrases of component design are found in the uncertain response. Panelists suggested deleting context, documenting IA and implementation and testing IA. They proposed changing some addresses.

The third round also has many changes. The number of context phrases has decreased to six, although most were in the category of uncertain response. Users decreased to four phrases, with the recommendation to redistribute some phrases and transfer to other subjects. Panelists suggested add new subjects, such as security, navigation, and evaluation, and proposed many phrases that fall under these new themes.

The final list also saw some important changes. The large number of panelists suggested deleting context and the distribution some of phrases. A new proposal was to structure and determine the number of phrases. Some subject addresses were changed or shortened, the title ‘design component’ has been changed to ‘design and build’. Then came the final list, including seven subjects described in the final list.

7 Conclusion

IA plays a vital role in organizing and simplifying information on web pages. It creates ways for people to find, understand, exchange, and manage information.

Within the framework of this study, the objective was to develop IA criteria for assessing and evaluating universities’ web portals. Thus, this paper illustrates 45 criteria and types of evidence, which are divided into seven sections: users, content, content management, structure, design and build, navigation, and security.