Keywords

The Pursuit of Trust and Transparency Within the EHEA

The vision that has guided the establishment of the European Higher Education Areas (EHEA) is the establishment of an integrated higher education area, with transparent and readable higher education systems, trustworthy institutions and mobile students and professors ensuring the international competitiveness of the European system of higher education (Bologna Declaration 1999). To set in motion this vision, the signatories of the Bologna Process have declared their willingness to pursue a set of common priorities in their own higher education system. These priorities underpin commitments made to ensure trust and transparency through the adoption of a three-cycle degree structure, a common set of standards for quality assurance and the adoption of a translatable system of qualifications (see also Table 1. Mechanisms for trust and transparency in the EHEA).

Table 1 Mechanisms for trust and transparency in the EHEA
  • Quality Assurance was put at the heart of efforts to build trust and to increase the competitiveness of the EHEA. The key commitments made by countries within the framework of the Bologna Process is the alignment of their system to the Standards and Guidelines for Quality Assurance in the European Higher Education Area (ESG). The ESG play a key role in the enhancement of quality in the European higher education and, in its broader context, support the use of higher education reform tools that include qualification frameworks, ECTS and the Diploma Supplement (DS). The new and revised ESG adopted in 2015 reflect the EHEA’s progress over the last 10 years and have made it more visible what is the “EHEA model” for quality assurance.

  • Another mechanism to consolidate transparency and trust between higher education systems is the Framework for Qualifications of the EHEA (QF-EHEA) adopted by ministers in Bergen, in 2005. The national qualifications frameworks (NQF) together with the QF-EHEA serve as “translation devices” between different qualification systems and their levels.

  • The relationship between qualification frameworks and quality assurance, for instance, is crucial as together they constitute the context in which the Bologna three-cycle degree structure is being implemented and should be quality assured.

A stronger emphasis on learning outcomes and recognition of practices has been made much clearer as the new ESG make a reference to the QF-EHEA through standard 1.2, which sets out that qualifications should be aligned to the corresponding national qualifications framework (NQF) and, thereby, to the QF-EHEA. This change underlines the important role of QA in ensuring that the assignment of qualifications to a level in the NQF and the QF-EHEA is valid and trustworthy. In this way, external quality assurance systems validate that qualifications offered by higher education institutions are correctly assigned to a level in the national qualifications framework (NQF). This may take the form of reviewing the institutions’ internal QA systems (in the case of institutional accreditation, evaluation or audit) or take place specifically for each study programme (in the case of programme accreditation or evaluation) (Fig. 1).

Fig. 1
figure 1

The quality assurance and qualifications chain

EHEA countries are also signatories of the Lisbon Recognition Convention, which has a subsidiary text on the use of NQFs in the recognition of qualifications for learning and professional purposes. The main principle of the convention is that degrees and periods of study must be recognised unless substantial differences can be proven by those in charge with recognition.

Ministers also set out the ambitious goal to achieve automatic recognition of qualifications in the EHEA by 2020 (Bucharest Communique 2012, p. 2 and Yerevan Communique 2015, p. 1). In its report to the ministers in 2015, the Pathfinder Group on Automatic Recognition (p. 7, 23) underlined the importance of quality assurance systems in line with the EHEA’s agreed Standards and Guidelines for Quality Assurance (ESG) for reaching this goal. Ministers agreed to encourage higher education institutions and quality assurance agencies to assess institutional recognition procedures in internal and external quality assurance and to promote the European Area of Recognition (EAR) manual as a set of guidelines for recognition (Bucharest Communiqué 2012, p. 4–5).

The European Quality Assurance Register for Higher Education (EQAR) also forms part of the trust-building path, established by the E4Footnote 1 organisations at the request of Bologna ministers (2007) to “[further] the development of the European Higher Education Area by enhancing confidence in higher education and by facilitating the mutual recognition of quality assurance decisions”. EQAR’s core function is the management of the EHEA’s official register of quality assurance (QA) agencies that substantially comply with the ESG, providing reliable information on quality assurance provision in Europe and thus enhancing trust and recognition within the EHEA.

EQAR-registered QA agencies are required, as set out in the ESG (see standard 2.6), to publish the full reports of their external quality assurance activities. Making such information easily accessible to the academic community, external partners and other interested individuals can ensure an increase in the transparency of external QA and expedite the recognition of academic qualifications.

Better accessibility of external QA results would be a further helpful tool for the (automatic) recognition of qualifications, as recognition information centres (ENIC-NARICs), higher education institutions and employers need an efficient way to establish whether a higher education institution was subject to external QA in line with the ESG. Enhancing the accessibility of external QA reports and decisions is expected to take place with the development of a database of external QA results in 2018.Footnote 2

The ESG not only serve as a common framework for the development of national quality assurance systems in the EHEA but are also regarded by QA agencies as a suitable basis for work across borders (Szabo 2015). At European policy level, ministers agreed to “allow EQAR-registered agencies to perform their activities across the EHEA, while complying with national requirements” (Bucharest Communiqué 2012) and “to enable higher education institutions to use a suitable EQAR registered agency for their external quality assurance process” (Yerevan Communiqué 2015).

The most recent addition to the common EHEA framework is the European Approach for Quality Assurance of Joint Programmes, adopted by ministers at the same time as the revised ESG in 2015 (Yerevan Communiqué 2015). Its “revolutionary” aspect is that ministers agreed that the European Approach should be applied without additional national criteria. That is, joint programmes would benefit from a joint approach to quality assurance based on a common set of European standards. The European Approach further includes an agreed external quality assurance procedure to be implemented by a suitable EQAR-registered agency identified by the cooperating institutions. This should be used where programmes require external evaluation or accreditation at programme level.

Outside the EHEA framework, other transparency tools operate in tandem i.e. benchmarking and classification systems, national and global rankings. They differ in purpose, policy orientation and methodology to the EHEA tools while at the same time being part of the transparency and accountability of higher education system. These initiatives are generally set up to provide consumer-oriented information to students and parents by measuring and comparing higher education performance and may be also used as a management and strategic tool by higher education leaders and governing authorities (Hazelkorn 2012).

The most commonly used tools to satisfy the need for information on higher education systems within EHEA are national repositories, portals, databases or platforms. The survey of possible users of a database of quality assurance results,Footnote 3 carried out in 2016 by EQAR, showed that respondents most often inform themselves about quality-related information on higher education institutions and programmes referring to national or institutional tools. The 380 respondents named over 70 different databases, repositories or portals which they have consulted (often or very often) for such information. The analysis of the databases showed that such tools were managed by national or quality assurance agencies, and that users generally consult online tools within the national system they are most familiar with. While the results of the survey showed less familiarity with international online tools, about half of respondents have used (at least once) the World Higher Education Database (WHED) and U-Multirank, and less than 40% stated they accessed the databases of Anabin, Qrossroads, and Learning Opportunities and Qualifications in Europe.

Admittedly, the existing transparency tools outside the EHEA enhance the information provision on higher education system. However, they do not cover (or expected to) the 48 national higher education systems or recognise the diversity of higher education systems (as they use a set of narrow indicators). The vast majority of these tools do not include information on the external quality assurance of the institutions and programmes they contain and thus are limited in responding to the full extent of users’ needs.Footnote 4

The Governance of the Bologna Process

The intergovernmental policy-making within the Bologna Process is based on consensual agreement among all 48 participating countries. Although the participation in the process is voluntary and it is up to each member state to follow-up on its commitments, the process includes a number of tools and mechanisms to monitor progress towards the agreed objectives.

Comparable to the European Union’s open method of coordination (OMC), these ministerial agreements are a form of “soft” law and it is the preferred method of policy-making as it fosters policy learning through the establishment of shared understandings of best practices, processes of national reporting, and peer review while acknowledging the diversity in Europe’s higher education systems (Harmsen 2013).

Elements of the “soft” law type tools used within the Bologna Process are the reports and studies by consultative members of the Bologna process. Before each ministerial conference, governments are also asked to report on the implementation of their commitments as part of the stocktaking exercise. Showing the extent to which these commitments have been implemented, the Monitoring Working Group prepares the so-called “Bologna scorecard indicators”, “naming and shaming” countries that are still lagging behind. The other follow-up structures of the Bologna Process prepare and coordinate the action needed to advance the goals of the Bologna Process.

The Bologna Process has been successful through its “soft” law approach, as it has fostered consensual dialogue, peer learning and has created a common European “language” of higher education policy without the menace of sanctions (Zgaga 2012). The process nevertheless has seen issues of both accountability and effectiveness (Garben 2010), and its governance seems to be more effective or suitable for purposes of policy formation and less so for aspects of policy monitoring, coordination or implementation (Lažetić 2010). One further criticism is that the “naming and shaming” mechanism provides a low degree of accuracy as it is based on analysis of national action that may “allow for window dressing” (Amaral and Veiga 2012).

Indicators for Trust and Transparency in the EHEA

The common European infrastructure for transparency, trust and recognition embrace a shared understanding of principles for quality assurance and recognition (common degree structure and credit system) that form part of EHEA’s key commitments, viewed by Ministers as a way of consolidating the EHEA and a prerequisite to ensure its success (Yerevan Communiqué 2015).

While the main vision and principles of the EHEA have stayed the same throughout the past 17 years, the successive ministerial conferences have added more layers to the process, making it more difficult to assess the real degree of success on how far its objectives were fulfilled.

Conclusions may be drawn on the extent to which trust and transparency have been achieved within the EHEA by examining the level of implementation for each of the processes’ objectives. A number of indicators are proposed and considered in detail: four of the indicators below (indicator 1, 2, 4, 5) are composite indicators based on those used in the 2015 EHEA Implementation Report. Two further indicators were added to reflect the new commitments made in the Yerevan Communiqué 2015 (indicator 6) or recent developments in EHEA, i.e. the set-up of a database of external QA results and reports (indicator 3).

Indicator 1: Stage of Development of QA Systems in Line with the ESG

Establishing internal and external quality assurance systems in line with the ESG is one of the “key commitments” identified by the Bologna Follow-Up Group in March 2016 (see BFUG 2016). Although most EHEA countries have set up some form of external quality assurance system, there are significant differences in the approach behind them. Currently, 24 EHEA countries fulfil the commitment that external QA is performed by agencies that demonstrably comply with the ESG (see map, dark blue coloured countries Fig. 2), evidenced through registration on EQAR. Six other countries fulfil it partially in that they have only some parts of the higher education system externally quality assured in line with ESG (light blue coloured countries). The remaining EHEA countries (see map, in grey colour, Fig. 2) have yet to fully develop an external quality assurance system in line with the ESG and the key commitment.

Fig. 2
figure 2

Key commitment to external quality assurance

Over the years, there has been a continuous increase of ESG-compliant agencies on the Register and, consequently, an increase in the number of higher education systems that fulfil this requirement (see Table 2). Nine years after it was established, the Register includes 46 QA agencies carrying out external QA on a regular basis in 24 countries (see table below) and a number of other EHEA countriesFootnote 5 as part of their cross-border external QA. In the past seven years, the Register had almost doubled the number of listed QA agencies (see table below).

Table 2 Evolution of the registration of quality assurance agencies on EQAR and coverage of higher education systems

About a third of the existing ca. 90 European QA agencies have not yet undergone an ESG review. While some of these agencies have been recently established, others have chosen not to undergo an ESG review (yet) although they have been operating for a considerable time. According to the responses provided to the EQAR Self-Evaluation survey (2015, p. 35), most of the non-registered QA agencies surveyed, however, stated that they plan to (re-) apply for inclusion on the Register in the future.

Indicator 2: Allowing HEIs to Choose an EQAR-Registered QA Agency

One significant measure of trust in other countries’ quality assurance’ systems and agencies is whether governments enable higher education institutions to be evaluated by a quality assurance agency from another country when aware that the agency works in full compliance with the ESG.

The data collected between 2014 and 2016 on the activities of EQAR-registered agencies show that about half of the EQAR-registered agencies carry out reviews across borders and that their cross-border external QA activities have registered an increase of 29% in the total number of cross-border external QA compared to 2014 and a 35% increase compared to 2015. While these activities are growing and taking place in most of the EHEA member countries, only 17 countries (35% of EHEA) have put in place legislative provisions to allow (all or some) higher education institutions to request accreditation, evaluation or audit from a suitable EQAR-registered agency (see dark blue countries in the Fig. 3). Nine other countries (EE, HU, FI, FR, NL, ME, KZ, PT, TR) have also opened their system, although other criteria apply for foreign QA agencies to operate in the country (see Fig. 3 and Annual Update of EQAR-registered agencies, Eurydice & EQAR Survey of EHEA 2017).

Fig. 3
figure 3

Mapping system openness to EQAR-registered agencies

The data collected in the past three years by EQAR shows that most cross-border external QA activities are carried out in countries that recognise the activity of EQAR-registered agencies as part of the regular quality assurance at programme and/or institutional level (e.g. KZ, BE-FL, MD, AT, RO, CY, LT). Nevertheless, cross-border QA also takes place in countries where such recognition does not exist (e.g. RU, SI, UA, UK, MK, BH, TK, FR, LU) (Fig. 4).

Fig. 4
figure 4

(*) Countries that recognise the activity of EQAR-registered QA agencies

EHEA Countries where EQAR-registered QA agencies carried out cross-border QA activities in the past three years.

Recognising accreditation, evaluation or audit by a foreign QA agency, working based on the same common platform codified in the ESG would avoid the often unproductive duplication of efforts, or even fatigue, where both a national and a foreign agency review the same programme or institution, asking sometimes the same questions, even if for a different purpose (see RIQAA 2014).

EUA’s Trends report (2015) states that cross-border EQA activities are increasing due to an increased interest of quality assurance agencies and HEIs’ international aspiration, but concluded that “the actors (institutions and agencies) are ahead of the policymakers as indicated by the lack of progress in legal frameworks allowing institutions to choose any quality assurance agency that is listed in EQAR”.

Indicator 3: Use of the European Approach for the QA of Joint Programmes

Another indicator of trust in the EHEA is the use and recognition of the European Approach for Quality Assurance (QA) of Joint Programme, based on the principle that one single evaluation using this approach is recognised in all countries where the joint programme is provided. The pre-condition for its use is that EHEA countries allow so in their national legislation, i.e. to recognise external quality assurance in line with the European Approach as sufficient to fulfil the external QA obligations.

So far, the European Approach can only be used in a few countries with obligatory programme accreditation that have made recent legal changes or where existing legal provisions already allow its use (e.g. BE-FL, DK, DE, NL). Discussions are on-going or legislative changes are being drafted in a few additional countries (e.g. HR, SI). In a few EHEA countries (AT, FI, IE, UK), higher education institutions (some or all) do not require external programme level accreditation, thus they may choose to use the European Approach in their internal QA arrangements in order to “self-accredit” their programmes without a need for legislative changes.

In total, the European Approach is, in principle, available to all institutions in 12 higher education systems and to some institutions in another 13 systems. Since 2015, only a handful of EQAR-registered agencies declared to have actually used the European Approach. In general, joint programmes remain a relatively small phenomenon: only 1% of the EQAR-registered agencies’ programme evaluations/accreditations are of joint programmes, and they have registered a significant decrease in the past years (Annual Update of EQAR-registered agencies).

Indicator 4: Self-certification of National Qualifications Framework

According to the Implementation Report, in 2015, 38 countries were in the “green zone” regarding the implementation of national qualifications framework. About half of the EHEA countries have self-certified their compatibility with the QF-EHEA, while other 14 more countries were close to completion. Three countries (AD, SK, and RU) remain stagnant in the first steps of the implementation of the national qualifications frameworks.

EUA’s Trends report (2015) revealed that higher education institutions from countries that have a national qualifications framework generally rated the impact of NQF highest in terms of promoting transparency and comparability between degrees and across education sectors. The Trends report also showed that while some countries had certified their national qualifications, the institutions were not always aware of it although the self-certification process requires that NQF be fully used by institutions in order to be operational.

Indicator 5: Implementation of the Lisbon Recognition Convention

The Lisbon Recognition Convention (LRC) provides a legal basis for recognition in the EHEA and has helped develop the methodology of credential evaluators through the networks of national recognition information centres (ENIC-NARIC networks). The implementation of the LRC represents a measure of the degree of convergence and trust attained (Bucharest Communiqué, p. 4).

The 2015 Implementation Report (p. 78) assessed the extent to which countries have specified in the national legislation five of the main principles of the LRC: (i) whether applicants have a right to fair assessment; (ii) whether there is recognition if no substantial differences can be proven; (iii) if there is encouragement in the comparison of learning outcomes rather than programme content; (iv) if, in cases of negative decisions, the competent recognition authority demonstrates the existence of substantial difference and (v) if the applicant has the right to appeal recognition decisions.

The country responses revealed that only 11 systems included all the main principles, with 26 other systems omitting one principle, usually the cases where the competent authority has to demonstrate the existence of substantial differences. The report also underlines that embedding these principles in legislation does not necessarily guarantee good recognition practices, as the higher education institution is generally the one responsible for taking the final decision on recognising foreign qualifications for academic purposes. The data submitted by countries for the 2018 Implementation Report show some further improvements in adopting the LRC principles, with a few more countries making reference in their national legislation and pointing out that quality assurance agencies normally examine recognition practice during their external quality assurance activities.Footnote 6

Indicator 6: Use and Accessibility of Published External QA Reports

Whilst there are various dimensions to the transparency of external QA, the accessibility of the published reports (on evaluation/accreditation/audit of higher education institutions and programmes) is one important aspect. The public accountability and transparency requirements in quality assurance systems are evolving with more and more published outcomes for quality assurance evaluations of higher education institutions or programmes, even when negative (Implementation report, p. 18). According to the annual update of EQAR-registered agencies, there are over 9000 reviews carried out each year by these agencies that result in the same amount of quality assurance reports. In 2016, the 44 EQAR-registered agencies carried out a total of 9764 external activities of which 6% at institutional, 93% at programme level and 0.3% at joint programme level within 30 of the EHEA member countries.

The information on the external QA of higher education institutions and programmes is currently spread across many quality assurance agencies’ websites, most of them national. European databases and tools usually offer only patchy and limited information on quality assurance results and decisions (see Database of External Quality Assurance Results Report and Operational Model 2016).

The survey carried out by EQAR in 2016 with possible users of a database of quality assurance reports and results (EQAR 2016, p. 8) revealed that 42% of respondents consult decisions or reports on the external quality assurance (QA) of higher education institutions or programmes on at least monthly basis, with a third of them at least once a week (see chart below, Fig. 5). There is some variation between the respondent’s profile and, not surprisingly, those dealing with recognition of qualifications access reports/decisions most frequently, next to quality assurance agencies themselves.

Fig. 5
figure 5

Frequency in the use of published external QA reports by group

While only 20% find that external quality assurance (QA) decisions and reports are “difficult to access”, 16% find them “easily accessible”. By far, most respondents (61%) to EQAR’s survey considered that external QA results and decisions are “somewhat accessible”.

Among the main issues encountered in searching for such information, respondents mentioned the following:

  • Identifying which agency or agencies might have carried out an evaluation, accreditation or audit of the higher education institution of interest;

  • Navigating different QA agencies’ websites, which vary in structure and user-friendliness;

  • Language barriers, as most reports are only available in the local language. The same sometimes applies to the agencies’ websites;

  • Understanding different national systems and the status and meaning of external QA decisions and reports therein.

A European database that would join all quality assurance results within the scope of the ESG would be able to address some of the users’ concerns e.g. to easily identify higher education institutions that have gone through an ESG-type of review, whether that review was voluntary in nature or part of the regular external QA, validity of decision, but not all types of difficulties e.g. language of reports will generally be the same. Nevertheless, most users would find it useful to have access to a pan-European database of external quality assurance reports and decisions (76% of respondents) as it would help users to navigate the complexity of external quality assurance systems in Europe and expose its results to a wider audience.

Such a database was welcomed by EHEA governments (members of EQAR)Footnote 7 and has received co-financing for its implementation from the EU’s Erasmus + funded projects. The extent of its success will largely depend on the accuracy and clarity of the information and participation of all registered quality assurance agencies.

Implications for Trust and Transparency in the EHEA

While the discussed indicators provide some evidence that the support mechanisms for trust and transparency are in place and that the different commitments have been realised, it remains difficult to draw a clear balance sheet of where we stand in terms of trust and transparency between higher education systems.

Generally, at policy level, the key commitments to building trust and transparency have been followed through, although unevenly among countries or within the same higher education system. With all its limitations stemming from the nature of a voluntary process, the EHEA is taking shape—ministers have committed to a European framework to consolidate trust and recognition and developments are visible in a considerable part of the EHEA. There is a clear potential for achieving automatic recognition at system level for many/approximately half of the EHEA countries. For at least some of the EHEA countries, three quarters of qualifications are treated equally as national qualifications (Implementation Report 2015, p. 18). Quality assurance systems are in place in almost all higher education systems although the use of EQAR-registered agencies and the implementation of the ESG are visible in a little over half of the EHEA.

While a number of other transparency tools exist across the EHEA, they are generally built around the idea of excellence and performance indicators that ignore the diversity of higher education systems and institutional missions and do not provide information on the results of quality assurance assessments. Thus, the path at EHEA level should be to ensure the consolidation of the current framework for trust and transparency that values its diversity, enables the use of quality assurance fit for purpose by a suitable EQAR-registered quality assurance agency and rewards activities across teaching, learning and research.

The challenge in ensuring this consolidation lies with the differential implementation of Bologna, ensuing from what sometimes appears to be a “pick and choose” or “a la carte” type of approach. Countries are not incentivised to follow-up on all commitments because the membership of the process is not conditioned on the actual implementation of the agreements (Furlong 2010). Ministers thus recognised the risk that non-implementation in some countries undermines the functioning and credibility of the EHEA as a whole, as indicated by the Yerevan Communiqué (2015). There is a particular risk of deepening the gap between those that have implemented the commitments and those who experience difficulties in doing so. A dangerous consequence in this respect might be that trust concentrates in a few regions and countries with comparable or more compatible systems, instead of spanning the whole EHEA. Following up the Yerevan Communiqué, the Bologna Follow-Up Group commissioned a dedicated advisory group to make proposals as to how to tackle the problem of non-implementation of key commitments. It remains to be seen what changes to the governance of the Bologna Process ministers will consider at their upcoming ministerial conference in Paris, in May 2018.

The specific characteristics of national modes of governance play a role in the extent to which the dynamic higher education sector is shaped. The policy and institutional goals are best shaped through integrative communication processes between policy-makers and higher education institutions (Maassen et al. 2011).

Boer et al. (2016) argue that reform processes in higher education were proven most successful when stakeholders were involved in the earlier stages of the policy development and there was a deliberate action towards reaching consensus. Where such consensus was not achieved, reform initiatives ran counter to the interests of those initiating the reform and problems emerged in the implementation. Thus, driving progress forward requires the participation of all member countries and support from all stakeholders. The actors of the Bologna Process are in a relatively closed arena, essentially engaging a limited community of officials and experts, which may not ensure a good assimilation into national higher educations systems. Consolidating the stakeholder engagement, ensuring broad ownership among those who have the responsibility for the operational implementation is of particular relevance especially in areas where there is still a low level of implementation or awareness (Amaral and Veiga 2012).

A further challenge is that higher education reforms are usually filtered through different opportunities and constraints provided by the national and institutional context (Dobbins and Knill 2014). The 2015 Trends report draws attention to a number of issues in realising the Bologna commitments i.e. the hasty introduction of the three-cycle structures in some countries, which did not always lead to meaningful curricular renewal; the use of the Diploma Supplement had been in parts disconnected from the developments in learning outcomes and qualifications framework; where institutions were not involved in consultations on the national qualification system they have fallen short in understanding the importance of learning outcomes and of their role within the qualifications frameworks in facilitating mobility and lifelong learning.

In going forward, it is reasonable to expect that most countries will aim to catch up in the implementation of their key commitments so as to ensure that their higher education system is fully trusted by their international partners, that the qualifications offered are easily recognised and that this will facilitate the mobility of students and professors within the EHEA. The Bologna Process has acted as an external catalyst in the past, even for some time unpopular internal change and may still do so, at least for those who have entered the process at a later stage.