Monitoring, Evaluation and Risk Management
- 1.1k Downloads
Monitoring and evaluation (M&E) is a key component of RI investment. This includes the tracking of (i) research outputs; (ii) grants expenditures by grant holders; (iii) research productivity of grant holders; and (iv) student training and graduation rates. In some cases, research performing entities such as universities, science councils and NFs are held responsible by the funding agency for the monitoring of RI during the construction and implementation phase. Monitoring of RI also forms a key component of the annual reporting frameworks based on predetermined indicators aligned to funding activities and outcomes. Evaluation of such activities is needed in order to adapt and enhance the programmes as well as to ensure: (i) scalability of interventions; (ii) improve performance and (iii) highlight factors (internal and external) that affect implementation. This chapter presents a number of conceptual frameworks that are used to structure M&E criteria for the successful management of RI investments.
6.1 Monitoring and Evaluation
Various conceptual frameworks are used to design and structure M&E evaluation criteria. For instance, these include: (i) the logic model, (ii) results-chain framework, and (iii) balanced scorecard approach.
In using the logic model, the following key variables are considered: inputs, outputs and outcomes. The model also considers the logical linkages to external influences, environmental and related programmes as well as the situational context (problem) the motivates the introduction of the intervention (inputs and outputs) to achieve a specific impact (outcome) (Millar, Simeone and Carnevale, 2001). Oftentimes, the logic model is critiqued for being a linear model that aims to monitor and evaluate a multi-dimensional process. When planning to build a logic model the following questions can be posed: (i) what is the current situation that needs to be tackled? (ii) what will it look like when the desired outcome has been achieved? (iii) what behaviours need to change for that outcome to be achieved? (iv) what knowledge or skills do people need before the behaviour changes? (v) what activities need to be performed to cause the necessary change? (vi) what resources will be required to achieve the desired outcome? (Millar et al., 2001).
Does the programme in its current form respond to national priorities and original objectives?
What progress has been made in implementing the contractual framework?
Were the programme, systems, processes and activities put into place as originally intended?
What factors have facilitated and/or acted as barriers to implementation?
How can the implementation process of the new contract be improved?
To what extent are the strategic objectives for the programme being met?
Is the programme achieving the goals and objectives it was intended to accomplish?
Have the interventions and equipment used produced the expected effects?
Could more effects be obtained by using different equipment?
Are the programme’s activities being produced with appropriate use of resources such as budget and staff time?
Have the objectives been achieved at the lowest cost, or can better effects be obtained at the same cost?
To what extent has the infrastructure and workload changed?
Is the equipment producing satisfactory outcomes with regard to the initial goal from the beneficiary’s point of view?
Have local working relationships with and within field system changed?
Can progress on goals and objectives be shown to be related to the programme, as opposed to other things that are going on at the same time?
Is the programme sustainable? This links to: (i) financial, (ii) human resourcing, (iii) environment, and (iv) research outputs.
What quality assurance measures have been introduced? (World Bank, 2012)
The third approach is that of the balanced scorecard. In 1992, Kaplan and Norton proposed the balanced scorecard method to evaluate and measure the financial and non-financial performance of organisations in terms of finances, customers, internal business processes, and learning and growth. The development of the balanced scorecard, therefore, claims to provide a holistic perspective of progress and performance towards achieving strategic goals that allow the organisation to function in a rapidly evolving environment. This multi-perspective method articulates links between inputs, processes and outcomes as well as focuses on the importance of managing these components in order to achieve the organisation’s strategic priorities and targets (Kaplan & Norton, 1992). The balanced scorecard has been adopted in the services, manufacturing, marketing and retailing, and public sectors (Hoque, 2014).
The choice of the most suitable M&E tool depends on its fit with the organisation’s mandate and its strategic imperatives. This means that based on the maturity of the organisation and the systems and processes that are in place, the choice of the M&E tool may differ.
6.2 Site Visits and/or Technical Audits
All management plan criteria and requirements are met.
The functional capability of the equipment in terms of the equipment yielding results that meet publication or journal standards.
The quality and quantity of outputs linked to the usage of the equipment.
The usage of the equipment by (i) postgraduate students; (ii) other researchers, both national and international; and (iii) private sector.
Have the manufacturer-specified environmental conditions for housing the equipment been met? If there are gaps in meeting any or all of the specified conditions as described in Chap. 7, then these must be stated.
Are there are any challenges that relate to either the hardware or software? If hardware, is reported as a challenge then the supplier must indicate if the replacement components are covered by the guarantees and/or warranties of the service level agreement.
Are there gaps in the skills set at the institution, in terms of optimally operating the equipment?
Steps that will be taken to address the gaps identified.
Timelines for delivery.
Available budget for the implementation of the above.
Technical audit team comprising representatives from the funding agency, including (i) staff responsible for managing the RI grants; (ii) internal auditors; (iii) financial grants management staff; and (iv) an independent research equipment expert.
Research equipment team comprising of (i) senior management representatives at the research institution; (ii) the grant holder; and (iii) the supplier and/or manufacturer of the equipment.
Reaching consensus and recording the agreements, committed budgets and timeframes for implementation.
Engaging the supplier and/or manufacturer on how best to expedite the process for addressing the gaps and/or challenges. This may include defining the role of the supplier and/or manufacturer in aiding the grant holder to resolve these challenges.
Engaging senior management and the grant holder of the research institution on meeting the agreed to deliverables.
In the event that there is a lack of commitment or adherence to the timelines and/or deliverables in the management plan the funding agency is liable to make reference to the breach clause in the Conditions of Grant and to proceed to either withdraw or recall the grant awarded to the institution as described in Chap. 4.
6.3 Risk Management
Funding agencies need to manage risks on a daily basis, especially relating to financial controls and integrity (Bailey, 2010). These organisations need to guard against falling prey to managing risks in a haphazard and unsystematic manner. In this section, the term “risk” is used to describe event(s) that have a potentially negative impact on the funding agency’s assets, activities and operations (Kwak & Keleher, 2015). The management of risks and risk events refers to the (i) continuous process of assessing risks; (ii) reducing the chances of a risk event transpiring; and (iii) putting in places measures to tackle an event should it occur (Kwak & Keleher, 2015). The mapping of potential risks and the impact of risk events against the likelihood of such events transpiring, forms part of a risk register, and is an important risk management exercise (Bailey, 2010). Hence risk management must commence at the RI planning phase.
Part of risk management relating to research equipment involves the planning related to minimising loss (financial and other), damages, and impact of acquired physical assets from third party allegations of liability. Information presented in this section makes reference to the work done by Bailey (2010) and Kwak and Keleher (2015). There are six components identified as part of the risk management process which includes the (i) internal environment; (ii) objective setting; (iii) event identification; (iv) risk assessment and response; (v) control activities, and (vi) communication and monitoring (Bailey, 2010).
One of the suggestions of Kwak and Keleher (2015) is to adopt enterprise risk management (ERM) as a tool to manage risks and exploit opportunities. The rationale for using ERM is that it affords organisations, particularly funding agencies, the ability to identify and assess threats or risk events in terms of the likelihood of such an event transpiring and the magnitude of impact should the risk event occur. A further suggestion is that the funding agency develop new internal policies in support of the ERM and that for risk management processes to be effective existing data sources must be utilised whilst simultaneously considering the incorporation of new ones. In the way of recommendations, Kwak and Keleher (2015) propose that funding agencies utilise data-driven systems to collect and manage data which in turn can be utilised to assess risks—such data may include historic data on the grant holder in terms of historic number of grants and size of grant values, performance and other monitoring data. Another recommendation that the investment in the introduction of new or revised risk management practices be supported by parallel investments in training and capacity development interventions. These in turn can inform tools and processes to standardise the decision-making and decision-approving process within the funding agency (Kwak & Keleher, 2015).
In addition, risk management must be an iterative process across the four stages of the grant lifecycle. Within each stage of the grant lifecycle, risk events have the possibility of materialising and funding agencies need to be proactive in preparing for such threats. For a detailed implementation framework of risk refer to Annexure B.
What will be insured?
At whose cost?
What are the options for public liability cover?
What are the options for professional liability cover?
In safeguarding the funding or investment from any risks, it is imperative for the funding agency that is awarding the grant to stipulate the conditions associated with that grant award. This is a legally binding document that is issued by the funding agency and is consented to and signed by the researcher and their research institution’s designated authority.
As part of a risk management process, one of the recommendations by Kwak and Keheler (2015) is for a business unit for risk management services to be established. This unit ought to comprise of (i) a policy team that drafts policy and provides technical assistance to staff at the funding agency; (ii) management improvement team that focuses on providing assistance to grant holders on matters relating to grants; and (iii) a programme monitoring team that concentrates on monitoring and evaluation activities as well as measuring performance against KPIs. This team also focuses on standardisation of the collection and review of data (Kwak & Keleher, 2015).
In order to manage risks relating to large investments in RI, a requirement from the side of the funding agency would be to put in place a governance and management structure at the host research institution. Based on experience, it is imperative to have a two-layered governance structure. The first layer will primarily (i) have an advisory role; (ii) ensure good governance; (iii) commit to the provision of the necessary resources required to meet obligations and conditions relating to the equipment, including risks relating to currency fluctuations; and (iv) review performance and budgets. This first layer can be termed the advisory committee and may comprise of, but be not limited to, representatives from (i) senior management at research institutions; (ii) the funding agency; (iii) private sector or other donor parties if they have contributed in some form to the cost of acquiring the research equipment; (iv) public outreach sector; (v) operations management; and (vi) independent experts.
The second layer, or operations committee, may comprise of, but not limited to, representatives from (i) the user community; (ii) the researcher to whom the equipment was awarded; (iii) staff scientists, operators, technicians, engineers and data specialists; and (iv) the finance officer. The operations committee will be responsible for (i) the day-to-day management of the facility; (ii) reporting on usage of the equipment, income and expenditure, and research outputs; (iii) develop an access and research strategy for the research equipment facility; and (iv) submit statutory reports that are required by the funding agency.
Data received by the NRF is usually sourced from annual progress reports (APRs) that are submitted by the grant holder on an annual basis (National Research Foundation, 2018b). This data must be checked by the institutional management that information presented to the funding agency is: (i) accurate; (ii) complete; (iii) valid; (iv) reliable; and (v) transparent, in accordance with Sect. 4.2 above. This quality assurance check ensures that collated and consolidated information is accurately reported by the funding agency against both financial and non-financial indicators. The non-performance indicators within the NRF context extends firstly to outputs linked to human capital development, which in turn counts (i) the number of users linked to the placement of an equipment; and (ii) the number of postgraduate students trained on using the equipment. The second non-financial indicator links to research outputs, viz. (i) number of publications; (ii) number of patents; and (iii) other research outputs (National Research Foundation, 2018b). These indicators are expanded, as follows:
Human capital development
Number of postgraduate students trained: A reflection of how many Master’s and Doctoral students have obtained degrees where they utilised the research equipment.
Number of users: A reflection of usage of the equipment by the wider research community.
Staff and researcher development: A reflection of capacity development for training instrument staff and researchers, both at the home institution as well as other research institutions. This also links to the concept of succession planning.
Number of publications: A reflection of the productivity linked to the usage of the equipment.
Number of patents: A reflection of the innovative capacity linked to the usage of the equipment.
Other research outputs: A reflection of other novel areas of productivity linked to the usage of the equipment. These may extend to invited plenary talks at national and/or international meetings that links to the research equipment.
Outputs against research infrastructure grants awarded by the National Research Foundation from 2009 to 2017 (National Research Foundation, 2018a)
Total number of users
Publications (peer reviewed articles, conference proceedings, books, chapters)
Priority areas that have been supported through research conducted on equipment grants awarded by the National Research Foundation (2018a)
Farmer to Pharma
No grants were awarded
Global Climate Change
Human and Social Dynamics
6.5 Equipment Database
Informing a funding agency of continued investment(s) in research equipment and platforms.
Advising the researcher community of what equipment is available nationally.
Facilitating access by researchers and students to multi-user equipment.
Stimulating new applications to the funding agency for research infrastructure (National Research Foundation, 2018a).
Minimising the duplication of equipment within a specific institution, region or country.
Type of equipment.
Model of the equipment.
Functional state of the equipment.
Disciplines supported by the equipment.
Geographical location of the equipment (name of the research institution, the department and the laboratory space/building the equipment occupies).
Contact details of the person in charge of the equipment who would facilitate access to various users (National Research Foundation, 2018a).
Such a database is able to map the type of research equipment available within a country and how this is distributed across the national landscape with the secondary objective of minimising the duplication of investments at institutions that are in close proximity. It serves as an analytical tool that allows funding agency staff to update content and also track the outputs, outcomes and impact relating to the investment in research equipment.
This chapter presents an overview of monitoring and evaluation aligned to the management of research infrastructure. Furthermore, the chapter makes reference to pertinent issues such as risk management, reporting, site visits and technical audits. This chapter also recommends the development and maintenance of a database that can serve as a central repository of RI grants within a specific country.
- Bailey, J. A. (2010). Strengthening control and integrity: A checklist for government managers. [online] Available at: http://www.businessofgovernment.org/sites/default/files/BaileyReport.pdf. Accessed: 31 January 2018.
- Kaplan, R. S., & Norton, D. P. (1992). The balanced scorecard: Measures that drive performance. Harvard Business Review, 70(1), 71–79.Google Scholar
- Kwak, Y. H., & Keleher, J. B. (2015). Risk management for grants administration: A case study of the Department of Education. [online] Available at: http://www.businessofgovernment.org/sites/default/files/Risk%20Management%20for%20Grants%20Adminnistration.pdf. Accessed: 31 January 2018.
- National Research Foundation. (2018a). Research equipment database. [online] Available at: http://eqdb.nrf.ac.za/funding. Accessed: 10 October 2018.
- National Research Foundation. (2018b). Infrastructure funding instrument: National equipment programme framework and funding guide. [online] Available at: http://www.nrf.ac.za/sites/default/files/documents/NEP%20Call%20Framework%20and%20Funding%20Guide%202019.pdf. Accessed: 21 September 2018.
- World Bank. (2012). Designing a results framework for achieving results: A how-to-guide. Accessible at https://siteresources.worldbank.org/EXTEVACAPDEV/Resources/designing_results_framework.pdf. Date accessed: 26 November 2019.
Open Access This chapter is licensed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license and indicate if changes were made.
The images or other third party material in this chapter are included in the chapter's Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the chapter's Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder.