Introduction to MIRA, an Open Solution Approach

  • Cynthia H. Stahl
  • Alan J. Cimorelli
Part of the Risk, Systems and Decisions book series (RSD)


The Multi-criteria Integrated Resource Assessment (MIRA) open solution approach is a framework and a process. MIRA is a transparent policy framework that embraces diversity, adversity, and discovery, with a process that allows for the policy-specific evaluation of Decision Uncertainty and the emergence of stakeholder agreement. The three major unique aspects of an open solution approach are the inclusion of Decision Uncertainty; stakeholder-directed, but structured, iterations of the Requisite Steps; and trans-disciplinary learning. Trans-disciplinary learning becomes key to ensuring that each iteration is informed by the previous. These components distinguish the open solution approach from conceptual social science approaches that lack methodologies to compare specific policy alternatives and the reductionist decision-analytic approach that is currently the dominant public policy making paradigm. In this chapter, the Requisite Steps as applied in a MIRA approach are described, including terminology and guiding principles specific to MIRA.


Trans-disciplinary learning Adaptive management Transparency in policy making Multi-criteria assessment 


  1. 1.
    Groot A, van Dijk N, Jiggins J, Maarleveld M (2002) Three challenges in the facilitation of system-wide change. In: Leeuwais C, Pyburn R (eds) Wheelbarrows full of frogs. Koninklijke Van Gorcum, Assen, pp 199–213Google Scholar
  2. 2.
    Benessia A, Funtowicz SO, Giampietro M, Guimaraes Pereira A, Ravetz JR, Saltelli A, Strand R, van der Sluijs JP (2016) The rightful place of science: science on the verge. Consortium for Science, Policy and Outcomes, Arizona State University, CharlestonGoogle Scholar
  3. 3.
    Lawrence RJ (2010) Beyond disciplinary confinement to imaginative transdisciplinarity. In: Brown VA, Harris JA, Russell JY (eds) Tackling wicked problems: through the transdisciplinary imagination. Earthscan, London, pp 16–30Google Scholar
  4. 4.
    Röling N (2002) Beyond the aggregation of individual preferences. In: Leeuwais C, Pyburn R (eds) Wheelbarrows full of frogs: social learning in rural resource management. Koninklijke Van Gorcum, Assen, pp 25–47Google Scholar
  5. 5.
    Leeuwais C, Pyburn R (eds) (2002) Wheelbarrows full of frogs: social learning in rural resource management. Koninklijke Van Gorcum, AssenGoogle Scholar
  6. 6.
    Guijt I, Proost J (2002) Monitoring for social learning: insights from Brazilian NGOs and Dutch farmer study groups. In: Leeuwais C, Pyburn R (eds) Wheelbarrows full of frogs. Koninklijke Van Gorcum, Assen, pp 215–232Google Scholar
  7. 7.
    Curtin CG (2015) The science of open spaces: theory and practice for conserving large complex systems. Island Press, Washington, D.C.CrossRefGoogle Scholar
  8. 8.
    Bond A, Morrison-Saunders A, Gunn JAE, Pope J, Retief F (2015) Managing uncertainty, ambiguity and ignorance in impact assessment by embedding evolutionary resilience, participatory modelling and adaptive management. J Environ Manag 151:97–104. Scholar
  9. 9.
    Endter-Wada J, Blahna D, Krannich R, Brunson M (1998) A framework for understanding social science contributions to ecosystem management. Ecol Appl 8(3):891–904CrossRefGoogle Scholar
  10. 10.
    Isaacs W (1999) Dialogue and the art of thinking together: a pioneering approach to communicating in business and in life. Doubleday, New YorkGoogle Scholar
  11. 11.
    Bell S, Morse S, Shah RA (2012) Understanding stakeholder participation in research as part of sustainable development. J Environ Manage 101:13–22. Scholar
  12. 12.
    Andrews CJ (2002) Humble analysis: the practice of joint fact-finding. Praeger, WestportGoogle Scholar
  13. 13.
    Webler T, Renn O (1995) A brief primer on participation: philosophy and practice. In: Renn O, Webler T, Wiedemann P (eds) Fairness and competence in citizen participation: evaluating models for environmental discourse, Technology, risk and society, vol 10. Kluwer, Dordrecht, pp 17–34CrossRefGoogle Scholar
  14. 14.
    U.S. Environmental Protection Agency (1998) Guideline for data handling conventions for the 8-hour ozone NAAQS. U.S. Environmental Protection Agency, Office of Air Quality Planning and Standards, Research Triangle ParkGoogle Scholar
  15. 15.
    U.S. Environmental Protection Agency (2015) National ambient air quality standards for ozone. Fed Regist 80(208):177Google Scholar
  16. 16.
    Saaty TL (1990) Multicriteria decision making: the analytic hierarchy process. RWS Publications, PittsburghGoogle Scholar
  17. 17.
    Clean Air Act (1990) 42 U.S.C. 7407 et seqGoogle Scholar
  18. 18.
    Stahl CH, Fernandez C, Cimorelli AJ (2004) Technical support document for the Region III 8-hour ozone designations 11-factor analysis. U.S. Environmental Protection Agency, Region III, PhiladelphiaGoogle Scholar
  19. 19.
    U.S. Environmental Protection Agency (2004) Air quality designations and classifications for the 8-hour ozone National Ambient Air Quality Standard; early action compact areas with deferred dates. Fed Regist 84(69):23858–23951Google Scholar
  20. 20.
    Corporate Finance Institute (2018) Dow Jones Industrial Average (DJIA). Accessed 12/13/18
  21. 21.
    Susskind LE, Levy PF, Thomas-Larmer J (2000) Negotiating environmental agreements: how to avoid escalating confrontation, needless costs, and unnecessary litigation. Island Press, Washington D.C.Google Scholar

Copyright information

© Springer Nature Switzerland AG 2020

Authors and Affiliations

  • Cynthia H. Stahl
    • 1
  • Alan J. Cimorelli
    • 2
  1. 1.US EPAPhiladelphiaUSA
  2. 2.US EPA (retired)PhiladelphiaUSA

Personalised recommendations