Advertisement

Impact assessment of a support programme of science-based emerging technologies

  • Ulrich SchmochEmail author
  • Bernd Beckert
  • Petra Schaper-Rinkel
Article
  • 59 Downloads

Abstract

The impact assessment of support programmes of science-based emerging technologies requires the analysis of several dimensions of performance, as these programmes refer to used-inspired basic research which is linked to basic research as well as to technological application. Bibliometric analysis proves to be a useful tool for capturing different aspects of performance. In the specific programme “future emerging technologies”, interdisciplinarity turns out to be crucial for achieving excellent and creative outcomes. Furthermore, the orientation on risky projects yields some excellent results, but few failures.

Keywords

Impact assessment Science-based emerging technologies Multi-dimensional impact Impact of interdisciplinarity Impact of risk-orientation 

Notes

Acknowledgements

Certain data included in this paper are derived from the Science Citation Index Expanded (SCIE), the Social Science Citation Index (SSCI), the Arts and Humanities Citation Index (AHCI), and the Index to Social Sciences and Humanities Proceedings (ISSHP) (all updated June 2010) prepared by Thomson Reuters (Scientific) Inc. (TR®), Philadelphia, Pennsylvania, USA, USA: ©Copyright Thomson Reuters (Scientific) 2010. All rights reserved.

Funding

Funding was provided by European Commission (Grant No. i665083)

References

  1. Airaghi, A., Busch, N. E., Georghiou, L., Stefan Kuhlmann, S., Ledoux, M. J., et al. (1999). Options and limits for assessing the socio-economic impact of european RTD programmes, report to the European Commission. Brussels: European Commission.Google Scholar
  2. Amabile, T. M. (1996). Creativity in context: Update to the social psychology of creativity. New York: Springer.Google Scholar
  3. Arts, S., & Veugelers, R. (2014). Technology familiarity, recombinant novelty, and breakthrough invention. Industrial and Corporate Change, 24(6), 1215–1246.CrossRefGoogle Scholar
  4. Bonnín Roca, J., Vaishnava, P., Morgan, M. G., Joana Mendonça, J., & Fuchs, E. (2017). When risks cannot be seen: Regulating uncertainty in emerging technologies. Research Policy, 46, 1215–1233.CrossRefGoogle Scholar
  5. Cochrane, J. H. (2005). The risk and return of venture capital. Journal of Financial Economics, 75(1), 3–52.CrossRefGoogle Scholar
  6. De Touzalin, A. (2013). Future and emerging technologies (FET) work programme 2014–2015 in H2020, DG CONNECT. Paris: European Commission.Google Scholar
  7. Georghiou, L., & Roessner, D. (2000). Evaluating technology programs: Tools and methods. Research Policy, 29, 657–678.CrossRefGoogle Scholar
  8. Glänzel, W., & Czerwon, H. J. (1996). A new methodological approach to bibliographic coupling and its application to the national, regional and institutional level. Scientometrics, 37, 195–221.CrossRefGoogle Scholar
  9. Lai, P. C. (2017). The literature review of technology adoption models and theories for the novelty technology. JISTEM Journal of Information Systems and Technology Management, 14(1), 1.Google Scholar
  10. Mamykina, L., Candy, L., & Ernest Edmonds, E. (2002). Collaborative creativity. Communications of the ACM, 45(10), 96–99.CrossRefGoogle Scholar
  11. Mason, C. M., & Harrison, R. T. (2002). Is it worth it? The rates of return from informal venture capital investments. Journal of Business Venturing, 17(3), 211–236.CrossRefGoogle Scholar
  12. Rhoten, D., & Andrew Parker, A. (2004). Risks and rewards of an interdisciplinary research path. Science, 17(306/5704), 2046.CrossRefGoogle Scholar
  13. Rotolo, D., Hicks, D., & Martin, B. R. (2016). What is an emerging technology? Research Policy, 44(10), 1827–1843.CrossRefGoogle Scholar
  14. Sahlmann, W. (1990). The structure and governance of venture capital organizations. Journal of Financial Economics, 27, 473–521.CrossRefGoogle Scholar
  15. Sawyer, R. K. (2011). Explaining creativity: The science of human innovation. Oxford: Oxford University Press.Google Scholar
  16. Schmoch, U. (2007). Double-boom cycles and the comeback of science-push and market-pull. Research Policy, 36(7), 1000–1015.CrossRefGoogle Scholar
  17. Schmoch, U., Breiner, S., Cuhls, K., Hinze, S., & Münt, G. (1996). The organisation of interdisciplinarity—Research structures in the areas of medical lasers and neural networks. In G. Reger & U. Schmoch (Eds.), Organisation of science and technology at the watershed (pp. 267–372). Heidelberg: Physica-Verlag.CrossRefGoogle Scholar
  18. Shapira, P., & Kuhlmann, S. (2003). Learning from science and technology policy evaluation: Experiences from the United States and Europe. Cheltenham: Edward Elgar Publishing.Google Scholar
  19. Stacey, R. D. (1996). Complexity and creativity in organizations. San Francisco: Berrett-Koehler Publishers.Google Scholar
  20. Sternberg, R. J. (Ed.). (1999). Handbook of creativity. Cambridge: Cambridge University Press.Google Scholar
  21. Stokes, D. E. (1997). Pasteur’s quadrant—Basic science and technological innovation. Washington, D.C.: Brookings Institution Press.Google Scholar
  22. Struening, E. L., & Guttentag, M. (1975). Handbook of evaluation research. Beverly Hills: Sage Publications.Google Scholar
  23. Tatikonda, M. V., & Rosenthal, S. R. (2000). Technology novelty, project complexity, and product development project execution success: A deeper look at task uncertainty in product innovation. IEEE Transactions on Engineering Management, 47(1), 74–87.CrossRefGoogle Scholar
  24. Van Raan, A. J. F. (2005). Measurement of central aspects of scientific research: Performance. Interdisciplinarity, Structure, Measurement, 3(1), 1–19.Google Scholar

Copyright information

© Akadémiai Kiadó, Budapest, Hungary 2019

Authors and Affiliations

  • Ulrich Schmoch
    • 1
    Email author
  • Bernd Beckert
    • 1
  • Petra Schaper-Rinkel
    • 2
  1. 1.Fraunhofer Institute for Systems and Innovation ResearchKarlsruheGermany
  2. 2.Austrian Institute of TechnologyViennaAustria

Personalised recommendations