Advertisement

Taking scholarly books into account, part II: a comparison of 19 European countries in evaluation and funding

  • Elea Giménez-Toledo
  • Jorge Mañana-Rodríguez
  • Tim C. E. Engels
  • Raf Guns
  • Emanuel Kulczycki
  • Michael Ochsner
  • Janne Pölönen
  • Gunnar Sivertsen
  • Alesia A. Zuccala
Article

Abstract

In May 2016, an article published in Scientometrics, titled ‘Taking scholarly books into account: current developments in five European countries’, introduced a comparison of book evaluation schemes implemented within five European countries. The present article expands upon this work by including a broader and more heterogeneous set of countries (19 European countries in total) and adding new variables for comparison. Two complementary classification models were used to point out the commonalities and differences between each country’s evaluation scheme. First, we employed a double-axis classification to highlight the degree of ‘formalization’ for each scheme, second, we classified each country according to the presence or absence of a bibliographic database. Each country’s evaluation scheme possesses its own unique merits and details; however the result of this study was the identification of four main types of book evaluation systems, leading to the following main conclusions. First, countries may be differentiated on the basis of those that use a formalized evaluation system and those that do not. Also, countries that do use a formalized evaluation system either have a supra-institutional database, quality labels for publishers and/or publisher rankings in place to harmonize the evaluations. Countries that do not use a formalized system tend to rely less on quantitative evaluation procedures. Each evaluation type has its advantages and disadvantages; therefore an exchange between countries might help to generate future improvements.

Keywords

Scholarly books Book publishers Evaluation processes Classification Research evaluation Social sciences Humanities Book series 

MSC Classification

00–02 

JEL Classification

C00 

Notes

Acknowledgements

The authors want to thank all ENRESSH participants in the survey for their valuable contribution to this work: Croatia: Jadranka Stojanovski, Czech Republic: Jiří Kolman and Petr Kolman, France: Ioana Galleron, Israel : Judit Bar-Ilan, Saul Smiliansky and Sharon Link Italy: Ginevra Peruginelli, Latvia: Arnis Kokorevics and Linda Sīle, Lithuania: Aldis Gedutis, Montenegro: Sanja Pekovic, Portugal: Luisa Carvalho and Ana Ramos, Serbia: Dragan Ivanovic, Slovakia: Alexandra Bitusikova, Slovenia: Andreja Istenic Starcic, and Switzerland: Sven Hug.

Funding

This article is based upon work from ENRESSH (European Network for Research Evaluation in the Social Sciences and Humanities, COST Action (CA15137)), supported by COST (European Cooperation in Science and Technology).

References

  1. Adams, J. (2009). The use of bibliometrics to measure research quality in UK higher education institutions. Archivum Immunologiae Et Therapiae Experimentalis, 57(1), 19–32.  https://doi.org/10.1007/s00005-009-0003-3.CrossRefGoogle Scholar
  2. Camiz, S., & Gomes, G. C. (2013). Joint correspondence analysis versus multiple correspondence analysis: A solution to an undetected problem. In Classification and data mining (pp. 11–18). Berlin: Springer.Google Scholar
  3. Engels, T. C. E., Ossenblok, T. L. B., & Spruyt, E. H. J. (2012). Changing publication patterns in the social sciences and humanities, 2000–2009. Scientometrics, 93, 373–390.CrossRefGoogle Scholar
  4. Geuna, A., & Martin, B. R. (2003). University research evaluation and funding: An international comparison. Minerva, 41(4), 277–304.CrossRefGoogle Scholar
  5. Giménez-Toledo, E., Mañana-Rodríguez, J., Engels, T. C. E., Ingwersen, P., Sivertsen, G., Verleysen, F. T., et al. (2016). Taking scholarly books into account: Current developments in five European countries. Scientometrics, 107(2), 685–699.  https://doi.org/10.1007/s11192-016-1886-5.CrossRefGoogle Scholar
  6. Giménez-Toledo, E., Sivertsen, G., & Mañana-Rodríguez, J. (2017). Peer review as a delineation criterion in data sources for the assessment and measurement of scholarly book publishing in social sciences and humanities. In 16th International conference on scientometrics and informetrics. Wuhan.Google Scholar
  7. Gorraiz, J., Purnell, P. J., & Glänzel, W. (2013). Opportunities for and limitations of the Book Citation Index. Journal of the American Society for Information Science and Technology, 64(7), 1388–1398.  https://doi.org/10.1002/asi.22875.CrossRefGoogle Scholar
  8. Greenacre, M. (2007). Correspondence analysis in practice (2nd ed.). Boca Raton, FL: Chapman & Hall.CrossRefGoogle Scholar
  9. Kousha, K., Thelwall, M., & Rezaie, S. (2011). Assessing the citation impact of books: The role of Google Books, Google Scholar, and Scopus. Journal of the American Society for Information Science and Technology, 62(11), 2147–2164.CrossRefGoogle Scholar
  10. Michavila, F. (ed.). (2012). La Universidad española en cifras. Madrid: CRUE. http://www.crue.org/Publicaciones/Documents/UEC/LA_UNIVERSIDAD_ESPANOLA_EN_CIFRAS.pdf. Accessed Sept 2017.
  11. Sello de calidad en edición académica (CEA/APQ). http://www.selloceaapq.es/. Accessed Apr 2018.
  12. Sīle, L., Guns, R., Sivertsen, G., & Engels, T. C. E. (2017). European databases and repositories for social sciences and humanities research output. Antwerp: ECOOM & ENRESSH.  https://doi.org/10.6084/m9.figshare.5172322sivertsen.
  13. Sivertsen, G. (2017). Unique, but still best practice? The Research Excellence Framework (REF) from an international perspective. Palgrave Communications, 3, 17078.CrossRefGoogle Scholar
  14. Sivertsen, G. (2018). Why has no other European country adopted the Research Excellence Framework? Available at the London School of Economics and Political Science blog. http://blogs.lse.ac.uk/politicsandpolicy/why-has-no-other-european-country-adopted-the-research-excellence-framework/.
  15. Williams, K., & Grant, J. (2018). A comparative review of how the policy and procedures to assess research impact evolved in Australia and the UK. Research Evaluation, 27, 93–105.  https://doi.org/10.1093/reseval/rvx042.CrossRefGoogle Scholar
  16. Wilsdon, J. et al. (2015). Metric Tide: Report of the independent review of the role of metrics in research assessment and management.  https://doi.org/10.13140/rg.2.1.4929.1363.

Copyright information

© Akadémiai Kiadó, Budapest, Hungary 2018

Authors and Affiliations

  • Elea Giménez-Toledo
    • 1
  • Jorge Mañana-Rodríguez
    • 1
  • Tim C. E. Engels
    • 2
  • Raf Guns
    • 2
  • Emanuel Kulczycki
    • 3
  • Michael Ochsner
    • 4
    • 5
  • Janne Pölönen
    • 6
  • Gunnar Sivertsen
    • 7
  • Alesia A. Zuccala
    • 8
  1. 1.Research Group on Scholarly Books (ILIA), Institute of Philosophy (IFS)Spanish National Research Council (CSIC)MadridSpain
  2. 2.Centre for R&D Monitoring (ECOOM), Faculty of Social SciencesUniversity of AntwerpAntwerpBelgium
  3. 3.Scholarly Communication Research Group, Faculty of Social SciencesAdam Mickiewicz University in PoznańPoznanPoland
  4. 4.Swiss Center of Expertise in the Social Sciences, FORSUniversity of LausanneLausanneSwitzerland
  5. 5.GESSETH ZurichZurichSwitzerland
  6. 6.Federation of Finnish Learned SocietiesHelsinkiFinland
  7. 7.Nordic Institute for Studies in Innovation, Research and EducationOsloNorway
  8. 8.Department of Information StudiesUniversity of CopenhagenCopenhagenDenmark

Personalised recommendations