, Volume 102, Issue 3, pp 2131–2150 | Cite as

Globalization of the social sciences in Eastern Europe: genuine breakthrough or a slippery slope of the research evaluation practice?



The introduction of new research evaluation policies in most of the Eastern European (EE) countries was followed by the substantial growth in their (international) scientific productivity. The article starts with a brief review of the current research evaluation practice in EE countries and then explores the pattern of changes in international scientific production of 20 EE countries in the field of social sciences and humanities during 2004–2013. A new indicator named Journal Diversity Index (JDI) is suggested as a possible measure of sustainability and genuineness of the globalization of social sciences in EE countries. JDI represents the number of journals that account for 50 % of country’s published articles, corrected for the total number of unique journals in which articles by the authors from all EE countries appear. The analysis has shown that EE countries with the lower JDI largely base their international scientific production on national journals covered by Web of Science (WoS). Those countries also have a lower average citation rate of articles. With the exception of Hungary and Poland, the “globalization” of EE social sciences still rely strongly on language, regional and cultural proximities. This is potentially harmful given the unstable status of EE journals in WoS. EE science policy institutions should take more responsibility in controlling the quality of national journals indexed in international databases. They should also be aware of significant differences in the coverage policies of Thomson Reuters and Elsevier and possible implications of those differences for the science evaluation practice.


Journal Diversity Index Social sciences Eastern Europe Bibliographic indicators Science evaluation policy 


  1. Allik, J. (2013). Factors affecting bibliometric indicators of scientific quality. TRAMES, 17(3), 199–214.CrossRefGoogle Scholar
  2. Almeida, J. A. S., Pais, A. A. C. C., & Formosinho, S. J. (2009). Science indicators and science patterns in Europe. Journal of Informetrics, 3(2), 134–142. doi: 10.1016/j.joi.2009.01.001.CrossRefGoogle Scholar
  3. Archambault, É., Vignola-Gagné, É., Côté, G., Larivière, V., & Gingras, Y. (2006). Benchmarking scientific output in the social sciences and humanities: The limits of existing databases. Scientometrics, 68(3), 329–342. doi: 10.1007/s11192-006-0115-z.CrossRefGoogle Scholar
  4. Boyack, K. W., Klavans, R., & Börner, K. (2005). Mapping the backbone of science. Scientometrics, 64(3), 351–374.CrossRefGoogle Scholar
  5. Collazo-Reyes, F. (2013). Growth of the number of indexed journals of Latin America and the Caribbean: The effect on the impact of each country. Scientometrics. doi: 10.1007/s11192-013-1036-2.Google Scholar
  6. de Nooy, W., Mrvar, A., & Batagelj, V. (2011). Exploratory social network analysis with pajek. Cambridge: Cambridge University Press.CrossRefGoogle Scholar
  7. Engels, T. C. E., Ossenblok, T. L. B., & Spruyt, E. H. J. (2012). Changing publication patterns in the social sciences and humanities, 2000–2009. Scientometrics, 93(2), 373–390. doi: 10.1007/s11192-012-0680-2.CrossRefGoogle Scholar
  8. Fenrich, W., Nowiński, A., Zamłyńska, K., & Sylwestrzak, W. (2013). POL-index—Polska Baza Cytowań. Presented at the Bibliograficzne bazy danych i ich rola w rozwoju nauki. II Konferencja naukowa Konsorcjum BazTech, Poznań, 17–19 kwietnia 2013.Google Scholar
  9. Fiala, D. (2013). Science evaluation in the Czech Republic: The case of universities. Societies, 3(3), 266–279.CrossRefMathSciNetGoogle Scholar
  10. Geuna, A., & Martin, B. R. (2003). University research evaluation and funding: An international comparison. Minerva, 41(4), 277–304. doi: 10.1023/ Scholar
  11. Heilbron, J. (2013). The social sciences as an emerging global field. Current Sociology. doi: 10.1177/0011392113499739.Google Scholar
  12. Hicks, D. (2012a). One size doesn’t fit all: On the co-evolution of national evaluation systems and social science publishing. Confero, 1(1), 67–99.CrossRefGoogle Scholar
  13. Hicks, D. (2012b). Performance-based university research funding systems. Research Policy, 41(2), 251–261. doi: 10.1016/j.respol.2011.09.007.CrossRefMathSciNetGoogle Scholar
  14. Himanen, L., Auranen, O., Puuska, H.-M., & Nieminen, M. (2009). Influence of research funding and science policy on university research performance: A comparison of five countries. Science and Public Policy, 36(6), 419–430. doi: 10.3152/030234209X461006.CrossRefGoogle Scholar
  15. Jokić, M., Zauder, K., & Letina, S. (2009). Croatian scholarly productivity 1991–2005 measured by journals indexed in Web of Science. Scientometrics, 83(2), 375–395. doi: 10.1007/s11192-009-0071-5.Google Scholar
  16. Južnič, P., Pečlin, S., Žaucer, M., Mandelj, T., Pušnik, M., & Demšar, F. (2010). Scientometric indicators: Peer-review, bibliometric methods and conflict of interests. Scientometrics, 85(2), 429–441. doi: 10.1007/s11192-010-0230-8.CrossRefGoogle Scholar
  17. Kamada, T., & Kawai, S. (1989). An algorithm for drawing general undirected graphs. Information Processing Letters, 31(1), 7–15.CrossRefMATHMathSciNetGoogle Scholar
  18. Koczkodaj, W. W., Kulakowski, K., & Ligeza, A. (2014). On the quality evaluation of scientific entities in Poland supported by consistency-driven pairwise comparisons method. Scientometrics, 99, 911–926. doi: 10.1007/s11192-014-1258-y.CrossRefGoogle Scholar
  19. Kosanović, B., & Šipka, P. (2013). Output in WoS vs. representation in JCR of SEE Nations: does mother thomson cherish all her children equally (pp. 125–137). Presented at the Fifth Belgrade International Open Access Conference 2012, May 18–19, Belgrade: CEES. doi: 10.5937/BIOAC-111.
  20. Kozak, M., Bornmann, L., & Leydesdorff, L. (2013). How have the Eastern European countries of the former Warsaw Pact developed since 1990? A bibliometric study. arXiv:1312.3077 [cs, Stat]. Retrieved from
  21. Larivière, V., Archambault, É., Gingras, Y., & Vignola-Gagné, É. (2006). The place of serials in referencing practices: Comparing natural sciences and engineering with social sciences and humanities. Journal of the American Society for Information Science and Technology, 57(8), 997–1004. doi: 10.1002/asi.20349.CrossRefGoogle Scholar
  22. Leite, P., Mugnaini, R., & Leta, J. (2011). A new indicator for international visibility: Exploring Brazilian scientific community. Scientometrics, 88(1), 311–319. doi: 10.1007/s11192-011-0379-9.CrossRefGoogle Scholar
  23. Lõhkivi, E., Velbaum, K., & Eigi, J. (2012). Epistemic injustice in research evaluation: A cultural analysis of the humanities and physics in Estonia. Studia Philosophica Estonica, 5(2), 108–132.Google Scholar
  24. López-Illescas, C., de Anegón, F. M., & Moed, H. F. (2009). Comparing bibliometric country-by-country rankings derived from the Web of Science and Scopus: The effect of poorly cited journals in oncology. Journal of Information Science, 35(2), 244–256. doi: 10.1177/0165551508098603.CrossRefGoogle Scholar
  25. Macdonald, S., & Kam, J. (2007). Ring a Ring o’ Roses: Quality journals and gamesmanship in management studies. Journal of Management Studies, 44(4), 640–655. doi: 10.1111/j.1467-6486.2007.00704.x.CrossRefGoogle Scholar
  26. Mali, F. (2011). Policy issues of the international productivity and visibility of the social sciences in Central and Eastern European Countries. Sociologija I Prostor, 48(3), 415–435.Google Scholar
  27. Mañana-Rodríguez, J. (2014). A critical review of SCImago journal and country rank. Research Evaluation. doi: 10.1093/reseval/rvu008.Google Scholar
  28. Moed, H. F. (2009). New developments in the use of citation analysis in research evaluation. Archivum Immunologiae et Therapiae Experimentalis, 57(1), 13–18. doi: 10.1007/s00005-009-0001-5.CrossRefGoogle Scholar
  29. Nederhof, A. J. (2006). Bibliometric monitoring of research performance in the social sciences and the humanities: A Review. Scientometrics, 66(1), 81–100. doi: 10.1007/s11192-006-0007-2.CrossRefMathSciNetGoogle Scholar
  30. Ossenblok, T. L. B., Engels, T. C. E., & Sivertsen, G. (2012). The representation of the social sciences and humanities in the Web of Science—A comparison of publication patterns and incentive structures in Flanders and Norway (2005–9). Research Evaluation, 21(4), 280–290. doi: 10.1093/reseval/rvs019.CrossRefGoogle Scholar
  31. Pajić, D. (2014). Browse to search, visualize to explore: Who needs an alternative information retrieving model? Computers in Human Behavior, 39, 145–153. doi: 10.1016/j.chb.2014.07.010.CrossRefGoogle Scholar
  32. Pajić, D., & Jevremov, T. (2014). Globally national—Locally international: Bibliometric analysis of a SEE psychology journal. Psihologija, 47(2), 263–277. doi: 10.2298/PSI1402263P.CrossRefGoogle Scholar
  33. Pálné Kovács, I., & Kutsar, D. (Eds.). (2012). Internationalisation of social sciences in Central and Eastern Europe: The “Catching Up”—A myth or a strategy?. Abingdon: Routledge.Google Scholar
  34. Resnik, D. B., Patrone, D., & Peddada, S. (2010). Research misconduct policies of social science journals and impact factor. Accountability in Research, 17(2), 79–84. doi: 10.1080/08989621003641181.CrossRefGoogle Scholar
  35. Sambunjak, D., Ivaniš, A., Marušić, A., & Marušić, M. (2008). Representation of journals from five neighboring European countries in the journal citation reports. Scientometrics, 76(2), 261–271. doi: 10.1007/s11192-007-1915-5.CrossRefGoogle Scholar
  36. Schuermans, N., Meeus, B., & De Maesschalck, F. (2010). Is there a world beyond the Web of Science? Publication practices outside the heartland of academic geography. Area, 42(4), 417–424. doi: 10.1111/j.1475-4762.2010.00938.x.CrossRefGoogle Scholar
  37. Segalla, M. (2008). Publishing in the right place or publishing the right thing: Journal targeting and citations’ strategies for promotion and tenure committees. European Journal of International Management, 2(2), 122–127. doi: 10.1504/EJIM.2008.017765.CrossRefGoogle Scholar
  38. Seglen, P. O. (1992). The skewness of science. Journal of the American Society for Information Science, 43(9), 628–638. doi: 10.1002/(SICI)1097-4571(199210)43:9<628:AID-ASI5>3.0.CO;2-0.CrossRefGoogle Scholar
  39. Šipka, P. (2005). The Serbian citation index: Context and content. Proceedings of ISSI 2005–10th International Conference of the Society for Scientometrics and Informetrics, Stockholm, Sweden, July 24–28, 2005 (pp. 710–711). Stockholm: ISSI and Karolinska University Press.Google Scholar
  40. Šipka, P. (2012). Legitimacy of citations in predatory publishing: The case of proliferation of papers by Serbian authors in two Bosnian WoS-indexed journals (No. 2012-12-2.). Belgrade: CEES. Retrieved from
  41. Stirling, A. (2007). A general framework for analysing diversity in science, technology and society. Journal of the Royal Society, Interface, 4(15), 707–719. doi: 10.1098/rsif.2007.0213.CrossRefGoogle Scholar
  42. Teodorescu, D., & Andrei, T. (2014). An examination of “citation circles” for social sciences journals in Eastern European countries. Scientometrics, 99(2), 209–231. doi: 10.1007/s11192-013-1210-6.CrossRefGoogle Scholar
  43. Vanecek, J. (2014). The effect of performance-based research funding on output of R&D results in the Czech Republic. Scientometrics, 98(1), 657–681. doi: 10.1007/s11192-013-1061-1.CrossRefGoogle Scholar
  44. Varshavskii, A. E., Ivanov, V. V., & Markusova, V. A. (2011). Adequate assessment of scientific output. Herald of the Russian Academy of Sciences, 81(4), 358–363. doi: 10.1134/S1019331611020195.CrossRefGoogle Scholar
  45. Vinkler, P. (2000). Evaluation of the publication activity of research teams by means of scientometric indicators. Current Science, 79(5), 602–612.Google Scholar

Copyright information

© Akadémiai Kiadó, Budapest, Hungary 2014

Authors and Affiliations

  1. 1.Faculty of Philosophy, Department of PsychologyUniversity of Novi SadNovi SadSerbia

Personalised recommendations