Advertisement

Scientometrics

, Volume 116, Issue 1, pp 181–202 | Cite as

The application of bibliometric analysis: disciplinary and user aspects

Article

Abstract

Bibliometric analysis has been used increasingly as a tool within the scientific community. Interplay is vital between those involved in refining bibliometric methods and the recipients of this type of analysis. Production as well as citations patterns reflect working methodologies in different disciplines within the specialized Library and Information Science (LIS) field, as well as in the non-specialist (non-LIS) professional field. We extract the literature on bibliometric analyses from Web of Science in all fields of science and analyze clustering of co-occurring keywords at an aggregate level. It reveals areas of interconnected literature with different impact on the LIS and the non-LIS community.We classify and categorize bibliometric articles that obtain the most citations in accordance with a modified version of Derrick’s, Jonker’s and Lewison’s method (Derrick et al. in Proceedings, 17th international conference on science and technology indicators. STI, Montreal, 2012). The data demonstrates that cross-referencing between the LIS and the non-LIS field is modest in publications outside their main categories of interest, i.e. discussions of various bibliometric issues or strict analyses of various topics. We identify some fields as less well-covered bibliometrically.

Keywords

Bibliometric analysis Policy implications Publication analysis Citation analysis 

Notes

Acknowledgements

The author wishes to thank Ph.D. Mette Bruus and anonymous referees for valuable comments and suggestions for improvement of the article.

Supplementary material

11192_2018_2765_MOESM1_ESM.xlsx (17 kb)
Supplementary material 1 (XLSX 17 kb)
11192_2018_2765_MOESM2_ESM.xlsx (19 kb)
Supplementary material 2 (XLSX 18 kb)
11192_2018_2765_MOESM3_ESM.xlsx (14 kb)
Supplementary material 3 (XLSX 13 kb)

References

  1. Abbott, A., Cyranoski, D., Jones, N., Maher, B., Schiermeier, Q., & Van Noorden, R. (2010). Do metrics matter? Many researchers believe that quantitative metrics determine who gets hired and who gets promoted at their institutions. With an exclusive poll and interviews, nature probes to what extent metrics are really used that way. Nature, 465(7300), 860–863.CrossRefGoogle Scholar
  2. Bar-Ilan, J. (2008). Which h-index? Comparison of WoS. Scopus and Google Scholar. Scientometrics, 74(2), 257–271.Google Scholar
  3. Bollen, J., & Van de Sompel, H. (2008). Usage impact factor: The effects of sample characteristics on usage-based impact metrics. Journal of the American Society for Information Science and Technology, 59(1), 136–149.CrossRefGoogle Scholar
  4. Bonnell, A. G. (2016). Tide or tsunami? The impact of metrics on scholarly research. Australian Universities’ Review, The, 58(1), 54.Google Scholar
  5. Bornmann, L., & Daniel, H.-D. (2005). Does the h-index for ranking of scientists really work? Scientometrics, 65(3), 391–392.CrossRefGoogle Scholar
  6. Bornmann, L., Mutz, R., & Daniel, H. D. (2008). Are there better indices for evaluation purposes than the h index? A comparison of nine different variants of the h index using data from biomedicine. Journal of the American Society for Information Science and Technology, 59(5), 830–837.CrossRefGoogle Scholar
  7. Bornmann, L., Stefaner, M., de Moya Anegón, F., & Mutz, R. (2014). Ranking and mapping of universities and research-focused institutions worldwide based on highly-cited papers: A visualisation of results from multi-level models. Online Information Review, 38(1), 43–58.CrossRefGoogle Scholar
  8. Braun, T., Bergstrom, C. T., Frey, B. S., Osterloh, M., West, J. D., Pendlebury, D., et al. (2010). How to improve the use of metrics. Nature, 465(17), 870–872.Google Scholar
  9. Cox, A., Gadd, E., Petersohn, S., & Sbaffi, L. (2017). Competencies for bibliometrics. Journal of Librarianship and Information Science.  https://doi.org/10.1177/0961000617728111.Google Scholar
  10. Derrick, G., Jonkers, K., & Lewison, G. (2012) Characteristics of bibliometrics articles in library and information sciences (LIS) and other journals. In Proceedings, 17th international conference on science and technology indicators, (pp. 449–551). STI: Montreal.Google Scholar
  11. Ellegaard, O., & Wallin, J. A. (2015). The bibliometric analysis of scholarly production: How great is the impact? Scientometrics, 105(3), 1809–1831.CrossRefGoogle Scholar
  12. Garfield, E. (1977). Restating fundamental assumptions of citation analysis. Current Contents, 39, 5–6.Google Scholar
  13. Glänzel, W. (1996). The need for standards in bibliometric research and technology. Scientometrics, 35(2), 167–176.CrossRefGoogle Scholar
  14. Grandjean, P., Eriksen, M. L., Ellegaard, O., & Wallin, J. A. (2011). The Matthew effect in environmental science publication: A bibliometric analysis of chemical substances in journal articles. Environmental Health, 10(1), 96.CrossRefGoogle Scholar
  15. Hall, C. M. (2011). Publish and perish? Bibliometric analysis, journal ranking and the assessment of research quality in tourism. Tourism Management, 32(1), 16–27.CrossRefGoogle Scholar
  16. Harvey, L. (2008). Rankings of higher education institutions: A critical review. Routledge: Taylor & Francis.Google Scholar
  17. Hicks, D. (2012). Performance-based university research funding systems. Research Policy, 41(2), 251–261.CrossRefGoogle Scholar
  18. Hicks, D., Wouters, P., Waltman, L., De Rijcke, S., & Rafols, I. (2015). The Leiden manifesto for research metrics. Nature, 520(7548), 429.CrossRefGoogle Scholar
  19. Hirsch, J. E. (2005). An index to quantify an individual’s scientific research output. Proceedings of the National Academy of Sciences of the United States of America, 102(46), 16569–16572.CrossRefMATHGoogle Scholar
  20. Jonkers, K., & Derrick, G. (2012). The bibliometric bandwagon: Characteristics of bibliometric articles outside the field literature. Journal of the Association for Information Science and Technology, 63(4), 829–836.Google Scholar
  21. Kaur, J., Radicchi, F., & Menczer, F. (2013). Universality of scholarly impact metrics. Journal of Informetrics, 7(4), 924–932.CrossRefGoogle Scholar
  22. Larivière, V. (2012). The decade of metrics? Examining the evolution of metrics within and outside LIS. Bulletin of the American Society for Information Science and Technology, 38(6), 12–17.CrossRefGoogle Scholar
  23. Larivière, V., Archambault, E., Gingras, Y., & Vignola-Gagné, É. (2006). The place of serials in referencing practices: Comparing natural sciences and engineering with social sciences and humanities. Journal of the Association for Information Science and Technology, 57(8), 987–1004.Google Scholar
  24. Leydesdorff, L., & Bornmann, L. (2016). The operationalization of “fields” as WoS subject categories (WCs) in evaluative bibliometrics: The cases of “library and information science” and “science & technology studies”. Journal of the Association for Information Science and Technology, 67(3), 707–714.CrossRefGoogle Scholar
  25. Leydesdorff, L., Wouters, P., & Bornmann, L. (2016). Professional and citizen bibliometrics: Complementarities and ambivalences in the development and use of indicators-a state-of-the-art report. Scientometrics, 109, 2129–2150.CrossRefGoogle Scholar
  26. Liu, X., Zhang, L., & Hong, S. (2011). Global biodiversity research during 1900–2009: A bibliometric analysis. Biodiversity and Conservation, 20(4), 807–826.CrossRefGoogle Scholar
  27. Martinez-Pulgarin, D. F., Acevedo-Mendoza, W. F., Cardona-Ospina, J. A., Rodriiuez-Morales, A. J., & Paniz-Mondolfi, A. E. (2016). A bibliometric analysis of global Zika research. Travel Medicine and Infectious Disease, 14(1), 55–57.CrossRefGoogle Scholar
  28. McKechnie, L., & Pettigrew, K. E. (2002). Surveying the use of theory in library and information science research: A disciplinary perspective. Library trends, 50(3), 406.Google Scholar
  29. Petersohn, S. (2016). Professional competencies and jurisdictional claims in evaluative bibliometrics: The educational mandate of academic librarians. Education for Information, 32(2), 165–193.CrossRefGoogle Scholar
  30. Prebor, G. (2010). Analysis of the interdisciplinary nature of library and information science. Journal of Librarianship and Information Science, 42(4), 256–267.CrossRefGoogle Scholar
  31. Tranfield, D., Denyer, D., & Smart, P. (2003). Towards a methodology for developing evidence-informed management knowledge by means of systematic review. British Journal of Management, 14(3), 207–222.CrossRefGoogle Scholar
  32. Van Eck, N. J., & Waltman, L. (2010). Software survey: VOSviewer, a computer program for bibliometric mapping. Scientometrics, 84(2), 523–538.CrossRefGoogle Scholar
  33. Van Eck, N. J., & Waltman, L. (2017). Citation-based clustering of publications using CitNetExplorer and VOSviewer. Scientometrics, 111(2), 1053–1070.CrossRefGoogle Scholar
  34. Van Noorden, R. (2010). A profusion of measures: Scientific performance indicators are proliferating—Leading researchers to ask afresh what they are measuring and why. Richard Van Noorden surveys the rapidly evolving ecosystem. Nature, 465(7300), 864–867.CrossRefGoogle Scholar
  35. Van Raan, A. F. (2005). Fatal attraction: Conceptual and methodological problems in the ranking of universities by bibliometric methods. Scientometrics, 62(1), 133–143.CrossRefGoogle Scholar
  36. Wallin, J. A. (2005). Bibliometric methods: Pitfalls and possibilities. Basic & Clinical Pharmacology & Toxicology, 97(5), 261–275.CrossRefGoogle Scholar
  37. Weingart, P. (2005). Impact of bibliometrics upon the science system: Inadvertent consequences? Scientometrics, 62(1), 117–131.CrossRefGoogle Scholar
  38. Weller, K. (2015). Social media and altmetrics: An overview of current alternative approaches to measuring scholarly impact. In I. M. Welpe, J. Wollersheim, S. Ringelhan, & M. Osterloh (Eds.), Incentives and performance (pp. 261–276). Berlin: Springer.Google Scholar
  39. Wilsdon, J., Allen, L., Belfiore, E., Campbell, P., Curry, S., Hill, S. & Johnson, B. (2015). Report of the independent review of the role of metrics in research assessment and management.  https://doi.org/10.13140/rg.2.1.4929.1363.
  40. Wouters, P. et al. (2015). The Metric Tide: Literature review (supplementary report I to the independent review of the role of metrics in research assessment and management). HEFCE.  https://doi.org/10.13140/rg.2.1.5066.3520.

Copyright information

© Akadémiai Kiadó, Budapest, Hungary 2018

Authors and Affiliations

  1. 1.University Library of Southern DenmarkOdense MDenmark

Personalised recommendations