Advertisement

Indicating Studies’ Quality Based on Open Data in Digital Libraries

  • Yusra ShakeelEmail author
  • Jacob Krüger
  • Gunter Saake
  • Thomas Leich
Conference paper
Part of the Lecture Notes in Business Information Processing book series (LNBIP, volume 339)

Abstract

Researchers publish papers to report their research results and, thus, contribute to a steadily growing corpus of knowledge. To not unintentionally repeat research and studies, researchers need to be aware of the existing corpus. For this purpose, they crawl digital libraries and conduct systematic literature reviews to summarize existing knowledge. However, there are several issues concerned with such approaches: Not all documents are available to every researcher, results may not be found due to ranking algorithms, and it requires time and effort to manually assess the quality of a document. In this paper, we provide an overview of the publicly available information of different digital libraries in computer science. Based on these results, we derive a taxonomy to describe the connections between this information and discuss their suitability for quality assessments. Overall, we observe that bibliographic data and simple citation counts are available in almost all libraries, with some of them providing rather unique information. Some of this information may be used to improve automated quality assessment, but with limitations.

Keywords

Citation counts Quality assessment Literature analysis Digital libraries 

Notes

Acknowledgments

This research is supported by the DAAD STIBET Matching Funds grant.

References

  1. 1.
    Beel, J., Gipp, B.: Google Scholar’s ranking algorithm: the impact of citation counts (an empirical study). In: International Conference on Research Challenges in Information Systems (RCIS), pp. 439–446 (2009)Google Scholar
  2. 2.
    Bergstrom, C.T., West, J.D., Wiseman, M.A.: The eigenfactorTM metrics. J. Neurosci. 28(45), 11433–11434 (2008)CrossRefGoogle Scholar
  3. 3.
    Bornmann, L., Daniel, H.D.: What do we know about the h index? J. Assoc. Inf. Sci. Technol. 58(9), 1381–1385 (2007)CrossRefGoogle Scholar
  4. 4.
    Bornmann, L., Schier, H., Marx, W., Daniel, H.D.: What factors determine citation counts of publications in chemistry besides their quality? J. Inf. 6(1), 11–18 (2012)Google Scholar
  5. 5.
    Brophy, J., Bawden, D.: Is Google enough? Comparison of an Internet search engine with academic library resources. In: Aslib Proceedings, vol. 57, no. 6, pp. 498–512 (2005)CrossRefGoogle Scholar
  6. 6.
    Daniel, H.D.: Publications as a measure of scientific advancement and of scientists’ productivity. Learn. Publ. 18(2), 143–148 (2005)CrossRefGoogle Scholar
  7. 7.
    Giannakakis, I.A., Haidich, A.B., Contopoulos-Ioannidis, D.G., Papanikolaou, G.N., Baltogianni, M.S., Ioannidis, J.P.A.: Citation of randomized evidence in support of guidelines of therapeutic and preventive interventions. J. Clin. Epidemiol. 55(6), 545–555 (2002)CrossRefGoogle Scholar
  8. 8.
    Giles, C.L.: The future of citeseer: citeseerx. In: Fürnkranz, J., Scheffer, T., Spiliopoulou, M. (eds.) ECML 2006. LNCS (LNAI), vol. 4212, p. 2. Springer, Heidelberg (2006).  https://doi.org/10.1007/11871842_2CrossRefGoogle Scholar
  9. 9.
    Harnad, S.: Open access scientometrics and the UK research assessment exercise. In: Conference of the International Society for Scientometrics and Informetrics, pp. 27–33 (2007)Google Scholar
  10. 10.
    Harter, S.P.: Scholarly communication and the digital library: problems and issues. J. Digital Inf. 1(1), 147–156 (2006)Google Scholar
  11. 11.
    Hemlin, S.: Research on research evaluation. Soc. Epistemology 10(2), 209–250 (1996)CrossRefGoogle Scholar
  12. 12.
    Ioannidis, J.P.A.: A generalized view of self-citation: direct, co-author, collaborative, and coercive induced self-citation. J. Psychosom. Res. 78(1), 7–11 (2015)CrossRefGoogle Scholar
  13. 13.
    Jacso, P.: As we may search - comparison of major features of the web of science, scopus, and google scholar citation-based and citation-enhanced databases. Curr. Sci. 89(9), 1537–1547 (2005)Google Scholar
  14. 14.
    Kitchenham, B.A., Charters, S.: Guidelines for performing systematic literature reviews in software engineering. Technical report, Keele University and University of Durham (2007)Google Scholar
  15. 15.
    Kulkarni, A.V., Aziz, B., Shams, I., Busse, J.W.: CoLus: Comparisons of citations in web of science, scopus, and Google Scholar for articles published in general medical journals. JAMA: J. Am. Med. Assoc. 302(10), 1092–1096 (2009)CrossRefGoogle Scholar
  16. 16.
    Lausberger, C.: Konzeption von Suchprozessen und Suchstrategien für systematische Literatur Reviews. Master’s thesis, University of Magdeburg, German (2017)Google Scholar
  17. 17.
    Ley, M.: DBLP—some lessons learned. Proc. VLDB Endowment 2(2), 1493–1500 (2009)CrossRefGoogle Scholar
  18. 18.
    Lindsey, D.: Using citation counts as a measure of quality in science measuring what’s measurable rather than what’s valid. Scientometrics 15(3–4), 189–203 (1989)CrossRefGoogle Scholar
  19. 19.
    Meier, J.J., Conkling, T.W.: Google Scholar’s coverage of the engineering literature: an empirical study. J. Acad. Libr. 34(3), 196–201 (2008)CrossRefGoogle Scholar
  20. 20.
    Meyyappan, N., Chowdhury, G.G., Foo, S.: A review of the status of 20 digital libraries. J. Inf. Sci. 26(5), 337–355 (2000)CrossRefGoogle Scholar
  21. 21.
    Neuhaus, C., Daniel, H.D.: Data sources for performing citation analysis: an overview. J. Doc. 64(2), 193–210 (2008)CrossRefGoogle Scholar
  22. 22.
    Phelan, T.J.: A compendium of issues for citation analysis. Scientometrics 45(1), 117–136 (1999)CrossRefGoogle Scholar
  23. 23.
    Robinson, K.A., Goodman, S.: A systematic examination of the citation of prior research in reports of randomized controlled trials. Ann. Intern. Med. 154(1), 50–55 (2011)CrossRefGoogle Scholar
  24. 24.
    Schröter, I., Krüger, J., Ludwig, P., Thiel, M., Nürnberger,A., Leich, T.: Identifying innovative documents: quo vadis? In: International Conference on Enterprise Information Systems (ICEIS), pp.653–658. ScitePress (2017)Google Scholar
  25. 25.
    Schwartz, C.: Digital libraries: an overview. J. Acad. Libr. 26(6), 385–393 (2000)CrossRefGoogle Scholar
  26. 26.
    Shakeel, Y.: Supporting quality assessment in systematic literature reviews. Master’s thesis, University of Magdeburg (2017)Google Scholar
  27. 27.
    Shakeel, Y., et al.: (Automated) literature analysis - threats and experiences. In: International Workshop on Software Engineering for Science (SE4Science), pp. 20–27. ACM (2018)Google Scholar
  28. 28.
    Thijs, B., Glänzel, W.: The influence of author self-citations on bibliometric meso-indicators. The case of European Universities. Scientometrics 66(1), 71–80 (2006)CrossRefGoogle Scholar
  29. 29.
    Walter, G., Bloch, S., Hunt, G., Fisher, K.: Counting on citations: a flawed way to measure quality. Med. J. Australia 178(6), 280–281 (2003)Google Scholar
  30. 30.
    Zhang, H., Ali Babar, M.: On searching relevant studies in software engineering. In: International Conference on Evaluation and Assessment in Software Engineering (EASE), pp. 111–120. BCS Learning & Development Ltd. (2010)Google Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  • Yusra Shakeel
    • 1
    • 2
  • Jacob Krüger
    • 1
  • Gunter Saake
    • 1
  • Thomas Leich
    • 2
    • 3
  1. 1.Otto-von-Guericke UniversityMagdeburgGermany
  2. 2.METOP GmbHMagdeburgGermany
  3. 3.Harz University of Applied SciencesWernigerodeGermany

Personalised recommendations