Advertisement

Evaluating the Academic Performance of Institutions within Scholarly Communities

  • Lili Lin
  • Zhuoming Xu
  • Yuanhang Zhuang
  • Jie Wei
Part of the Lecture Notes in Computer Science book series (LNCS, volume 8839)

Abstract

Most state-of-the-art studies either conduct peer assessment or adopt bibliometric indicators for institution evaluation. However, peer assessments are labor intensive and time consuming, and existing bibliometric methods may produce a biased evaluation result because they do not synthetically model many crucial factors that reflect the academic performance of institutions in a unified way. Thus,we propose a factor graph-based institution ranking model to leverage institutions’ individual information (i.e., quantitative and qualitative information) and scholarly network information (i.e., collaborative intensity) in this paper. We choose the peer assessment result from the best-known U.S. News & World Report as the ground truth and conduct a case study on the U.S. institution ranking in the library and information science (LIS) research field. The experimental results indicate that our approach can be a better alternative for the manual peer assessment for institution evaluation when compared with existing bibliometrics methods.

Keywords

Institution evaluation academic performance scholarly network factor graph 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Acuña, E., Espinosa, M., Cancino, J.: Paper-based Productivity Ranking of Chilean Forestry Institutions. Bosqu 34(2), 211–219 (2013)CrossRefGoogle Scholar
  2. 2.
    Grant, J.B., Olden, J.D., Lawler, J.J., et al.: Academic Institutions in the United States and Canada Ranked according to Research Productivity in the Field of Conservation Biology. Conservation Biology 21(5), 1139–1144 (2007)CrossRefGoogle Scholar
  3. 3.
    Huang, M.: Exploring the H-index at the Institutional Level - A Practical Application in World University Rankings. Online Information Review 36(4), 534–547 (2012)CrossRefGoogle Scholar
  4. 4.
    Torres-Salinas, D., Moreno-Torres, J.G., Delgado-López-Cózar, E., et al.: A Methodology for Institution-Field Ranking based on a Bidimensional Analysis: the IFQ 2 A Index. Scientometrics 88(3), 771–786 (2011)CrossRefGoogle Scholar
  5. 5.
    Kschischang, F.R., Frey, B.J., Loeliger, H.: Factor Graphs and the Sum-Product Algorithm. Institute of Electrical and Electronics Engineers Transactions on Information Theory 47(2), 498–519 (2001)MathSciNetCrossRefzbMATHGoogle Scholar
  6. 6.
    U.S. News & World Report - Best Rankings, http://www.usnews.com/rankings
  7. 7.
    Academic Ranking of World Universities, http://www.shanghairanking.cn/
  8. 8.
    CWTS Leiden Ranking, http://www.leidenranking.com/
  9. 9.
    SCImago Institutions Rankings, http://www.scimagoir.com/
  10. 10.
    Yan, E., Sugimoto, C.R.: Institutional Interactions: Exploring Social, Cognitive, and Geographic Relationships between Institutions as Demonstrated through Citation Networks. Journal of the American Society for Information Science and Technology 62(8), 1498–1514 (2011)CrossRefGoogle Scholar
  11. 11.
    Bishop, C.M.: Pattern Recognition and Machine Learning, pp. 359–419. Springer Publications, New York (2006)zbMATHGoogle Scholar
  12. 12.
    Jarvelin, K., Kekalainen, J.: Cumulated Gain-based Evaluation of IR Techniques. Association for Computing Machinery Transactions on Information Systems 20(4), 422–446 (2002)CrossRefGoogle Scholar

Copyright information

© Springer International Publishing Switzerland 2014

Authors and Affiliations

  • Lili Lin
    • 1
  • Zhuoming Xu
    • 1
  • Yuanhang Zhuang
    • 1
  • Jie Wei
    • 1
  1. 1.College of Computer and InformationHohai UniversityNanjingChina

Personalised recommendations