Advertisement

Publication Data Integration as a Tool for Excellence-Based Research Analysis at the University of Latvia

  • Laila NiedriteEmail author
  • Darja SolodovnikovaEmail author
  • Aivars Niedritis
Conference paper
Part of the Communications in Computer and Information Science book series (CCIS, volume 767)

Abstract

The evaluation of research results can be carried out with different purposes aligned with strategic goals of an institution, for example, to decide upon distribution of research funding or to recruit or promote employees of an institution involved in research. Whereas quantitative measures such as number of scientific papers or number of scientific staff are commonly used for such evaluation, the strategy of the institution can be set to achieve ambitious scientific goals. Therefore, a question arises as to how more quality oriented aspects of the research outcomes should be measured. To supply an appropriate dataset for evaluation of both types of metrics, a suitable framework should be provided, that ensures that neither incomplete, nor faulty data are used, that metric computation formulas are valid and the computed metrics are interpreted correctly. To provide such a framework with the best possible features, data from various available sources should be integrated to achieve an overall view on the scientific activity of an institution along with solving data quality issues. The paper presents a publication data integration system for excellence-based research analysis at the University of Latvia. The system integrates data available at the existing information systems at the university with data obtained from external sources. The paper discusses data integration flows and data integration problems including data quality issues. A data model of the integrated dataset is also presented. Based on this data model and integrated data, examples of quality oriented metrics and analysis results of them are provided.

Keywords

Research evaluation Research metrics Data integration Data quality Information system Data model 

References

  1. 1.
    Hicks, D., Wouters, P., Waltman, L., De Rijcke, S., Rafols, I.: The Leiden Manifesto for research metrics. Nature 520(7548), 429–431 (2015). doi: 10.1038/520429a CrossRefGoogle Scholar
  2. 2.
    Kosten, J.: A classification of the use of research indicators. Scientometrics 108(1), 457–464 (2016). doi: 10.1007/s11192-016-1904-7 CrossRefGoogle Scholar
  3. 3.
    Aagaard, K., Bloch, C., Schneider, J.W.: Impacts of performance-based research funding systems: the case of the Norwegian Publication Indicator. Res. Eval. 24(2), 106–117 (2015). doi: 10.1093/reseval/rvv003 CrossRefGoogle Scholar
  4. 4.
    Nikolić, S., Penca, V., Ivanović, D., Surla, D., Konjović, Z.: Storing of bibliometric indicators in CERIF data model. In: International Conference on Internet Society Technology (2013). doi: 10.13140/2.1.2196.5121
  5. 5.
    Quix, C., Matthias, J.: Information integration in research information systems. Procedia Comput. Sci. 33, 18–24 (2014). doi: 10.1016/j.procs.2014.06.004 CrossRefGoogle Scholar
  6. 6.
    Jörg, B.: CERIF: the common European research information format model. Data Sci. J. 9, 24–31 (2010). doi: 10.2481/dsj.CRIS4 Google Scholar
  7. 7.
    Rampāne, I., Rozenberga, G.: Latvijas Universitātes publikāciju citējamība datubāzēs (2012–2015). Alma Mater (vasara), pp. 26–28 (2016)Google Scholar
  8. 8.
    Cabinet Regulation No. 1316: Regulations regarding calculation and assignment of grant-based funding for research institutions. https://likumi.lv/doc.php?id=262508
  9. 9.
    Sivertsen, G.: Data integration in Scandinavia. Scientometrics 106(2), 849–855 (2016). doi: 10.1007/s11192-015-1817-x CrossRefGoogle Scholar
  10. 10.
    Kulczycki, E., Korzeń, M., Korytkowski, P.: Toward an excellence-based research funding system: evidence from Poland. J. Informetr. 11(1), 282–298 (2017). doi: 10.1016/j.joi.2017.01.001 CrossRefGoogle Scholar
  11. 11.
    Galimberti, P., Mornati, S.: The Italian model of distributed research information management systems: a case study. Procedia Comput. Sci. 106, 183–195 (2017). doi: 10.1016/j.procs.2017.03.015 CrossRefGoogle Scholar
  12. 12.
  13. 13.
    The International Organisation for Research Information. http://eurocris.org/cerif/main-features-cerif
  14. 14.
    Teixeira da Silva, J.A., Memon, A.R.: CiteScore: a cite for sore eyes, or a valuable, transparent metric? Scientometrics 111(1), 553–556 (2017). doi: 10.1007/s11192-017-2250-0 CrossRefGoogle Scholar
  15. 15.
    Moed, H.F.: Measuring contextual citation impact of scientific journals. J. Informetr. 4(3), 265–277 (2010). doi: 10.1016/j.joi.2010.01.002 CrossRefGoogle Scholar
  16. 16.
    Gonzalez-Pereira, B., Guerrero-Bote, V.P., Moya-Anegon, F.: A new approach to the metric of journals’ scientific prestige: the SJR indicator. J. Informetr. 4(3), 379–391 (2010). doi: 10.1016/j.joi.2010.03.002 CrossRefGoogle Scholar
  17. 17.
    Hardcastle, J.: New journal citation metric – impact per publication (2014). http://editorresources.taylorandfrancisgroup.com/new-journal-citation-metric-impact-per-publication/
  18. 18.
    Winkler, W.: The state of record linkage and current research problems. Technical report, Statistics of Income Division, US Census Bureau (1999)Google Scholar
  19. 19.
    Niedrite, L., Solodovnikova, D.: University IS architecture for the research evaluation support. In: 11th International Scientific and Practical Conference “Environment.Technology. Resources”, pp. 112–117. Rezekne Academy of Technologies, Rezekne (2017). doi: 10.17770/etr2017vol2.2528

Copyright information

© Springer International Publishing AG 2017

Authors and Affiliations

  1. 1.Faculty of ComputingUniversity of LatviaRigaLatvia

Personalised recommendations