Advertisement

Analysing the Scientific Impact of Development Studies: Challenges for the Future

  • Sergio Tezanos
  • Carmen Trueba
Chapter
Part of the EADI Global Development Series book series (EADI)

Abstract

The chapter assesses the scientific performance of Development Studies in terms of its impact factors in the two main (and competing) citation indexes: the Clarivate Analytics’ Social Science Citation Index and the Elsevier’s Scimago Journal Rank. We briefly explain the usefulness of journal impact factors for analysing the “quality” of research across different fields of study; we discuss the implications of the cross-disciplinary character of Development Studies; we comparatively analyse the impact factors of Development Studies in the Social Science Citation Index and the Scimago Journal Rank; and we explore the consequences of creating a separate category of “development” in the Social Science Citation Index. Finally, we summarize the main results and offer recommendations to EADI and national Development Studies associations in order to increase the scientific impact of Development Studies.

Notes

Acknowledgements

We would like to especially thank Ramón Gandarillas Pérez and Borja Mantecón for their careful assistance and helpful comments in this piece of research. The views expressed in this paper, however, remain solely those of the authors. Of course, the usual caveats apply.

References

  1. Aksnes, D. W. (2003). Characteristics of Highly Cited Papers. Research Evaluation, 12(3), 159–170.  https://doi.org/10.3152/147154403781776645.CrossRefGoogle Scholar
  2. Amin, M., & Mabe, M. (2004). Impact Factors: Use and Abuse. International Journal of Environmental Science and Technology, 1(1), 1–6.CrossRefGoogle Scholar
  3. Clarivate Analytics. (2017). History of Citation Indexing. http://wokinfo.com/essays/history-of-citation-indexing/.
  4. Cross, J. (2014). Impact Factors—The Basics. In G. Stone (Ed.), The E-Resources Management Handbook (Chapter 9).  https://doi.org/10.1629/9552448-0-3.17.1.
  5. Domínguez, R. (2012). Manifiesto interdisciplinar por los estudios del desarrollo. In R. Domínguez & S. Tezanos (Eds.), Desafíos de los Estudios del Desarrollo: Actas del I Congreso Internacional de Estudios del Desarrollo, Red Española de Estudios del Desarrollo (REEDES) (pp. 19–45). http://congresoreedes.unican.es/actas/actas.html.
  6. European Association of Development Research and Training Institutes (EADI). (2005). Development Studies, Accreditation and EADI. A Vision Paper Presented to the EADI Executive Committee. http://www.eadi.org/typo3/fileadmin/Accreditation/Documents/EADI_Vision_paper_final.pdf.
  7. Fundación Española para la Ciencia y la Tecnología (FECYT). (2017). Herramienta de análisis índices de impacto FECYT. https://www.recursoscientificos.fecyt.es/servicios/indices-de-impacto.
  8. Garfield, E. (1977). The Mystery of the Transposed Journal Lists—Wherein Bradford’s Law of Scattering Is Generalized According to Garfield’s Law of Concentration. Essays of an Information Scientist, 1, 222–223.Google Scholar
  9. Garfield, E. (1979). Citation Indexing: Its Theory and Application in Science, Technology, and Humanities. New York: Wiley.Google Scholar
  10. Garfield, E. (1996, September 2). The Significant Scientific Literature Appears in a Small Core of Journals. The Scientist. http://www.the-scientist.com/?articles.view/articleNo/18038/title/The-Significant-Scientific-Literature-Appears-In-A-Small-Core-Of-Journals/.
  11. Garfield, E. (1998). Long Term vs. Short-Term Journal Impact: Does It Matter? The Scientist, 2(3), 10–12.Google Scholar
  12. Garfield, E., & Sher, I. H. (1963). New factors in the Evaluation of Scientific Literature Through Citation Indexing. American Documentation, 14(3), 195–201.CrossRefGoogle Scholar
  13. González-Pereira, B., Guerrero-Bote, V. P., & Moya-Anegón, F. (2010). A New Approach to the Metric of Journals’ Scientific Prestige: The SJR Indicator. Journal of Informetrics, 4(3), 379–391.CrossRefGoogle Scholar
  14. Hulme, D., & Toye, J. (2006). The Case for Cross-Disciplinary Social Science Research on Poverty, Inequality and Well-Being. Journal of Development Studies, 42(7), 1085–1107.  https://doi.org/10.1080/00220380600884050.CrossRefGoogle Scholar
  15. Kuhn, T. S. (1962/1996). The Structure of Scientific Revolutions (3rd ed.). Chicago: University of Chicago Press.Google Scholar
  16. Organisation for Economic Co-operation and Development (OECD). (1972). Interdisciplinarity: Problems of Teaching and Research in Universities. Paris: Centre for Educational Research and Innovation, OECD.Google Scholar
  17. Pietrobelli, C., & Rabellotti, R. (2011). Global Value Chains Meet Innovation Systems: Are There Learning Opportunities for Developing Countries? World Development, 39(7), 1261–1269.  https://doi.org/10.1016/j.worlddev.2010.05.013.CrossRefGoogle Scholar
  18. Scimago. (2017). SJR—Scimago Journal & Country Rank. http://www.scimagojr.com. Accessed 2 May 2017.
  19. Sumner, A., & Tribe, M. (2008). International Development Studies: Theories and Methods in Research and Practice. London: Sage.CrossRefGoogle Scholar
  20. Testa, J. (2016). Web of Science Core Collection Journal Selection Process. Clarivate Analytics. http://wokinfo.com/media/pdf/journal_selection_essay-en.pdf.
  21. Thomson Reuters. (2012). Scope Notes 2012 Social Science Citation Index. http://ip-science.thomsonreuters.com/mjl/scope/scope_ssci/#UQ.

Copyright information

© The Author(s) 2019

Authors and Affiliations

  • Sergio Tezanos
    • 1
  • Carmen Trueba
    • 1
  1. 1.Universidad de CantabriaSantanderSpain

Personalised recommendations