, Volume 91, Issue 3, pp 869–894 | Cite as

Identifying attractive research fields for new scientists

  • Leonidas Akritidis
  • Dimitrios Katsaros
  • Panayiotis Bozanis


Prior to the beginning of a scientific career, every new scientist is obliged to confront the critical issue of defining the subject area where his/her future research will be conducted. Regardless of the capabilities of a new scholar, an erroneous selection may condemn a dignified effort and result in wasted energy, time and resources. In this article we attempt to identify the research fields which are attractive to these individuals. To the best of our knowledge, this is a new topic that has never been discussed or addressed in the literature. Here we formally set the problem and we propose a solution combining the characteristics of the attractive research areas and the new scholars. Our approach is compared against a statistical model which reveals popular research areas. The comparison of this method to our proposed model leads to the conclusion that not all trendy research areas are suitable for new scientists. A secondary outcome reveals the existence of scientific fields which although they are not so emerging, they are promising for scientists who are starting their career.


Scientist Author Research area Research field Scientometrics Attractive 


  1. Banks, M. (2006). An extension of the hirsch index: Indexing scientific topics and compounds. Scientometrics, 69(1), 161–168.CrossRefGoogle Scholar
  2. Bharati, P., & Tarasewich, P. (2002). Global perceptions of journals publishing e-commerce research. Communications of the ACM, 45(5), 21–26.CrossRefGoogle Scholar
  3. Bornmann, L., & Daniel, H. (2005). Does the h-index for ranking of scientists really work? Scientometrics, 65(3), 391–392.CrossRefGoogle Scholar
  4. Braun, T., Glänzel, W., & Schubert, A. (2006). A Hirsch-type index for journals. Scientometrics, 69(1), 169–173.CrossRefGoogle Scholar
  5. Ding, Y., Chowdhury, G., & Foo, S. (2001). Bibliometric cartography of information retrieval research by using co-word analysis. Information Processing & Management, 37(6), 817–842.MATHCrossRefGoogle Scholar
  6. Egghe, L. (2006). Theory and practise of the g-index. Scientometrics, 69(1), 131–152.MathSciNetCrossRefGoogle Scholar
  7. Egghe, L. (2007). Dynamic h-index: The Hirsch index in function of time. Journal of the American Society for Information Science and Technology, 58(3), 452–454.CrossRefGoogle Scholar
  8. Garfield, E. (1972). Citation analysis as a tool in journal evaluation. Science, 178(4060), 471–479.CrossRefGoogle Scholar
  9. Getoor, L. (2005). Link-based classification. In Advanced methods for knowledge discovery from complex data (pp. 189–207). London: Springer.Google Scholar
  10. Hirsch, J. (2005). An index to quantify an individual’s scientific research output. Proceedings of the National Academy of Sciences, 102(46), 16,569.CrossRefGoogle Scholar
  11. Introducing the Impact Factor. Accessed 1 Oct 2011.
  12. Katerattanakul, P., Han, B., & Hong, S. (2003). Objective quality ranking of computing journals. Communications of the ACM, 46(10), 111–114.CrossRefGoogle Scholar
  13. Katsaros, D., Akritidis, L., & Bozanis, P. (2009). The f index: Quantifying the impact of coterminal citations on scientists’ ranking. Journal of the American Society for Information Science and Technology, 60(5), 1051–1056.CrossRefGoogle Scholar
  14. Lee, W. (2008). How to identify emerging research fields using scientometrics: An example in the field of Information Security. Scientometrics, 76(3), 503–525.CrossRefGoogle Scholar
  15. Lowry, P., Romans, D., Curtis, A., & PricewaterhouseCoopers, L. (2004). Global journal prestige and supporting disciplines: A scientometric study of information systems journals. Journal of the Association for Information Systems (JAIS), 5(2), 29–80.Google Scholar
  16. Noyons, E., Moed, H., & Van Raan, A. (1999). Integrating research performance analysis and science mapping. Scientometrics, 46(3), 591–604.CrossRefGoogle Scholar
  17. Ohniwa, R., Hibino, A., & Takeyasu, K. (2010). Trends in research foci in life science fields over the last 30 years monitored by emerging topics. Scientometrics, 85, 1–17.Google Scholar
  18. Rainer, Jr., R., & Miller, M. (2005). Examining differences across journal rankings. Communications of the ACM, 48(2), 94.CrossRefGoogle Scholar
  19. Sidiropoulos, A., Katsaros, D., & Manolopoulos, Y. (2007). Generalized Hirsch h-index for disclosing latent facts in citation networks. Scientometrics, 72(2), 253–280.CrossRefGoogle Scholar
  20. Sidiropoulos, A., & Manolopoulos, Y. (2005a). A citation-based system to assist prize awarding. ACM SIGMOD Record, 34(4), 60.CrossRefGoogle Scholar
  21. Sidiropoulos, A., & Manolopoulos, Y. (2005b). A new perspective to automatically rank scientific conferences using digital libraries. Information Processing & Management, 41(2), 289–312.CrossRefGoogle Scholar
  22. Sidiropoulos, A., & Manolopoulos, Y. (2006). Generalized comparison of graph-based ranking algorithms for publications and authors. Journal of Systems and Software, 79(12), 1679–1700.CrossRefGoogle Scholar
  23. Small, H. (2006). Tracking and predicting growth areas in science. Scientometrics, 68(3), 595–610.CrossRefGoogle Scholar
  24. Tseng, Y., Lin, Y., Lee, Y., Hung, W., & Lee, C. (2009). A comparison of methods for detecting hot topics. Scientometrics, 81(1), 73–90.CrossRefGoogle Scholar
  25. Upham, S., & Small, H. (2010). Emerging research fronts in science and technology: Patterns of new knowledge development. Scientometrics, 83(1), 15–38.CrossRefGoogle Scholar

Copyright information

© Akadémiai Kiadó, Budapest, Hungary 2012

Authors and Affiliations

  • Leonidas Akritidis
    • 1
  • Dimitrios Katsaros
    • 1
  • Panayiotis Bozanis
    • 1
  1. 1.University of ThessalyVolosGreece

Personalised recommendations