Advertisement

Scientometrics

, Volume 117, Issue 2, pp 1237–1264 | Cite as

Does prestige dimension influence the interdisciplinary performance of scientific entities in knowledge flow? Evidence from the e-government field

  • Shunshun Shi
  • Wenyu ZhangEmail author
  • Shuai Zhang
  • Jie Chen
Article
  • 99 Downloads

Abstract

It has long been understood that knowledge flow can be divided into knowledge integration and knowledge diffusion and can be investigated by interdisciplinary scientific research (IDR) approaches. The literature describes some quantitative approaches for measuring interdisciplinary research, and all of them belong to a popularity dimension. Previous work failed to address the problem of evaluating interdisciplinary research in a prestige dimension. However, in this study, the authors introduce an extended IDR measure that combines the P-Rank algorithm with the traditional IDR approaches to promote the current IDR approaches from the popularity dimension to the prestige dimension. This extended measure explores the prestige dimension of papers and considers the subsequent contribution of papers of different prestige devoted to the knowledge flow in which they are embedded. An experiment regarding the e-government field demonstrates that the interdisciplinary performance of some papers is overestimated under traditional IDR approaches and that the performance would be more reasonable under an extended IDR measure that considers the prestige dimension. We expect that the extended IDR measure can identify the different contributions of papers of different prestige with regard to their interdisciplinary performance and then reevaluate their contributions to the knowledge flow in which they are embedded.

Keywords

Interdisciplinary research Prestige dimension Category citation analysis Diversity measure Knowledge integration Knowledge diffusion 

Notes

Acknowledgements

The work has been supported by National Natural Science Foundation of China (Nos. 51475410, 51875503, 51775496), Zhejiang Natural Science Foundation of China (No. LY17E050010).

References

  1. Agarwal, R., & Karahanna, E. (2000). Time flies when you’re having fun: Cognitive absorption and beliefs about information technology usage. MIS Quarterly, 24, 665–694.Google Scholar
  2. Agarwal, R., & Prasad, J. (1999). Are individual differences germane to the acceptance of new information technologies? Decision Sciences, 30(2), 361–391.Google Scholar
  3. Bandura, A. (1978). Self-efficacy: Toward a unifying theory of behavioral change. Advances in Behaviour Research and Therapy, 1(4), 139–161.Google Scholar
  4. Bélanger, F., & Carter, L. (2008). Trust and risk in e-government adoption. The Journal of Strategic Information Systems, 17(2), 165–176.Google Scholar
  5. Bélanger, F., & Carter, L. (2012). Digitizing government interactions with constituents: An historical review of e-government research in information systems. Journal of the Association for Information Systems, 13(5), 363–394.Google Scholar
  6. Belcher, B. M., Rasmussen, K. E., Kemshaw, M. R., & Zornes, D. A. (2016). Defining and assessing research quality in a transdisciplinary context. Research Evaluation, 25(1), 1–17.Google Scholar
  7. Bertot, J. C., Jaeger, P. T., & Grimes, J. M. (2010). Using ICTs to create a culture of transparency: E-government and social media as openness and anti-corruption tools for societies. Government Information Quarterly, 27(3), 264–271.Google Scholar
  8. Bolívar, R. M. P., Muñoz, A. L., & Hernández, L. A. M. (2010). Trends of e-government research: Contextualization and research opportunities. The International Journal of Digital Accounting Research, 10, 87–111.Google Scholar
  9. Bollen, J., Rodriquez, M. A., & Van de Sompel, H. (2006). Journal status. Scientometrics, 69(3), 669–687.Google Scholar
  10. Brin, S., & Page, L. (1998). The anatomy of a large-scale hypertextual web search engine. Computer networks and ISDN systems, 30(1–7), 107–117.Google Scholar
  11. Carley, S., & Porter, A. L. (2011). A forward diversity index. Scientometrics, 90(2), 407–427.Google Scholar
  12. Carr, G., Loucks, D. P., & Blöschl, G. (2018). Gaining insight into interdisciplinary research and education programmes: A framework for evaluation. Research Policy, 47(1), 35–48.Google Scholar
  13. Carter, L., & Bélanger, F. (2005). The utilization of e-government services: Citizen trust, innovation and acceptance factors. Information Systems Journal, 15(1), 5–25.Google Scholar
  14. Chi, X., Streicher-Porte, M., Wang, M. Y., & Reuter, M. A. (2011). Informal electronic waste recycling: A sector review with special focus on China. Waste Management, 31(4), 731–742.Google Scholar
  15. Davis, F. D. (1993). User acceptance of information technology: System characteristics, user perceptions and behavioral impacts. International Journal of Man-Machine Studies, 38(3), 475–487.Google Scholar
  16. Ding, Y., Yan, E., Frazho, A., & Caverlee, J. (2009). PageRank for ranking authors in co-citation networks. Journal of the Association for Information Science and Technology, 60(11), 2229–2243.Google Scholar
  17. Dwivedi, Y. K., & Weerakkody, V. (2010). A profile of scholarly community contributing to the International Journal of Electronic Government Research. International Journal of Electronic Government Research, 6(4), 1–11.Google Scholar
  18. Falagas, M. E., Kouranos, V. D., Arencibia-Jorge, R., & Karageorgopoulos, D. E. (2008). Comparison of SCImago journal rank indicator with journal impact factor. The FASEB Journal, 22(8), 2623–2628.Google Scholar
  19. Frodeman, R., Klein, J. T., & Pacheco, R. C. D. S. (2017). The Oxford handbook of interdisciplinarity. Oxford: Oxford University Press.Google Scholar
  20. Garfield, E. (1972). Citation analysis as a tool in journal evaluation. Science, 178(4060), 471–479.Google Scholar
  21. Gefen, D., Karahanna, E., & Straub, D. W. (2003). Trust and TAM in online shopping: an integrated model. MIS Quarterly, 27(1), 51–90.Google Scholar
  22. Haveliwala, T., Kamvar, S., & Jeh, G. (2003). An analytical comparison of approaches to personalizing pagerank. Technical Report, Stanford University, California.Google Scholar
  23. Heeks, R., & Bailur, S. (2007). Analyzing e-government research: Perspectives, philosophies, theories, methods, and practice. Government Information Quarterly, 24(2), 243–265.Google Scholar
  24. Hoffman, D. L., Novak, T. P., & Peralta, M. (1999). Building consumer trust online. Communications of the ACM, 42(4), 80–85.Google Scholar
  25. Horst, M., Kuttschreuter, M., & Gutteling, J. M. (2007). Perceived usefulness, personal experiences, risk perception and trust as determinants of adoption of e-government services in the Netherlands. Computers in Human Behavior, 23(4), 1838–1852.Google Scholar
  26. Jiang, X., Sun, X., Yang, Z., Zhuge, H., & Yao, J. (2016). Exploiting heterogeneous scientific literature networks to combat ranking bias: Evidence from the computational linguistics area. Journal of the Association for Information Science and Technology, 67(7), 1679–1702.Google Scholar
  27. Joseph, R. C. (2013). A structured analysis of e-government studies: Trends and opportunities. Government Information Quarterly, 30(4), 435–440.Google Scholar
  28. Karahanna, E., Straub, D. W., & Chervany, N. L. (1999). Information technology adoption across time: A cross-sectional comparison of pre-adoption and post-adoption beliefs. MIS Quarterly, 23, 183–213.Google Scholar
  29. Klein, J. T. (2008). Evaluation of interdisciplinary and transdisciplinary research. American Journal of Preventive Medicine, 35(2), S116–S123.Google Scholar
  30. Larivière, V., & Gingras, Y. (2010). On the relationship between interdisciplinarity and scientific impact. Journal of the Association for Information Science and Technology, 61(1), 126–131.Google Scholar
  31. Layne, K., & Lee, J. (2001). Developing fully functional E-government: A four stage model. Government Information Quarterly, 18(2), 122–136.Google Scholar
  32. Levitt, J. M., & Thelwall, M. (2008). Is multidisciplinary research more highly cited? A macrolevel study. Journal of the Association for Information Science and Technology, 59(12), 1973–1984.Google Scholar
  33. Levitt, J. M., Thelwall, M., & Oppenheim, C. (2011). Variations between subjects in the extent to which the social sciences have become more interdisciplinary. Journal of the Association for Information Science and Technology, 62(6), 1118–1129.Google Scholar
  34. Leydesdorff, L. (2007a). Betweenness centrality as an indicator of the interdisciplinarity of scientific journals. Journal of the Association for Information Science and Technology, 58(9), 1303–1319.Google Scholar
  35. Leydesdorff, L. (2007b). Mapping interdisciplinarity at the interfaces between the Science Citation Index and the Social Science Citation Index. Scientometrics, 71(3), 391–405.Google Scholar
  36. Leydesdorff, L., & Goldstone, R. L. (2014). Interdisciplinarity at the journal and specialty level: The changing knowledge bases of the journal Cognitive Science. Journal of the Association for Information Science and Technology, 65(1), 164–177.Google Scholar
  37. Leydesdorff, L., & Rafols, I. (2009). A global map of science based on the ISI subject categories. Journal of the Association for Information Science and Technology, 60(2), 348–362.Google Scholar
  38. Leydesdorff, L., & Rafols, I. (2011). Indicators of the interdisciplinarity of journals: Diversity, centrality, and citations. Journal of Informetrics, 5(1), 87–100.Google Scholar
  39. Leydesdorff, L., Rafols, I., & Chen, C. (2013). Interactive overlays of journals and the measurement of interdisciplinarity on the basis of aggregated journal–journal citations. Journal of the Association for Information Science and Technology, 64(12), 2573–2586.Google Scholar
  40. Linders, D. (2012). From e-government to we-government: Defining a typology for citizen coproduction in the age of social media. Government Information Quarterly, 29(4), 446–454.Google Scholar
  41. Linton, J. D., Tierney, R., & Walsh, S. T. (2012). What are research expectations? A comparative study of different academic disciplines. Serials Review, 38(4), 228–234.Google Scholar
  42. Liu, X., Tanaka, M., & Matsui, Y. (2006). Generation amount prediction and material flow analysis of electronic waste: A case study in Beijing. China. Waste Management & Research, 24(5), 434–445.Google Scholar
  43. Mansilla, V. B., Feller, I., & Gardner, H. (2006). Quality assessment in interdisciplinary research and education. Research Evaluation, 15(1), 69–74.Google Scholar
  44. Morillo, F., Bordons, M., & Gómez, I. (2001). An approach to interdisciplinarity through bibliometric indicators. Scientometrics, 51(1), 203–222.Google Scholar
  45. Morillo, F., Bordons, M., & Gómez, I. (2003). Interdisciplinarity in science: A tentative typology of disciplines and research areas. Journal of the Association for Information Science and Technology, 54(13), 1237–1249.Google Scholar
  46. Moya-Anegón, F., Vargas-Quesada, B., Herrero-Solana, V., Chinchilla-Rodríguez, Z., Corera-Álvarez, E., & Munoz-Fernández, F. (2004). A new technique for building maps of large scientific domains based on the cocitation of classes and categories. Scientometrics, 61(1), 129–145.Google Scholar
  47. Muñoz, A. L., & Bolívar, R. M. P. (2015). Understanding e-government research: a perspective from the information and library science field of knowledge. Internet Research, 25(4), 633–673.Google Scholar
  48. Muñoz, A. L., Bolívar, R. M. P., Cobo, M. J., & Viedma, H. E. (2017a). Analysing the scientific evolution of e-government using a science mapping approach. Government Information Quarterly, 34(3), 545–555.Google Scholar
  49. Muñoz, A. L., Bolívar, R. M. P., & Hernández, L. A. M. (2017b). Transparency in governments: A meta-analytic review of incentives for digital versus hard-copy public financial disclosures. The American Review of Public Administration, 47(5), 550–573.Google Scholar
  50. Nykl, M., Ježek, K., Fiala, D., & Dostal, M. (2014). PageRank variants in the evaluation of citation networks. Journal of Informetrics, 8(3), 683–692.Google Scholar
  51. Porter, A. L., & Chubin, D. (1985). An indicator of cross-disciplinary research. Scientometrics, 8(3–4), 161–176.Google Scholar
  52. Porter, A. L., Cohen, A., David Roessner, J., & Perreault, M. (2007). Measuring researcher interdisciplinarity. Scientometrics, 72(1), 117–147.Google Scholar
  53. Porter, A. L., Roessner, J. D., Cohen, A. S., & Perreault, M. (2006). Interdisciplinary research: Meaning, metrics and nurture. Research Evaluation, 15(3), 187–195.Google Scholar
  54. Qin, J., Lancaster, F. W., & Allen, B. (1997). Types and levels of collaboration in interdisciplinary research in the sciences. Journal of the American Society for Information Science and Technology, 48(10), 893–916.Google Scholar
  55. Raasch, C., Lee, V., Spaeth, S., & Herstatt, C. (2013). The rise and fall of interdisciplinary research: The case of open source innovation. Research Policy, 42(5), 1138–1151.Google Scholar
  56. Rafols, I., & Meyer, M. (2010). Diversity and network coherence as indicators of interdisciplinarity: Case studies in bionanoscience. Scientometrics, 82(2), 263–287.Google Scholar
  57. Rao, C. R. (1982). Diversity: its measurement, decomposition, apportionment and analysis. Sankhyā: The Indian Journal of Statistics, Series A, 44, 1–22.MathSciNetzbMATHGoogle Scholar
  58. Rinia, E., Van Leeuwen, T., & Van Raan, A. (2002). Impact measures of interdisciplinary research in physics. Scientometrics, 53(2), 241–248.Google Scholar
  59. Saha, S., Saint, S., & Christakis, D. A. (2003). Impact factor: A valid measure of journal quality? Journal of the Medical Library Association, 91(1), 42.Google Scholar
  60. Salton, G., & Bergmark, D. (1979). A citation study of computer science literature. IEEE Transactions on Professional Communication, 22(3), 146–158.Google Scholar
  61. Scholl, H. J. J., & Dwivedi, Y. K. (2014). Forums for electronic government scholars: Insights from a 2012/2013 study. Government Information Quarterly, 31(2), 229–242.Google Scholar
  62. Small, H., & Garfield, E. (1985). The geography of science: Disciplinary and national mappings. Information Scientist, 11(4), 147–159.Google Scholar
  63. Snead, J. T., & Wright, E. (2014). E-government research in the United States. Government Information Quarterly, 31(1), 129–136.Google Scholar
  64. Soós, S., & Kampis, G. (2011). Towards a typology of research performance diversity: The case of top Hungarian players. Scientometrics, 87(2), 357–371.Google Scholar
  65. Steele, T. W., & Stier, J. C. (2000). The impact of interdisciplinary research in the environmental sciences: A forestry case study. Journal of the Association for Information Science and Technology, 51(5), 476–484.Google Scholar
  66. Stirling, A. (2007). A general framework for analysing diversity in science, technology and society. Journal of the Royal Society, Interface, 4(15), 707–719.Google Scholar
  67. Tat-Kei Ho, A. (2002). Reinventing local governments and the e-government initiative. Public Administration Review, 62(4), 434–444.Google Scholar
  68. Teo, T. S., Srivastava, S. C., & Jiang, L. (2008). Trust and electronic government success: An empirical study. Journal of Management Information Systems, 25(3), 99–132.Google Scholar
  69. Tolbert, C. J., & Mossberger, K. (2006). The effects of e-government on trust and confidence in government. Public Administration Review, 66(3), 354–369.Google Scholar
  70. Van Leeuwen, T., & Tijssen, R. (2000). Interdisciplinary dynamics of modern science: Analysis of cross-disciplinary citation flows. Research Evaluation, 9(3), 183–187.Google Scholar
  71. Van Rijnsoever, F. J., & Hessels, L. K. (2011). Factors associated with disciplinary and interdisciplinary research collaboration. Research Policy, 40(3), 463–472.Google Scholar
  72. Vanclay, J. K. (2012). Impact factor: Outdated artefact or stepping-stone to journal certification? Scientometrics, 92(2), 211–238.Google Scholar
  73. Venkatesh, V. (1999). Creation of favorable user perceptions: Exploring the role of intrinsic motivation. MIS Quarterly, 23, 239–260.Google Scholar
  74. Venkatesh, V., Morris, M. G., Davis, G. B., & Davis, F. D. (2003). User acceptance of information technology: toward a unified view. MIS Quarterly, 27, 425–478.Google Scholar
  75. Wagner, C. S., Roessner, J. D., Bobb, K., Klein, J. T., Boyack, K. W., Keyton, J., et al. (2011). Approaches to understanding and measuring interdisciplinary scientific research (IDR): A review of the literature. Journal of Informetrics, 5(1), 14–26.Google Scholar
  76. Wang, X., Wang, Z., Huang, Y., Chen, Y., Zhang, Y., Ren, H., et al. (2017). Measuring interdisciplinarity of a research system: Detecting distinction between publication categories and citation categories. Scientometrics, 111(3), 2023–2039.Google Scholar
  77. West, D. M. (2004). E-government and the transformation of service delivery and citizen attitudes. Public Administration Review, 64(1), 15–27.MathSciNetGoogle Scholar
  78. Yan, E., & Ding, Y. (2010). Weighted citation: An indicator of an article’s prestige. Journal of the Association for Information Science and Technology, 61(8), 1635–1643.Google Scholar
  79. Yan, E., Ding, Y., & Sugimoto, C. R. (2011). P-Rank: An indicator measuring prestige in heterogeneous scholarly networks. Journal of the Association for Information Science and Technology, 62(3), 467–477.Google Scholar
  80. Yildiz, M. (2007). E-government research: Reviewing the literature, limitations, and ways forward. Government Information Quarterly, 24(3), 646–665.Google Scholar
  81. Yu, D., & Shi, S. (2015). Researching the development of Atanassov intuitionistic fuzzy set: Using a citation network analysis. Applied Soft Computing, 32, 189–198.Google Scholar
  82. Yu, D. J., Wang, W. R., Zhang, S., Zhang, W. Y., & Liu, R. Y. (2017a). A multiple-link, mutually reinforced journal-ranking model to measure the prestige of journals. Scientometrics, 111(1), 521–542.Google Scholar
  83. Yu, D., Xu, Z., Pedrycz, W., & Wang, W. (2017b). Information Sciences 1968–2016: A retrospective analysis with text mining and bibliometric. Information Sciences, 418, 619–634.Google Scholar
  84. Zhou, D., Orshanskiy, S. A., Zha, H., & Giles, C. L. (2007). Co-ranking authors and documents in a heterogeneous network. In: Proceedings of the seventh IEEE international conference on data mining, October 28–31, Omaha, USA (pp. 739–744).Google Scholar
  85. Zitt, M. (2005). Facing diversity of science: A challenge for bibliometric indicators. Measurement: Interdisciplinary Research and Perspectives, 3(1), 38–49.Google Scholar
  86. Zitt, M., Ramanana-Rahary, S., & Bassecoulard, E. (2005). Relativity of citation performance and excellence measures: From cross-field to cross-scale effects of field-normalisation. Scientometrics, 63(2), 373–401.Google Scholar

Copyright information

© Akadémiai Kiadó, Budapest, Hungary 2018

Authors and Affiliations

  1. 1.School of InformationZhejiang University of Finance and EconomicsHangzhouChina

Personalised recommendations