Abstract
This paper presents some of the first evidence on the effect of information and communications technology (ICT) on college students’ labor market performance. Using a large, representative survey of college students in China, we examine outcomes before and after students were exposed to technology-aided instruction, compared with students who were not exposed to such instruction. The results indicate that the ICT program significantly increased students’ likelihood of obtaining a job offer in the labor market and the wage they were offered. The positive effect comes from students’ increased use of computers and the internet for job search. While most previous studies of the use of technology in education focus only on students’ academic achievement and find zero or negative effects, our study demonstrates that technology may be an effective tool for improving college students’ labor market performance, and that the potential benefits of technology might be underestimated if we focus only on test scores and ignore students’ career development.
Similar content being viewed by others
Notes
The statistic is obtained from the CCW Research group, and it is available online at: http://www.edu.cn/xxh/fei/sj/201501/t20150129_1226140.shtml
A large strand of literature focuses on the effectiveness of technology-aided instruction on primary and secondary schools, and reaches mixed conclusions. In particular, some studies find positive effects of technology on students’ academic outcomes (Banerjee et al. 2007; Barrow et al. 2009; Muralidharan et al. 2019), while others find zero or negative effects (Goolsbee and Guryan 2006; Belo et al. 2013).
Specifically, the number of students enrolled in higher education institutions (HEIs) has increased by 65 million between 1999 and 2008, and the global demand for higher education is predicted to expand from less than 100 million students in 2000 to over 250 million in 2025 (Bangkok UNESCO 2011).
According to China’s price statistics, the inflation-adjusted price of a computer declined substantially by 60.1% from 1995 to 2015. Statistics is are the author’s calculation from the China Price Statistic Yearbook.
Most existing studies that examine effect of technology on college students’ academic outcomes use data from tests, and there is little dataset to date that links students’ use of technology and their job market performance.
In particular, Malamud and Pop-Eleches (2011) find that children who won a voucher to purchase a computer had significantly lower school grades but showed improved computer skills, and find that the voucher substantially increases children’s use of computer for computer games, and had only limited impact on the use of computer educational software.
The ICT program in college is a part of “education technology project” in China, proposed by MoE, which aims to improve and accelerate the adoption of educational technology. It is available online at: http://old.moe.gov.cn//publicfiles/business/htmlfiles/moe/s3342/201211/144888.html
Relevantly, in their review of studies on technology in education, Escueta et al. (2017) divided the use of technology into four categories: access to technology, computer-assisted learning, technology-based behavioral intervention, and online instructions. The project we study has three of these categories: access to technology, computer-assisted learning, and online instruction.
A limit of the dataset is that it only contains information on the first job students obtained in their career, but lack information about students’ long-term labor market performance.
In particular, while the time range was the same (from the end of May to early June), the exact date and time of the survey was decided by each college. We believe that such differences will not cause a severe estimate bias for two reasons: First, at the time of the survey, most students had already decided on their future plan. In China, most employees and firms send the job offer before April and according to students’ responses at the time of the survey, 95.1% of the undergraduates had plans after graduation. Second, even if the survey time may be correlated with students’ job market performance, given that we use a difference-in-differences setting, the survey time will bias the estimate if and only if it is systematically correlated with the respondent’s treatment status. For example, if compared with the time period before ICT program implementation, after policy implementation, the survey time is earlier in treated colleges than untreated colleges, the estimate will be biased. Otherwise, the estimate will not be biased. In addition, we find no anecdotal evidence indicating the correlation between ICT program and survey time. To this end, we believe that the different survey times will not be a severe threat for estimates.
In the estimation sample, the 36 colleges are located across 20 provinces, and the treated colleges are located across 13 provinces.
A potential concern about the survey dataset is students’ potential misreport—For example, about their job offer or grades. While this is a common question about measurement error when using survey data, and we are unable to ideally address this issue, we believe that this will not be a severe threat in our setting. First, students are assumed to have little incentives to lie. They were informed that it is a survey conducted by the data center at Tsinghua University, it is only for research purposes and is not concerned with their academic or labor market interests. In addition, students are not required to enter their name or ID on the questionnaire. Thus students have no obvious incentive or motivation to lie. Second, the data center compares statistics from the survey for example, students’ self-reported first job wage, job occupation, and industry with statistics for similar age cohorts from other survey data (including the China Family Panel Survey (CFPS) and population census), and find similar mean values and distributions. Furthermore, given that we rely on a difference-in-differences estimation framework, even if some students misreport their academic performance, in our econometric specification such measurement error will bias the estimate only if the misreport systematically correlates with the treatment status. For example, if students from the treated cohort in treated colleges are more likely to overstate their grade than those from non-treated colleges, the estimate will have a upward bias. Otherwise, measurement error will not bias the estimate.
Given that wage is only observed for students who have already obtained a job offer, and the ICT program has a significantly positive effect on students’ likelihood of obtaining a job offer, estimates of the effect on wage may encounter the problem of sample selection issue. We discuss and address this issue in Section 4.3.
The reduced statistical significance of the estimates is likely to be resulted from the reduced observations. To further test this argument, we remove the first treatment cohort to have a similar size of observation and replicate the regressions. We find that, as expected, similar to the estimates in which we remove cohort 2012, the estimates become less significant as well.
Chinese colleges have low attrition rates. First, most Chinese colleges have high enrollment standards but relatively low graduation standards. Second, the return to college degree is high in China, and Chinese students and their families place high value on college study. Very few college students have an incentive to quit, and most parents don’t allow their children to quit college. To this end, the attrition rate is much lower in Chinese colleges compared to those in the US and Europe.
In other words, we exclude the sample of students who decided to pursue graduate degree, and those who had not reported his/her plan at the time of survey.
Lee (2009) provides a detailed discussion of the issue of sample selection.
4.2% is the estimated effect on the probability of obtaining job offer, as shown in Table 3 column 2.
It is worth noting that while Heckman selection model is also a commonly used approach to the sample selection issue, here we are unable to use it because it requires an exogenous instrument in the first stage, which is not applicable in our setting.
In China, most final exams in college use a pen-and-paper format, and the CET-4 test is a national-level pen-and-paper format test.
A limitation of this question is that it does not separate the number of CVs sent via the internet vs. those sent as hard copies by mail.
A limitation of the survey question is that it does not separate computer games from other entertainment.
Escueta et al. (2017) reviews such studies and divides the educational technology into four categories: access to technology, computer-assisted learning, technology-based behavioral intervention, and online instruction.
Since the differences are not statistically significant from zero, this heterogeneity pattern across family background should be interpreted with caution.
References
Akerman A, Gaarder I, Mogstad M (2015) The skill complementarity of broadband internet. The Quarterly Journal of Economics 130(4):1781–1824
Allen IE, Seaman J (2013) Changing course: ten years of tracking online education in the United States. ERIC
Alpert WT, Couch KA, Harmon OR (2016) A randomized assessment of online learning. Am Econ Rev 106(5):378–82
Andersson B, Nfuka EN, Sumra S, Uimonen P, Pain A (2014) Evaluation of implementation of ICT in teachers’ colleges project in Tanzania, Swedish International Development Cooperation Agency (Sida)
Angrist J, Lavy V (2002) New evidence on classroom computers and pupil learning. Econ J 112(482):735–765
Angrist J, Lavy Vi, Hudson S, Pallais A, et al. (2014) Leveling up: early results from a randomized evaluation of post-secondary aid, Technical Report, National Bureau of Economic Research
Banerjee AV, Cole S, Duflo E, Linden L (2007) Remedying education: evidence from two randomized experiments in India. The Quarterly Journal of Economics 122(3):1235–1264
Bangkok UNESCO (2011) ICT for higher education: case studies from Asia and the Pacific
Barrow L, Markman L, Rouse CE (2009) Technology’s edge: the educational benefits of computer-aided instruction. American Economic Journal: Economic Policy 1(1):52–74
Belo R, Ferreira P, Telang R (2013) Broadband in school: impact on student performance. Manag Sci 60(2):265–282
Bettinger EP, Fox L, Loeb S, Taylor ES (2017) Virtual classrooms: how online college courses affect student success. Am Econ Rev 107(9):2855–2875
Beuermann DW, Cristia J, Cueto S, Malamud O, Cruz-Aguayo Y (2015) One laptop per child at home: short-term impacts from a randomized experiment in Peru. Am Econ J: Appl Econ 7(2):53–80
Chen Shiyi, Wanlin Liu, Hong Song (2019) Broadband internet, firm performance and worker welfare: evidence and mechanism. Economic Inquiry
Czernich N, Falck O, Kretschmer T, Woessmann L (2011) Broadband infrastructure and economic growth. Econ J 121(552):505–532
Dale SB, Krueger AB (2002) Estimating the payoff to attending a more selective college: an application of selection on observables and unobservables. The Quarterly Journal of Economics 117(4):1491–1527
Deming DJ, Walters CR (2017) The impact of price caps and spending cuts on US postsecondary attainment, Technical Report, National Bureau of Economic Research
Deming DJ, Walters CR, Hastings JS, Kane TJ, Staiger DO (2014) School choice, school quality, and postsecondary attainment. Am Econ Rev 104(3):991–1013
DeStefano T, Kneller R, Timmis J (2018) Broadband infrastructure, ICT use and firm performance: evidence for UK firms. Journal of Economic Behavior & Organization 155:110–139
Dynarski S (2002) The behavioral and distributional implications of aid for college. Am Econ Rev 92(2):279–285
Escueta M, Quan V, Nickow AJ, Oreopoulos P (2017) Education technology: an evidence-based review, Technical Report, National Bureau of Economic Research
Fabritz N (2013) The impact of broadband on economic activity in rural areas: evidence from German municipalities, Technical Report, Ifo Working Paper
Fairlie RW (2012) Academic achievement, technology and race: experimental evidence. Econ Educ Rev 31(5):663–679
Fairlie RW, Grunberg SH (2014) Access to technology and the transfer function of community colleges: evidence from a field experiment. Econ Inq 52(3):1040–1059
Fairlie RW, Robinson J (2013) Experimental evidence on the effects of home computers on academic achievement among school children. Am Econ J Appl Econ 5(3):211–40
Figlio D, Rush M, Yin L (2013) Is it live or is it internet? experimental estimates of the effects of online instruction on dtudent learning. J Labor Econ 31 (4):763–784
Freeman JA, Hirsch BT (2008) College majors and the knowledge content of jobs. Econ Educ Rev 27(5):517–535
Goolsbee A, Guryan J (2006) The impact of internet subsidies in public schools. Rev Econ Stat 88(2):336–347
Hoekstra M (2009) The effect of attending the flagship state university on earnings: a discontinuity-based approach. Rev Econ Stat 91(4):717–724
Hoosen S, Butcher N (2012) Growth of information and communications technology at African universities, International Higher Education
Jackson CK, Johnson RC, Persico C (2015) The effects of school spending on educational and economic outcomes: evidence from school finance reforms. The Quarterly Journal of Economics 131(1):157–218
Jia R, Li H (2017) Access to elite education, wage premium and social mobility: the truth and illusion of China’s college entrance exam, Working paper
Kolko J (2012) Broadband and local growth. J Urban Econ 71(1):100–113
Lee DS (2009) Training, wages and sample selection: estimating sharp bounds on treatment effects. Rev Econ Stud 76(3):1071–1102
Malamud O, Pop-Eleches C (2011) Home computer use and the development of human capital. The Quarterly Journal of Economics 126(2):987–1027
Muralidharan K, Singh A, Ganimian AJ (2019) Disrupting education? Experimental evidence on technology-aided instruction in India. Am Econ Rev 109 (4):1426–1460
Oreopoulos P, Petronijevic U (2018) Student coaching: how far can technology go?. J Human Res 53(2):299–329
Pop-Eleches C, Urquiola M (2013) Going to a better school: effects and behavioral responses. Am Econ Rev 103(4):1289–1324
Rothstein J, Rouse CE (2011) Constrained after college: student loans and early-career occupational choices. J Public Econ 95(1-2):149–163
Sjoquist DL, Winters JV (2015) State merit aid programs and college major: a focus on STEM. J Labor Econ 33(4):973–1006
Acknowledgments
We thank David Figlio, Ofer Malamud, and other scholars and seminar participants from Fudan University, National University of Singapore, and Northwestern University for helpful comments.
Funding
This study was funded by the National Natural Science Foundation of China (71803027), Ministry of Education of the People’s Republic of China (18YJC790139) and Shanghai Chenguang Talent Program (18CG03).
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of interest
The authors declare that they have no conflict of interest.
Additional information
Responsible editor: Junsen Zhang
Publisher’s note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Appendix
Appendix
Rights and permissions
About this article
Cite this article
Lu, Y., Song, H. The effect of educational technology on college students’ labor market performance. J Popul Econ 33, 1101–1126 (2020). https://doi.org/10.1007/s00148-019-00756-3
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s00148-019-00756-3