Multimodal emotion recognition algorithm based on edge network emotion element compensation and data fusion

Original Article


The data feature set of emotion recognition based on complex network has the characteristics of complex redundant information, difficult recognition and lost data, so it will cause great interference to the emotion feature of speech or image recognition. In order to solve the above problems, this paper studies the multi-modal emotion recognition algorithm based on emotion element compensation in the background of streaming media communication in edge network. Firstly, an edge streaming media network is designed to transfer the traditional server-centric transmission tasks to edge nodes. The architecture can transform complex network problems into edge nodes and user side problems. Secondly, the multi-modal parallel training is realized by using the cooperative combination of weights equalization, and the reasoning of nonlinear mapping is mapped to a better emotional data fusion relationship. Then, from the point of view of non-linearity and uncertainty of different types of emotional data samples in the training subset, emotional recognition data compensation evolves into emotional element compensation, which is convenient for qualitative analysis and optimal decision-making. Finally, the simulation results show that the proposed multi-modal emotion recognition algorithm can improve the recognition rate by 3.5%, save the average response time by 5.7% and save the average number of iterations per unit time by 1.35 times.


Emotion recognition Edge network Multimodal Emotion compensation Emotion recognition Data fusion 


Funding information

This work is supported in part by the Key scientific research projects of Henan Province Education Department (No.18A520004), Henan Province Science and Technology projects (No. 182102310925), and National Natural Science Foundation of China (No. 61802115).


  1. 1.
    Hossain MS, Muhammad G, Alhamid MF et al (2016) Audio-visual emotion recognition using big data towards 5G[J]. Mobile Netw Appl 21(5):1–11Google Scholar
  2. 2.
    Fan Y, Lu X, Li D et al (2016) Video-based emotion recognition using CNN-RNN and C3D hybrid networks[C]// ACM international conference on multimodal interaction. ACM:445–450Google Scholar
  3. 3.
    Dhall A, Goecke R, Joshi J, Emoti W et al (2016) Video and group-level emotion recognition challenges[C]//ACM international conference on multimodal interaction. ACM 2016:427–432Google Scholar
  4. 4.
    Zheng WL, Lu BL (2017) Investigating critical frequency bands and channels for EEG-based emotion recognition with deep neural networks [J]. IEEE Trans Auton Ment Dev 7(3):162–175CrossRefGoogle Scholar
  5. 5.
    Zhao M, Adib F, Katabi D (2017) Emotion recognition using wireless signals[C]// international conference on mobile computing and networking. ACM:95–108Google Scholar
  6. 6.
    Papageorgiou A, Cheng B, Kovacs E (2016) Real-time data reduction at the network edge of internet-of-things systems[C]//International Conference on Network and Service Management. IEEE:284–291Google Scholar
  7. 7.
    Kunz J, Becker C, Jamshidy M, et al. (2016) OpenEdge: a dynamic and secure open service edge network[C]//Network operations and management symposium. IEEE:257–264. Edge streamingGoogle Scholar
  8. 8.
    Sajjad HP, Danniswara K, Al-Shishtawy A et al (2016) SpanEdge: towards unifying stream processing over central and near-the-edge data centers[C]// edge computing. IEEE:168–178Google Scholar
  9. 9.
    Hong K R, Sun D, Zhao X, et al. (2017) The influence of Participant's emotion on land expropriation compensation strategy——based on RDEU evolutionary game theory[J]. Modern Finance and Economics-Journal of Tianjin University of Finance and Economics (9):40–51Google Scholar
  10. 10.
    Ojha S, Williams MA (2016) Ethically-guided emotional responses for social robots: should I be angry?[C]// International Conference on Social Robotics. Springer International Publishing, Switzerland, pp 233–242Google Scholar
  11. 11.
    Dzafic I, Burianová H, Periyasamy S et al (2018) Association between schizophrenia polygenic risk and neural correlates of emotion perception. Psychiatry Res Neuroimaging 276:33–40 Data fusionCrossRefGoogle Scholar
  12. 12.
    Roy A, Mihailovic I, Zwaenepoel W (2013) X-stream:edge-centric graph processing using streaming partitions[C]//twenty-fourth ACM symposium on operating systems principles. ACM:472–488Google Scholar
  13. 13.
    Mäkinen O (2016) Streaming at the edge: local service concepts utilizing Mobile edge computing[C]// international conference on next generation mobile applications, services and technologies. IEEE:1–6Google Scholar
  14. 14.
    Bilal K, Erbad A (2017) Edge computing for interactive media and video streaming[C]// International Conference on Fog & Mobile Edge Computing. IEEE:68–73. Emotion compensationGoogle Scholar
  15. 15.
    Zhang S, Hsee CK, Yu X (2018) Small economic losses lower total compensation for victims of emotional losses [J]. Organ Behav Hum Decis Processes 144:1–10CrossRefGoogle Scholar
  16. 16.
    Kulkarni P U, Bharate V D, Chaudhari D S (2016) Human emotions recognition using adaptive sublayer compensation and various feature extraction mechanisms[C]// international conference on wireless communications, signal processing and networking, IEEEGoogle Scholar
  17. 17.
    Bareinboim E, Pearl J (2016) Causal inference and the data-fusion problem [J]. Proc Natl Acad Sci U S A 113(27):7345–7352CrossRefGoogle Scholar
  18. 18.
    Schmitt M, Zhu XX (2016) Data fusion and remote sensing: an ever-growing relationship [J]. IEEE Geosci Remote Sens Mag 4(4):6–23CrossRefGoogle Scholar
  19. 19.
    Ambühl L, Menendez M (2016) Data fusion algorithm for macroscopic fundamental diagram estimation [J]. Transp Res Part C Emerg Technol 71:184–197CrossRefGoogle Scholar
  20. 20.
    Sahoo S, Routray A (2017) Emotion recognition from audio-visual data using rule based decision level fusion[C]// Technology Symposium. IEEE:7–12Google Scholar

Copyright information

© Springer-Verlag London Ltd., part of Springer Nature 2019

Authors and Affiliations

  1. 1.College of Computer ScienceHenan University of EngineeringZhengzhouChina
  2. 2.State Key Laboratory of Mathematical Engineering and Advanced ComputingZhengzhouChina

Personalised recommendations