Design of a hierarchy modular neural network and its application in multimodal emotion recognition
- 17 Downloads
Achievement of the fusion for different modalities is a critical issue for multimodal emotion recognition. Feature-level fusion methods cannot deal with missing or corrupted data, while decision-level fusion methods may lose the correlation information between different modalities. To solve the above problems, a hierarchy modular neural network (HMNN) is proposed and is applied for multimodal emotion recognition. First, an HMNN is constructed to mimic the hierarchy modular architecture as demonstrated in the human brain. Each module contains several submodules dealing with features from different modalities. Connections are built between submodules within the same module and between corresponding submodules from different modules. Then, a learning algorithm based on Hebbian learning is used to train the connection weights in HMNN, which simulates the learning mechanism of the human brain. HMNN recognizes the label based on the activity level of each module and adopts the winner-take-all strategy. Finally, the proposed HMNN is applied on a public dataset for multimodal emotion recognition. Experimental results show that the proposed HMNN improves the recognition results, when compared with other decision-fusion methods, including support vector machine, as well as neural networks such as back-propagation and radial basis function neural networks. Furthermore, the inter-submodule connections in one module realizes information integration from different modalities and improves the performance of HMNN. Besides, the experiments suggest the effectiveness of HMNN on dealing with missing/corrupted data.
KeywordsHierarchy modular neural network (HMNN) Inter-submodule connections Hebbian learning rule Multimodal emotion recognition
This work was supported by the National Natural Science Foundation of China (No. 61603009); Beijing Natural Science Foundation (No. 4182007); the Beijing Municipal Education Commission Foundation (No. KM201910005023); the Key Project of National Natural Science Foundation of China (No. 61533002); and Rixin Scientist” Foundation of Beijing University of Technology (No. 2017-RX(1)-04).
Compliance with ethical standards
Conflict of interest
The authors declare that they have no conflict of interest.
This article does not contain any studies with human participants or animals performed by any of the authors.
- Ali M, Sarwar A, Sharma V, Suri J (2017) Artificial neural network based screening of cervical cancer using a hierarchical modular neural network architecture (HMNNA) and novel benchmark uterine cervix cancer database. Neural Comput Appl 4:1–15Google Scholar
- Chen J, Hu B, Xu L, Moore P, Su Y (2015) Feature-level fusion of multimodal physiological signals for emotion recognition. In: The IEEE international conference on bioinformatics and biomedicine, pp 395–399Google Scholar
- Ioannou S, Kessous L, Caridakis G, Karpouzis K, Aharonson V, Kollias S (2006) Adaptive on-line neural network retraining for real life multimodal emotion recognition. In: International conference on artificial neural networks, pp 81–92Google Scholar
- Mitsuyama S, Motoike J, Matsuo H (1999) Automatic classification of urinary sediment images by using a hierarchical modular neural network. In: SPIE’s international symposium on medical imaging, pp 680–688Google Scholar
- Shibata K, Ikeda Y (2009) Effect of number of hidden neurons on learning in large-scale layered neural networks. In: ICCAS-SICE, pp 5008–5013Google Scholar
- Wang SJ, Hilgetag CC, Zhou C (2011) Sustained activity in hierarchical modular neural networks: self-organized criticality and oscillations. Front Comput Neurosci 5:30Google Scholar