A Cognitive Model of Feature Selection and Categorization for Autonomous Systems
We describe a computational cognitive model intended to be a generalizable classifier that can provide context-based feedback to semantic perception in robotic applications. Many classifiers (including cognitive models of categorization) perform well at the task of associating features with objects. Underlying their performance is an effective selection of the features used during classification. This Feature Selection (FS) process is usually performed outside the boundaries of the model that learns and performs the classification task, often by a human expert. In contrast, the cognitive model we describe simultaneously learns which features to use, as it learns the associations between features and classes. This integration of FS and class learning in one model makes it complementary to other Machine-Learning (ML) techniques that automate the FS process (e.g., deep learning methods). But their integration in a cognitive architecture also provides a means for creating a dynamic context that includes disparate sources of information (e.g., environmental observations, task knowledge, commands from humans); this richer context, in turn, provides a means for making semantic perception goal-directed. We demonstrate automated FS, integrated with an Instance-Based Learning (IBL) approach to classification, in an ACT-R model of categorization by labeling facial expressions of emotion (e.g., happy, sad) from a set of relevant, irrelevant, distinct, and overlapping facial action features.
KeywordsAutonomous systems Feature selection ACT-R Cognitive architectures Machine-learning Classification Categorization Instance-based learning
This research was supported by OSD ASD (R&E) and by the Army Research Laboratory’s Robotics Collaborative Technology Alliance.
- 1.Fields, M., Lennon, C., Liu, C., Martin, M.K.: Recognizing scenes by simulating implied social interaction networks. In: Liu, H., Kubota, N., Zhu, X., Dillmann, R., Zhou, D. (eds.) ICIRA 2015. LNCS, vol. 9246, pp. 360–371. Springer, Cham (2015). doi: 10.1007/978-3-319-22873-0_32 CrossRefGoogle Scholar
- 6.Kanade, T., Cohn, J.F., Tian, Y.: Comprehensive database for facial expression analysis. In: Proceedings of the Fourth IEEE International Conference on Automatic Face and Gesture Recognition, pp. 46–53. IEEE (2000)Google Scholar
- 7.Ekman, P.: Facial expressions. In: Handbook of Cognition and Emotion, vol. 16, pp. 301–320 (1999)Google Scholar
- 8.Cohn, J.F., Ambadar, Z., Ekman, P.: Observer-based measurement of facial expression with the facial action coding system. In: The Handbook of Emotion Elicitation and Assessment, pp. 203–221 (2007)Google Scholar
- 9.Lebiere, C.: The dynamics of cognitive arithmetic. Kognitionswissenschaft [J. Ger. Cogn. Sci. Soc.] Special issue on cognitive modelling and cognitive architectures, Wallach, D, Simon, H.A. (eds.), 8(1), 5–19 (1999)Google Scholar
- 10.Lebiere, C., Staszewski, J.: Expert decision making in landmine detection. In: Proceedings of Human Factors and Ergonomics Society Conference, San Francisco (2010)Google Scholar