Abstract
The three areas covered in depth in this chapter are workflows and business process management (BPM), semantic parsing, and robotics. The production and consumption of knowledge should warrant as much attention as do the actions or processes on the factory floor. Workflows are a visible gap in most knowledge management. A reason for the gap is that workflows and business processes intimately involve people. Shared communication is at the heart of workflow management, a reason why semantic technologies are essential to the task. In semantic parsing, a lexical theory needs to handle word senses, sentences and semantics, cross-language meanings, common-sense reasoning, and learning algorithms. A formal grammar provides a set of transition rules for evaluating tokens and a lexicon of types that can build up, or generate, representative language structures. We can map the compositional and semantic aspects of our language to the categorial perspectives of Peirce’s logic and semiosis, and then convert those formalisms to distributions over broad examples provided by KBpedia’s knowledge. Peircean ideas may contribute to part-of-speech tagging, machine learning implementations, and a dedicated Peircean grammar. Cognitive robots embrace the ideas of learning and planning and interacting with a dynamic world. Robotics is a critical potential testbed for Peircean ideas. Robots, to conduct actions, must resolve or ‘ground’ their reasoning symbols into indecomposable primitives. The implication is that higher order concepts are derivations in some manner of lower level concepts, which fits KBpedia and Peirce’s universal categories. Kinesthetic robots may also be essential to refine natural language understanding.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Notes
- 1.
BPM may also refer to business process modeling. We retain the management sense here, noting that the modeling part only comes after thinking through the management portion.
- 2.
The exception to this observation is advanced manufacturing. Some of these businesses, inherently action oriented, have pioneered BPM’s related cousin of manufacturing process management. However, it is an open question whether manufacturing businesses are better at KM as well.
- 3.
See, for example, the open-source Yaoqiang BPMN editor (http://bpmn.sourceforge.net/).
- 4.
The word grammar is derived from a Greek word meaning “writing,” though at one time the knowledge of Latin grammar was viewed as endowing one with magical power, from which arose our word glamor [4].
- 5.
In many areas of computational linguistics, care should be taken when comparing findings from the contributing disciplines.
- 6.
This split somewhat reflects a similar one for discriminative versus generative machine learning models. Discriminative models, also called conditional models, are a class of models used in machine learning for modeling the dependence of unobserved (target) variables y on observed variables x. Example discriminative models include support vector models (SVM), conditional random fields (CRF), neural networks (xNN), linear regression, maximum entropy Markov, and random forests. Generative models use algorithms to try to reconstruct how the original data was generated, often through probabilistic means. Example models include hidden Markov models (HMM), naive Bayes, generative adversarial networks (GANs), Gaussian mixture model, and other types of the mixture model.
- 7.
Nivre argues that a dependency grammar is not a grammar formalism, rather a specific way to describe the syntactic structure of a sentence [49].
- 8.
- 9.
Also known as the Chomsky–Schützenberger hierarchy.
- 10.
We talked of this simple data struct in Chap. 9.
- 11.
For a sample detailed description see SLING, a frame-based semantic parser using a dependency grammar [50].
- 12.
Montague’s contributions came to an untimely end when he was violently murdered at age 40.
- 13.
This makes these grammars well suited to functional languages like Lisp.
- 14.
Tokenizers and POS taggers, plus any reference tagsets employed, should be attentive to syntax that is declinable (noun, pronoun, verb, adverb). The indeclinable terms (proposition, conjunction, interjection, particles, modals) are less of a problem since only single terms are required. Declensions of tense, case, plurality, or gender are very important topics in some languages, though I do not speak further of it here.
- 15.
By formal proposition I mean a sentence in the indicative mood, “for a proposition is equivalent to a sentence in the indicative mood” (1903, CP 2.315), for which Peirce was mostly concerned. Contrast this to the other moods (1893, CP 2.291) or “quasi-propositions,” see below.
- 16.
A “selective” (1903, CP 4.408) is an indeterminant individual such as indicated by selective pronouns (any, every, all, no, none, whatever, whoever, everybody, anybody, nobody) or particular selectives (some, something, somebody, a, a certain, some or other, one) (1903, CP 2.289).
- 17.
Peirce did not hold the common noun to be a universal part of speech (POS). He states, “I do not regard the common noun as an essentially necessary part of speech. Indeed, it is only fully developed as a separate part of speech in the Aryan languages and the Basque, −- possibly in some other out of the way tongues” (1904, CP 8.337).
- 18.
Such as this, that, something, anything.
- 19.
Peirce also termed this collateral experience, collateral information, and collateral acquaintance.
- 20.
Of course, it is possible to write out full syllogisms in the confines of a single sentence, but most often arguments are made over multiple sentences.
- 21.
- 22.
We are also missing a design or approach to compositionality.
- 23.
Leiß’s publication is dated in 2009, but based on a conference paper presented in 2005.
- 24.
Note that Peirce specifically excludes consideration of instinct in the scientific method and its quest for truth, since all assumptions should be open to question. Pragmatism, however, adds action and instinct to the equation.
- 25.
Note that meaning has many connotations including existential, linguistic, philosophical, psychological, semiotic, of life, and others. Our use embraces all of these senses.
References
M. Hepp, F. Leymann, J. Domingue, A. Wahler, D. Fensel, in e-Business Engineering. Semantic business process management: A vision towards using semantic web services for business process management (IEEE, Beijing, China, 2005), pp. 535–540
F. Smith, M. Proietti, Ontology-based representation and reasoning on process models: A logic programming approach, arXiv:1410.1776 [cs] (2014)
A. Kornai, Mathematical Linguistics (Springer, London, 2008)
J. Lambek, From Word to Sentence: A Computational Algebraic Approach to Grammar, Polimetrica sas (2008)
M. Lange, H. Leiß, To CNF or not to CNF? An efficient yet presentable version of the CYK algorithm. Informatica Didactica 8, 1–21 (2009)
A. Copestake, D. Flickinger, C. Pollard, I.A. Sag, Minimal recursion semantics: An introduction. Res. Lang. Comput. 3, 281–332 (2005)
M. Steedman, A Very Short Introduction to CCG. Unpublished draft note (1996), p. 8
J. Bos, A survey of computational semantics: Representation, inference and knowledge in wide-coverage text understanding. Lang. Linguist. Compass 5, 336–366 (2011)
P. Liang, Learning executable semantic parsers for natural language understanding, arXiv:1603.06677 [cs] (2016)
B.H. Partee, Montague grammar, ed. by N.J. Smelser and P.B. Bates, in International Encyclopedia of the Social and Behavioral Sciences (Pergamon/Elsevier Science, Oxford, 2001). p. 7 pp.
A. Wierzbicka, Semantics: Primes and Universals (Oxford University Press, Oxford, 1996)
L. Abzianidze, J. Bos, Towards Universal Semantic Tagging, arXiv:1709.10381 [cs] (2017)
S. Hassan, Measuring Semantic Relatedness Using Salient Encyclopedic Concepts. Ph.D., University of North Texas (2011)
P.P.-S. Chen, English, Chinese and ER diagrams. Data Knowl. Eng. 23, 5–16 (1997)
A. Ninio, Learning a generative syntax from transparent syntactic atoms in the linguistic input. J. Child Lang. 41, 1249–1275 (2014)
W. Nöth, Charles Sanders Peirce, pathfinder in linguistics, in Digital Encyclopedia of Charles S. Peirce (2000)
A.-V. Pietarinen, Peirce’s pragmatic theory of proper names. Trans. Charles S. Peirce Soc. 46, 341 (2010)
W. Nöth, Representation and reference according to Peirce. Int. J. Signs Semiotic Syst. 1, 28–39 (2011)
R. Hilpinen, On CS Peirce’s theory of the proposition: Peirce as a precursor of game-theoretical semantics. Monist 65, 182–188 (1982)
J. Sarbo, J. Farkas, A Peircean ontology of language, in International Conference on Conceptual Structures (Springer, Berlin, Heidelberg, 2001), pp. 1–14
T. Kwiatkowski, E. Choi, Y. Artzi, L. Zettlemoyer, Scaling semantic parsers with on-the-fly ontology matching, in Proceedings of the 2013 Conference on Empirical Methods in Natural Language Processing (Association for Computational Linguistics, Seattle, WA, 2013), pp. 1545–1556
S. Clark, Type-driven syntax and semantics for composing meaning vectors, in Quantum Physics and Linguistics: A Compositional, Diagrammatic Discourse (2013), pp. 359–377
J. Maillard, S. Clark, E. Grefenstette, A Type-Driven Tensor-Based Semantics for CCG (Association for Computational Linguistics, 2014), pp. 46–54
E. Grefenstette, Category-Theoretic Quantitative Compositional Distributional Models of Natural Language Semantics. Ph.D., Balliol College, University of Oxford (2013)
M. Gardner, J. Krishnamurthy, Open-vocabulary semantic parsing with both distributional statistics and formal knowledge, in Proceedings of the Thirty-First AAAI Conference on Artificial Intelligence (AAAI-17). (Association for the Advancement of Artificial Intelligence, 2017), pp. 3195–3201
T.A.D. Fowler, Lambek Categorial Grammars for Practical Parsing. Ph.D., University of Toronto (2016)
P. Liang, C. Potts, Bringing machine learning and compositional semantics together. Ann. Rev. Linguist. 1, 355–376 (2015)
P. Suppes, Direct inference in English. Teach. Philos. 4, 405–418 (1981)
C.H. Brink, The Algebra of Relations. Ph.D., University of Cambridge (1978)
C.S. Peirce, Description of a notation for the logic of relatives, resulting from an amplification of the conceptions of Boole’s calculus of logic. Mem. Am. Acad. Arts Sci. 9, 317–378 (1870)
C. Brink, The algebra of relatives. Notre Dame J. Form. Logic XX, 900–908 (1979)
C. Brink, R.A. Schmidt, Subsumption computed algebraically. Comput. Math. Appl. 23, 329–342 (1992)
M. de Rijke, Extending Modal Logic. Ph.D., Universiteit van Amsterdam, Institute for Logic, Language and Computation (1993)
M. Böttner, Peirce grammar. Grammars 4, 1–19 (2001)
H. Leiß, The proper treatment of coordination in Peirce grammar, in Proceedings of FG-MoL 2005. (Edinburgh, Scotland, 2009), pp. 149–166
M. Scheutz, J. Harris, P. Schermerhorn, Systematic integration of cognitive and robotic architectures. Adv. Cogn. Syst. 2, 277–296 (2013)
P. Steiner, CS Peirce and Artificial Intelligence: Historical heritage and (new) theoretical stakes, in Philosophy and Theory of Artificial Intelligence (Springer, New York, 2013), pp. 265–276
L. Shapiro, The embodied cognition research programme. Philos. Compass 2, 338–346 (2007)
P. Chiasson, Logica Utens, in Digital Encyclopedia of Charles S. Peirce (2001)
C. Matuszek, E. Herbst, L. Zettlemoyer, D. Fox, Learning to parse natural language commands to a robot control system, in Experimental robotics (Springer, Heidelberg, 2013), pp. 403–415
A. Cangelosi, Solutions and open challenges for the symbol grounding problem. Int. J. Signs Semiotic Syst. 1, 49–54 (2011)
D. Roy, Semiotic schemas: A framework for grounding language in action and perception. Artif. Intell. 167, 170–205 (2005)
M.-A. Williams, Representation = grounded information, in Trends in Artificial Intelligence, ed. by T.-B. Ho, Z.-H. Zhou (Springer Berlin Heidelberg, Berlin, Heidelberg, 2008), pp. 473–484
D. Roy, A mechanistic model of three facets of meaning, in Symbols and Embodiment: Debates on Meaning and Cognition, ed. by M.D. Vega, A.M. Glenberg, A.C. Graesser (Oxford University Press, Oxford, 2008), pp. 195–222
C.J. Stanton, The value of meaning for autonomous robots, in Proceedings of the Tenth International Conference on Epigenetic Robotics: Modeling Cognitive Development in Robotic Systems, ed. by B. Johansson, E. Sahin, C. Balkenius (Lund University, Lund, 2010), pp. 129–136
T. Taniguchi, T. Nagai, T. Nakamura, N. Iwahashi, T. Ogata, H. Asoh, Symbol emergence in robotics: A survey. Adv. Robot. 30, 706–728 (2016)
R. Gudwin, The icon grounding problem. Int. J. Signs Semiotic Syst. 1, 73–74 (2011)
G.I. Parisi, J. Tani, C. Weber, S. Wermter, Emergence of multimodal action representations from neural network self-organization. Cogn. Syst. Res. 43, 208–221 (2017)
J. Nivre, Dependency Grammar and Dependency Parsing (Växjö University, Växjö, 2005)
M. Ringgaard, R. Gupta, F.C.N. Pereira, Sling: A framework for frame semantic parsing. arXiv:1710.07032 [cs] (2017)
B. Coecke, M. Sadrzadeh, S. Clark, Mathematical foundations for a compositional distributional model of meaning. Linguist. Anal. 36, 345–384 (2010)
S.C.B. Coecke, M. Sadrzadeh, A compositional distributional model of meaning, in Proceedings of the Second Quantum Interaction Symposium (QI-2008) (Oxford University Press, Oxford, 2008), pp. 133–140
J.R. Hobbs, S.J. Rosenschein, Making computational sense of Montague’s intensional logic. Artif. Intell. 9, 287–306 (1977)
A. K. Joshi, Context-sensitive grammars, in Oxford International Encyclopedia of Linguistics, 2nd Edition (K. Vijay-Shanker and D. Weir, eds., Oxford University Press, Oxford, UK, 2003), pp. 1–4
Author information
Authors and Affiliations
Rights and permissions
Copyright information
© 2018 Springer Nature Switzerland AG
About this chapter
Cite this chapter
Bergman, M.K. (2018). Potential Uses in Depth. In: A Knowledge Representation Practionary. Springer, Cham. https://doi.org/10.1007/978-3-319-98092-8_16
Download citation
DOI: https://doi.org/10.1007/978-3-319-98092-8_16
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-98091-1
Online ISBN: 978-3-319-98092-8
eBook Packages: Computer ScienceComputer Science (R0)