Skip to main content

Potential Uses in Depth

  • Chapter
  • First Online:
Book cover A Knowledge Representation Practionary
  • 911 Accesses

Abstract

The three areas covered in depth in this chapter are workflows and business process management (BPM), semantic parsing, and robotics. The production and consumption of knowledge should warrant as much attention as do the actions or processes on the factory floor. Workflows are a visible gap in most knowledge management. A reason for the gap is that workflows and business processes intimately involve people. Shared communication is at the heart of workflow management, a reason why semantic technologies are essential to the task. In semantic parsing, a lexical theory needs to handle word senses, sentences and semantics, cross-language meanings, common-sense reasoning, and learning algorithms. A formal grammar provides a set of transition rules for evaluating tokens and a lexicon of types that can build up, or generate, representative language structures. We can map the compositional and semantic aspects of our language to the categorial perspectives of Peirce’s logic and semiosis, and then convert those formalisms to distributions over broad examples provided by KBpedia’s knowledge. Peircean ideas may contribute to part-of-speech tagging, machine learning implementations, and a dedicated Peircean grammar. Cognitive robots embrace the ideas of learning and planning and interacting with a dynamic world. Robotics is a critical potential testbed for Peircean ideas. Robots, to conduct actions, must resolve or ‘ground’ their reasoning symbols into indecomposable primitives. The implication is that higher order concepts are derivations in some manner of lower level concepts, which fits KBpedia and Peirce’s universal categories. Kinesthetic robots may also be essential to refine natural language understanding.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 169.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Hardcover Book
USD 219.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    BPM may also refer to business process modeling. We retain the management sense here, noting that the modeling part only comes after thinking through the management portion.

  2. 2.

    The exception to this observation is advanced manufacturing. Some of these businesses, inherently action oriented, have pioneered BPM’s related cousin of manufacturing process management. However, it is an open question whether manufacturing businesses are better at KM as well.

  3. 3.

    See, for example, the open-source Yaoqiang BPMN editor (http://bpmn.sourceforge.net/).

  4. 4.

    The word grammar is derived from a Greek word meaning “writing,” though at one time the knowledge of Latin grammar was viewed as endowing one with magical power, from which arose our word glamor [4].

  5. 5.

    In many areas of computational linguistics, care should be taken when comparing findings from the contributing disciplines.

  6. 6.

    This split somewhat reflects a similar one for discriminative versus generative machine learning models. Discriminative models, also called conditional models, are a class of models used in machine learning for modeling the dependence of unobserved (target) variables y on observed variables x. Example discriminative models include support vector models (SVM), conditional random fields (CRF), neural networks (xNN), linear regression, maximum entropy Markov, and random forests. Generative models use algorithms to try to reconstruct how the original data was generated, often through probabilistic means. Example models include hidden Markov models (HMM), naive Bayes, generative adversarial networks (GANs), Gaussian mixture model, and other types of the mixture model.

  7. 7.

    Nivre argues that a dependency grammar is not a grammar formalism, rather a specific way to describe the syntactic structure of a sentence [49].

  8. 8.

    See https://en.wikipedia.org/wiki/Deterministic_context-free_grammar

  9. 9.

    Also known as the Chomsky–Schützenberger hierarchy.

  10. 10.

    We talked of this simple data struct in Chap. 9.

  11. 11.

    For a sample detailed description see SLING, a frame-based semantic parser using a dependency grammar [50].

  12. 12.

    Montague’s contributions came to an untimely end when he was violently murdered at age 40.

  13. 13.

    This makes these grammars well suited to functional languages like Lisp.

  14. 14.

    Tokenizers and POS taggers, plus any reference tagsets employed, should be attentive to syntax that is declinable (noun, pronoun, verb, adverb). The indeclinable terms (proposition, conjunction, interjection, particles, modals) are less of a problem since only single terms are required. Declensions of tense, case, plurality, or gender are very important topics in some languages, though I do not speak further of it here.

  15. 15.

    By formal proposition I mean a sentence in the indicative mood, “for a proposition is equivalent to a sentence in the indicative mood” (1903, CP 2.315), for which Peirce was mostly concerned. Contrast this to the other moods (1893, CP 2.291) or “quasi-propositions,” see below.

  16. 16.

    A “selective” (1903, CP 4.408) is an indeterminant individual such as indicated by selective pronouns (any, every, all, no, none, whatever, whoever, everybody, anybody, nobody) or particular selectives (some, something, somebody, a, a certain, some or other, one) (1903, CP 2.289).

  17. 17.

    Peirce did not hold the common noun to be a universal part of speech (POS). He states, “I do not regard the common noun as an essentially necessary part of speech. Indeed, it is only fully developed as a separate part of speech in the Aryan languages and the Basque, −- possibly in some other out of the way tongues” (1904, CP 8.337).

  18. 18.

    Such as this, that, something, anything.

  19. 19.

    Peirce also termed this collateral experience, collateral information, and collateral acquaintance.

  20. 20.

    Of course, it is possible to write out full syllogisms in the confines of a single sentence, but most often arguments are made over multiple sentences.

  21. 21.

    One genesis of this grand synthesis is a 2010 paper by Coecke et al., “Mathematical Foundations for a Compositional Distributional Model of Meaning” [51], first unveiled in 2008 [52].

  22. 22.

    We are also missing a design or approach to compositionality.

  23. 23.

    Leiß’s publication is dated in 2009, but based on a conference paper presented in 2005.

  24. 24.

    Note that Peirce specifically excludes consideration of instinct in the scientific method and its quest for truth, since all assumptions should be open to question. Pragmatism, however, adds action and instinct to the equation.

  25. 25.

    Note that meaning has many connotations including existential, linguistic, philosophical, psychological, semiotic, of life, and others. Our use embraces all of these senses.

References

  1. M. Hepp, F. Leymann, J. Domingue, A. Wahler, D. Fensel, in e-Business Engineering. Semantic business process management: A vision towards using semantic web services for business process management (IEEE, Beijing, China, 2005), pp. 535–540

    Google Scholar 

  2. F. Smith, M. Proietti, Ontology-based representation and reasoning on process models: A logic programming approach, arXiv:1410.1776 [cs] (2014)

    Google Scholar 

  3. A. Kornai, Mathematical Linguistics (Springer, London, 2008)

    Book  Google Scholar 

  4. J. Lambek, From Word to Sentence: A Computational Algebraic Approach to Grammar, Polimetrica sas (2008)

    Google Scholar 

  5. M. Lange, H. Leiß, To CNF or not to CNF? An efficient yet presentable version of the CYK algorithm. Informatica Didactica 8, 1–21 (2009)

    Google Scholar 

  6. A. Copestake, D. Flickinger, C. Pollard, I.A. Sag, Minimal recursion semantics: An introduction. Res. Lang. Comput. 3, 281–332 (2005)

    Article  Google Scholar 

  7. M. Steedman, A Very Short Introduction to CCG. Unpublished draft note (1996), p. 8

    Google Scholar 

  8. J. Bos, A survey of computational semantics: Representation, inference and knowledge in wide-coverage text understanding. Lang. Linguist. Compass 5, 336–366 (2011)

    Article  Google Scholar 

  9. P. Liang, Learning executable semantic parsers for natural language understanding, arXiv:1603.06677 [cs] (2016)

    Article  Google Scholar 

  10. B.H. Partee, Montague grammar, ed. by N.J. Smelser and P.B. Bates, in International Encyclopedia of the Social and Behavioral Sciences (Pergamon/Elsevier Science, Oxford, 2001). p. 7 pp.

    Chapter  Google Scholar 

  11. A. Wierzbicka, Semantics: Primes and Universals (Oxford University Press, Oxford, 1996)

    Google Scholar 

  12. L. Abzianidze, J. Bos, Towards Universal Semantic Tagging, arXiv:1709.10381 [cs] (2017)

    Google Scholar 

  13. S. Hassan, Measuring Semantic Relatedness Using Salient Encyclopedic Concepts. Ph.D., University of North Texas (2011)

    Google Scholar 

  14. P.P.-S. Chen, English, Chinese and ER diagrams. Data Knowl. Eng. 23, 5–16 (1997)

    Article  Google Scholar 

  15. A. Ninio, Learning a generative syntax from transparent syntactic atoms in the linguistic input. J. Child Lang. 41, 1249–1275 (2014)

    Article  Google Scholar 

  16. W. Nöth, Charles Sanders Peirce, pathfinder in linguistics, in Digital Encyclopedia of Charles S. Peirce (2000)

    Google Scholar 

  17. A.-V. Pietarinen, Peirce’s pragmatic theory of proper names. Trans. Charles S. Peirce Soc. 46, 341 (2010)

    Article  Google Scholar 

  18. W. Nöth, Representation and reference according to Peirce. Int. J. Signs Semiotic Syst. 1, 28–39 (2011)

    Article  Google Scholar 

  19. R. Hilpinen, On CS Peirce’s theory of the proposition: Peirce as a precursor of game-theoretical semantics. Monist 65, 182–188 (1982)

    Article  Google Scholar 

  20. J. Sarbo, J. Farkas, A Peircean ontology of language, in International Conference on Conceptual Structures (Springer, Berlin, Heidelberg, 2001), pp. 1–14

    Google Scholar 

  21. T. Kwiatkowski, E. Choi, Y. Artzi, L. Zettlemoyer, Scaling semantic parsers with on-the-fly ontology matching, in Proceedings of the 2013 Conference on Empirical Methods in Natural Language Processing (Association for Computational Linguistics, Seattle, WA, 2013), pp. 1545–1556

    Google Scholar 

  22. S. Clark, Type-driven syntax and semantics for composing meaning vectors, in Quantum Physics and Linguistics: A Compositional, Diagrammatic Discourse (2013), pp. 359–377

    Chapter  Google Scholar 

  23. J. Maillard, S. Clark, E. Grefenstette, A Type-Driven Tensor-Based Semantics for CCG (Association for Computational Linguistics, 2014), pp. 46–54

    Google Scholar 

  24. E. Grefenstette, Category-Theoretic Quantitative Compositional Distributional Models of Natural Language Semantics. Ph.D., Balliol College, University of Oxford (2013)

    Google Scholar 

  25. M. Gardner, J. Krishnamurthy, Open-vocabulary semantic parsing with both distributional statistics and formal knowledge, in Proceedings of the Thirty-First AAAI Conference on Artificial Intelligence (AAAI-17). (Association for the Advancement of Artificial Intelligence, 2017), pp. 3195–3201

    Google Scholar 

  26. T.A.D. Fowler, Lambek Categorial Grammars for Practical Parsing. Ph.D., University of Toronto (2016)

    Google Scholar 

  27. P. Liang, C. Potts, Bringing machine learning and compositional semantics together. Ann. Rev. Linguist. 1, 355–376 (2015)

    Article  Google Scholar 

  28. P. Suppes, Direct inference in English. Teach. Philos. 4, 405–418 (1981)

    Article  Google Scholar 

  29. C.H. Brink, The Algebra of Relations. Ph.D., University of Cambridge (1978)

    Google Scholar 

  30. C.S. Peirce, Description of a notation for the logic of relatives, resulting from an amplification of the conceptions of Boole’s calculus of logic. Mem. Am. Acad. Arts Sci. 9, 317–378 (1870)

    Google Scholar 

  31. C. Brink, The algebra of relatives. Notre Dame J. Form. Logic XX, 900–908 (1979)

    Article  MathSciNet  Google Scholar 

  32. C. Brink, R.A. Schmidt, Subsumption computed algebraically. Comput. Math. Appl. 23, 329–342 (1992)

    Article  Google Scholar 

  33. M. de Rijke, Extending Modal Logic. Ph.D., Universiteit van Amsterdam, Institute for Logic, Language and Computation (1993)

    Google Scholar 

  34. M. Böttner, Peirce grammar. Grammars 4, 1–19 (2001)

    Article  MathSciNet  Google Scholar 

  35. H. Leiß, The proper treatment of coordination in Peirce grammar, in Proceedings of FG-MoL 2005. (Edinburgh, Scotland, 2009), pp. 149–166

    Google Scholar 

  36. M. Scheutz, J. Harris, P. Schermerhorn, Systematic integration of cognitive and robotic architectures. Adv. Cogn. Syst. 2, 277–296 (2013)

    Google Scholar 

  37. P. Steiner, CS Peirce and Artificial Intelligence: Historical heritage and (new) theoretical stakes, in Philosophy and Theory of Artificial Intelligence (Springer, New York, 2013), pp. 265–276

    Google Scholar 

  38. L. Shapiro, The embodied cognition research programme. Philos. Compass 2, 338–346 (2007)

    Article  Google Scholar 

  39. P. Chiasson, Logica Utens, in Digital Encyclopedia of Charles S. Peirce (2001)

    Google Scholar 

  40. C. Matuszek, E. Herbst, L. Zettlemoyer, D. Fox, Learning to parse natural language commands to a robot control system, in Experimental robotics (Springer, Heidelberg, 2013), pp. 403–415

    Chapter  Google Scholar 

  41. A. Cangelosi, Solutions and open challenges for the symbol grounding problem. Int. J. Signs Semiotic Syst. 1, 49–54 (2011)

    Article  Google Scholar 

  42. D. Roy, Semiotic schemas: A framework for grounding language in action and perception. Artif. Intell. 167, 170–205 (2005)

    Article  Google Scholar 

  43. M.-A. Williams, Representation = grounded information, in Trends in Artificial Intelligence, ed. by T.-B. Ho, Z.-H. Zhou (Springer Berlin Heidelberg, Berlin, Heidelberg, 2008), pp. 473–484

    Chapter  Google Scholar 

  44. D. Roy, A mechanistic model of three facets of meaning, in Symbols and Embodiment: Debates on Meaning and Cognition, ed. by M.D. Vega, A.M. Glenberg, A.C. Graesser (Oxford University Press, Oxford, 2008), pp. 195–222

    Chapter  Google Scholar 

  45. C.J. Stanton, The value of meaning for autonomous robots, in Proceedings of the Tenth International Conference on Epigenetic Robotics: Modeling Cognitive Development in Robotic Systems, ed. by B. Johansson, E. Sahin, C. Balkenius (Lund University, Lund, 2010), pp. 129–136

    Google Scholar 

  46. T. Taniguchi, T. Nagai, T. Nakamura, N. Iwahashi, T. Ogata, H. Asoh, Symbol emergence in robotics: A survey. Adv. Robot. 30, 706–728 (2016)

    Article  Google Scholar 

  47. R. Gudwin, The icon grounding problem. Int. J. Signs Semiotic Syst. 1, 73–74 (2011)

    Google Scholar 

  48. G.I. Parisi, J. Tani, C. Weber, S. Wermter, Emergence of multimodal action representations from neural network self-organization. Cogn. Syst. Res. 43, 208–221 (2017)

    Article  Google Scholar 

  49. J. Nivre, Dependency Grammar and Dependency Parsing (Växjö University, Växjö, 2005)

    MATH  Google Scholar 

  50. M. Ringgaard, R. Gupta, F.C.N. Pereira, Sling: A framework for frame semantic parsing. arXiv:1710.07032 [cs] (2017)

    Google Scholar 

  51. B. Coecke, M. Sadrzadeh, S. Clark, Mathematical foundations for a compositional distributional model of meaning. Linguist. Anal. 36, 345–384 (2010)

    Google Scholar 

  52. S.C.B. Coecke, M. Sadrzadeh, A compositional distributional model of meaning, in Proceedings of the Second Quantum Interaction Symposium (QI-2008) (Oxford University Press, Oxford, 2008), pp. 133–140

    Google Scholar 

  53. J.R. Hobbs, S.J. Rosenschein, Making computational sense of Montague’s intensional logic. Artif. Intell. 9, 287–306 (1977)

    Article  MathSciNet  Google Scholar 

  54. A. K. Joshi, Context-sensitive grammars, in Oxford International Encyclopedia of Linguistics, 2nd Edition (K. Vijay-Shanker and D. Weir, eds., Oxford University Press, Oxford, UK, 2003), pp. 1–4

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

Copyright information

© 2018 Springer Nature Switzerland AG

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Bergman, M.K. (2018). Potential Uses in Depth. In: A Knowledge Representation Practionary. Springer, Cham. https://doi.org/10.1007/978-3-319-98092-8_16

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-98092-8_16

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-98091-1

  • Online ISBN: 978-3-319-98092-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics