Advertisement

Word, Sense, and Graph Embeddings

  • Jose Manuel Gomez-Perez
  • Ronald Denaux
  • Andres Garcia-Silva
Chapter
  • 58 Downloads

Abstract

Distributed word representations in the form of dense vectors, known as word embeddings, are the basic building blocks for machine-learning based natural language processing. Such embeddings play an important role in tasks such as part-of-speech tagging, chunking, named entity recognition, and semantic role labeling, as well as downstream tasks including sentiment analysis and more in general text classification. However, early word embeddings were static context-independent representations that fail to capture multiple meanings for polysemous words. This chapter presents an overview of such traditional word embeddings, but also of alternative approaches that have been proposed to produce sense and concept embeddings using disambiguated corpora or directly from knowledge graphs. As a result, this chapter serves as a conceptual framework for the rest of book.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Copyright information

© Springer Nature Switzerland AG 2020

Authors and Affiliations

  • Jose Manuel Gomez-Perez
    • 1
  • Ronald Denaux
    • 1
  • Andres Garcia-Silva
    • 1
  1. 1.Expert SystemMadridSpain

Personalised recommendations