Word, Sense, and Graph Embeddings
- 58 Downloads
Distributed word representations in the form of dense vectors, known as word embeddings, are the basic building blocks for machine-learning based natural language processing. Such embeddings play an important role in tasks such as part-of-speech tagging, chunking, named entity recognition, and semantic role labeling, as well as downstream tasks including sentiment analysis and more in general text classification. However, early word embeddings were static context-independent representations that fail to capture multiple meanings for polysemous words. This chapter presents an overview of such traditional word embeddings, but also of alternative approaches that have been proposed to produce sense and concept embeddings using disambiguated corpora or directly from knowledge graphs. As a result, this chapter serves as a conceptual framework for the rest of book.
Unable to display preview. Download preview PDF.