Skip to main content
Log in

Narrative context-based data-to-text generation for ambient intelligence

  • Original Research
  • Published:
Journal of Ambient Intelligence and Humanized Computing Aims and scope Submit manuscript

Abstract

In this paper, we propose a language generation model for the world of ambient intelligence (AmI). Various devices in use today are connected to the Internet and are used to provide a considerable amount of information. Because language is the most effective way for humans to communicate with one another, one approach to controlling AmI devices is to use a smart assistant based on language systems. One such framework for data-to-text generation is the natural language generation (NLG) model that generates text from non-linguistic data. Previously proposed NLG models employed heuristic-based approaches to generate relatively short sentences. We find that such approaches are structurally inflexible and tend to generate text that is not diverse. Moreover, there are various domains where numerical values are important, such as sports, finance, and weather. These values need to be generated in terms of categorical information. (e.g., hits, homeruns, and strikeouts.) In the generated outputs, the numerical values often do not accurately correspond to categorical information. Our proposed data-to-text generation model provides both diversity and coherence of information through a narrative context and a copy mechanism. It allows for the learning of the narrative context and sentence structures from a domain corpus without requiring additional explanation of the intended category or sentential grammars. The results of experiments performed from various perspectives show that the proposed model generates text outputs containing diverse and coherent information.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1

Similar content being viewed by others

Notes

  1. http://statiz.co.kr.

References

  • Aarts E, de Ruyter B (2009) New research perspectives on ambient intelligence. J Ambient Intell Smart Environ 1(1):5–14

    Article  Google Scholar 

  • Albano G, Pierri A (2017) Digital storytelling in mathematics: a competence-based methodology. J Ambient Intell Human Comput 8:937

    Article  Google Scholar 

  • Angeli G, Liang P, Klein D (2010) A simple domain-independent probabilistic approach to generation. In: Proceedings of the conference on empirical methods in natural language processing (EMNLP)

  • Bahdanau D, Cho K, Bengio Y (2015) Neural machine translation by jointly learning to align and translate. In: Proceedings of the international conference on learning representations (ICLR)

  • Davani AM, Shirehjini AAN, Daraei S (2018) Towards interacting with smarter systems. J Ambient Intell Human Comput 9:187

    Article  Google Scholar 

  • Ester M, Kriegel HP, Sander J, Xu X (1996) A density-based algorithm for discovering clusters in large spatial databases with noise. In: Proceedings of the conference on knowledge discovery and data mining (KDD)

  • Fersini E, Messina E, Pozzi FA (2017) Earthquake management: a decision support system based on natural language processing. J Ambient Intell Human Comput 8:37

    Article  Google Scholar 

  • Gal Y, Ghahramani Z (2016) A theoretically grounded application of dropout in recurrent neural networks. arXiv: 1512.05287v5

  • Goldberg E, Driedger N, Kittredge R (1994) Using natural-language processing to produce weather forecasts. IEEE Expert 9(2):45–53

    Article  Google Scholar 

  • Gu J, Lu Z, Li H, Li VOK (2016) Incorporating copying mechanism in sequence-to-sequence learning. In: Proceedings of the annual meeting of the association for computational linguistics (ACL)

  • Hinton G, Vinyals O, Dean J (2015) Distilling the knowledge in a neural network. arXiv: 1503.02531

  • Kingma DP, Ba J (2015) Adam: a method for stochastic optimization. In: Proceedings of the international conference on learning representations (ICLR)

  • Lebret R, Grangier D, Auli M (2016) Neural text generation from structured data with application to the biography domain. In: Proceedings of the conference on empirical methods in natural language processing (EMNLP)

  • Li J, Luong MT, Jurafsky D (2015) A hierarchical neural autoencoder for paragraphs and documents. In: Proceedings of the annual meeting of the association for computational linguistics (ACL)

  • Li J, Galley M, Brockett C, Gao J, Dolan B (2016) A diversity-promoting objective function for neural conversation models. In: Proceedings of the conference of the north american chapter of the association for computational linguistics human language technologies (NAACL-HLT)

  • Liang P, Jordan MI, Klein D (2009) Learning semantic correspondences with less supervision. In: Association for computational linguistics and international joint conference on natural language processing (ACL-IJCNLP)

  • Ling W, Blunsom P, Grefenstette E, Hermann KM, Kocisky T, Wang F, Senior A (2016) Latent predictor networks for code generation. In: Proceedings of the annual meeting of the association for computational linguistics (ACL)

  • Liu CW, Lowe R, Serban IV, Noseworthy M, Charlin L, Pineau J (2016) How NOT to evaluate your dialogue system: An empirical study of unsupervised evaluation metrics for dialogue response generation. In: Proceedings of the conference on empirical methods in natural language processing (EMNLP)

  • Mairesse F, Walker M (2008) Trainable generation of big- five personality styles through data-driven parameter estimation. In: Proceedings of the annual meeting of the association for computational linguistics (ACL)

  • Mairesse F, Young S (2014) Stochastic language generation in dialogue using factored language models. Comput Linguist 40(4):763–799

    Article  Google Scholar 

  • McKee G, Malvern D, Richards B (2000) Measuring vocabulary diversity using dedicated software. Literary Linguist Comput 15:323–338

    Article  Google Scholar 

  • Mei H, Bansal M, Walter MR (2016) What to talk about and how? Selective generation using LSTMS with coarse-to-fine alignment. In: Proceedings of the conference of the north american chapter of the association for computational linguistics human language technologies (NAACL-HLT)

  • Merity S, Xiong C, Bradbury J, Socher R (2016) Pointer sentinel mixture models. arXiv: 1609.07843

  • Nallapati R, Zhou B, dos Santos C, Gulçehre C, Xiang B (2016) Abstractive text summarization using sequence-to-sequence RNNs and beyond. arXiv: 1602.06023

  • Pascanu R, Mikolov T, Bengio Y (2013) On the difficulty of training recurrent neural networks. In: Proceedings of the international conference on machine learning (ICML)

  • Reiter E, Dale R (1997) Building applied natural language generation systems. Nat Lang Eng 3(1):57–87

    Article  Google Scholar 

  • Srivastava N, Hinton G, Krizhevsky A, Sutskever I, Salakhutdinov R (2014) Dropout: a simple way to prevent neural networks from overfitting. JMLR 15(1):1929–1958

    MathSciNet  MATH  Google Scholar 

  • Vinyals O, Fortunato M, Jaitly N (2015) Pointer networks. arXiv: 1506.03134

  • Wagoner AR, Matson ET (2016) A task manager using an ontological framework for a HARMS-based system. J Ambient Intell Human Comput 7:457

    Article  Google Scholar 

  • Walker M, Rambow O, Rogati M (2002) Training a sentence planner for spoken dialogue using boosting. Comput Speech Lang 16:409–433

    Article  Google Scholar 

  • Wen TH, Gasic M, Mrksic N, Su PH, Vandyke D, Young S (2015) Semantically conditioned LSTM-based natural language generation for spoken dialogue systems. In: Proceedings of the conference on empirical methods in natural language processing (EMNLP)

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Jungsun Jang.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Jang, J., Noh, H., Lee, Y. et al. Narrative context-based data-to-text generation for ambient intelligence. J Ambient Intell Human Comput 11, 1421–1429 (2020). https://doi.org/10.1007/s12652-019-01176-7

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12652-019-01176-7

Keywords

Navigation