Skip to main content

Mini-Batch Variational Inference for Time-Aware Topic Modeling

  • Conference paper
  • First Online:
PRICAI 2018: Trends in Artificial Intelligence (PRICAI 2018)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 11013))

Included in the following conference series:

  • 3521 Accesses

Abstract

This paper proposes a time-aware topic model and its mini-batch variational inference for exploring chronological trends in document contents. Our contribution is twofold. First, to extract topics in a time-aware manner, our method uses two vector embeddings: the embedding of latent topics and that of document timestamps. By combining these two embeddings and applying the softmax function, we have as many word probability distributions as document timestamps for each topic. This modeling enables us to extract remarkable topical trends. Second, to achieve memory efficiency, the variational inference is implemented as mini-batch gradient ascent maximizing the evidence lower bound. This enables us to perform parameter estimation in the way similar to neural networks. Our method was actually implemented with deep learning framework. The evaluation results show that we could improve test set perplexity by using document timestamps and also that our test perplexity was comparable with that of collapsed Gibbs sampling, which is less efficient in memory usage than the proposed inference.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    https://github.com/pytorch.

  2. 2.

    http://taku910.github.io/mecab/.

  3. 3.

    https://catalog.ldc.upenn.edu/LDC2005T16.

  4. 4.

    http://dblp.uni-trier.de/xml/.

  5. 5.

    https://www.kaggle.com/stackoverflow/rquestions.

  6. 6.

    https://github.com/amueller/word_cloud.

References

  1. Asuncion, A., Welling, M., Smyth, P., Teh, Y.W.: On smoothing and inference for topic models. In: Proceedings of UAI, pp. 27–34 (2009)

    Google Scholar 

  2. Blei, D.M., Lafferty, J.D.: Dynamic topic models. In: Proceedings of ICML, pp. 113–120 (2006)

    Google Scholar 

  3. Blei, D.M., Ng, A.Y., Jordan, M.I.: Latent Dirichlet allocation. J. Mach. Learn. Res. 3, 993–1022 (2003)

    MATH  Google Scholar 

  4. Broderick, T., Boyd, N., Wibisono, A., Wilson, A.C., Jordan, M.I.: Streaming variational Bayes. In: Proceedings of NIPS, pp. 1727–1735 (2013)

    Google Scholar 

  5. Chen, S.F., Goodman, J.: An empirical study of smoothing techniques for language modeling. In: Proceedings of ACL, pp. 310–318 (1996)

    Google Scholar 

  6. Dieng, A.B., Wang, C., Gao, J., Paisley, J.: TopicRNN: a recurrent neural network with long-range semantic dependency. arXiv:1611.01702 (2016)

  7. Eisenstein, J., Ahmed, A., Xing, E.P.: Sparse additive generative models of text. In: Proceedings of ICML, pp. 1041–1048 (2011)

    Google Scholar 

  8. Griffiths, T.L., Steyvers, M.: Finding scientific topics. Proc. Natl. Acad. Sci. U.S.A. 101(Suppl. 1), 5235–5288 (2004)

    Google Scholar 

  9. Hoffman, M.D., Blei, D.M., Bach, F.: Online learning for latent Dirichlet allocation. In: Proceedings of NIPS, pp. 856–864 (2010)

    Google Scholar 

  10. Kingma, D.P., Welling, M.: Auto-encoding variational bayes. In: Proceedings of ICLR (2014)

    Google Scholar 

  11. Kingma, D.P., Ba, J.L.: Adam: a method for stochastic optimization. In: Proceedings of ICLR (2015)

    Google Scholar 

  12. Mikolov, T., Zweig, G.: Context dependent recurrent neural network language model. In: Proceedings of IEEE Spoken Language Technology, Workshop, pp. 234–239 (2012)

    Google Scholar 

  13. Roberts, M.E., Stewart, B.M., Tingley, D., Airoldi, E.M.: The structural topic model and applied social science. In: Proceedings of Advances in Neural Information Processing Systems Workshop on Topic Models: Computation, Application, and Evaluation (2013)

    Google Scholar 

  14. Srivastava, A., Sutton, C.: Autoencoding variational inference for topic models. In: Proceedings of ICLR (2017)

    Google Scholar 

  15. Wan, L., Zeiler, M., Zhang, S., Cun, Y.L., Fergus, R.: Regularization of neural networks using DropConnect. In: Proceedings of ICML, pp. 1058–1066 (2013)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Tomonari Masada .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2018 Springer International Publishing AG, part of Springer Nature

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Masada, T., Takasu, A. (2018). Mini-Batch Variational Inference for Time-Aware Topic Modeling. In: Geng, X., Kang, BH. (eds) PRICAI 2018: Trends in Artificial Intelligence. PRICAI 2018. Lecture Notes in Computer Science(), vol 11013. Springer, Cham. https://doi.org/10.1007/978-3-319-97310-4_18

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-97310-4_18

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-97309-8

  • Online ISBN: 978-3-319-97310-4

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics