Skip to main content

Evaluating an Opinion Annotation Scheme Using a New Multi-Perspective Question and Answer Corpus

  • Chapter
Computing Attitude and Affect in Text: Theory and Applications

Part of the book series: The Information Retrieval Series ((INRE,volume 20))

Abstract

In recent work, Wiebe et al. (2003) propose a semantic representation for encoding the opinions and perspectives expressed at any given point in a text. This paper evaluates the opinion annotation scheme for multiperspective vs. fact-based question answering using a new question and answer corpus.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 129.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 169.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

8. Bibliography

  • Cardie, C., Ng, V., Pierce, D., and Buckley, C. (2000) Examining the role of statistical and linguistic knowledge sources in a general-knowledge question-answering system. In Proceedings of the Sixth Applied Natural Language Processing Conference. 180–187.

    Google Scholar 

  • Cardie, C., Wiebe, J., Wilson, T., and Litman, D. (2003) Combining low-level and summary representations of opinions for multi-perspective question answering. Working Notes of the 2003 AAAI Spring Symposium on New Directions in Question Answering.

    Google Scholar 

  • Harabagiu, S., Moldovan, D., Pasca, M., Surdeanu, M., Mihalcea, R., Girju, R., Rus, V., Lacatusu, F., Morarescu, P., and Bunescu, R. (2001) Answering complex, list and context questions with lcc’s question-answering server. In Voorhees, E., and Harman, D. K., (Eds.) Proceedings of the Tenth Text Retrieval Conference (TREC 2001). 355–362.

    Google Scholar 

  • Moldovan, D., Harabagiu, S., Pasca, M., Mihalcea, R., Girju, R., Goodrum, R. and Rus, V. (1999) Lasso: A tool for surfing the answer net. In Voorhees, E., and Harman, D. K., (Eds.) Proceedings of the Eighth Text Retrieval Conference (TREC-8).

    Google Scholar 

  • Moldovan, D., Harabagiu, S., Girju, R., Morarescu, P., Lacatusu, F., Novischi, A.., Badulescu, A., and Bolohan, O. (2002) Lcc tools for question answering. In Voorhees, E., and Buckland, L. P., (Eds.) Proceedings of the Eleventh Text REtrieval Conference (TREC 2002). 79–89.

    Google Scholar 

  • Pasca, M. and Harabagiu, S. (2000) High performance question/answering. In Proceedings of the 38th annual meeting of the association for computational linguistics (ACL-2000). 563–570.

    Google Scholar 

  • Quirk, R., Greenbaum, S., Leech, G., and Svartvik, J. (1985) A comprehensive grammar of the English language. New York. Longman.

    Google Scholar 

  • Salton, G. (Ed.) (1971) The SMART Retrieval System — Experiments in Automatic Document Processing. Englewood Cliffs, NJ. Prentice Hall Inc.

    Google Scholar 

  • Voorhees, E. and Tice, D. (1999) The TREC-8 question answering track evaluation. In Voorhees, E. and Harman, D. K., (Eds.) Proceedings of the Eighth Text Retrieval Conference (TREC-8). 83–105.

    Google Scholar 

  • Voorhees, E. (2000) The TREC-9 question answering track evaluation. In Voorhees, E. and Harman, D. K., (Eds.) Proceedings of the Ninth Text REtrieval Conference (TREC-9). 71–81.

    Google Scholar 

  • Voorhees, E. (2001) Overview of the TREC 2001 question answering track. In Voorhees, E. and Harman, D. K., (Eds.) Proceedings of the Tenth Text Retrieval Conference (TREC 2001). 42–52.

    Google Scholar 

  • Voorhees, E. (2002) Overview of the TREC 2002 question answering track. In Voorhees, E. and Buckland, L. P. (Eds.) Proceedings of the Eleventh Text REtrieval Conference (TREC 2002). 53–75.

    Google Scholar 

  • Wiebe, J., Breck, E., Buckley, C., Cardie, C., Davis, P., Fraser, B., Litman, D., Pierce, D., Riloff, E., Wilson, T., Day, D., and Maybury, M. (2003) Recognizing and organizing opinions expressed in the world press. Working Notes of the 2003 AAAI Spring Symposium on New Directions in Question Answering.

    Google Scholar 

  • Wiebe, J. (2002) Instructions for annotating opinions in newspaper articles. Department of Computer Science TR-02-101. University of Pittsburgh, Pittsburgh, PA.

    Google Scholar 

  • Wilson, T. and Wiebe, J. (2003) Annotating opinions in the world press. 4th SIGdial Workshop on Discourse and Dialogue (SIGdial-03).

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2006 Springer

About this chapter

Cite this chapter

Stoyanov, V., Cardie, C., Litman, D., Wiebe, J. (2006). Evaluating an Opinion Annotation Scheme Using a New Multi-Perspective Question and Answer Corpus. In: Shanahan, J.G., Qu, Y., Wiebe, J. (eds) Computing Attitude and Affect in Text: Theory and Applications. The Information Retrieval Series, vol 20. Springer, Dordrecht. https://doi.org/10.1007/1-4020-4102-0_8

Download citation

  • DOI: https://doi.org/10.1007/1-4020-4102-0_8

  • Publisher Name: Springer, Dordrecht

  • Print ISBN: 978-1-4020-4026-9

  • Online ISBN: 978-1-4020-4102-0

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics