Skip to main content

Development and Challenges of Crowdsourcing Quality of Experience Evaluation for Multimedia

  • Conference paper
  • First Online:
Book cover Big Data Computing and Communications (BigCom 2015)

Part of the book series: Lecture Notes in Computer Science ((LNISA,volume 9196))

Included in the following conference series:

Abstract

Crowdsourcing quality of experience (QoE) evaluation for multimedia are more cost effective and flexible than traditional in-lab evaluations, and it has gradually caused extensive concern. In this paper, we start from the concept, characteristics and challenges of crowdsourcing QoE evaluation for multimedia, and then summarize the current research progresses including some key technologies in a crowdsourceable QoE evaluation framework. Finally, we point out the open research problems to be solved and the future trends.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Chen, K.-T., Wu, C.-C., Chang, Y.-C., Lei, C.-L.: A crowdsourceable QoE evaluation framework for multimedia content. In: Proceedings of the 17th ACM International Conference on MultimediaPages, pp. 491–500 (2009)

    Google Scholar 

  2. Jain, R.: Quality of experience. IEEE Multimedia 11(1), 95–96 (2004)

    Article  Google Scholar 

  3. Wang, H., Kwong, S., Kok, C.-W.: An efficient mode decision algorithm for H.264/AVC encoding optimization. IEEE Transactions on Multimedia 9(4), 882–888 (2007)

    Article  Google Scholar 

  4. Mushtaq, M.S., Augustin, B., Mellouk, A.: Crowd-sourcing framework to assess QoE. In: 2014 IEEE International Conference on Communications (ICC), pp. 1705–1710 (2014)

    Google Scholar 

  5. Anegekuh, L., Sun, L., Ifeachor, E.: A screening methodlogy for crowdsourcing video QoE evaluation. In: 2014 IEEE Global Communications Conference (GLOBECOM), pp. 1152–1157 (2014)

    Google Scholar 

  6. Methods for subjective determination of transmission quality. ITU-R Recommendation, p. 800 (1996)

    Google Scholar 

  7. Keimel, C., Habigt, J., Diepold, K.: Challenge in crowd-based ideo quality assessment. In: 2012 Fourth International Workshop on Quality of Multimedia Experience (QoMEX), pp. 13–18 (2012)

    Google Scholar 

  8. Hossfeld, T., Keimel, C., Timmerer, C.: IEEE Computer Society, Crowdsourcing quality-of-experience assessments (2014)

    Google Scholar 

  9. Hossfeld, T., Keimel, C., Hirth, M., Gardlo, B., Habigt, J., Diepold, K., Tran-Gia, P.: Best practices for QoE crowdtesting: QoE assessment with crowdsourcing. IEEE Transactions on Multimedia 16(2), 541–555 (2014)

    Article  Google Scholar 

  10. Hossfeld, T., et al.: Quantification of YouTube QoE via crowdsourcing. In: 2011 IEEE International Symposium on Multimedia, pp. 494–499 (2011)

    Google Scholar 

  11. Schulze, T., Seedorf, S., Geiger, D., Kaufmann, N., Schader, M.: Exploring task properties in crowdsourcing-an empirical study on mechanical turk. In: ECIS 2011 Proceedings (2011)

    Google Scholar 

  12. Faradani, S., Hartmann, B., Ipeirotis, P.G., Whats the right price? pricingtasks for finishing on time right price? pricing tasks for finishing on time. In: Workshops at the Twenty-Fifth AAAI Conference on Artificial Intelligence, August 2011

    Google Scholar 

  13. Wu, C.C., Chen, K.T., Chang, Y.C., Lei, C.L.: Crowdsourcing multimedia QoE evaluation: a trusted framework. IEEE Transaction on Multimedia 15(5), 1121–1136 (2015)

    Article  Google Scholar 

  14. Resnick, P., Kuwabara, K., Zeckhauser, R., Friedman, E.: Reputation systems. Commun. ACM 43(12), 45–48 (2000)

    Article  Google Scholar 

  15. Shaw, A.D., Horton, J.J., Chen, D.L.: Designing incentives for inexpert human raters. In: Proc. ACM 2011 Conf. Computer Supported Cooperative Work, CSCW 2011, New York, NY, USA, pp. 275–284 (2011)

    Google Scholar 

  16. David, H.A.: The Method of Paired Comparisons, 2nd edn. Hodder Arnold, London (1988). ISBN 0852642903

    Google Scholar 

  17. Yen, Y.-C., Chu, C.-Y., Yeh, S.-L., Chu, H.-H., Huang, P.: Lab experiment vs. crowdsourcing: a comparative user study on skype call quality. In: AINTEC 2013, Bangkok, Thailand (2013)

    Google Scholar 

  18. Hobfeld, T., Hirth, M., Korshunov, P., et al.: Survey of web-based crowdsourcing frameworks for subjective quality assessment. In: IEEE 16th International Workshop on Multimedia Signal Processing (MMSP) (2014)

    Google Scholar 

  19. Nowak, S., et al.: How reliable are annotations via crowdsourcing: a study about inter-annotator agreement for multi-label image annotation. In: Proceedings of the International Conference on Multimedia Information Retrieval, New York, USA, pp. 557–566 (2010)

    Google Scholar 

  20. Su, H., et al.: Crowdsourcing annotations for visual object detection. In: Workshops at the Twenty-Sixth AAAI Conference on Artificial Intelligence (2012)

    Google Scholar 

  21. Wu, S.Y., et al.: Video summarization via crowdsourcing. In: Workshops atthe Twenty-Sixth Extended Abstracts on Human Factors in Computing Systems, New York, USA, pp. 1531–1536 (2011)

    Google Scholar 

  22. Tang, A., Boring, S.: EpicPlay: crowdsourcing sports video highlights. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, New York, USA, pp. 1569–1572 (2012)

    Google Scholar 

  23. Ma, H., Liu, Y: Correlation based video processing in video sensor networks. In: IEEE International Conference on Wireless Networks, Communications and Mobile Computing, pp. 987–992 (2005)

    Google Scholar 

  24. Wang, H., Kwong, S.: Rate-Distortion optimization of rate control for H.264 with adaptive initial quantization parameter determination. IEEE Transactions on Circuits and Systems for Video Technology 18(1), 140–144 (2008)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Dan Tao .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2015 Springer International Publishing Switzerland

About this paper

Cite this paper

Wang, Z., Tao, D., Liu, P. (2015). Development and Challenges of Crowdsourcing Quality of Experience Evaluation for Multimedia. In: Wang, Y., Xiong, H., Argamon, S., Li, X., Li, J. (eds) Big Data Computing and Communications. BigCom 2015. Lecture Notes in Computer Science(), vol 9196. Springer, Cham. https://doi.org/10.1007/978-3-319-22047-5_36

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-22047-5_36

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-22046-8

  • Online ISBN: 978-3-319-22047-5

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics