Skip to main content
Log in

The contrast effect: QoE of mixed video-qualities at the same time

  • Research Article
  • Published:
Quality and User Experience Aims and scope Submit manuscript

Abstract

In desktop multi-party video-conferencing videostreams of participants are delivered in different qualities, but we know little about how such composition of the screen affects the quality of experience. Do the different videostreams serve as indirect quality references and the perceived video quality is thus dependent on other streams in the same session? How is the relation between the perceived qualities of each stream and the perceived quality of the overall session? To answer these questions we conducted a crowdsourcing study, in which we gathered over 5000 perceived quality ratings of overall sessions and individual streams. Our results show a contrast effect: high quality streams are rated better when more low quality streams are co-present, and vice versa. In turn, the quality perception of an overall session can increase significantly by exchanging one low quality stream with a high quality one. When comparing the means of individual and overall ratings we can further observe that users are more critical when asked for individual streams than for an overall rating. However, the results show that while contrast effect exists, the effect is not strong enough, to optimize the experience by lowering the quality of other participants.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12

Similar content being viewed by others

Notes

  1. www.microworkers.com.

  2. www.mturk.com

  3. www.ffmpeg.org.

  4. www.gstreamer.org.

  5. www.microworkers.com.

References

  1. Qualinet White Paper on Definitions of Quality of Experience (2012) European network on quality of experience in multimedia systems and services (COST Action IC 1003). Version 1.2, March 2013. Lausanne, Switzerland, Version 1.2

  2. Aldridge R, Davidoff J, Ghanbari M, Hands D, Pearson D (1995) Recency effect in the subjective assessment of digitally-coded television pictures. In: Fifth international conference on image processing and its applications, pp 336–339

  3. Amour L, Sami S, Hoceini S, Mellouk A (2015) An open source platform for perceived video quality evaluation. In: Proceedings of the 11th ACM symposium on QoS and security for wireless and mobile networks, Q2SWinet’15, pp 139–140, New York, NY, USA. ACM

  4. Beebe SA, Masterson JT (1997) Communicating in small groups: principles and practices. Longman

  5. Beerends JG, Caluwe D, Frank E (1999) The influence of video quality on perceived audio quality and vice versa. J Audio Eng Soc 47(5):355–362

    Google Scholar 

  6. Belmudez B (2015) Audiovisual quality assessment and prediction for videotelephony. T-Labs series in telecommunication services. Springer, Cham

    Google Scholar 

  7. Belmudez B, Lewcio B, Möller S (2013) Call quality prediction for audiovisual time-varying impairments using simulated conversational structures. Acta Acust United Acust 99(5):792–805

    Article  Google Scholar 

  8. Belmudez B, Möller S (2013) Audiovisual quality integration for interactive communications. EURASIP J Audio Speech Music Process 2013(1):1–23

    Article  Google Scholar 

  9. Belmudez B, Moeller S, Lewcio B, Raake A, Mehmood A (2009) Audio and video channel impact on perceived audio-visual quality in different interactive contexts. In: Proceedings of the MMSP, pp 1–5. IEEE

  10. Berndtsson G, Folkesson M, Kulyk V (2012) Subjective quality assessment of video conferences and telemeetings. In: Proceedings of the 19th PV, pp 25–30. Cited by 0000

  11. Biech E (2007) The Pfeiffer book of successful team-building tools: best of the annuals. Pfeiffer, Santa Ana 00000

    Google Scholar 

  12. Brauer F, Ehsan MS, Kubin G (2008) Subjective evaluation of conversational multimedia quality in IP networks. In: Proceedings of the 10th MMSP, pp 872–876

  13. Chen K-T, Chang C-J, Chen-Chi W, Chang Y-C, Lei C-L (2010) Quadrant of euphoria: a crowdsourcing platform for QoE assessment. IEEE Netw 24(2):28–35 00063

    Article  Google Scholar 

  14. Chen S, Chu C-Y, Yeh S-L, Chu H-H, Huang P (2014) Modeling the QoE of rate changes in Skype/SILK VoIP calls. IEEE/ACM Trans Netw 22(6):1781–1793

    Article  Google Scholar 

  15. Daengsi T, Yochanang K, Wuttidittachotti P (2013) A study of perceptual VoIP quality evaluation with thai users and codec selection using voice quality-Bandwidth tradeoff analysis. In: 2013 international conference on ICT convergence (ICTC), pp 691–696. IEEE

  16. Davison AC (2008) Statistical models, 1st edn. Cambridge University Press, Cambridge

    MATH  Google Scholar 

  17. DiCiccio TJ, Efron B (1996) Bootstrap confidence intervals. Stat Sci 11(3):189–212

    Article  MathSciNet  Google Scholar 

  18. Efron B, Tibshirani RJ (1994) An introduction to the bootstrap. CRC Press, Boca Raton

    MATH  Google Scholar 

  19. Egger S, Schatz R, Schoenenberg K, Raake A, Kubin G (2012) Same but different?—Using speech signal features for comparing conversational VoIP quality studies. In: 2012 IEEE international conference on communications (ICC), pp 1320–1324

  20. Egger S, Reichl P (2009) A nod says more than thousand uhmm’s: towards a framework for measuring audio-visual interactivity. In: Proceedings of the COST298 conference The Good, the Bad and the Challenging, Copenhagen, Denmark

  21. Fredrickson BL (2000) Extracting meaning from past affective experiences: the importance of peaks, ends, and specific emotions. Cognit Emot 14(4):577–606

    Article  Google Scholar 

  22. Fröhlich, P, Egger, S, Schatz, R, Mühlegger M, Masuch K,  Gardlo B (2012) QoE in 10 seconds: are short video clip lengths sufficient for quality of experience assessment? In: 2012 fourth international workshop on quality of multimedia experience, pp 242–247

  23. Gardlo B, Egger S, Hossfeld T (2015) Do scale-design and training matter for video QoE assessments through crowdsourcing?. ACM, Brisbane ACM

    Book  Google Scholar 

  24. Gunkel SNB, Schmitt M, Cesar P (2015) A QoE study of different stream and layout configurations in video conferencing under limited network conditions. ResearchGate

  25. Hammer F, Reichl P, Raake A (2005) The well-tempered conversation: interactivity, delay and perceptual VoIP quality. In: 2005 IEEE international conference on communications, 2005, ICC 2005, vol 1, pp 244–249. Cited by 0029

  26. Hands DS, Avons SE (2001) Recency and duration neglect in subjective assessment of television picture quality. Appl Cognit Psychol 15(6):639–657

    Article  Google Scholar 

  27. Hayashi T, Yamagishi K, Tominaga T, Takahashi A (2007). Multimedia quality integration function for videophone services. In: Proceedings of the GLOBECOM, pp 2735–2739

  28. Hoßfeld T, Keimel C, Hirth M, Gardlo B, Habigt J, Diepold K, Tran-Gia P (2014) Best practices for QoE crowdtesting: QoE assessment with crowdsourcing. IEEE Trans Multimed 16(2):541–558

    Article  Google Scholar 

  29. Hoßfeld T, Seufert M, Hirth M, Zinner T, Tran-Gia P, Schatz R (2011) Quantification of YouTube QoE via crowdsourcing. In: 2011 IEEE international symposium on multimedia (ISM), pp. 494–499. 00066

  30. Hoßfeld T, Seufert M, Sieber C, Zinner T (2014) Assessing effect sizes of influence factors towards a QoE model for HTTP adaptive streaming. In: 2014 sixth international workshop on quality of multimedia experience (qomex), pp 111–116

  31. Hoßfeld T, Biedermann S, Schatz R, Platzer A, Egger S, Fiedler M (2011) The Memory effect and its implications on web QoE modeling. In: Proceedings of the 23rd ITC, ITC ’11, pp 103–110, San Francisco, California. International Teletraffic Congress

  32. Hoßfeld T, Egger S, Schatz R, Fiedler M, Masuch K, Lorentzen C (2012) Initial delay vs. interruptions: between the devil and the deep blue sea. In: 2012 fourth international workshop on quality of multimedia experience (QoMEX), pp 1–6. Cited by 0013

  33. Hoßfeld T, Keimel C (2014) Crowdsourcing in QoE evaluation. In: Möller S, Raake A (eds) Quality of experience, T-Labs series in telecommunication services. Springer, Berlin, pp 315–327. https://doi.org/10.1007/978-3-319-02681-7_21

    Chapter  Google Scholar 

  34. Hoßfeld T, Skorin-Kapov L, Heegaard PE, Varela M (2017) Definition of QoE fairness in shared systems. IEEE Commun Lett 21(1):184–187

    Article  Google Scholar 

  35. Hoßfeld T, Strohmeier D, Raake A, Schatz R (2013) Pippi longstocking calculus for temporal stimuli pattern on Youtube QoE: 1+1=3 and 1.4\(\pm \)4.1. In: Proceedings of the 5th workshop on mobile video, MoVid ’13, pp 37–42, New York, NY, USA. ACM

  36. ITU-T. ITU-T Recommendation P.910 - Subjective video quality assessment methods for multimedia applications. 1995. 00000

  37. ITU-T. P.1305 : Effect of delays on the telemeeting quality, 2016

  38. P.1301 ITU-T RECOMMENDATION. ITU-P.1301 - Subjective quality evaluation of audio and audiovisual multiparty telemeetings, February 2013

  39. Jones C, Atkinson DJ (1998) Development of opinion-based audiovisual quality models for desktop video-teleconferencing. In: Proceedings of the 6th IWQoS, pp 196–203. IEEE

  40. Kim SJ, Chae CB, Lee JS (2012) Quality perception of coding artifacts and packet loss in networked video communications. In: 2012 IEEE Globecom workshops (GC Wkshps), pp 1357–1361

  41. Lee J-S, Goldmann L, Ebrahimi T(2011) A new analysis method for paired comparison and its application to 3d quality assessment. In: Proceedings of the 19th ACM international conference on multimedia, MM ’11, pp 1284–1284, New York, NY, USA. ACM

  42. Möller S (2000) Assessment and prediction of speech quality in telecommunications. Springer, Boston. https://doi.org/10.1007/978-1-4757-3117-0

    Book  Google Scholar 

  43. Ndiaye M, Larabi MC, Saadane H, Le Lay G, Perrine C, Quinquis C, Gros L (2015) Subjective assessment of the perceived quality of video calling services over a real LTE/4g network. In: Proceedings of the 7th QoMEX, pp 1–6

  44. Raake A, Schlegel C (2008) Auditory assessment of conversational speech quality of traditional and spatialized teleconferences. In: ITG conference on voice communication [8. ITG-Fachtagung], pp 1–4

  45. ITU-T RECOMMENDATION. ITU-R P.920-Interactive test methods for audiovisual communications (2000)

  46. ITU-T RECOMMENDATION. ITU-T G.1070 Opinion model for video-telephony applications, July 2012

  47. Reichl P, Egger S, Möller S, Kilkki K, Fiedler M, Hossfeld T, Tsiaras C, Asrese A (2015) Towards a comprehensive framework for QOE and user behavior modelling. In: Proceedings of the 7th QoMEX, pp 1–6

  48. Ribeiro Flávio, Florêncio Dinei, Zhang Cha, Seltzer Michael (2011) Crowdmos: An approach for crowdsourcing mean opinion score studies. In: 2011 IEEE international conference on acoustics, speech and signal processing (ICASSP), pp 2416–2419. IEEE

  49. Sacks H, Schegloff E, Jefferson G (1974) A simplest systematics for the organization of turn-taking for conversation. Language 50(4):696–735 10741

    Article  Google Scholar 

  50. Saidi I, Hang Lu, Barriac V, Deforges O (2016) Interactive vs. non-interactive subjective evaluation of IP network impairments on audiovisual quality in videoconferencing context. In: 2016 eighth international conference on quality of multimedia experience (QoMEX), pp 1–6

  51. Schmitt M, Gunkel S, Cesar P, Bulterman D (2014) Asymmetric delay in video-mediated group discussions. In: Proceedings of the 6th QoMEX, pp 19–24

  52. Schmitt M, Redi J, Cesar P, Bulterman D (2016) 1mbps is enough: video quality and individual idiosyncrasies in multiparty HD video-conferencing. In: Proceedings of the 8th QoMEX, pp 1–6. IEEE

  53. Schmitt MR, Redi J, Bulterman D, Cesar P (2017) Towards individual QoE for multi-party video conferencing. IEEE Trans Multimed PP(99):1–1

  54. Schmitt M, Gunkel S, Cesar P, Bulterman D (2014) Asymmetric delay in video-mediated group discussions. In: 2014 sixth international workshop on quality of multimedia experience (QoMEX), pp 19–24. IEEE

  55. Schmitt M, Gunkel S, Cesar P, Bulterman D (2014) The influence of interactivity patterns on the quality of experience in multi-party video-mediated conversations under symmetric delay conditions. In: Proceedings of the 3rd SAM, SAM ’14, pp 13–16, New York, NY, USA. ACM

  56. Schmitt M, Redi J, Cesar P (2016) Towards context-aware interactive quality of experience evaluation for audiovisual multiparty conferencing. In: Proceedings of the 5th PQS, Berlin, pp 64–68

  57. Schmitt M, Redi J, Cesar P, Bulterman D(2016) 1Mbps is enough: video quality and individual idiosyncrasies in multiparty HD video-conferencing. In: 2016 eighth international conference on quality of multimedia experience (QoMEX), pp 1–6. IEEE

  58. Schoenenberg K, Raake A, Lebreton P (September 2014) Conversational quality and visual interaction of video-telephony under synchronous and asynchronous transmission delay. In: Proceeding of the 6th QoMEX, pp 31–36

  59. Schoenenberg K, Raake A, Egger S, Schatz R (2014) On interaction behaviour in telephone conversations under transmission delay. Speech Commun 63–64:1–14 00002

    Article  Google Scholar 

  60. International Telecommunication Union Telecommunication Standardization Sector. ITU-T Recommendation P.911—Subjective audiovisual quality assessment methods for multimedia applications. International Telecommunication Union, 1998. 00000 Cited by 0000

  61. Seshadrinathan K, Bovik AC (2011) Temporal hysteresis model of time varying subjective video quality. In: 2011 IEEE international conference on acoustics, speech and signal processing (ICASSP), pp 1153–1156

  62. Skowronek J, Herlinghaus J, Raake A (2013) Quality assessment of asymmetric multiparty telephone conferences: a systematic method from technical degradations to perceived impairments. In: INTERSPEECH, pp 2604–2608

  63. Skowronek J, Raake A (2011) Investigating the effect of number of interlocutors on the quality of experience for multi-party audio conferencing. In: INTERSPEECH, pp 829–832

  64. Skowronek J, Raake A (2015) conceptual model of multiparty conferencing and telemeeting quality. In: Proceedings of the 7th QoMEX. IEEE

  65. Skowronek J, Raake A, Hoeldtke K, Geier M (2011) Speech recordings for systematic assessment of multi-party conferencing. In: Proceedings of forum acusticum, pp 111–116

  66. Vucic D, Skorin-Kapov L (2015) The impact of mobile device factors on QoE for multi-party video conferencing via WebRTC. In: 2015 13th international conference on telecommunications (Contel), pp 1–8

  67. Winkler S (2009) On the properties of subjective ratings in video quality experiments. In: International workshop on quality of multimedia experience, 2009. QoMEx, pp 139–144

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Marwin Schmitt.

Ethics declarations

Conflict of interest

On behalf of all authors, the corresponding author states that there is no conflict of interest.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Appendix

Appendix

Fig. 13
figure 13

Line plot comparing the individual quality ratings from the campaigns ‘both’ and ‘individual’

Fig. 14
figure 14

Mean of ratings per Internet connection type with 95% confidence intervals

Fig. 15
figure 15

Mean of ratings per Internet connection speed with 95% confidence intervals

See Figs.1314 and 15.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Schmitt, M., Bulterman, D.C.A. & Cesar, P.S. The contrast effect: QoE of mixed video-qualities at the same time. Qual User Exp 3, 7 (2018). https://doi.org/10.1007/s41233-018-0020-2

Download citation

  • Received:

  • Published:

  • DOI: https://doi.org/10.1007/s41233-018-0020-2

Keywords

Navigation