Advertisement

Impact of Referral Incentives on Mobile App Reviews

  • Noor Abu-El-RubEmail author
  • Amanda Minnich
  • Abdullah Mueen
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10360)

Abstract

Product owners occasionally provide referral incentives to the customers (e.g. coupons, bonus points, referral rewards). However, clever customers can write their referral codes in online review pages to maximize incentives. While these reviews are beneficial for both writers and product owners, the core motivation behind such reviews is monetary as opposed to helping potential customers. In this paper, we analyze referral reviews in the Google Play store and identify groups of users that have been consistently taking part in writing such abusive reviews. We further explore how such referral reviews indeed help the mobile apps in gaining popularity when compared to apps that do not provide incentives. We also find an increasing trend in the number of apps being targeted by abusers, which, if continued, will render review systems as crowd advertising platforms rather than an unbiased source of helpful information.

Keywords

Online reviews Advertisement Referrals Google play Abuse Cliques Incentives 

References

  1. 1.
  2. 2.
    Google: developer console promotional code terms of service (2016). https://play.google.com/about/promo-code-developer-terms.html
  3. 3.
    Akoglu, L., Chandy, R., Faloutsos, C.: Opinion fraud detection in online reviews by network effects. In: Proceedings of ICWSM, pp. 2–11 (2013)Google Scholar
  4. 4.
    Chau, D.H., Pandit, S., Faloutsos, C.: Detecting fraudulent personalities in networks of online auctioneers. In: Fürnkranz, J., Scheffer, T., Spiliopoulou, M. (eds.) PKDD 2006. LNCS (LNAI), vol. 4213, pp. 103–114. Springer, Heidelberg (2006). doi: 10.1007/11871637_14 CrossRefGoogle Scholar
  5. 5.
    Wang, G., Xie, S., Liu, B., Yu, P.S.: Review graph based online store review spammer detection. In: Proceedings - IEEE International Conference on Data Mining, ICDM 2011, pp. 1242–1247 (2011)Google Scholar
  6. 6.
    Ott, M., Choi, Y., Cardie, C., Hancock, J.T.: Finding deceptive opinion spam by any stretch of the imagination. In: Proceedings of HLT 2011, pp. 309–319 (2011)Google Scholar
  7. 7.
    Jindal, N., Liu, B.: Opinion spam and analysis. In: Proceedings of WSDM 2008, pp. 219–230. ACM, New York (2008)Google Scholar
  8. 8.
    Sun, H., Morales, A., Yan, X.: Synthetic review spamming and defense. In: KDD 2013, p. 1088 (2013)Google Scholar
  9. 9.
    Xie, S., Wang, G., Lin, S., Yu, P.S.: Review spam detection via temporal pattern discovery. In: Proceedings of KDD 2012, p. 823 (2012)Google Scholar
  10. 10.
    Fei, G., Mukherjee, A., Liu, B., Hsu, M., Castellanos, M., Ghosh, R.: Exploiting burstiness in reviews for review spammer detection. In: Proceedings of ICWSM, pp. 175–184 (2013)Google Scholar
  11. 11.
    Rahman, M., Rahman, M., Carbunar, B., Duen, H., Chau, G., Tech: fairplay: fraud and malware detection in google play. In: Proceedings of the 2016 SIAM International Conference on Data Mining, pp. 99–107. SIAM (2016)Google Scholar
  12. 12.
    Supporting Page - supporting webpage containing experimental result, data and code. http://www.cs.unm.edu/~nabuelrub/referral_reviews/
  13. 13.
    Minnich, A.J., Chavoshi, N., Mueen, A., Luan, S., Faloutsos, M.: TrueView: harnessing the power of multiple review sites. In: Proceedings of WWW 2015, pp. 787–797 (2015)Google Scholar
  14. 14.
    Wagner, R.A., Fischer, M.J.: The string-to-string correction problem. J. ACM (JACM) 21(1), 168–173 (1974)MathSciNetCrossRefzbMATHGoogle Scholar

Copyright information

© Springer International Publishing AG 2017

Authors and Affiliations

  • Noor Abu-El-Rub
    • 1
    Email author
  • Amanda Minnich
    • 1
  • Abdullah Mueen
    • 1
  1. 1.University of New MexicoAlbuquerqueUSA

Personalised recommendations