Skip to main content

Indexing with Crowds

  • Reference work entry
  • First Online:
  • 42 Accesses

Synonyms

Crowdsourced indexing; Indexing techniques

Definition

Humans are better than computers in performing certain tasks, e.g., in understanding images and videos, in estimating prices of antique items, as well as in performing tasks that demand subjective judgement, e.g., ranking flowers or butterflies in terms of beauty, wilderness, or smell. Using crowd sourcing, one can delegate these tasks to humans. However, with large numbers of tasks, costs and response times can be large. Typically, database systems use a large variety of indexing techniques to speed up the search and the manipulation, e.g., update, insert, and delete, of large sets of data items. The question is: Are database indexing techniques applicable when humans are in the loop for such tasks that demand subjective judgement? The answer is in the affirmative. Indeed, several applications integrate database indexing alongside with crowdsourcing to reduce the overall task cost while still improving the quality of task...

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   4,499.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Hardcover Book
USD   6,499.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Recommended Reading

  1. Chow C-Y, Mokbel MF, Aref WG. Casper*: query processing for location services without compromising privacy. ACM Trans Database Syst. 2009;34(4):Article 24.

    Article  Google Scholar 

  2. Das Sarma A, Parameswaran A, Garcia-Molina H, Halevy A. Crowd-powered find algorithms. In: Proceedings of the 30th International Conference on Data Engineering; 2014. p. 964–75.

    Google Scholar 

  3. Davidson SB, Khanna S, Milo T, Roy S. Using the crowd for top-k and group-by queries. In: Proceedings of the 16th International Conference on Database Theory; 2013. p. 225–36.

    Google Scholar 

  4. Eltabakh MY, Aref WG, Elmagarmid AK, Silva YN, Ouzzani M. Supporting real-world activities in database management systems. In: Proceedings of the 26th International Conference on Data Engineering; 2010. p. 808–11.

    Google Scholar 

  5. Eltabakh MY, Aref WG, Elmagarmid AK, Silva YN, Ouzzani M. Handson db: managing data dependencies involving human actions. In: IEEE Trans Knowl Data Eng. 2013.

    Google Scholar 

  6. Finin T, Murnane W, Karandikar A, Keller N, Martineau J, Dredze M. Annotating named entities in twitter data with crowdsourcing. In: Proceedings of the NAACL HLT 2010 Workshop on Creating Speech and Language Data with Amazon’s Mechanical Turk; 2010. p. 80–8.

    Google Scholar 

  7. Franklin MJ, Kossmann D, Kraska T, Ramesh S, Xin R. CrowdDB: answering queries with crowdsourcing. In: Proceedings of the ACM SIGMOD International Conference on Management of Data; 2011. p. 61–72.

    Google Scholar 

  8. Guo S, Parameswaran AG, Garcia-Molina H. So who won? Dynamic max discovery with the crowd. In: Proceedings of the ACM SIGMOD International Conference on Management of Data; 2012. p. 385–96.

    Google Scholar 

  9. Kazemi L, Shahabi C. Geocrowd: enabling query answering with spatial crowdsourcing. In: Proceedings of the 20th SIGSPATIAL ACM International Symposium on Advances in Geographic Information Systems; 2012. p. 189–98.

    Google Scholar 

  10. Lofi C, El Maarry K, Balke W-T. Skyline queries in crowd-enabled databases. In: Proceedings of the 16th International Conference on Extending Database Technology; 2013. p. 465–76.

    Google Scholar 

  11. Mahmood AR, Aref WG, Dragut EC, Basalamah S. The palm-tree index: indexing with the crowd. In: Proceedings of the 1st VLDB Workshop on Databases and Crowdsourcing; 2013.

    Google Scholar 

  12. Marcus A, Wu E, Karger D, Madden S, Miller R. Human-powered sorts and joins. Proc VLDB Endowment. 2011;5(1):13–24.

    Article  Google Scholar 

  13. Marcus A, Wu E, Karger DR, Madden S, Miller RC. Crowdsourced databases: query processing with people. In: Proceedings of the 5th Biennial Conference on Innovative Data Systems Research; 2011.

    Google Scholar 

  14. Marcus A, Karger D, Madden S, Miller R, Oh S. Counting with the crowd. Proc VLDB Endowment. 6(2):109–20.

    Article  Google Scholar 

  15. Nowak S, Rüger S. How reliable are annotations via crowdsourcing: a study about inter-annotator agreement for multi-label image annotation. In: Proceedings of the 11th International Conference on Multimedia Information Retrieval. 2010. p. 557–66.

    Google Scholar 

  16. Park H, Pang R, Parameswaran AG, Garcia-Molina H, Polyzotis N, Widom J. Deco: a system for declarative crowdsourcing. Proc VLDB Endowment. 2012;5(12):1990–3.

    Article  Google Scholar 

  17. Roy SB, Lykourentzou I, Thirumuruganathan S, Amer-Yahia S, Das G, et al. Crowds, not drones: modeling human factors in interactive crowdsourcing. In: Proceedings of the 1st VLDB Workshop on Databases and Crowdsourcing; 2013. p. 39–42.

    Google Scholar 

  18. Roy SB, Lykourentzou I, Thirumuruganathan S, Amer-Yahia S, Das G. Optimization in knowledge-intensive crowdsourcing. CoRR; 2014. abs/1401.1302.

    Google Scholar 

  19. To H, Ghinita G, Shahabi C. A framework for protecting worker location privacy in spatial crowdsourcing. Proc VLDB Endow. 2014;7:919–30.

    Article  Google Scholar 

  20. Venetis P, Garcia-Molina H, Huang K, Polyzotis N. Max algorithms in crowdsourcing environments. In: Proceedings of the 21st International World Wide Web Conference; 2012. p. 989–98.

    Google Scholar 

  21. Vondrick C, Patterson D, Ramanan D. Efficiently scaling up crowdsourced video annotation. Int J Comput Vis. 2013;101(1):184–204.

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Ahmed R. Mahmood .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2018 Springer Science+Business Media, LLC, part of Springer Nature

About this entry

Check for updates. Verify currency and authenticity via CrossMark

Cite this entry

Mahmood, A.R., Aref, W.G., Basalamah, S. (2018). Indexing with Crowds. In: Liu, L., Özsu, M.T. (eds) Encyclopedia of Database Systems. Springer, New York, NY. https://doi.org/10.1007/978-1-4614-8265-9_80656

Download citation

Publish with us

Policies and ethics