Synonyms
Crowdsourced indexing; Indexing techniques
Definition
Humans are better than computers in performing certain tasks, e.g., in understanding images and videos, in estimating prices of antique items, as well as in performing tasks that demand subjective judgement, e.g., ranking flowers or butterflies in terms of beauty, wilderness, or smell. Using crowd sourcing, one can delegate these tasks to humans. However, with large numbers of tasks, costs and response times can be large. Typically, database systems use a large variety of indexing techniques to speed up the search and the manipulation, e.g., update, insert, and delete, of large sets of data items. The question is: Are database indexing techniques applicable when humans are in the loop for such tasks that demand subjective judgement? The answer is in the affirmative. Indeed, several applications integrate database indexing alongside with crowdsourcing to reduce the overall task cost while still improving the quality of task...
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsRecommended Reading
Chow C-Y, Mokbel MF, Aref WG. Casper*: query processing for location services without compromising privacy. ACM Trans Database Syst. 2009;34(4):Article 24.
Das Sarma A, Parameswaran A, Garcia-Molina H, Halevy A. Crowd-powered find algorithms. In: Proceedings of the 30th International Conference on Data Engineering; 2014. p. 964–75.
Davidson SB, Khanna S, Milo T, Roy S. Using the crowd for top-k and group-by queries. In: Proceedings of the 16th International Conference on Database Theory; 2013. p. 225–36.
Eltabakh MY, Aref WG, Elmagarmid AK, Silva YN, Ouzzani M. Supporting real-world activities in database management systems. In: Proceedings of the 26th International Conference on Data Engineering; 2010. p. 808–11.
Eltabakh MY, Aref WG, Elmagarmid AK, Silva YN, Ouzzani M. Handson db: managing data dependencies involving human actions. In: IEEE Trans Knowl Data Eng. 2013.
Finin T, Murnane W, Karandikar A, Keller N, Martineau J, Dredze M. Annotating named entities in twitter data with crowdsourcing. In: Proceedings of the NAACL HLT 2010 Workshop on Creating Speech and Language Data with Amazon’s Mechanical Turk; 2010. p. 80–8.
Franklin MJ, Kossmann D, Kraska T, Ramesh S, Xin R. CrowdDB: answering queries with crowdsourcing. In: Proceedings of the ACM SIGMOD International Conference on Management of Data; 2011. p. 61–72.
Guo S, Parameswaran AG, Garcia-Molina H. So who won? Dynamic max discovery with the crowd. In: Proceedings of the ACM SIGMOD International Conference on Management of Data; 2012. p. 385–96.
Kazemi L, Shahabi C. Geocrowd: enabling query answering with spatial crowdsourcing. In: Proceedings of the 20th SIGSPATIAL ACM International Symposium on Advances in Geographic Information Systems; 2012. p. 189–98.
Lofi C, El Maarry K, Balke W-T. Skyline queries in crowd-enabled databases. In: Proceedings of the 16th International Conference on Extending Database Technology; 2013. p. 465–76.
Mahmood AR, Aref WG, Dragut EC, Basalamah S. The palm-tree index: indexing with the crowd. In: Proceedings of the 1st VLDB Workshop on Databases and Crowdsourcing; 2013.
Marcus A, Wu E, Karger D, Madden S, Miller R. Human-powered sorts and joins. Proc VLDB Endowment. 2011;5(1):13–24.
Marcus A, Wu E, Karger DR, Madden S, Miller RC. Crowdsourced databases: query processing with people. In: Proceedings of the 5th Biennial Conference on Innovative Data Systems Research; 2011.
Marcus A, Karger D, Madden S, Miller R, Oh S. Counting with the crowd. Proc VLDB Endowment. 6(2):109–20.
Nowak S, Rüger S. How reliable are annotations via crowdsourcing: a study about inter-annotator agreement for multi-label image annotation. In: Proceedings of the 11th International Conference on Multimedia Information Retrieval. 2010. p. 557–66.
Park H, Pang R, Parameswaran AG, Garcia-Molina H, Polyzotis N, Widom J. Deco: a system for declarative crowdsourcing. Proc VLDB Endowment. 2012;5(12):1990–3.
Roy SB, Lykourentzou I, Thirumuruganathan S, Amer-Yahia S, Das G, et al. Crowds, not drones: modeling human factors in interactive crowdsourcing. In: Proceedings of the 1st VLDB Workshop on Databases and Crowdsourcing; 2013. p. 39–42.
Roy SB, Lykourentzou I, Thirumuruganathan S, Amer-Yahia S, Das G. Optimization in knowledge-intensive crowdsourcing. CoRR; 2014. abs/1401.1302.
To H, Ghinita G, Shahabi C. A framework for protecting worker location privacy in spatial crowdsourcing. Proc VLDB Endow. 2014;7:919–30.
Venetis P, Garcia-Molina H, Huang K, Polyzotis N. Max algorithms in crowdsourcing environments. In: Proceedings of the 21st International World Wide Web Conference; 2012. p. 989–98.
Vondrick C, Patterson D, Ramanan D. Efficiently scaling up crowdsourced video annotation. Int J Comput Vis. 2013;101(1):184–204.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2018 Springer Science+Business Media, LLC, part of Springer Nature
About this entry
Cite this entry
Mahmood, A.R., Aref, W.G., Basalamah, S. (2018). Indexing with Crowds. In: Liu, L., Özsu, M.T. (eds) Encyclopedia of Database Systems. Springer, New York, NY. https://doi.org/10.1007/978-1-4614-8265-9_80656
Download citation
DOI: https://doi.org/10.1007/978-1-4614-8265-9_80656
Published:
Publisher Name: Springer, New York, NY
Print ISBN: 978-1-4614-8266-6
Online ISBN: 978-1-4614-8265-9
eBook Packages: Computer ScienceReference Module Computer Science and Engineering