An Innovative Crowdsourcing Approach for ... - Semantic Scholar

Report 8 Downloads 127 Views
An Innovative Crowdsourcing Approach for Amazon Mechanical Turk

{tag} Volume 52 - Number 4

{/tag} International Journal of Computer Applications © 2012 by IJCA Journal

Year of Publication: 2012

Authors: Hanieh Javadi Khasraghi Shahriar Mohammadi

10.5120/8190-1556 {bibtex}pxc3881556.bib{/bibtex}

Abstract

Web2 and the evolving vision of Web3 have a great effect on facilitation of information sharing, information aggregation, interoperability, user-centered design, collaboration on the World Wide Web, and crowd-centered services. New concept of Web is the intuition that drives crowdsourcing, crowd servicing, and crowd computing. With crowdsourcing emergence people get motivated to work through internet without being limited by time or geographical location. On the other hand employers could have their jobs done faster and cheaper. This paper is going to introduce an innovative approach for Amazon Mechanical Turk (AMT) crowdsourcing marketplace. In current AMT marketplace, workers especially new ones need to qualify themselves for each requester that has submitted Human Intelligence Tasks (HITs) in AMT, and there is lack of shared reputation system; some workers may cheat on tasks in order to maximize their income, as a result requesters are uncertain of the quality of results, so they offer lower rewards and consequently qualified workers leave the marketplace. Because of the above shortcomings, we introduce a new approach for AMT crowdsourcing marketplace. In our proposed approach we offer to distribute HITs among Amazon's customers and ask them to work on tasks in exchange for discount. The distribution of HITs is based on customers' interests and skills that Amazon has this information in its database. Through our proposed approach the HITs will be done by more qualified people, and spammers will be

1/3

An Innovative Crowdsourcing Approach for Amazon Mechanical Turk

decreased to the minimum. This innovative approach is very efficient, time saving, and user friendly (because workers don't need to search for HITs of their interests).

ences

Refer

- Sheng, V. S. , Provost, F. , and Ipeirotis, P. G. 2008. Get another label? improving data quality and data mining using multiple, noisy labelers. In KDD '08: Proceeding of the 14th ACM SIGKDD international conference on Knowledge discovery and data mining, 614–622. New York, NY, USA: ACM. - Le, J. , Edmonds, A. , Hester, V. and Biewald, L. 2010. Ensuring quality in crowdsourced search relevance evaluation: The effects of training question distribution. In Proceedings of the ACM SIGIR 2010 Workshop on Crowdsourcing for Search Evaluation. - Kittur, A. , Chi, Ed H. , and Suh, B. 2008. Crowdsourcing user studies with Mechanical Turk. In Proceeding of CHI. April 5-10. Florence, Italy. ACM. - Howe, J. 2006. The Rise of Crowdsourcing. Wired. - Davis, J. 2011. From Crowdsourcing to Crowdservicing. IEEE Internet Computing. - Vukovic, M. 2009. Crowdsourcing for Enterprises, in Proceeding of the 2009 Congress on Services – I. IEEE Computer Society, Washington,DC, USA, 686-692. - Ipeirotis, P. 2010. Analyzing the Amazon Mechanical Turk Marketplace. - Ambati, V. , Vogel, S. , Carbonel, J. 2011. Towards Task Recommendation in Micro-Task Markets. AAAI Workshops, North America. - Quinn, A. J. , and Bederson, B. B. 2011. Human Computation: A Survey and Taxonomy of a Growing Field. In proceeding of CHI. - Fort, K. , Adda, G. and Cohen, K. B. 2011. Amazon Mechanical Turk: Gold Mine or Coal Mine? In Association for Computational Linguistics, Volume 37, Number2. - Geiger, D. , Seedorf, S. , Schulze, T. , Nickerson, R. 2011. Managing the Crowds: Towards a Taxonomy of Crowdsourcing Processes, In Processing of the seventeenth Americas Conference on Information Systems. Pan, Y. , Blevis, E. 2011. A Survey of Crowdsourcing as means of Collaboration and Implications of Crowdsourcing for Interaction Design, IEEE. - Chen, J. , Menezes, N. , Bradley, A. 2011. Opportunities for Crowdsourcing Research on Amazon Mechanical Turk, Amazon. com,Inc. - Paolacci, G. , Chandler, J. , Ipeirotis, P. 2010. Running experiments on Amazon Mechanical Turk, Judgment and Decision Making, Vol. 5, No. 5. - Shaw, A. , Horton, J. , Chen, D. 2011. Designing Incentives for Inexpert Human Raters, CSCW 2011, March 19-23, 2011, Hangzhou, China. - Khanna, Sh. , Ratan, A. , Davis, J. , Thies, W. 2010. Evaluating and Improving the Usability of Mechanical Turk for Low-Income Workers in India, ACM DEV'10, December 17–18, 2010, London, United Kingdom. - Chen, J. , Menezes, N. , and Bradley, A. 2011. Opportunities for Crowdsourcing Research on Amazon Mechanical Turk. In proceeding of CHI 2011 Workshop on Crowdsourcing and Human Computation - Eickhoff, C. , De Vries, A. 2011. How crowdsourcable is your task? In Proceedings of the Workshop on Crowdsourcing for Search and Data Mining (CSDM) at the Fourth ACM

2/3

An Innovative Crowdsourcing Approach for Amazon Mechanical Turk

International Conference on Web Search and Data Mining (WSDM), pp. 11–14. - Dow, S. , Kulkarni, A. , Bunge, B. , Nguyen, T. , Klemmer, S. and Hartmann, B. 2011. Shepherding the crowd: managing and providing feedback to crowd workers. In Proceedings of the CHI Extended Abstracts on Human factors in computing Systems. New York, NY, USA. ACM. 1669–1674. - Brabham, D. 2008. Crowdsourcing as a model for problem solving. The International Journal of Research into New Media Technologies. Vol 14(1). pp. 75-90. - Ipeirotis, P. G. , Provost, F. , & Wang, J. 2010. Quality management on Amazon Mechanical Turk. HCOMP'10. - Kittur, A. , Kraut, R. E. 2008. Harnessing the wisdom of crowds in wikipedia: quality through coordination, in Proceedings of the ACM Conference on Computer Supported Cooperative Work, San Diego, California, USA. - Zaidan, O. F. Callison-Burch, Ch. 2011. Crowdsourcing Translation: Professional Quality from Non-Professionals. In Proceeding of 49th Annual Meeting of the Association for Computational Linguistics. Portland, Oregan. June 19-24. Association for Computational Linguistics. pp. 1220-1229. Computer Science

Index Terms Algorithms

Keywords

Crowdsourcing Amazon Mechanical Turk (AMT) Human Intelligence Tasks (HITs)

Classifying Distributing

3/3