Introduction

Crowdsourcing, the practice of obtaining services, ideas, or content from a large often self defined group, has gained a lot of attention recently, as a way to organize anything from innovation processes to crisis management:  Private companies and non-governmental organizations (NGOs) are utilizing crowdsourcing technologies to collect multiple solutions [26], and distributed pieces of work to crowds of workers [20]. Citizens are engaged in a more participatory and open government [14], such as collaborative policymaking [2], and participatory budgeting [18]. Governments are using wiki-technologies to enhance collective intelligence [4], and source government information [12]. Citizen science engages the public to collect or improve research data [5, 19, 20, 11, 32], and to participate in analysis [6,7,2]. Recent natural catastrophes have raised an interest in involving crowds of civilians in data sourcing [15, 23, 31] as well as performing physical activities during crisis situations [21].

The development of new types of working relationships has also been problematized for several reasons. For example, Irani and Silberman [16] have questioned crowdworker dynamics from a labor rights perspective, leading to calls for collective action by crowd workers [27]. Martin et. al’s [22] analysis of the online discussion in the community of workers at the Mechanical Turk, shows the tensions between the dividing logic of the system and the information-sharing processes in the community. Gupta et. al’s [13] study of Indian workers shows how English and computer literacy, as well as availability of technological resources, are taken for granted by platform designers and users, and can amount to workers losing out on work or creating a bad reputation for themselves. Other conflicts suggested by the research subjects concern rejected work, slow or unfair payments, lack of transparency and technical problems [30]. Especially the wage is a conflict area among “web workers”[3]. Some design research also suggests means to empower crowdworkers and make them more visible [17].

Digital differentiation and inequalities within the crowd becomes problematic when crowdsourcing is used for democratic purposes [14]. Studies of Amazon Mechanical Turk [11], Wikipedia [25] and Twitter [9] indicate a lack of representativeness in terms of age, gender and education. According to Menking and Erikson [24], women face marked obstacles to effective participation in Wikipedia. Cultural geographers have also pointed out the hegemonic discourses and socio-spatial relations in the geographic web [8, 31, 28, 33].

Research on conflicts within groups online tend to focus on communities like in open source development [10], professional groups, or learning processes [1]. However, there is a need of more empirical research on the conflicts and dynamics within the more fluid work relations in crowd work.

This workshop aims to bring together researchers from different disciplines in an effort to critically explore the conflicts and tensions in the field of crowd-work. These can for example be about;

  • The negotiation of order that take place in the social interaction in the crowd setting, like conflicts between informal group interactions within the crowdsourcing initiative and the technical roles afforded by the platform.
  • The performance of identity over distributed contexts and cultures, for example how new types of labor roles are enacted in self presentations and communication practices.
  • The symbolic representation of places and identities, like the unequal representation of place in geo-mapping contexts.
  • It can also be about conflicts due to segregation, where the crowd work is undermining established work divisions and creating new ones, e.g. [16].

We invite both empirical and theoretical work, position papers and works in progress. We encourage a mix of methods and theoretical frameworks in examining aspects of conflicts and contradictions in crowdsourcing contexts from different perspectives. In this one-day workshop we will explore the topics in interactive mini-flash presentations and brainstorming sessions. The workshop will result in a research agenda, in which we, as a part of the CHI community, present a framework for addressing these questions.

The workshop builds on four earlier successful workshops: “Back to the Future of Organizational Work: Crowdsourcing Digital Work Marketplaces” and “Structures for Knowledge Co-creation between Organizations and the Public” hosted at ACM CSCW 2014, “The Morphing Organization – Rethinking Groupwork Systems in the Era of Crowdwork” hosted at ACM GROUP 2014, and “Examining the Essence of the Crowds: Motivations, Roles and Identities” at ECSCW 2015.

References

  1. Aleksi Aaltonen and Jannis Kallinikos. 2012. Coordination and Learning In Wikipedia: Revisiting the Dynamics of Exploitation and Exploration. In M. Holmqvist & A. Spicer (Eds.), Managing “Human Resources” by Exploiting and Exploring People’s Potentials Research in the Sociology of Organizations (Vol. 37, pp. 161–192). Emerald Group Publishing Limited.
  2. Tanja Aitamurto and Hélène Landemore. 2015. Five design principles for crowdsourced policymaking: Assessing the case of crowdsourced off-­road traffic law in Finland. Journal of Social Media for Organizations 2(1).
  3. Ben B. Bederson and Alexander J. Quinn. 2011. Web Workers Unite ! Addressing Challenges of Online Laborers. Human Factors, 97–105.
  4. Eli Ben and  Jim Hutchins. 2010. Intelligence after Intellipedia : Improving the push pull balance with a social networking utility. Research Report in Information Science. Technology Directoratet. Defense Technical Information Center, February 2010.
  5. William T. Causer, and Valerie Wallace. 2012. Building a volunteer community: results and findings from Transcribe Bentham. Digital Humanities Quarterly, 6.
  6. Seth Cooper. 2014. A framework for scientific discovery through video games. ACM and Morgan Claypool, NY NY USA.
  7. Seth Cooper, Firas Khatib, Ilya Makedon, Hao Lu, Janos Barbero, David Baker, James Fogarty, Zoran Popović, and Foldit players. 2011. Analysis of social gameplay macros in the Foldit cookbook. FDG 2011, 9-14.
  8. Jeremy W. Crampton, Mark Graham, Ate Poorthuis, Taylor Shelton, Monica Stephens, Matthew W. Wilson, & Matthew Zook. 2013. Beyond the geotag: situating “big data” and leveraging the potential of the geoweb. Cartography and Geographic Information Science, 40(2), 130–139.
  9. Maeve Duggan, Nicole B. Ellison, Cliff Lampe, Amanda L. Lenhart, and Mary Madden. 2015. Demographics of Key Social Networking Platforms. Pew Research Center. Retrieved July 8, 2015, from http://www.pewinternet.org/2015/01/09/demographics-of-key-social-networking-platforms-2/
  10. Anna Filippova & Hichang Cho. 2015. Mudslinging and Manners. In Proceedings of the 18th ACM Conference on Computer Supported Cooperative Work & Social Computing – CSCW ’15 (pp. 1393–1403). New York, New York, USA: ACM Press.
  11. Karën Fort, Ada Gilles, K. Bretonnel Cohen. 2011. Amazon Mechanical Turk: Gold Mine or Coal Mine? Computational Linguistics 37(2), 413-420.
  12. Toby Fyfe & Paul Crookall. 2010. Social media and public sector policy dilemmas. Toronto: Institute of Public Administration of Canada.
  13. Neha Gupta, David Martin, Benjamin V. Hanrahan, and Jackie O’Neill. 2014. Turk-life in India. In Proceedings of the 18th International Conference on Supporting Group Work, 1- 11.
  14. Karin Hansson, Kheira Belkacem and Love Ekenberg. 2014. Open government and democracy: A research review. Social Science Computer Review, (December).
  15. Amanda L. Hughes, Lisa A. St. Denis, Leysia Palen, and Ken M. Anderson. 2014. Online public communications by police and fire services during 2012 Hurricane Sandy. CHI 2014, 1505-1514.
  16. Lilly Irani and M. Six Silberman. 2014. From critical design to critical infrastructure: Lessons from turkopticon. interactions 21(4), 32-35.
  17. Lilly Irani & M. Six Silberman. 2013. Turkopticon: Interrupting worker invisibility in amazon mechanical turk. Proceedings of the SIGCHI Conference, 611–620.
  18. Jyldyz Kasymova. 2013. Reforming local government in developing countries: Implementation of a participatory budgeting process in Kyrgyzstan. The State University of New Jersey.
  19. Ece Kamar, Severin Hacker, and Eric Horvitz. 2012. Combining human and machine intelligence in large-­scale crowdsourcing. In Proceedings of the 11th International Conference on Autonomous Agents and Multiagent Systems-­Volume 1 (pp. 467-­474). International Foundation for Autonomous Agents and Multiagent Systems.
  20. Aniket Kittur,  Jeffrey V. Nickerson, Michael Bernstein, Elizabeth Gerber, Aaron Shaw, John Zimmerman, Matt Lease, and John Horton. 2013. The future of crowd work. Proceedings of the 2013 conference on Computer supported cooperative work. ACM, 2013.
  21. Thomas Ludwig, Christian Reuter, Tim Siebigteroth, and Volkmar Pipek. 2015. CrowdMonitor: Mobile Crowd Sensing for Assessing Physical and Digital Activities of Citizens during Emergencies. In: Proceedings of the 33th International Conference on Human Factors in Computing Systems (CHI ’15), Seoul, South Korea, ACM-Press.
  22. David Martin, Benjamin V. Hanrahan, Jackie O’Neill, & Neha Gupta. 2014. Being a turker. Proceedings of the 17th ACM Conference on Computer Supported Cooperative Work & Social Computing – CSCW ’14, 224–235.
  23. Jim McKay. 2014. How Sandy changed social media strategies in New York City. Emergency Management. Retrieved February 21, 2014, from http://www.emergencymgmt.com/disaster/Sandy-Social-Media-Strategies-New-York-City.html
  24. Amanda Menking and Ingrid Erickson. 2015. The heartwork of Wikipedia: Gendered, emotional labor in the world’s largest online encyclopedia. Proc CHI 2015, 207-210.
  25. Felipe Ortega, Jesus M Gonzalez-Barahona and Gregorio Robles. 2008. On the inequality of contributions to Wikipedia. In Proceedings of the 41st Annual Hawaii International Conference on System Sciences (HICSS 2008). Waikoloa, HI.  
  26. Daniela Retelny, Sebastien Robaszkiewicz, Alexandra To, Walter Lasecki, Jay Patel, Negar Rahmati, Tulsee Doshi, Melissa Valentine, Michael S. Bernstein. 2014. Expert crowdsourcing with flash teams. Proceedings of the 27th annual ACM symposium on User interface software and technology. ACM, 2014.
  27. Niloufar Salehi, Lilly C. Irani, Michael S. Bernstein, Ali Alkhatib, Eva Ogbe, Kristy Milland, and Clickhappier. 2015. We are Dynamo: Overcoming stalling and friction in collective action for crowd workers. CHI 2015, 1621-1630.
  28. Taylor Shelton, Ate Poorthuis, Mark Graham, & Matthew Zook. 2014. Mapping the data shadows of hurricane Sandy: Uncovering the sociospatial dimensions of “ Big Data .” Geoforum, Forthcoming.
  29. Tomer Simon, Avishay Goldberg, and Bruria Adini. 2015. Socializing in emergencies — A review of the use of social media in emergency situations. J. Info. Mgmt. 35(5), 609-619.
  30. Six Silberman, Joel Ross, Lilly Irani, & Bill Tomlinson. 2010. Sellers’ problems in human computation markets. In Proceedings of the ACM SIGKDD Workshop on Human Computation – HCOMP ’10 (p. 18). New York, New York, USA: ACM.
  31. Robert Soden & Leysia Palen. 2014. From Crowdsourced Mapping to Community Mapping: The Post-Earthquake Work of OpenStreetMap Haiti. In 11th International Conference on the Design of Cooperative Systems.
  32. Andrea Wiggins and Kevin Crowston. 2012. Goals and tasks: Two typologies of citizen science projects. HICSS 2012, 3426-3435.
  33. Matthew Zook, Mark Graham, & Andrew Boulton. 2015. Crowd-Sourced Augmented Realities: Social Media and the Power of Digital Representation. In S. P. Mains, J. Cupples, & C. Lukinbeal (Eds.), Mediated Geographies and Geographies of Media. Springer.