Crowded: Digital Piecework and the Politics of Platform Responsibility in Precarious Times

 

Are you a crowdworker willing to be interviewed or take a survey? We’re looking to hear from you! Contact us at: msrcrowd@microsoft.com

ہم اردو بولتے ہیں

ನಾವು ಕನ್ನಡ ಮಾತನಾಡಲು

 

Project Description

The hope that technological innovations will create good, middle-class jobs is part of government and city policies around the world, from New York to New Delhi. At the same time, machine learning and crowdsourcing are being heralded as the future; as Kittur et al. (2013) said, “Paid crowd work offers remarkable opportunities for improving productivity, social mobility, and the global economy… But it is also possible that crowd work will fail to achieve its potential, focusing on assembly-line piecework.” It is this tension that this project looks to explore.

Crowded: Digital Piecework and the Politics of Platform Responsibility in Precarious Times looks as crowdsourcing as a focal point for many of the issues that are raised by the structure of our current information economy: economic value, cultural meaning, and ethics. Crowdsourcing has become fundamental to the everyday working of many popular internet sites and, increasingly, offline workflows. Travel sites rely on individual reviews to attract visitors to their websites; doctors rely on accessing inexpensive transcription services posted, completed, and turned around entirely online; busy professionals hire local TaskRabbits to pick up dry cleaning or put together Ikea furniture. Much of the discussion focuses on the strength of super computer clusters to drive automation, on the one hand, and, on the other, the power of crowds to produce content as they work, surf, comment, or shop.

Crowdwork—small tasks distributed and completed online, done in minutes for pennies a pop—ranges from matching images and product descriptions listed on commercial websites to writing content for websites and other online platforms. Yet the discussion of Internet-based microlabor tends to concentrate—for better or worse-- on the fate of skilled, creative knowledge workers. When talk turns to the impact of crowdsourcing on industries ranging from animation to the news, it either celebrates how easy and efficient it is to tap a readily available pool of online labor, or complains that skilled creative workers are facing a restructuring of their professional environments and identities.

But what about people earning money doing low-to-no skill, one-off tasks through digital portals? The labor pools that contribute to digital media economies come from a distributed (and arguably disjointed) workforce. These workers do these tasks for many reasons, ranging from killing time to putting food on the table. Their working conditions are so varied that not only is it difficult to refer to a singular “work environment”, but these differences also make it difficult for crowdworkers to feel enough similarity to build a shared identity.

Methodological Overview

This project draws on findings from a year-long, comparative ethnographic and quantitative study of the labor exchanged through crowdsourcing platforms. Specifically, we look at the participants exchanging work through three major crowdsourcing platforms:

  • Amazon.com's Mechanical Turk (AMT)
  • Microsoft’s Universal Human Relevance System (UHRS)
  • MobileWorks, a startup with a social and entrepreneurial mission

We will be doing participant observation and in-depth interviews with the following groups:

  • Those who assign tasks in India and the United States
  • Engineers and employers shaping the platforms

We will also be examining the following:

  • Large survey sets gathered from AMT, UHRS, and MobileWorks
  • An analysis of the backend data produced by workflows on UHRS and MobileWorks
  • Studies of worker discussion forums
  • Industry rhetoric and practices organizing the crowdsourcing labor market

Research Questions

Drawing on the sources listed above, we ask:

  • What kind of cultural practice is crowdsourcing and under what conditions does it take place?
  • Who participates in crowdsourcing and what does it mean to them?
  • How do the contemporary experiences and understandings of crowdsourcing resonate with the role of human intelligence and its relationship to automation in the history of technological innovation?
  • Tapping the crowd’s wisdom is one way to conceptualize crowdsourcing practices., However, “piecework”, the practice of paying for work by the task completed instead of by the time it takes to do it, is another frame that could be brought to bear on this work, particularly given its historical links to gendered and regional labor patterns. If we use this framing, how does it change how we understand work, our connections to others in work environments, and the commodities produced in a digital information service economy?
  • How does knowing more about the lives of crowdsourcing platform workers, from homeless people in the United States to upwardly mobile tech workers in urban India, better equip us to consider the political and ethical stakes of the very material labor digital capitalism produces?

Crowded offers a way to think through the politics and ethics of crowdsourcing, the responsibilities of commercial platforms providing such services, and the material labor microtasks reflect.

Research design overview

  • ~3,000 extended surveys/~1,000 survey respondents for each crowdsourcing platform contextualized through and correlated with U.S. Census and Census of India data (gathered from volunteers from 2013-2014, recruited from image-tagging, classification, “spam,” translation, survey, and transcription tasks).
  • Interviews with ~80 survey respondents (40 in each country); additional, extended interviews with a subset of survey respondents (~20 in each country); participant observation among a subset of the extended interview participants (~15 in each country) over a period of 3 months. Supplemental ethnographic interviews and observation of participants found through survey respondents, at locations identified as “hubs” for crowdsourcing, and found through online discussion forums.
  • Textual analysis of crowdwork online discussion forums.
  • Analysis of meta-data aggregated by the UHRS and MobileWorks platforms that detail the workflows of crowdwork.
  • Open-ended, in-depth interviews with engineers designing crowdsourcing platform solutions and individuals requesting work through the platforms.

 

Research Team