Can crowd work (or citizen science) become a career option?

From Carnegie Mellon University

Research could ensure that crowd work becomes a career option, not a dead end Carnegie Mellon scientists and other crowd work researchers issue call to action.

PITTSBURGH—Crowdsourcing is an effective way to mobilize people to accomplish tasks on a global scale, but some researchers fear that crowd work for pay could easily become the high-tech equivalent of a sweat shop. Trivial work for rock bottom pay isn’t inevitable, however, and they’ve outlined a research agenda to make crowd work both intellectually and monetarily rewarding.

Leading researchers in crowd work from Carnegie Mellon University and other institutions will present their plan, hashed out in a special workshop last spring, at the Association for Computing Machinery’s Conference on Computer Supported Cooperative Work and Social Computing, CSCW 2013, Feb. 27 in San Antonio, Texas.

Finding ways to enhance collaboration, incorporate artificial intelligence and create ways for workers to build reputations are among the research challenges ahead.

“When my baby daughter was born I asked myself, ‘would I be proud to see her grow up to be a crowd worker?'” said Aniket Kittur, assistant professor in Carnegie Mellon’s Human-Computer Interaction Institute.

Co-authors with Kittur of the research strategy include Jeffrey Nickerson, director of the Center for Decision Technologies at Stevens Institute of Technology, and Michael Bernstein, assistant professor of computer science at Stanford University. Other leading crowd work researchers from Carnegie Mellon, Northwestern University and the University of Texas, Austin also contributed to the report, which is available for download.

The crowd work industry has expanded rapidly in recent years, with a number of vendors now offering work for people who get paid per task or who compete for prizes. A prominent vendor is Amazon Mechanical Turk, which claims more than 500,000 workers in more than 190 countries who complete tasks that, in some cases, may take only seconds to perform. Another, CrowdFlower, says it can access more than 2 million contributors worldwide. Others, such as oDesk, provide skilled labor, including web developers, designers and translators. Platforms such as Innocentive invite people to invent solutions to problems in hopes of winning a prize.

The still-young industry could grow very large because portions of almost any job — perhaps as much as 20 percent — potentially can be sent “down the wire,” Kittur said. Many crowd workers are paid substantially less than U.S. minimum wage, however, and, left to market forces, the crowd workforce could remain stigmatized and exploited.

“What if I want access to the best people in the world, but for only five minutes of their time?” Kittur said. As beneficial as that might be for some businesses, that possibility will not be achieved if the crowd workplace isn’t attractive for the very best workers and thinkers, he added. The call for action by Kittur and his colleagues, also discussed in a recent post on the Follow the Crowd blog, envisions three major research steps:

Create career ladders. Research is needed in how to structure teams so that skilled workers can train novices, as well as help design jobs and catch problems. Mechanisms are needed for credentialing workers. A better understanding of worker motivations could lead to better job designs.

Improve task design through better communication. Research suggests that some quality problems in crowd work have more to do with poorly designed tasks than with unskilled workers. Artificial intelligence could be used in complex tasks to identify work products that might still need improvement and assign workers accordingly. The crowd itself also may be used to train the computer programs, helping them support a broader range of tasks. Improved instructions and feedback mechanisms likewise could improve the work product.

Enable learning. Quality assurance assessments can identify skills that workers need to polish or learn to tackle new work tasks. Online tutoring, combined with tracking of work history, could support personalized instruction and feedback. The work platforms themselves will need mechanisms for learning what kinds of work requests attract talented workers, recognizing the patterns of learning and skill building among workers and determining what tasks are appropriate for which types of workers.


This work was supported by the National Science Foundation, a DARPA Young Faculty Award, a Temple Fellowship, Northwestern University and the Center for the Future of Work in Carnegie Mellon’s Heinz College.

Follow the School of Computer Science on Twitter @SCSatCMU.

Contact: Byron Spice

Categories: Citizen Science


About the Author

Darlene Cavalier

Darlene Cavalier

Darlene Cavalier is a Professor at Arizona State University's Center for Engagement and Training, part of the School for the Future of Innovation in Society. Cavalier is the founder of SciStarter. She is also the founder of Science Cheerleader, an organization of more than 300 current and former professional cheerleaders pursuing STEM careers, and a cofounder of ECAST: Expert and Citizen Assessment of Science and Technology, a network of universities, science centers, and think tanks that produces public deliberations to enhance science policymaking. She is a founding board member of the Citizen Science Association, a senior advisor at Discover Magazine, a member of the EPA's National Advisory Council for Environmental Policy and Technology, and was appointed to the National Academy of Sciences "Designing Citizen Science to Support Science Learning" committee. She is the author of The Science of Cheerleading and co-editor of The Rightful Place of Science: Citizen Science, published by Arizona State University. Darlene holds degrees from Temple University and the University of Pennsylvania and was a high school, college and NBA cheerleader. Darlene lives in Philadelphia with her husband and four children.