the crowd machine
TRANSCRIPT
THE CROWD MACHINE Elena Simperl
CROWDSOURCING: PROBLEM SOLVING VIA OPEN CALLS
2
"the act of a company or institution taking a function once performed by employees and outsourcing it to an undefined (and generally large) network of people in the form of an open call. This can take the form of peer-production (when the job is performed collaboratively), but is also often undertaken by sole individuals. The crucial prerequisite is the use of the open call format and the large network of potential laborers.“
[Howe, 2006]
TYPOLOGY OF CROWDSOURCING
Macrotasks Microtasks
Challenges Self-
organized crowds
Crowdfunding
3
Source: [Prpić, Shukla, Kietzmann and McCarthy 2015]
SOCIAL MACHINES: PEOPLE DO THE CREATIVE WORK AND MACHINES DO THE ADMINISTRATION
"Real life is and must be full of all kinds of social constraint – the very processes from which society arises. Computers can help if we use them to create abstract social machines on the Web: processes in which the people do the creative work and the machine does the administration […] The stage is set for an evolutionary growth of new social engines.
[Tim Berners-Lee, Weaving the Web, 1999
]
Are social machines just crowdsourcing?
CROWDSOURCING & SOCIALITY
CROWDSOURCING CREATIVE TASKS
CROWDSOURCING & ETHICS
[Difallah et al, 2015]
CROWDSOURCING RESEARCH
TASK DESIGN TASK ASSIGNMENT QUALITY
ASSURANCE INCENTIVES
ENGINEERING
WORKFLOW DESIGN AND EXECUTION
CROWD TRAINING SELF-ORGANIZING
CROWDS COLLABORATIVE
CROWDSOURCING
REAL-TIME DELIVERY EXTENSIONS TO TECHNOLOGIES
PROGRAMMING LANGUAGES
GAMIFYING PAID MICROTASKS Improving paid microtasks through gamification and adaptive furtherance incentives O Feyisetan, E Simperl, M Van Kleek, N Shadbolt WWW 2015, 333-343
11
OVERVIEW
Make paid microtasks more cost-effective though gamification*
*use of game elements and mechanics in a non-game context
12
[Source: http://www.hideandseek.net/wp-
content/uploads/2010/10/gamification_badges.jpg]
PLATFORM Image labelling tasks published on microtask platform Free-text labels, varying numbers of labels per image, taboo words
1st setting: ‘standard’ tasks, including basic spam control
2nd setting: same requirements and rewards, but contributors were asked to carry out the task in Wordsmith
13
RESULTS BETTER, CHEAPER, BUT FEWER WORKERS
14
Metric CrowdFlower Wordsmith
Total workers 600 423
Total keywords 1,200 41,206
Unique keywords 111 5,708
Avg. agreement 5.72% 37.7%
Mean images/person 1 32
Max images/person 1 200
RESULTS (NO INCENTIVES) COMPARABLE QUALITY, HIGHER UNIT COSTS, FEWER DROPOUTS
15
Metric CrowdFlower Wordsmith
Total workers 600 514
Total keywords 13,200 35,890
Unique keywords 1,323 4,091
Avg. agreement 6.32% 10.9%
Mean images/person 11 27
Max images/person 1 351
RESULTS (WITH INCENTIVES) Increased participation People come back (20 times) and play longer (43 hours vs. 3 hours without incentives)
Targeted incentives work 77% players stayed vs. 27% in the randomised condition
19% more labels compared to no incentives condition
16
CONCLUSIONS & FUTURE WORK
MAIN FINDINGS
Task design matters as much as payment
Gamification achieves high accuracy for lower costs and improved engagement
Top contributors appreciate social features, but their main motivation is still task-driven
NEXT STEPS
The effect of individual gamification elements
The effect of task autonomy (allowing people to skip tasks)
Best ways to retain contributors
The effect of referrals
Application of SAPS to GWAPs and financially motivated crowds
17