Workshop - Best of Both Worlds_ Combine KG and Vector search for enhanced R...
Crowdsourcing in Action
1. 21 June 2012
Crowdsourcing in Action
Dheeraj Chowdhury
Group Leader – Digital Media
NSW DEC – Curriculum and Learning Innovation Centre
Dheeraj Chowdhury
6. Peer-Vetted Creative Production Approach
Threadless (http://www.threadless.com/)
Next Stop Design
http://nextstopdesign.com/inspiration
Dheeraj Chowdhury
7. Distributed Human Intelligence Tasking
Rosetta@home (http://boinc.bakerlab.org/)
SETI@home (http://setiathome.berkeley.edu)
Amazon Mechanical Turk (https://www.mturk.com)
Dheeraj Chowdhury
8. Knowledge Discovery and Management
CURLS – (http://www.curls.edu.au) (Case Study - http://innotecture.files.wordpress.com/2010/02/j21_v024_olc_pt01_moore.pdf)
Dheeraj Chowdhury
9. NSW DEC
2200+ schools
60,000 teachers
How do we harness this
expertise to gather
links/references to relevant
and good teaching and
learning resources?
for
1.2 million+ students
Dheeraj Chowdhury
26. $9,000
(total cost to set up & run CURLS including hosting for 4 years)
26,013 Resources / Links shared
11,970 + Users (20% of NSW DEC staff + others)
1,941 Schools (88% footprint in NSW DEC Schools)
177,728 Hits on shared links (1 hit every 2 minutes during school hours)
Dheeraj Chowdhury
Jeff Howe, a contributing editor for Wired magazine, coined the term “crowdsourcing” in the June 2006 issue of Wired.
Broadcast search is useful when an empirically right answer exists and the knowledge of a single expert (or handful of experts) somewhere in the network is needed to know the answer. Opening up the problem solving process through crowdsourcing is like casting a wide net, hoping to find the one needle in the haystack. Examples of the Broadcast Search Approach to crowdsourcing include InnoCentive, a company that allows scientific R&D problems to be broadcast to a base of scientists, and the Goldcorp Challenge, a past competition that tasked an online community with identifying gold deposits in a tract of land by making geophysical data available.
Peer-vetted creative production is useful when there is no empirically right answer, but rather the “right” answer is the one the market will support. In other words, when the “right” answer is a matter of consumer tastes or user preferences, this approach can help generate and vet original ideas to find a best choice. Examples of the Peer-Vetted Creative Production Approach to crowdsourcing include Threadless, a company that facilitates an ongoing t-shirt design contest; user-generated advertising contests, which task online communities with producing and selecting the next best advertisement for a company; and the Next Stop Design project, a competition to design the next best bus stop shelter for a transit system.
This final approach is useful when online communities are needed to perform tasks that require human intelligence in order to process large batches of data. Crowdsourcing organizations using this approach need massive amounts of microlabor to crunch large piles of information in systematic ways, yet computers are not capable of performing these processes. Organizations broadcast these data to online communities, which function in ways similar to computers in distributed computing systems like SETI@Home or Rosetta@Home. Examples of the Distributed Human Intelligence Tasking approach to crowdsourcing include Amazon Mechanical Turk, which allows companies to hire an online community to perform human intelligence tasks, and Subvert and Profit, a company that works to game social media ranking systems by distributing social media voting.
This approach is useful when knowledge exists in the network (e.g., in written records, prior art, and other published sources) and there is a need to find and assemble that knowledge in a coherent way in a single location. Crowdsourcing organizations task online communities to find this knowledge and provide a framework in which individuals in the online community can assemble and manage that knowledge as it is found. This is similar to the way wikis work, except that in this case a crowdsourcing organization is directing the process, dictating exactly what needs to be found and where it needs to be deposited. Rather, YochaiBenkler calls processes such as Wikipedia’s “commons-based peer production.” An example of the Knowledge Discovery and Management Approach to crowdsourcing is the Peer to Patent Community Patent Review project, which tasked an online community with finding and reporting prior art in the review of applications to the U.S. Patent and Trademark Office.