Diese Präsentation wurde erfolgreich gemeldet.
Die SlideShare-Präsentation wird heruntergeladen. ×

Van Hyning - Reaching New Audiences with Crowdsourcing

Anzeige
Anzeige
Anzeige
Anzeige
Anzeige
Anzeige
Anzeige
Anzeige
Anzeige
Anzeige
Anzeige
Anzeige
Wird geladen in …3
×

Hier ansehen

1 von 37 Anzeige

Weitere Verwandte Inhalte

Andere mochten auch (20)

Ähnlich wie Van Hyning - Reaching New Audiences with Crowdsourcing (20)

Anzeige

Weitere von Association of Danish Museums / Organisationen Danske Museer (20)

Aktuellste (20)

Anzeige

Van Hyning - Reaching New Audiences with Crowdsourcing

  1. 1. Dr Victoria Van Hyning Zooniverse.org, University of Oxford victoria@zooniverse.org @VanHyningV Reaching New Audiences with Crowdsourcing
  2. 2. Talk outline • Crowdsourcing • Origins of Zooniverse • Gamification and motivation • Project design principles • Humanities and transcription projects • Tate Britain partnership
  3. 3. Citizen Science (Crowdsourcing) ‘In citizen science projects, members of the public (citizens) collaborate with professional scientists to conduct scientific research.’ G. Newman et al, ‘The future of citizen science: Emerging technologies and shifting paradigms’, Front Ecol Environment, 10, 6 (2012), 298-304
  4. 4. 1.29 million users
  5. 5. Why do people participate in Zooniverse projects?
  6. 6. ? ? ? ?
  7. 7. Do we gamify the projects to incentivize users?
  8. 8. Principle #1: Many volunteers want to further real research: contributing to knowledge is an end in itself. Zooniverse designs for this reality.
  9. 9. Milky Way Project http://www.milkywayproject.org/
  10. 10. Principle #2: Be clear about how volunteers’ time and effort is useful.
  11. 11. Principle #3: Strike a balance between an enjoyable user experience and getting useful data.
  12. 12. Principle #4: Your community needs to hear from you!
  13. 13. Humanities at Zooniverse
  14. 14. Mostly text-based No audio No video
  15. 15. Transcription Projects • Existing projects:  ‘Ancient Lives’: www.ancientlives.org/  ‘Old Weather’: www.oldweather.org/  ‘Notes from Nature’: www.notesfromnature.org/  ‘Operation War Diary’: www.operationwardiary.org/ • Upcoming Projects:  ‘Anno.Tate’ with Tate Britain  ‘Shakespeare’s World’ with Folger Shakespeare Library
  16. 16. Ancient Lives • launched July 2011 • >1.5 million transcriptions of ancient Greek papyri • 250,000 unique volunteer contributors
  17. 17. Principle #5: A successful User Interface allows non-specialists to participate. A great UI allows them to learn and acquire new skills.
  18. 18. OWD by the numbers: • Over 1.2 million page views since launch • >12,000 registered users • >73,000 comments on Talk, the project discussion area • ~2,000 commenting users • ~100,000 pages ‘completed’ i.e. tagged and partially transcribed by 7 users.
  19. 19. http://wd3.herokuapp.com/pages/AWD0000h3c
  20. 20. Anno.Tate = +
  21. 21. Goals of Anno.Tate • To democratize access to Tate archives • to produce word searchable transcriptions of manuscripts for the online catalogue • To improve item level descriptions of archival items in catalogues • My goal: to create a new workflow for crowdsourced transcription
  22. 22. Principle #6: Granularize • break up tasks and make hard projects more accessible • less time commitment for users • mitigate against user fatigue • minimize repetition of tasks
  23. 23. How do we make hard tasks accessible?
  24. 24. Panoptes The next frontier
  25. 25. How can Panoptes help you?
  26. 26. www.zooniverse.org | victoria@zooniverse.org | @VanHyningV Thanks for listening!

Hinweis der Redaktion

  • Folks at the Zooniverse frequently use the term ‘citizen science’ to describe our work, but increasingly I prefer the term Academic Crowdsourcing.
  • Zooniverse began with a single project called Galaxy Zoo which was launched in July of 2007 by Dr Chris Lintott, now professor of Astrophysics here in Oxford, and Dr Kevin Schawinski. The goal was to process over one million images of galaxies from the Sloan Sky Survey into two types: spiral and elliptical, a task that would have one person at least three years of round the clock effort to complete. The several thousand strong crowd of volunteers who participated not only completed the task in a matter of months, each image was classified an average of 38 times, as opposed to once, and thus rendered excellent quality data for each image. The success of Galaxy Zoo led to the foundation of the Zooniverse in Oxford and over thirty new projects in astrophsyics, biology, climate science, and the humanities, including in the fields of music, papyrology and the history of World War I.

  • 50 projects by June of this year.
  • by June 50
  • Do we give them money or prizes? Heck no.
  • Very infrequently. and when we have we’ve found that some users were put off by the pressure to achieve and maintain a particular rank when rankings correlated with the amount of transcription achieved. Gamification based on bulk of activity rather than quality can also lead users to produce rushed or poor work, in an effort to maintain their lead or move up the league table.
  • Researchers must relay their findings and how the crowd is helping further research.
  • So what gets people motivated. Well, as this graph showing the participation in the first Milky Way Project demonstrates, people are motivated by a desire to help researchers produce real results. The sharp drop off you see here correlates with an announcement that all the data for the project had been classified (meaning each image was shown to 10 or 15 users and a concensus was reached about the content of each image). Participants were told they could still classify images and use the interface, but they moved on to other projects instead.
  • If their time would be better spent on an incomplete dataset on another project, tell them that. When there’s new data, that information needs to be shared too.
  • Tasks must strike a balance between being sufficiently complex in order to capture meaningful data for the science team, but sufficiently straight forward and engaging so that users don’t get tired and leave the site. [Grass example]
  • Most of our humanities projects are transcription based, but not all. I’ll walk you all through what we have built so far and what’s on the agenda for the coming year or so. It might be worth thinking about what sorts of research questions or data sets you have that might be served by our existing models or what these models don’t support and what you might be interested in.
  • but I can see applications for museums and we are exploring this space.
  • Having said that, we have several transcription projects which enjoy varying rates of volunteer effort and success. I’ve listed them in order of release here.
  • Ancient Lives was launched in 2011. It was the first Zooniverse humanities project and it’s got a dedicated community who have transcribed over a million fragments of papyri from the Oxyrynchus collection in Oxford.

    This project might be heartening for people who think they don’t have data that volunteers and amateurs would be interested in. Many people without any background in Ancient Greek participate, which you can see from the project Talk pages. Talk is the area where users can speak amongst themselves and with experts about their findings.
  • Ancient Lives invites volunteers to do character by character transcription. The keyboards are preloaded with sample characters to make the transcription tasks easier. This is significant because it enables non-specialist participation. If you go and read the project Talk and discussion pages you’ll discover that many participants have no background in Ancient Greek. Many users explicitly say that they want to participate in order to further knowledge. It is crucial when designing a project to try to create an interface that enables and encourages non-specialist participation. While many of us wish that specialists could devote time to crowdsourcing, on the whole they can’t or don’t. We need to create projects that are accessible to non-specialists and indeed non-enthusiasts, but which render data that is useful for specialists.
  • Too many academic crowdsourcing projects rely on the hope or assumption that specialists in the field will have time to participate. They usually don’t or don’t want to.

    *****The second part of Principle #2 leads me to the topic of my own work—two projects I’m developing for Tate Britain and the Folger Shakespeare Library in Washington DC.
  • Operation War Diary is our second true humanities project--a transcription and tagging interface for metadata collection. Operation War Diary was launched in January 2014 and as you see, it was created in partnership with the National Archives in the UK and the Imperial War Museum in London. The goal of the project is to tag and classify the daily diaries of all British Army Infantry units on the Western Front from 1914 to 1918. The full archive consists of 1.5 million pages chronicling life on the front.
  • This is page shown in the project interface. A first time visitor to the site completes a brief tutorial and there is a ‘field guide’ available too, so users can refresh their memories. Volunteers use a mixture of tagging and drop down menus when they participate in Operation War Diary. They identify what kind of document they are using: different document types have different drop down menus.. They then mark dates, times, locations, unit activities and the presence of named individuals. Tags enable us to know exactly where on a page the mention of a particular event or individual occurs. As users tag the pages, they can also add geolocation and transcribe things such as names. At any point during the classification process, users can comment by clicking on the little white word bubble in the top right of the screen. They can see what other users have said, and add their own comments using Twitter-style hashtags, and indeed, comments can only be 140 characters in length. Users can also write to each other directly using direct messaging.
  • Seven users tag each page. Their markings and annotations are then aggregated by an in-house consensus engine called WD3. The little white strips you see here contain terminology from the drop down menus as well as information from the free text fields. The numbers indicate how many users have submitted the same data.
  • Heritage Lottery Funding. Trying to reach new audiences. 52 artists, 1000 archival pieces.
  • Here are just a few examples of the kinds of material we’ll be presenting in the project. The work on the left is by Donald Rodney and the letter on the right is from the sculptor Kenneth Armitage to his future wife, the sculptor Joan Moore, dated to November/December 1939. The two married in 1940.
  • The goals of these projects do not mark a huge departure from projects like Transcribe Bentham; Ben Brumfield’s ‘From the Page’ or other Wikimedia powered projects. But they are predicated on a different style of working. The workflow is also very much an experiment—a series of hypotheses that I’ve made on the basis of what we know about our crowd’s behavior on existing Zooniverse projects. We haven’t started front end development yet, so if anyone has any thoughts or advice, these would be very welcome.
  • One of the things that I’ve learned from Operation War Diary is that it’s better to offer volunteers the chance to complete smaller shorter tasks, than to ask them to transcribe whole pages. To date, most existing transcription projects, such as Old Weather, Transcribe Bentham and others ask users to transcribe entire pages—a time consuming task.
  • The interface can even accommodate wonky lines.
  • Well, in the case of the Tate project this means allowing people to do single lines at a time or to just identify images, if that’s what they want to do. Describe interface
  • Panoptes and making your own projects. Mention DHOxSS.

×