Talk given at the e-science meeting on Web2 in research. Focuses on a couple of case studies trying to draw out what makes an effective and successful Web2 service for researchers.
HTML Injection Attacks: Impact and Mitigation Strategies
Walking the walk - the practical experience of Web2 in research
1. Walking the Walk
The experience of using Web 2.0
tools in active research projects
Cameron Neylon
2. 1. The long tail
2. Data is the next Intel inside
3. Users add value
4. Network effects by default
5. Some rights reserved
6. Perpetual Beta
7. Cooperate don’t control
8. Software above the level of a single device
http://www.oreillynet.com/pub/a/oreilly/tim/news/2005/09/30/what-is-web-20.html?page=5
3. What does Web 2.0 offer a
researcher in practice?
http://flickr.com/photos/heymans/480396810/
5. • Open tender and response
• Not mass participation or opinion markets
• There just aren’t that many researchers
• A good community and a well built and
cared for network are critical
9. 27 January 2009
February 1 2009
March 10 2009
http://gowers.wordpress.com/
10. • Successful project
• Small core group of participants
• Much larger group of watchers
• Concerns over embarrassment, keeping up,
mechanisms for giving credit
• Issues over management of large numbers
of very active threads
16. 113 individual measurements
(plus 71 literature values)
14 researchers in four countries
One undergraduate chemistry class
$6000 funding (for prizes and chemicals)
17. 113 individual measurements
(plus 71 literature values)
14 researchers in four countries
One undergraduate chemistry class
$6000 funding (for prizes and chemicals)
Four months
One (invited) book chapter submitted
Second paper in preparation
18. • Collaboration enabled via open data licensing
• Still relatively small numbers of people
• Are massive collaborative projects even possible?
• 90% aren’t aware of it, 9% are passive watchers
• 0.9% make occasional contributions and 0.1% are
core players - does that add up to more than one?
22. • Compelling and comprehensible story
• Much work gone into building a system that
enables people to make a contribution
• Responsive and appealing user experience
• Still a self selecting community but drawn
from a much wider pool
25. • “It’s going to be great but we need to put a lot of
work into getting it going...”
• “I don’t have the time to put all the stuff in...”
• Get what the advantages are but haven’t
necessarily bought in to the process
• Concerns over data re-use and “scooping”
27. 1. The long tail
2. Data is the next Intel inside
3. Users add value
4. Network effects by default
5. Some rights reserved
6. Perpetual Beta
7. Cooperate don’t control
8. Software above the level of a single device
http://www.oreillynet.com/pub/a/oreilly/tim/news/2005/09/30/what-is-web-20.html?page=5
33. 4. Network effects by default
SP
rU
ou
sy
at’
th
if ow
ut pn
b eu
giv
en
th
34. 5. Some rights reserved
licensing matters, even i
f
everyone thinks it’s bori
ng
35. is not an excuse for
giving people rubbish
6. Perpetual beta
36. is not an excuse for
giving people rubbish
6. Perpetual beta
37. is not an excuse for
giving people rubbish
6. Perpetual beta
provement
continual im
38. ating
tegr
in in
e lies
valu
l the
Al g it
ardin
n ho
not i
data,
7. Cooperate don’t control
39. ating
tegr
in in
e lies
valu
l the
Al g it
ardin
n ho
not i
data,
7. Cooperate don’t control
licensing matters, even i
f
everyone thinks it’s bori
ng
40. ating
tegr
in in
e lies
valu
l the
Al g it
ardin
n ho
not i
data,
7. Cooperate don’t control
licensing matters, even i
f
everyone thinks it’s bori
ng
ecause it
ike is bad b
share-al
perability
eaks intero
br
44. What are the design patterns
for successful research tools?
45. 1. Define and understand your target audience
2. Solve a pressing problem they have or tell them a story
they understand and want to contribute to
3. Build the service into an existing workflow or a established
framework that the target audience understands
4. Get the licensing right and give users a sense of control
over their own data and contribution
5. Build for network effects but don’t rely on them
6. Plan to build (and resource the building of) your community
7. Build for interoperability; technical and legal
50. • Aggregating content (solving a problem)
• Works without a network; network effects follow
• Collecting comments and ratings (data)
• Straightforward pumping of data in and pulling it
backout via API (licensing, interoperability)
• Community, community, community
• Building your own network
51. If you just build it they
(probably) won’t come