The document discusses communication strategies and measuring impact, with a dialogue between Jane Hawtin and Priya Mannava. It provides links to websites for Mendeley, ResearchGate, Networked Blogs, Bitly, Feedburner, YouTube, Prezi, and NYT Labs Cascade project. It also acknowledges assistance from The Conversation and provides contact information for Reema Rattan, Section Editor for Health and Medicine at The Conversation.
15. We would like to acknowledge
The Conversation for their assistance.
Reema Rattan
Section Editor, Health and Medicine
The Conversation
W: 03 9012 6665 | M: 0488 226 550
theconversation.edu.au/pages/health
Hinweis der Redaktion
Communication strategies for different audiences. Impact will depend on the context. Already had meetings, training etc. to discuss findings with relevant stakeholders, however, the donor is now asking for specific metrics. How can I show that my paper is being read and cited? And how can I show that I have communicated some aspects of my research to a broader audience using social media? With around 25,000 peer-reviewed journals publishing about 2.5 million articles each year it has become impossible to read, let alone evaluate all the research published in your field. In general medicine alone there are over 700 journals. In an ever-expanding universe, impact factor is intended to provide an ‘objective’ measure of the quality of a journal. Impact factor is calculates the average number of times that a research paper in a particular journal is cited. The Lancet has an impact factor of 33.63. Which means that, on average, an article published in the Lancet will be referenced about 33 or 34 times.
This graphic shows the relative impact of journals, the higher the impact the bigger the space occupied. The pink boxes are biological sciences. The Lancet is here. So impact factor is one way to show donors the likelihood of your research being read and cited. There are a number of criticisms of impact factor. I think that graphic also shows that impact factor is discipline specific. In some disciplines, like mathematical and physical sciences, only a small number of citations occur in the two-year period that the impact factor calculation is based on. Impact factor can, intentionally or not, be manipulated by journal editorial policies. The journals ranked by impact factor often don ’t match the experts’ top ten. Impact factor is a relative measure of the influence of a journal and should be used with extreme caution when evaluating an individual article.
Individual artcile metrics were also a response to criticisms of impact factor. Open-access Public Library of Science journals have introduced metrics for individual articles and includes some social media in their metrics. PloS metrics include how many views and downloads, this particular article has been viewed 6,105 times and downloaded 1,928 times. The article has also been cited in at least 21 papers. The social impact includes use of social bookmarks, in CiteULike, Connotea, Mendeley and Facebook. Here you can see that the article has 22 bookmarks. Analysis suggests that bookmarking strongly correlates to citations. PLoS tracks mentions in blogs through Research Blogging, Nature Blogs and Google. PLoS also has it ’s own network of blogs which often promote recently published research. We should say that there are a few things behind the inclusion of social media, including the UK Research Excellence Framework, which is asking academics to show the value of their research outside the campus. Another factor is that research is being reviewed by peers in social media. Some of the more comprehensive critiques make it full circle and are published as responses in journals months and months later.
While social media is a platform for communication between researchers, it is also useful for raising public awareness of an issue to influence the policy agenda. Like academic journals, social media platforms have proliferated. Facebook, Mendeley (which is a combination of Facebook and Endnote), Google +, Linked In, WordPress, Blogger, Tumblr, TEDTalks, Podcasts, Youtube, Vimeo, Flickr, Picasa, Scribd, and Slideshare. Transferring content between these platforms can be automated using services like Feedburner and Networkedblogs. So blog posts or website content automatically updates your Facebook and Twitter. So while 2.5 million journal articles are published each year. There are 140 million tweets sent every day. And there is no social impact factor serving as a proxy indicator for quality of the social media platform.
Most of these platforms generate their own metrics. There is Google Analytics, Facebook Insights, WordPress statistics, Flickr stats. These metrics can tell us the number of unique visitors. They can show how users came to your site. Whether it was from another site or what terms they used to search for your site. How they moved through the site. How long they spent there. They can generally also show limited demographic information about where in the world users are and what technology they are using to access your site. But are we actually measuring the impact of our research when we look at these metrics? Views don ’t tell us much about what people are doing with the information. [This should be expanded]
Social media is about what people do with information. It ’s about interactivity. Users can share their thoughts and share your content. Again, there are a multitude of ways to share content: Facebook likes, In share, Re-Tweeting, Google +1, and bookmarking on Delicious, Digg, Pinterest, Reddit, and Stumble Upon. And bookmarking on more academic sites CiteULike, Connotea and Mendeley. Youtube has some metrics that show us what they call ‘significant moments’ these include when videos are shared and a significant number of views result, when there is a referral from another site or an ad and when the video was viewed on a mobile device. This slide shows the metrics from a Bill & Melinda Gates Foundation video on vaccines. Here we can see that a significant number of views came from Facebook and the Huffington Post.
But there are some interesting projects that get a little closer to showing us how communication becomes a conversation through social media. One of these projects is ‘The Conversation’. ‘The Conversation’ is a media channel delivering academic insights, analysis and research to the public . Researchers write opinion pieces and research briefs. ‘The Conversation’ is an initiative of Australian research institutes and universities, in part to pre-empt government interest in whether research is making it off campus. When this interest becomes part of the funding equation, universities will need to report, so ‘The Conversation’ has developed some metrics available to authors.
We ’ll leave you with an interesting tool from The New York Times called ‘Cascade’ which maps how their news stories travel through the twittersphere. What we are showing behind us is the ‘real-time’ movement of a story on medical technology that could potentially save the lives of 70,000 women in developing countries each year. Rather than just looking at the number of tweets, this tool gives us new ways to think about how our research is shared through social media. Because it helps identify who is interested. Who is influential in our social media networks. We can trace the conversations our research has sparked and what exactly people are interested in. And we can use their interest and the questions they ask to shape our future communications and our research. Thank you.