SlideShare ist ein Scribd-Unternehmen logo
1 von 17
UCount: A community-driven approach for measuring Scientific Reputation Altmetrics Workshop / websci2011  Cristhian Parra University of Trento, Italy parra@disi.unitn.it
Context http://beta.kspaces.net/ic/ http://reseseval.org/ http://liquidjournal.org/
What is Scientific Reputation? Scientific Reputation is the social evaluation (opinion) by the scientific community of a researcher or its contributions given a certain criterion (scientific impact)
Main Goal understand To understand the way reputationis formed within and across scientific communities ? How, Why
Science is an Economy of Reputation [Whitley 2000] Motivation Improve support for Decision Making Readership Affiliation Bibliometrics
[Insert footer] Dataset Top H-Index (>200) 79 total Replies 8 Online Surveys ICWE (18) BPM (20) VLDB (15) ... http://reseval.org/survey http://www.cs.ucla.edu/~palsberg/h-number.html Experiment #1: LiquidReputation Surveys
Correlation Results H-Index (Palsberg) # Publications (DBLP) H-Index (Script) 1 3 2 Results published in ISSI2011 and SEBD2011
Experiment #2: Position Contests Analysis (*) http://reclutamento.murst.it/ (**) http://intersection.dsi.cnrs.fr/intersection/resultats- cc- en.do
Results Surveys: Correlation between bibliometric indicators and reputation is always in the rank of (-0.5:0.5) Research Position Contests CNRS dataset: same result as in surveys Italian dataset: around 50% of effectiveness in predictions for all metrics Bibliometrics are not a good describer of real reputation
UCount Methodology UCount Sci. Excellence UCount Reviewer Score
Challenges
UCount Eliciting Reputation Been there Peer Review based assessment (Research Position Contests) Surveys Community oriented Surveys Peer Review Feedback
UCount Surveys List of Candidates DBLP Coauthorship Graph ICST Affinity Shortest Path + Jaccard Editorial Boards Palsberg Top H Researchers http://icst.org/UCount-Survey/ http://icst.org/icst-transactions/ http://www.cs.ucla.edu/~palsberg/h-number.html
UCount Derive Reputation Functions UCount Scientific Impact UCount Reviewer Score Peer Review Feedback Surveys Results
Reverse Engineering of Reputation Combine Other Features? Minimum Distance
UCount Derive Reputation Functions UCount Scientific Impact UCount Reviewer Score Community Reputation Functions Library Peer Review Feedback Surveys Results
Reverse Engineering Approaches Decision Trees No tree with more than 60% of accuracy Unsupervised Methods Genetic algorithms applied on CNRS Dataset improved correlation in an average of 15% (running only for 5 minutes) Highly improved correlation for fields Research Management and Politics.  Next Applying Machine Learning techniques Explore other techniques (e.g. neural networks) Obtain other types of features (e.g. keynotes, advisory networks) http://code.google.com/p/revengrep/ https://github.com/cdparra/melquiades/

Weitere ähnliche Inhalte

Was ist angesagt?

Your Systematic Review: Getting Started
Your Systematic Review: Getting StartedYour Systematic Review: Getting Started
Your Systematic Review: Getting StartedElaine Lasda
 
TOWARDS A MULTI-FEATURE ENABLED APPROACH FOR OPTIMIZED EXPERT SEEKING
TOWARDS A MULTI-FEATURE ENABLED APPROACH FOR OPTIMIZED EXPERT SEEKINGTOWARDS A MULTI-FEATURE ENABLED APPROACH FOR OPTIMIZED EXPERT SEEKING
TOWARDS A MULTI-FEATURE ENABLED APPROACH FOR OPTIMIZED EXPERT SEEKINGcsandit
 
Research Data Sharing and Re-Use: Practical Implications for Data Citation Pr...
Research Data Sharing and Re-Use: Practical Implications for Data Citation Pr...Research Data Sharing and Re-Use: Practical Implications for Data Citation Pr...
Research Data Sharing and Re-Use: Practical Implications for Data Citation Pr...SC CTSI at USC and CHLA
 
Digital Scholar Webinar: Recruiting Research Participants Online Using Reddit
Digital Scholar Webinar: Recruiting Research Participants Online Using RedditDigital Scholar Webinar: Recruiting Research Participants Online Using Reddit
Digital Scholar Webinar: Recruiting Research Participants Online Using RedditSC CTSI at USC and CHLA
 
Design Science Introduction
Design Science IntroductionDesign Science Introduction
Design Science Introductionpajo01
 
Altmetrics - Measuring the Buzz
Altmetrics - Measuring the BuzzAltmetrics - Measuring the Buzz
Altmetrics - Measuring the BuzzBruce Antelman
 
Cloud computing in qualitative research data analysis with support of web qda...
Cloud computing in qualitative research data analysis with support of web qda...Cloud computing in qualitative research data analysis with support of web qda...
Cloud computing in qualitative research data analysis with support of web qda...German Jordanian university
 
Data Science: Origins, Methods, Challenges and the future?
Data Science: Origins, Methods, Challenges and the future?Data Science: Origins, Methods, Challenges and the future?
Data Science: Origins, Methods, Challenges and the future?Cagatay Turkay
 
Usability Evaluation of a Research Repository and Collaboration Website For H...
Usability Evaluation of a Research Repository and Collaboration Website For H...Usability Evaluation of a Research Repository and Collaboration Website For H...
Usability Evaluation of a Research Repository and Collaboration Website For H...Tao Zhang
 
DORA and the reinvention of research assessment
DORA and the reinvention of research assessmentDORA and the reinvention of research assessment
DORA and the reinvention of research assessmentMark Patterson
 
Ux Research Portfolio
Ux Research PortfolioUx Research Portfolio
Ux Research PortfolioTao Zhang
 
3rd Workshop on Social Information Retrieval for Technology-Enhanced Learnin...
3rd Workshop onSocial  Information Retrieval for Technology-Enhanced Learnin...3rd Workshop onSocial  Information Retrieval for Technology-Enhanced Learnin...
3rd Workshop on Social Information Retrieval for Technology-Enhanced Learnin...Hendrik Drachsler
 
Reproducibility from an infomatics perspective
Reproducibility from an infomatics perspectiveReproducibility from an infomatics perspective
Reproducibility from an infomatics perspectiveMicah Altman
 
Relevance Clues: Developing an experimental research design to investigate a ...
Relevance Clues: Developing an experimental research design to investigate a ...Relevance Clues: Developing an experimental research design to investigate a ...
Relevance Clues: Developing an experimental research design to investigate a ...Christiane Behnert
 
DIY ERM (Do-It-Yourself Electronic Resources Management) for the Small Library
DIY ERM (Do-It-Yourself Electronic Resources Management) for the Small LibraryDIY ERM (Do-It-Yourself Electronic Resources Management) for the Small Library
DIY ERM (Do-It-Yourself Electronic Resources Management) for the Small LibraryNASIG
 
Lesson 4 secondary research 2
Lesson 4   secondary research 2Lesson 4   secondary research 2
Lesson 4 secondary research 2Kavita Parwani
 

Was ist angesagt? (20)

Your Systematic Review: Getting Started
Your Systematic Review: Getting StartedYour Systematic Review: Getting Started
Your Systematic Review: Getting Started
 
TOWARDS A MULTI-FEATURE ENABLED APPROACH FOR OPTIMIZED EXPERT SEEKING
TOWARDS A MULTI-FEATURE ENABLED APPROACH FOR OPTIMIZED EXPERT SEEKINGTOWARDS A MULTI-FEATURE ENABLED APPROACH FOR OPTIMIZED EXPERT SEEKING
TOWARDS A MULTI-FEATURE ENABLED APPROACH FOR OPTIMIZED EXPERT SEEKING
 
Research Data Sharing and Re-Use: Practical Implications for Data Citation Pr...
Research Data Sharing and Re-Use: Practical Implications for Data Citation Pr...Research Data Sharing and Re-Use: Practical Implications for Data Citation Pr...
Research Data Sharing and Re-Use: Practical Implications for Data Citation Pr...
 
Digital Scholar Webinar: Recruiting Research Participants Online Using Reddit
Digital Scholar Webinar: Recruiting Research Participants Online Using RedditDigital Scholar Webinar: Recruiting Research Participants Online Using Reddit
Digital Scholar Webinar: Recruiting Research Participants Online Using Reddit
 
Design Science Introduction
Design Science IntroductionDesign Science Introduction
Design Science Introduction
 
Altmetrics - Measuring the Buzz
Altmetrics - Measuring the BuzzAltmetrics - Measuring the Buzz
Altmetrics - Measuring the Buzz
 
Cloud computing in qualitative research data analysis with support of web qda...
Cloud computing in qualitative research data analysis with support of web qda...Cloud computing in qualitative research data analysis with support of web qda...
Cloud computing in qualitative research data analysis with support of web qda...
 
Data Science: Origins, Methods, Challenges and the future?
Data Science: Origins, Methods, Challenges and the future?Data Science: Origins, Methods, Challenges and the future?
Data Science: Origins, Methods, Challenges and the future?
 
Usability Evaluation of a Research Repository and Collaboration Website For H...
Usability Evaluation of a Research Repository and Collaboration Website For H...Usability Evaluation of a Research Repository and Collaboration Website For H...
Usability Evaluation of a Research Repository and Collaboration Website For H...
 
Sandusky, "Deep Indexing and Discover of Tables and Figures"
Sandusky, "Deep Indexing and Discover of Tables and Figures"Sandusky, "Deep Indexing and Discover of Tables and Figures"
Sandusky, "Deep Indexing and Discover of Tables and Figures"
 
DORA and the reinvention of research assessment
DORA and the reinvention of research assessmentDORA and the reinvention of research assessment
DORA and the reinvention of research assessment
 
Ux Research Portfolio
Ux Research PortfolioUx Research Portfolio
Ux Research Portfolio
 
3rd Workshop on Social Information Retrieval for Technology-Enhanced Learnin...
3rd Workshop onSocial  Information Retrieval for Technology-Enhanced Learnin...3rd Workshop onSocial  Information Retrieval for Technology-Enhanced Learnin...
3rd Workshop on Social Information Retrieval for Technology-Enhanced Learnin...
 
Sirtel Workshop
Sirtel WorkshopSirtel Workshop
Sirtel Workshop
 
Reproducibility from an infomatics perspective
Reproducibility from an infomatics perspectiveReproducibility from an infomatics perspective
Reproducibility from an infomatics perspective
 
Relevance Clues: Developing an experimental research design to investigate a ...
Relevance Clues: Developing an experimental research design to investigate a ...Relevance Clues: Developing an experimental research design to investigate a ...
Relevance Clues: Developing an experimental research design to investigate a ...
 
Case Study Questions
Case Study QuestionsCase Study Questions
Case Study Questions
 
DIY ERM (Do-It-Yourself Electronic Resources Management) for the Small Library
DIY ERM (Do-It-Yourself Electronic Resources Management) for the Small LibraryDIY ERM (Do-It-Yourself Electronic Resources Management) for the Small Library
DIY ERM (Do-It-Yourself Electronic Resources Management) for the Small Library
 
Lesson 4 secondary research 2
Lesson 4   secondary research 2Lesson 4   secondary research 2
Lesson 4 secondary research 2
 
Surveys SD
Surveys SDSurveys SD
Surveys SD
 

Ähnlich wie 2011 06-14 cristhian-parra_u_count

Metadata and Metrics to Support Open Access
Metadata and Metrics to Support Open AccessMetadata and Metrics to Support Open Access
Metadata and Metrics to Support Open AccessMicah Altman
 
EDUCAUSE Midwest - Presentation - Koch - Henshaw
EDUCAUSE Midwest - Presentation - Koch - HenshawEDUCAUSE Midwest - Presentation - Koch - Henshaw
EDUCAUSE Midwest - Presentation - Koch - HenshawBruce Gilbert
 
Project E: Citation
Project E: CitationProject E: Citation
Project E: CitationLizLyon
 
The Innovation Engine for Team Building – The EU Aristotele Approach From Ope...
The Innovation Engine for Team Building – The EU Aristotele Approach From Ope...The Innovation Engine for Team Building – The EU Aristotele Approach From Ope...
The Innovation Engine for Team Building – The EU Aristotele Approach From Ope...ARISTOTELE
 
Stepping out of the echo chamber - Alternative indicators of scholarly commun...
Stepping out of the echo chamber - Alternative indicators of scholarly commun...Stepping out of the echo chamber - Alternative indicators of scholarly commun...
Stepping out of the echo chamber - Alternative indicators of scholarly commun...Andy Tattersall
 
Design & Evaluation of the Goal-Oriented Design Knowledge Library
Design & Evaluation of the Goal-Oriented Design Knowledge LibraryDesign & Evaluation of the Goal-Oriented Design Knowledge Library
Design & Evaluation of the Goal-Oriented Design Knowledge Libraryandrewhilts
 
Towards indicators for 'opening up' science and technology policy
Towards indicators for 'opening up' science and technology policyTowards indicators for 'opening up' science and technology policy
Towards indicators for 'opening up' science and technology policyORCID, Inc
 
Challenges and opportunities in research evaluation: toward a better evaluati...
Challenges and opportunities in research evaluation: toward a better evaluati...Challenges and opportunities in research evaluation: toward a better evaluati...
Challenges and opportunities in research evaluation: toward a better evaluati...ORCID, Inc
 
Methods for measuring citizen-science impact
Methods for measuring citizen-science impactMethods for measuring citizen-science impact
Methods for measuring citizen-science impactLuigi Ceccaroni
 
UKSG Conference 2016 Breakout Session - Institutional insights: adopting new ...
UKSG Conference 2016 Breakout Session - Institutional insights: adopting new ...UKSG Conference 2016 Breakout Session - Institutional insights: adopting new ...
UKSG Conference 2016 Breakout Session - Institutional insights: adopting new ...UKSG: connecting the knowledge community
 
A Social Network-Empowered Research Analytics Framework For Project Selection
A Social Network-Empowered Research Analytics Framework For Project SelectionA Social Network-Empowered Research Analytics Framework For Project Selection
A Social Network-Empowered Research Analytics Framework For Project SelectionNat Rice
 
Social CI: A Work method and a tool for Competitive Intelligence Networking
Social CI: A Work method and a tool for Competitive Intelligence NetworkingSocial CI: A Work method and a tool for Competitive Intelligence Networking
Social CI: A Work method and a tool for Competitive Intelligence NetworkingComintelli
 
Business research (1)
Business research (1)Business research (1)
Business research (1)007donmj
 
ICIS Rating Scales for Collective IntelligenceIcis idea rating-v1.0-final
ICIS Rating Scales for Collective IntelligenceIcis idea rating-v1.0-finalICIS Rating Scales for Collective IntelligenceIcis idea rating-v1.0-final
ICIS Rating Scales for Collective IntelligenceIcis idea rating-v1.0-finalriedlc
 
Information Systems Action design research method
Information Systems Action design research methodInformation Systems Action design research method
Information Systems Action design research methodRaimo Halinen
 
A Big Picture in Research Data Management
A Big Picture in Research Data ManagementA Big Picture in Research Data Management
A Big Picture in Research Data ManagementCarole Goble
 
IR Strangelove or: How I Learned to Stop Worrying and Love the Institutional ...
IR Strangelove or: How I Learned to Stop Worrying and Love the Institutional ...IR Strangelove or: How I Learned to Stop Worrying and Love the Institutional ...
IR Strangelove or: How I Learned to Stop Worrying and Love the Institutional ...OCLC Research
 
KMD 1001 Design Brief and Ontology Task
KMD 1001 Design Brief and Ontology TaskKMD 1001 Design Brief and Ontology Task
KMD 1001 Design Brief and Ontology TaskStian Håklev
 
ISEC'18 Tutorial: Research Methodology on Pursuing Impact-Driven Research
ISEC'18 Tutorial: Research Methodology on Pursuing Impact-Driven ResearchISEC'18 Tutorial: Research Methodology on Pursuing Impact-Driven Research
ISEC'18 Tutorial: Research Methodology on Pursuing Impact-Driven ResearchTao Xie
 

Ähnlich wie 2011 06-14 cristhian-parra_u_count (20)

Metadata and Metrics to Support Open Access
Metadata and Metrics to Support Open AccessMetadata and Metrics to Support Open Access
Metadata and Metrics to Support Open Access
 
EDUCAUSE Midwest - Presentation - Koch - Henshaw
EDUCAUSE Midwest - Presentation - Koch - HenshawEDUCAUSE Midwest - Presentation - Koch - Henshaw
EDUCAUSE Midwest - Presentation - Koch - Henshaw
 
Project E: Citation
Project E: CitationProject E: Citation
Project E: Citation
 
The Innovation Engine for Team Building – The EU Aristotele Approach From Ope...
The Innovation Engine for Team Building – The EU Aristotele Approach From Ope...The Innovation Engine for Team Building – The EU Aristotele Approach From Ope...
The Innovation Engine for Team Building – The EU Aristotele Approach From Ope...
 
Stepping out of the echo chamber - Alternative indicators of scholarly commun...
Stepping out of the echo chamber - Alternative indicators of scholarly commun...Stepping out of the echo chamber - Alternative indicators of scholarly commun...
Stepping out of the echo chamber - Alternative indicators of scholarly commun...
 
Design & Evaluation of the Goal-Oriented Design Knowledge Library
Design & Evaluation of the Goal-Oriented Design Knowledge LibraryDesign & Evaluation of the Goal-Oriented Design Knowledge Library
Design & Evaluation of the Goal-Oriented Design Knowledge Library
 
Towards indicators for 'opening up' science and technology policy
Towards indicators for 'opening up' science and technology policyTowards indicators for 'opening up' science and technology policy
Towards indicators for 'opening up' science and technology policy
 
Challenges and opportunities in research evaluation: toward a better evaluati...
Challenges and opportunities in research evaluation: toward a better evaluati...Challenges and opportunities in research evaluation: toward a better evaluati...
Challenges and opportunities in research evaluation: toward a better evaluati...
 
Methods for measuring citizen-science impact
Methods for measuring citizen-science impactMethods for measuring citizen-science impact
Methods for measuring citizen-science impact
 
UKSG Conference 2016 Breakout Session - Institutional insights: adopting new ...
UKSG Conference 2016 Breakout Session - Institutional insights: adopting new ...UKSG Conference 2016 Breakout Session - Institutional insights: adopting new ...
UKSG Conference 2016 Breakout Session - Institutional insights: adopting new ...
 
A Social Network-Empowered Research Analytics Framework For Project Selection
A Social Network-Empowered Research Analytics Framework For Project SelectionA Social Network-Empowered Research Analytics Framework For Project Selection
A Social Network-Empowered Research Analytics Framework For Project Selection
 
Social CI: A Work method and a tool for Competitive Intelligence Networking
Social CI: A Work method and a tool for Competitive Intelligence NetworkingSocial CI: A Work method and a tool for Competitive Intelligence Networking
Social CI: A Work method and a tool for Competitive Intelligence Networking
 
Business research (1)
Business research (1)Business research (1)
Business research (1)
 
ICIS Rating Scales for Collective IntelligenceIcis idea rating-v1.0-final
ICIS Rating Scales for Collective IntelligenceIcis idea rating-v1.0-finalICIS Rating Scales for Collective IntelligenceIcis idea rating-v1.0-final
ICIS Rating Scales for Collective IntelligenceIcis idea rating-v1.0-final
 
Information Systems Action design research method
Information Systems Action design research methodInformation Systems Action design research method
Information Systems Action design research method
 
A Big Picture in Research Data Management
A Big Picture in Research Data ManagementA Big Picture in Research Data Management
A Big Picture in Research Data Management
 
Qs 5 group c melinda kenneway
Qs 5 group c melinda kennewayQs 5 group c melinda kenneway
Qs 5 group c melinda kenneway
 
IR Strangelove or: How I Learned to Stop Worrying and Love the Institutional ...
IR Strangelove or: How I Learned to Stop Worrying and Love the Institutional ...IR Strangelove or: How I Learned to Stop Worrying and Love the Institutional ...
IR Strangelove or: How I Learned to Stop Worrying and Love the Institutional ...
 
KMD 1001 Design Brief and Ontology Task
KMD 1001 Design Brief and Ontology TaskKMD 1001 Design Brief and Ontology Task
KMD 1001 Design Brief and Ontology Task
 
ISEC'18 Tutorial: Research Methodology on Pursuing Impact-Driven Research
ISEC'18 Tutorial: Research Methodology on Pursuing Impact-Driven ResearchISEC'18 Tutorial: Research Methodology on Pursuing Impact-Driven Research
ISEC'18 Tutorial: Research Methodology on Pursuing Impact-Driven Research
 

Mehr von Cristhian Parra

AppCivist Ideas in the context of Participatory Budgeting
AppCivist Ideas in the context of Participatory BudgetingAppCivist Ideas in the context of Participatory Budgeting
AppCivist Ideas in the context of Participatory BudgetingCristhian Parra
 
Research and Design through Community Informatics - CIRN2014 presentation
Research and Design through Community Informatics - CIRN2014 presentationResearch and Design through Community Informatics - CIRN2014 presentation
Research and Design through Community Informatics - CIRN2014 presentationCristhian Parra
 
Research and Design through Community Informatics. Lessons from Participatory...
Research and Design through Community Informatics. Lessons from Participatory...Research and Design through Community Informatics. Lessons from Participatory...
Research and Design through Community Informatics. Lessons from Participatory...Cristhian Parra
 
"...And suddenly, the memory revealed itself". The role of IT in supporting s...
"...And suddenly, the memory revealed itself". The role of IT in supporting s..."...And suddenly, the memory revealed itself". The role of IT in supporting s...
"...And suddenly, the memory revealed itself". The role of IT in supporting s...Cristhian Parra
 
Reminiscens: stimulating memory and face-to-face social interactions
Reminiscens: stimulating memory and face-to-face social interactionsReminiscens: stimulating memory and face-to-face social interactions
Reminiscens: stimulating memory and face-to-face social interactionsCristhian Parra
 
ICT for social inclusion of Elderly
ICT for social inclusion of ElderlyICT for social inclusion of Elderly
ICT for social inclusion of ElderlyCristhian Parra
 
What’s Up: Fostering Intergenerational Social Interactions
What’s Up: Fostering Intergenerational Social InteractionsWhat’s Up: Fostering Intergenerational Social Interactions
What’s Up: Fostering Intergenerational Social InteractionsCristhian Parra
 
Enabling Community Participation of Senior Citizens
Enabling Community Participation of Senior CitizensEnabling Community Participation of Senior Citizens
Enabling Community Participation of Senior CitizensCristhian Parra
 

Mehr von Cristhian Parra (9)

Tecnologías Cívicas
Tecnologías CívicasTecnologías Cívicas
Tecnologías Cívicas
 
AppCivist Ideas in the context of Participatory Budgeting
AppCivist Ideas in the context of Participatory BudgetingAppCivist Ideas in the context of Participatory Budgeting
AppCivist Ideas in the context of Participatory Budgeting
 
Research and Design through Community Informatics - CIRN2014 presentation
Research and Design through Community Informatics - CIRN2014 presentationResearch and Design through Community Informatics - CIRN2014 presentation
Research and Design through Community Informatics - CIRN2014 presentation
 
Research and Design through Community Informatics. Lessons from Participatory...
Research and Design through Community Informatics. Lessons from Participatory...Research and Design through Community Informatics. Lessons from Participatory...
Research and Design through Community Informatics. Lessons from Participatory...
 
"...And suddenly, the memory revealed itself". The role of IT in supporting s...
"...And suddenly, the memory revealed itself". The role of IT in supporting s..."...And suddenly, the memory revealed itself". The role of IT in supporting s...
"...And suddenly, the memory revealed itself". The role of IT in supporting s...
 
Reminiscens: stimulating memory and face-to-face social interactions
Reminiscens: stimulating memory and face-to-face social interactionsReminiscens: stimulating memory and face-to-face social interactions
Reminiscens: stimulating memory and face-to-face social interactions
 
ICT for social inclusion of Elderly
ICT for social inclusion of ElderlyICT for social inclusion of Elderly
ICT for social inclusion of Elderly
 
What’s Up: Fostering Intergenerational Social Interactions
What’s Up: Fostering Intergenerational Social InteractionsWhat’s Up: Fostering Intergenerational Social Interactions
What’s Up: Fostering Intergenerational Social Interactions
 
Enabling Community Participation of Senior Citizens
Enabling Community Participation of Senior CitizensEnabling Community Participation of Senior Citizens
Enabling Community Participation of Senior Citizens
 

2011 06-14 cristhian-parra_u_count

  • 1. UCount: A community-driven approach for measuring Scientific Reputation Altmetrics Workshop / websci2011 Cristhian Parra University of Trento, Italy parra@disi.unitn.it
  • 3. What is Scientific Reputation? Scientific Reputation is the social evaluation (opinion) by the scientific community of a researcher or its contributions given a certain criterion (scientific impact)
  • 4. Main Goal understand To understand the way reputationis formed within and across scientific communities ? How, Why
  • 5. Science is an Economy of Reputation [Whitley 2000] Motivation Improve support for Decision Making Readership Affiliation Bibliometrics
  • 6. [Insert footer] Dataset Top H-Index (>200) 79 total Replies 8 Online Surveys ICWE (18) BPM (20) VLDB (15) ... http://reseval.org/survey http://www.cs.ucla.edu/~palsberg/h-number.html Experiment #1: LiquidReputation Surveys
  • 7. Correlation Results H-Index (Palsberg) # Publications (DBLP) H-Index (Script) 1 3 2 Results published in ISSI2011 and SEBD2011
  • 8. Experiment #2: Position Contests Analysis (*) http://reclutamento.murst.it/ (**) http://intersection.dsi.cnrs.fr/intersection/resultats- cc- en.do
  • 9. Results Surveys: Correlation between bibliometric indicators and reputation is always in the rank of (-0.5:0.5) Research Position Contests CNRS dataset: same result as in surveys Italian dataset: around 50% of effectiveness in predictions for all metrics Bibliometrics are not a good describer of real reputation
  • 10. UCount Methodology UCount Sci. Excellence UCount Reviewer Score
  • 12. UCount Eliciting Reputation Been there Peer Review based assessment (Research Position Contests) Surveys Community oriented Surveys Peer Review Feedback
  • 13. UCount Surveys List of Candidates DBLP Coauthorship Graph ICST Affinity Shortest Path + Jaccard Editorial Boards Palsberg Top H Researchers http://icst.org/UCount-Survey/ http://icst.org/icst-transactions/ http://www.cs.ucla.edu/~palsberg/h-number.html
  • 14. UCount Derive Reputation Functions UCount Scientific Impact UCount Reviewer Score Peer Review Feedback Surveys Results
  • 15. Reverse Engineering of Reputation Combine Other Features? Minimum Distance
  • 16. UCount Derive Reputation Functions UCount Scientific Impact UCount Reviewer Score Community Reputation Functions Library Peer Review Feedback Surveys Results
  • 17. Reverse Engineering Approaches Decision Trees No tree with more than 60% of accuracy Unsupervised Methods Genetic algorithms applied on CNRS Dataset improved correlation in an average of 15% (running only for 5 minutes) Highly improved correlation for fields Research Management and Politics. Next Applying Machine Learning techniques Explore other techniques (e.g. neural networks) Obtain other types of features (e.g. keynotes, advisory networks) http://code.google.com/p/revengrep/ https://github.com/cdparra/melquiades/
  • 18. Reverse Engineering Problem (2) Possible Examples of Combinations One single feature with the highest correlation to reputation (e.g. H-Index for Databases, Readership for Social Informatics) A linear combination of features A complex logic algorithm (e.g. a decision tree)
  • 19. Results What has been done so far
  • 20. Where are we now?
  • 22. Lamont (2009). How professors think: Inside the curious world of academic judgment. Bollen (2009) et. al. A principal component analysis of 39 scientific impact measures. Sabater (2005) et. al. Review on computational trust and reputation models. Hirsch (2005). “An index to quantify an individual’s scientific research output.” Calstelfranchi (2002). Social trust: A cognitive approach. Priem (2010) et. al. Alt-metrics: A manifesto. 2010. Mann (2006) et. al. Bibliometric impact measures leveraging topic analysis. Mussi (2010) et. al. Discovering Scientific Communities using Conference Network. Nazri (2007) et. al. Journal Impact Factor. 2007. Bergstrom (2007). Eigenfactor: Measuring the value and prestige of scholarly journals. Bar-Ilan (2008). Informetrics at the beginning of the 21st century–A review. Jensen (2009) et. al. Testing bibliometric indicators by their prediction of scientists promotions. Kulasegarah (2010) et. al. Comparison of the h-index with standard bibliometric indicators to rank influential otolaryngologists in Europe and North America. Katsaros (2008) et. al. Evaluating Greek Departments of Computer Science/Engineering using Bibliometric Indices. Whitley (2000). The intellectual and social organization of the sciences. Oxford: Oxford University Press References
  • 23.
  • 25. Derive reputation function across communities
  • 26.
  • 27.
  • 29.
  • 30.
  • 32.
  • 33. [Insert header here] [Insert footer] H2-Index # Cited Pub. G-Index 1 3 2 None of the chosen indicators has a significant correlation Results (CNRS Contests)
  • 34.
  • 35. 47% of winners have lowerH-Index
  • 36. 53% of winners have lowerG-Index
  • 37. 56% of winners have lowercitation count
  • 38. 50% of winners have lower# publicationResults (Italian Contests)
  • 39.
  • 41. Paper at CLEI, 2010(**) http://project.liquidpub.org/karaku (**) http://project.liquidpub.org/resman First year in one slide :)
  • 42. Social Networking Services Mendeley CiteULike Connotea Delicious Digital libraries SRSAPI SRS Repository Data Crawling DBLP Source dumps Target DB Staging area Scopus Xplore Data Loading and Cleaning …
  • 43. Social Networking Services Mendeley CiteULike Connotea Delicious Digital libraries SRSAPI SRS Repository Data Crawling DBLP Source dumps Target DB Staging area Scopus Xplore Data Loading and Cleaning …
  • 44. Data Acquisition Storage Data Acquisition Layer Off-line acquisition On-demand acquisition Adapter Layer DBLP MAS CiteULike Delicious Twitter MAS DBLP CiteULike Delicious Twitter Data Sources
  • 45. How professors think [Lamont 2009] Correlation experiments [Jensen 2009, Kulasegarah 2010, Katsaros 2008] There is no direct study of reputation in the research evaluation process.

Hinweis der Redaktion

  1. Good afternoon everyone. My name is Cristhian Parra and today I will present the work we are pushing forward in Trento to first capture and later estimate reputation in academia
  2. The most basic definition of reputation comes in the following way: reputation (in this case scientific) is the social evaluation of a group of entities (the scientific community) towards a person, group of persons, organizations, artifacts (researchers and contributions in this case) on a certain criterion (which is more frequently the scientific impact)And why is this of any importance?
  3. With this title, we want to refer to the two main elements of the proposal. The first element is “understanding”, which refers to the main goal of the proposal: to understand the way reputation is formed within and across scientific communities. Very few people will doubt about reputation people like Einstein in Physics, Turing in CS, or more recently by Aho in CS (famous to us students for his Dragon Book). Their good reputation is safe, in a way. Now, few people will also know how to precisely explain why this happens or what exactly make researchers to have such a good opinion about some of their peer. Which lead us to the second element of our proposal, related to the fundamental problem we will need to solve in order to get to the goal: Reverse Engineering Scientific Reputation.How can we derive the main aspects that affect reputation of researchersin the mind of people?
  4. Because Science is basically an Economy of Reputation, where the reward for contributing to science is fundamentally building up your reputation.An this reputation is mainly based on your Scientific impact, is a multi-dimensional construct that can not be adequately measured by any single indicator [9]. It might depend on features ranging from citation-based bibliometrics, to newly web based readership or download, twitter counts, or simply the reputation of your affiliation or collaborators.This features can be both objective (e.g. bibliometrics) and subjective (e.g.affiliation) criteria resulting in a measure of and they are highly dependant of the communities. Some communities might be more or less subjective than others. Researchers will understand criteria behind their own reputationResearchers will also understand how this reputation varies across communitiesAll this understanding will help to ease the pressure of the publish or perish cultureIn general, it will improve support for decision making in evaluation processes.
  5. weak positive linear dependence wrt H-Index (with self-citations).medium positive linear dependence wrtnumber of publications,
  6. Institute for Computer Sciences, Social Informatics and Telecommunications Engineering (ICST)
  7. Measure the difference on reputation across different communitiesValidation of resultsAnd the challenges are basically the following. First, we need to get reputation info. This is, we need to know the opinion researchers have about other researchersSecond, we need to understand what are the features that characterize to researchersor their work in computer science. Example of Features are Indicators as the "Total number of publications" and other Informations that can give an idea of thequality of the work of a scientist (e.g. keynotes talks, awards, grants, affiliation, etc.) Then, we need to find a way of representing and "Collecting" these features. That is, we need to crawl the web, academic libraries, search engines, etc. looking for this info. Once we have all the data, the next step is to efectively "derive" and "represent"reputation logic behind a particular ranking. And finally, the big challenge is to validate the work. To measure how much our derived reputation algorithms can actually help researchers make better decisions.
  8. Measure the difference on reputation across different communitiesValidation of resultsAnd the challenges are basically the following. First, we need to get reputation info. This is, we need to know the opinion researchers have about other researchersSecond, we need to understand what are the features that characterize to researchersor their work in computer science. Example of Features are Indicators as the "Total number of publications" and other Informations that can give an idea of thequality of the work of a scientist (e.g. keynotes talks, awards, grants, affiliation, etc.) Then, we need to find a way of representing and "Collecting" these features. That is, we need to crawl the web, academic libraries, search engines, etc. looking for this info. Once we have all the data, the next step is to efectively "derive" and "represent"reputation logic behind a particular ranking. And finally, the big challenge is to validate the work. To measure how much our derived reputation algorithms can actually help researchers make better decisions.
  9. Measure the difference on reputation across different communitiesValidation of resultsAnd the challenges are basically the following. First, we need to get reputation info. This is, we need to know the opinion researchers have about other researchersSecond, we need to understand what are the features that characterize to researchersor their work in computer science. Example of Features are Indicators as the "Total number of publications" and other Informations that can give an idea of thequality of the work of a scientist (e.g. keynotes talks, awards, grants, affiliation, etc.) Then, we need to find a way of representing and "Collecting" these features. That is, we need to crawl the web, academic libraries, search engines, etc. looking for this info. Once we have all the data, the next step is to efectively "derive" and "represent"reputation logic behind a particular ranking. And finally, the big challenge is to validate the work. To measure how much our derived reputation algorithms can actually help researchers make better decisions.
  10. Possible Examples of CombinationsOne single feature with the highest correlation to reputation (e.g. H-Index for Databases, Readership for Social Informatics)A linear combination of featuresA complex logic algorithm (e.g. a decision tree)
  11. Measure the difference on reputation across different communitiesValidation of resultsAnd the challenges are basically the following. First, we need to get reputation info. This is, we need to know the opinion researchers have about other researchersSecond, we need to understand what are the features that characterize to researchersor their work in computer science. Example of Features are Indicators as the "Total number of publications" and other Informations that can give an idea of thequality of the work of a scientist (e.g. keynotes talks, awards, grants, affiliation, etc.) Then, we need to find a way of representing and "Collecting" these features. That is, we need to crawl the web, academic libraries, search engines, etc. looking for this info. Once we have all the data, the next step is to efectively "derive" and "represent"reputation logic behind a particular ranking. And finally, the big challenge is to validate the work. To measure how much our derived reputation algorithms can actually help researchers make better decisions.
  12. Possible Examples of CombinationsOne single feature with the highest correlation to reputation (e.g. H-Index for Databases, Readership for Social Informatics)A linear combination of featuresA complex logic algorithm (e.g. a decision tree)
  13. Now, I’m sure that you are all thinking now. “Why do we want to do this?”Yes, and NO.
  14. Researchers will understand criteria behind their own reputation, allowing them to know what re- ally matters when it comes to research impact. This is what indicators contribute most to the researcher’s opinion of reputation.• Researchers will also understand how this reputation varies across communities, giving an important in- put for the always difficult problem of cross community comparisons.• This understanding will be done using data sources that include traditional but also social indicators (e.g. liquidpub, citeulike, mendeley, etc.) which means that our results will naturally extent metrics beyond cita- tions, helping to identify ways to measure scientific reputation in accurate terms (i.e. closer to the real opinion of people)• All these understanding will help to ease the pressure of the publish or perish culture and allow scientists to better focus on what it is really important.
  15. In our case, because we want to analyze Reputation in the context of Science, we need to understand Research Evaluationbecause in order to come up with an opinion about a peer in science, what we do is EVALUATING himIn research evaluation, not onlyResearchers are the subject of evaluation, but alsoTheir contributions (papers)The dissemination means such as Journals and ConferencesAnd the Institutions. To do so, we have been using two main methods:Committees (such as those of peer review)Quantitative Analysis (such as bibliometric indicators)
  16. weak positive linear dependence wrt H-Index (with self-citations).medium positive linear dependence wrtnumber of publications,
  17. weak positive linear dependence wrt H-Index (with self-citations).medium positive linear dependence wrtnumber of publications,