This presentation was made at a large pharmaceutical company's R&D and corporate affairs campus - going a little more indepth than the one from the prior Science of Team Science Conference
call girls in aerocity DELHI 🔝 >༒9540349809 🔝 genuine Escort Service 🔝✔️✔️
Evaluating research consortium
1. Evaluating the value of
research-by-consortium
Mark David Lim, PhD
September 4, 2014
2. action
FasterCures is an “action tank” driven by a singular
goal – to save lives by speeding up and
improving the medical research system.
A center of the Milken Institute, we are a nonprofit
and nonpartisan organization that works with all the
sectors of the medical research and development
ecosystem.
3. bringing a new discovery from lab to market is a
long, expensive and risky road
4. innovation
Milken Institute partnered with
Sanofi to host an Innovation
Retreat in 2011
Meeting yielded 40+ policy and
R&D recommendations including:
• Open innovation and
cooperation among competitors
• Collaborating in the
precompetitive space
• Defining metrics of success
7. research-by-consortium
Academia /
Clinical
Patient
groups
Government Industry
Temporary association
of researchers that share
resources and effort for
a common objective.
Consortia integrate
multiple types of
knowledge, data from
multiple sources, and
align different interests.
8. consortiapedia.fastercures.org
Operational Framework Landscape
Sci. Trans. Medicine, June 2014
http://bit.ly/STMConsortia
• Mission/governance
• Financing
• Data-sharing
• Intellectual property
• and others…
Database
369 consortia
• Disease focus
• Types of tools
• Where and who
• Why
Planned release:
end of 2014
• Consortium-provided
content
• Cross-comparison of
consortia
• Point-of-contact
9. objectives
• Share findings from analysis of
the consortia landscape
• Propose a new framework for
measuring the value of
research-by-consortia efforts
• Have an open dialogue around
the utility and feasibility of
measuring consortia value
10. Metrics
- collaboration and partnerships
- framework of consortia
What is important to you?
- output, efficiency
11. Who and what
Sci. Trans. Medicine, June 2014
http://bit.ly/STMConsortia
12. More than half focused on disease/condition
Sharing comparator arm
data from clinical trials
Research assays,
animal models
Sci. Trans. Medicine, June 2014
http://bit.ly/STMConsortia
Genomic/clinical
databaseT2D patients
AgedBrainSYSBIO
Age-associated
pathways
14. Consortium lifespan: 5 - 6 years
Inception
Ramp up
Mid-stream
Wind down
Closure
| 1 year | 2 - 3 years | 1 year |
Project launch
Team culture
Infrastructure
Scientific challenge
Sponsor engagement
Governance
Agreements
Tool concept
Engaging tool-builders
Project plan
Project execution
Milestones
Deliverables
Licensing/IP
Dissemination Data management
Licensing/IP
Dissemination
Royalties
15. Evaluations should be simple
Hub-and-spoke – central source of information
Innovative Medicines Initiative, Critical Path Initiative,
Foundation for the National Institutes of Health,
Health & Environmental Sciences Institute
Formalized agreements and governance
transparency
Established timelines and milestones
project management
16. Evaluation = Support
Inception
Ramp up
Mid-stream
Wind down
Closure
| 1 year | 2 - 3 years | 1 year |
Financial and in-kind commitment
Monitoring & Evaluation
Steering
Committee
Board of
Directors
17. Many formal evaluations
Steering Committee Board of Directors
Sponsors Consortium Staff
Research Team
18. Many informal evaluations
Steering Committee Board of Directors
Sponsors Consortium Staff
Research Team
20. What do you value?
Efficiency
- Convening
- Executing
- Managing
- Concluding
Output
- Level of adoption
- Business strategy alignment
- Government roles
- Creating opportunities
- R&D cost/time/efficiency
21. Output: eye of the beholder
Government • public health
• regulatory science
• de-risk innovation
• economic growth
• state-of-science research guidance documents
Industry • accelerate pipeline
• new therapeutic area
• access resources
• de-risk innovation
• access intellectual
capital
Academia • access resources
• opportunities for
publications
• training opportunities
• identify collaborators
Patient
organizations
• accelerate pipelines
• advance basic
research
• de-risk medical
product development
Consortium
researchers
• simplify day jobs
• access resources
• networking
• training / education
22. Bibliometrics
• By the end of 2013, IMI projects had
delivered over 600 scientific
publications in over 300 journals
• The citation index of papers from IMI
projects is twice the world average,
and higher than the EU average.
.
Data & analysis: Thomson Reuters (Custom Analytics & Engineered Solutions), 2013
23. Bibliometrics and collaboration
Pre IMI funding award Post IMI funding award
Data & analysis: Thomson Reuters (Custom Analytics & Engineered Solutions), 2013
24. Collaborations – who / what
Co-authorship – 69%
Cross-sector collaboration – 42%
Cross-project collaboration – 37%
Cross-disease collaboration – 31%
IMI researcher networks by sector
Data & analysis: Thomson Reuters (Custom Analytics & Engineered Solutions), 2013
25. Value of consortia
How will the output be used? Is consortium on-track?
Therapeutic area core strategy vs opportunistic
Platform methods / tools clinical trials, personalized medicine,
data standards / exchange, assays
Others?
Project Name Outcome Output Area
IMIDIA Smaller clinical trials and Personalized
medicine; Faster development times,
Reduced attrition, and Predictive
models
Biomarkers and
personalized medicine;
Efficacy
Diabetes
COMPACT Faster development times, Reduced
attrition, and Predictive models
Efficacy Biologicals
Safe-T Smaller clinical trials and Personalized
medicine; Faster development times,
Reduced attrition, and Predictive
models
Biomarkers and
personalized medicine
Drug Safety
Examples of IMI consortia
26. Complexities for evaluation by output
Not all consortium outputs are publishable – licenses, databases
Publications are retrospective, rarely primary/secondary deliverable
Different stakeholders = different expectations on output
Bias: "Sexiness" of the science
Virtual collaborations - no dedicated laboratory/workspace
Semi-committed teams - not their day jobs
Human capital - turnover, advancement
Numerous consortia, different operational models
- cross comparison?
#
27. What do you value?
Efficiency
- Convening
- Executing
- Managing
- Concluding
Output
- Level of adoption
- Business strategy alignment
- Government roles
- Creating opportunities
- R&D cost/time/efficiency
28. Evaluating efficiency
Tracking progress - convene to perform
Coordinating virtual teams
• Within work streams
• Across work streams
• With governing bodies
Resolving bottlenecks
• Maintaining scope
• Appropriate expertise / resources
• Communications
• Conflicts / adaptability
• Team member turnover
29. Dynamics of teamwork
Phase of
Research
Stage of
Team
Development
Phase of
Team
Adaptation
Wooten, U. Houston, Science of Team Science conference
30. Phase of
Research
Stage of
Team
Development
Phase of
Team
Adaptation
Development
- goals, mission
Conceptualization
- research question,
framework
Implementation
- launch, conduct
Translation
- application
Wooten, U. Houston, Science of Team Science conference
Hall et al, Trans Behavioral Med (2012)
31. Phase of
Research
Stage of
Team
Development
Phase of
Team
Adaptation
Assess situation
- recognition
Plan formulation
- goal setting, expectations
Plan execution
- monitoring, communication,
coordination
Team learning
- lessons learned
Wooten, U. Houston, Science of Team Science conference
Burke et al., J. Applied Psychology (2006)
32. Phase of
Research
Stage of
Team
Development
Phase of
Team
Adaptation
Forming
- tasks, strategy, team
Wooten, U. Houston, Science of Team Science conference
Tuckman & Jensen, Group and Organizational Studies (1977)
Storming
- roles and interactions
Norming
- rules, roles, expectations
Performing
- tasks, implementation
Adjourning
- finalizing
33. Tracking consortium progress via metrics
Inception
Ramp up
Mid-stream
Wind down
Closure
Collective orientation
Interpersonal relations
Goal setting
Teamwork concept
Knowledge
consideration
Role clarification
Team subgroups
Cohesion / collective efficacy
Evolved interpersonal
relations
Maintaining shared vision
Problem solving / adaptability
Knowledge accommodation
Evolved role clarification
Autonomy & interdependence
Collective knowledge
transformation
Evolved interpersonal relations
Defining accomplishments
Problem solving
Mediated information
exchange
Autonomy & interdependence
35. Convene Integrate Implement
Collective orientation
Interpersonal relations
Goal setting
Teamwork concept
Knowledge
consideration
Role clarification
Team subgroups
Cohesion / collective efficacy
Evolved interpersonal
relations
Maintaining shared vision
Problem solving / adaptability
Knowledge accommodation
Evolved role clarification
Autonomy & interdependence
Collective knowledge
transformation
Evolved interpersonal relations
Defining accomplishments
Problem solving
Mediated information
exchange
Autonomy & interdependence
Leveraging human capital
36. Periodic survey of team dynamics
Steering Committee Board of Directors Research Team
Consortium Staff
62%
coherence in mission
35%
contribution
Correctional action:
- Increased face-to-face interaction
- Document-sharing technology
- Conflict resolution
37. Framework for reports
Sponsors
Operational
efficiency
Alignment to
strategy
Consortium Staff
38. Metrics = better communication?
• Output - Did the team deliver?
• Technical milestones – binary
• Team dynamics - Could the team have done better?
• Leverage resources and expertise
• Adaptability
Mid-term report overview
Technical progress: 4 / 5 milestones accomplished
Stage of team: Perform
Team integration across disciplines
Document sharing / development
Researcher engagement
Steering committee alignment
Interdependency defined
Conflicts resolved
39. How was output used?
Business strategy
• Open new therapeutic approaches and research avenues
• Reduce R&D costs, time to market and development risk
• Increase the efficacy and/or safety of existing drugs
Indirect benefits
• Education and training
• Spin off companies
• New partners (patients, foundations, academic, SME)
• Increased interest in geographic investments
Others
• Implementation of standards / best practices / tools into strategy
• Informing regulatory science (policy / guidelines)
• Publication output and extent of collaboration
• Intellectual Property metrics
IMI Executive Office and other consortia
40. Design
• Consortium management
• Consortium participants
• Sponsor/stakeholder
Refine
• Other consortia – managers/participants
• Other sponsor within same sector
Pilot
• Several consortia
Optimize
• Analyze / Evaluate
• Optimize survey vehicles
• Re-pilot
41. ?
Utility
• Need for these evaluations?
• Inform best practices?
• Other non-consortium partnerships?
Approach
• Right approach?
• Aligning consortia with business strategy necessary
after concept development?
• Generalizable?
• Other key elements to measure?
• Indirect effects?
• Who/how to pilot?
Implementation
• How to measure (surveys, etc)?
• Who measures?