Pharmaceutical companies are highly dependent on access to high quality information retrieval. Insufficient gathering and selection of scientific information could potentially impact corporate decision-making in a wrong direction.
To assess the value of external information retrieval services a number of third party information providers were contacted with two information research requests (within inflammatory diseases). The providers were asked to return with search results and search methodologies used. In the first search the interaction with the providers were kept at a minimal level, whereas in the second search the contact, direction, and interaction were increased.
It is concluded that information research results from different providers are variable. The expected increase in inter-homogeneity of results from the different providers could not be confirmed after the second search. The overall overlap of results was 38% for the first search and 33% for the second search, and surprisingly none of the references were found by all providers.
To fully cover the area of interest and to avoid bias it is recommended to perform exhaustive scientific literature searches. Researchers and decision-makers should accept large amounts of results from literature searches and promote initiatives to analyse these results in detail.
VIP Kolkata Call Girl Kestopur 👉 8250192130 Available With Room
Biased Information Retrieval in Pharmaceutical Drug Development
1. Biased Information Retrieval in
Pharmaceutical Drug Development
ICIC - 20th October 2015
Kasper Højby Nielsen
kphn@novonordisk.com
Information scientist, team leader
Novo Nordisk A/S, Denmark
2. Background: Medicine, human biology 1993
Profession: Academic librarian 1995
-> information scientist 2006
-> information consultant 2015
I have performed hundreds if not thousands of searches!
Who am I?!
20-10-2015ICIC 2015: Biased Information Retrieval 2
5. An information scientist combines
• Field knowledge (scientific field)
• Business understanding
• Library field insight and experience
and acts as an internal information consultant:
Global Information & Analysis in Novo Nordisk
20-10-2015ICIC 2015: Biased Information Retrieval 5
6. • I am not an IT specialist
• I am an advanced user of IT systems/databases
• I have performed a study of vendor search deliveries
Why am I invited to ICIC?
20-10-2015ICIC 2015: Biased Information Retrieval 6
7. • Information searching
• Bias in information retrieval
• Investigation objectives
• Methods
• Results
• Conclusions and recommendations
Agenda
ICIC 2015: Biased Information Retrieval 20-10-2015 7
8. Oxford Dictionary:
Search (for somebody/something):
1. An attempt to find somebody/something, especially by
looking carefully for them/it
2. An act or the activity of looking for information in a computer
database or network
What is a search?
ICIC 2015: Biased Information Retrieval 20-10-2015 8
9. • IT perspective
• Information scientist perspective: Searching for data in bibliographies
A combination of queries
leading to a filtered result
What is a “search”?
20-10-2015ICIC 2015: Biased Information Retrieval 9
10. What is a “search”?
ICIC 2015: Biased Information Retrieval 20-10-2015 10
1_: ((SAFETY OR (ADVERSE ADJ EVENT$1)) OR 1386359 docs
(ADVERSE ADJ REACTION$1)) OR (ADVERSE
ADJ EFFECT$1)
2_: ((SUSAR$1 OR (SIDE ADJ EFFECT$1)) OR 969051 docs
(DRUG ADJ REACTION$1)) OR (DRUG ADJ
EFFECT$1)
3_: COMPLICATION$1 1417042 docs
4_: NOVOMIX$1 OR NOVOLOG$1 OR NOVORAPID$1 9939 docs
OR LEVEMIR$1 OR LANTUS$1 OR ASPART$1 OR
GLARGINE$1
5_: (DETEMIR$1 OR ((BIPHASIC ADJ INSULIN) 6216 docs
ADJ (LISPRO$1 OR ASPART$1))) OR
(INSULIN ADJ ANALOG$4)
6_: HUMALOG$1 512 docs
7_: 1 OR 3 OR 2 3210368 docs
8_: 6 OR 5 OR 4 12349 docs
9_: 8 AND 7 6151 docs
10_: "2014".PY. 4729501 docs
11_: 9 AND 10 1143 docs
11. ((SAFETY OR (ADVERSE ADJ EVENT$1)) OR
(ADVERSE ADJ REACTION$1)) OR (ADVERSE
ADJ EFFECT$1) 1386359 docs
What is a search?
ICIC 2015: Biased Information Retrieval 20-10-2015 11
12. What is a good search?
20-10-2015ICIC 2015: Biased Information Retrieval 12
13. It all depends!
What is a good search?
20-10-2015ICIC 2015: Biased Information Retrieval 13
14. Standardisation? Building the search profile
What is a good search?
ICIC 2015: Biased Information Retrieval 20-10-2015 14
Safety
Adverse
event
Adverse
reaction
Adverse
effect
AE
SAE
…
Drug
Insulin
analogue
Modern
insulin
Product
Levemir
Detemir
Lantus
Glargine
…
Children
Child
Paediatric
Teenager
Teen
…
15. What is a good search? Intersection
ICIC 2015: Biased Information Retrieval 20-10-2015 15
16. “Please give me everything about diabetes”
Results: 498.147 (PubMed as of 11. October 2015)
“Please everything about diabetes mellitus”
Results: 380.259 (PubMed as of 11. October 2015)
What is a good search query?
ICIC 2015: Biased Information Retrieval 20-10-2015 16
18. Many ways to perform a search
Are search results from different vendors alike – how much do
they differ?
What is the impact of strengthening the interaction between
customer and vendor?
Had something like this been done before? Not to my knowledge
Study to compare vendor results
20-10-2015ICIC 2015: Biased Information Retrieval 18
19. It is hypothesised that a potential difference in response from
third party providers to identical literature search requests can
be avoided or at least substantially reduced by strengthening
the communication and feedback between requester and
providers.
Hypothesis
ICIC 2015: Biased Information Retrieval 20-10-2015 19
20. Methods - highlights
• Preparations
• Field study
• Processing of data
• Analysis
Slide no
2020-10-2015ICIC 2015: Biased Information Retrieval
21. Methods - preparations
• Preparations of research questions
• Contact to a set of information providers – 6 in total
• Blinded (did not know about the project)
• The providers were paid for their services
• Novo Nordisk could use the data
• International
• Public/Private
• Same communication to all vendors
Slide no
2120-10-2015ICIC 2015: Biased Information Retrieval
22. Methods - preparations
Literature search request no. 1
Email:
I would like a comprehensive literature search performed on the
following subject – preferably within 5-10 working days. Would this be
feasible?
In Systemic Lupus Erythematosus (SLE): What is the frequency of
different comorbid diseases (mortality related) and do you find the
occurrence related to disease severity? Gender differences? Please
attach the reference list (including abstracts) as well as the search
criteria used including databases searched.
Slide no
2220-10-2015ICIC 2015: Biased Information Retrieval
23. Methods - preparations
Literature search request no. 2:
Email:
Please provide a comprehensive literature search covering the
following questions within the next 5-10 working days:
What is the frequency of comorbid diseases in Rheumatoid Arthritis
(RA) regarding cardiovascular events and cancer? Is there a
difference in occurrence of the above between the following
subpopulations: MTX-naïve patients? MTX-IR RA patients? TNF-IR
patients? (IR=inadequate response).
Please attach the reference list (including abstracts) for the recent
5-10 years as well as the search criteria used including databases
searched.
Please contact me via email with any questions you may have for
this query.
Slide no
2320-10-2015ICIC 2015: Biased Information Retrieval
24. Methods - Field study
• Interaction with the information providers
• Search no. 1: Minimal interaction
• Search no. 2: Attempts to increase communication
• Email correspondence
• Providers kept blinded
• Reception of responses/results
ICIC 2015: Biased Information Retrieval
Slide no
2420-10-2015
25. Methods - Processing of data
• Re-retrieval of references (intra-provider duplicate exclusion)
• Transfer to Reference Manager (citation handling system)
• Duplicate determination (overlap)
• Relevance review
• Vendor search methodology review
Slide no
2520-10-2015ICIC 2015: Biased Information Retrieval
26. No or very little response/interaction
Vendor behaviour
20-10-2015ICIC 2015: Biased Information Retrieval 26
27. Results
Responses from providers (search no. 1)
Slide no
27
Provider
no.
No. of references
PY = 2009-2013
Total no. of references
1 64 123 (no PY limitation)
2 94 94 (2009-2013)
3 37 (2012-2013) 37 (2012-2013)
4 75 77 (2008-2013)
5 67 109 (2004-2013)
6 133 252 (no PY limitation)
Total 472 692
20-10-2015ICIC 2015: Biased Information Retrieval
28. Results – Overlap search no. 1
Slide no
2820-10-2015ICIC 2015: Biased Information Retrieval
29. Results – Overlap search no. 2
ICIC 2015: Biased Information Retrieval
Slide no
2920-10-2015
30. Most important findings:
Overlap: Search no. 1: 35% (provider 4 and 5)
2% (provider 1 and 2)
Search no. 2: 24% (provider 5 and 6)
5% (provider 2 and 6)
None of the references were identified by all providers
Results
20-10-2015ICIC 2015: Biased Information Retrieval 30
32. • The hypothesis was rejected: Attempts to increase interaction
between customer and providers did not lead to an increase of
overlap between provider results
• There is a risk of introducing information retrieval bias
Conclusions
20-10-2015ICIC 2015: Biased Information Retrieval 32
33. • Decision makers´ perception of information research
• Outsourcing?
1. Lack of understanding (business understanding/query
understanding)
2. Lack of communication initiative
3. No use of advanced search methodology
Concerns
ICIC 2015: Biased Information Retrieval 20-10-2015 33
34. 1. Understand the information search request in detail
2. Establish a search strategy
3. Generate a search profile
4. Initiate the search as an iterative process
5. Combine various search methods
6. Accept large sets of search results
7. Allow for time and allocate resources to be able to filter down
and analyse the search results for relevance
8. Consider if it is possible to apply an advanced analytical tool
like e.g. text mining in order to analyse large sets of data
Recommendations
20-10-2015ICIC 2015: Biased Information Retrieval 34
37. • Who were the vendors?
• Are the results published?
• Case: Ph.d. literature search course
Back up slides
20-10-2015ICIC 2015: Biased Information Retrieval 37