Search results outliers among MEDLINE platforms

Authors

DOI:

https://doi.org/10.5195/jmla.2019.622

Keywords:

Information Retrieval, Search Queries, Medical Subject Headings (MeSH), MEDLINE

Abstract

Objective: Hypothetically, content in MEDLINE records is consistent across multiple platforms. Though platforms have different interfaces and requirements for query syntax, results should be similar when the syntax is controlled for across the platforms. The authors investigated how search result counts varied when searching records among five MEDLINE platforms.

Methods: We created 29 sets of search queries targeting various metadata fields and operators. Within search sets, we adapted 5 distinct, compatible queries to search 5 MEDLINE platforms (PubMed, ProQuest, EBSCOhost, Web of Science, and Ovid), totaling 145 final queries. The 5 queries were designed to be logically and semantically equivalent and were modified only to match platform syntax requirements. We analyzed the result counts and compared PubMed’s MEDLINE result counts to result counts from the other platforms. We identified outliers by measuring the result count deviations using modified z-scores centered around PubMed’s MEDLINE results.

Results: Web of Science and ProQuest searches were the most likely to deviate from the equivalent PubMed searches. EBSCOhost and Ovid were less likely to deviate from PubMed searches. Ovid’s results were the most consistent with PubMed’s but appeared to apply an indexing algorithm that resulted in lower retrieval sets among equivalent searches in PubMed. Web of Science exhibited problems with exploding or not exploding Medical Subject Headings (MeSH) terms.

Conclusion: Platform enhancements among interfaces affect record retrieval and challenge the expectation that MEDLINE platforms should, by default, be treated as MEDLINE. Substantial inconsistencies in search result counts, as demonstrated here, should raise concerns about the impact of platform-specific influences on search results.

 This article has been approved for the Medical Library Association’s Independent Reading Program.

Author Biographies

Christopher Sean Burns, Associate Professor, School of Information Science, University of Kentucky, Lexington, KY

Associate Professor

School of Information Science

Robert M. Shapiro II, Assistant Professor, School of Information Science, University of Kentucky, Lexington, KY

School of Information Science

Assistant Professor

Tyler Nix, Informationist, Taubman Health Sciences Library, University of Michigan, Ann Arbor, MI

Taubman Health Sciences Library

Informationist

Jeffrey T. Huber, Professor, School of Information Science, University of Kentucky, Lexington, KY

School of Information Science

Professor

References

Amrhein V, Korner-Nievergelt F, Roth T. The earth is flat (p>0.05): significance thresholds and the crisis of unreplicable research. Peer J. 2017 Jul 7;5:e3544.

Baker M. 1,500 scientists lift the lid on reproducibility. Nature. 2016 May 26;533(7604):452–4.

Open Science Collaboration. Estimating the reproducibility of psychological science. Science. 2015 Aug 28;349(6251):aac4716. DOI: http://dx.doi.org/10.1126/science.aac4716.

Moher D, Liberati A, Tetzlaff J, Altman DG, Group TP. Preferred Reporting Items for Systematic Reviews and Meta-Analyses: the PRISMA statement. PLOS Med. 2009 Jul 21;6(7):e1000097. DOI: http://dx.doi.org/10.1371/journal.pmed.1000097.

Cochrane. Cochrane handbook for systematic reviews of interventions [Internet]. Cochrane [cited 6 May 2019]. <https://training.cochrane.org/handbook>.

Buchanan S, Salako A. Evaluating the usability and usefulness of a digital library. Libr Rev. 2009 Oct 9;58(9):638–51. DOI: http://dx.doi.org/10.1108/00242530910997928.

Edwards A, Kelly D, Azzopardi L. The impact of query interface design on stress, workload and performance. In: Hanbury A, Kazai G, Rauber A, Fuhr N, eds. Advances in information retrieval. Springer International Publishing; 2015. p. 691–702. (Lecture Notes in Computer Science).

Goodman SN, Fanelli D, Ioannidis JPA. What does research reproducibility mean? Sci Transl Med. 2016 Jun 1;8(341):341ps12. DOI: http://dx.doi.org/10.1126/scitranslmed.aaf5027.

Ho GJ, Liew SM, Ng CJ, Shunmugam RH, Glasziou P. Development of a search strategy for an evidence based retrieval service. PLOS One. 2016 Dec 9;11(12):e0167170. DOI: http://dx.doi.org/10.1371/journal.pone.0167170.

Peng RD. Reproducible research and biostatistics. Biostatistics. 2009 Jul;10(3):405–8. DOI: http://dx.doi.org/10.1093/biostatistics/kxp014.

Toews LC. Compliance of systematic reviews in veterinary journals with Preferred Reporting Items for Systematic Reviews and Meta-Analysis (PRISMA) literature search reporting guidelines. J Med Libr Assoc. 2017 Jul;105(3):233–9. DOI: http://dx.doi.org/10.5195/jmla.2017.246.

Lam MT, McDiarmid M. Increasing number of databases searched in systematic reviews and meta-analyses between 1994 and 2014. J Med Libr Assoc. 2016 Oct;104(4):284–9. DOI: http://dx.doi.org/10.5195/jmla.2016.141.

Bethel A, Rogers M. A checklist to assess database-hosting platforms for designing and running searches for systematic reviews. Health Inf Libr J. 2014 Mar;31(1):43–53.

Ahmadi M, Sarabi RE, Orak RJ, Bahaadinbeigy K. Information retrieval in telemedicine: a comparative study on bibliographic databases. Acta Inform Med. 2015 Jun;23(3):172–6. DOI: http://dx.doi.org/10.5455/aim.2015.23.172-176.

Younger P, Boddy K. When is a search not a search? a comparison of searching the AMED complementary health database via EBSCOhost, OVID and DIALOG. Health Inf Libr J. 2009 Jun;26(2):126–35. DOI: http://dx.doi.org/10.1111/j.1471-1842.2008.00785.x.

De Groote SL. PubMed, Internet Grateful Med, and Ovid. Med Ref Serv Q. 2000 Winter;19(4):1–13. DOI: http://dx.doi.org/10.1300/J115v19n04_01.

Bramer WM, Giustini D, Kramer BM, Anderson P. The comparative recall of Google Scholar versus PubMed in identical searches for biomedical systematic reviews: a review of searches used in systematic reviews. Syst Rev. 2013 Dec 23;2:115.

Haase A, Follmann M, Skipka G, Kirchner H. Developing search strategies for clinical practice guidelines in SUMSearch and Google Scholar and assessing their retrieval performance. BMC Med Res Methodol. 2007 Jun 30;7:28.

Nourbakhsh E, Nugent R, Wang H, Cevik C, Nugent K. Medical literature searches: a comparison of PubMed and Google Scholar. Health Inf Libr J. 2012 Sep;29(3):214–22.

Craven J, Jefferies J, Kendrick J, Nicholls D, Boynton J, Frankish R. A comparison of searching the Cochrane library databases via CRD, Ovid and Wiley: implications for systematic searching and information services. Health Inf Libr J. 2014 Mar;31(1):54–63.

Dunikowski LG. EMBASE and MEDLINE searches. Can Fam Physician. 2005 Sep 10;51(9):1191. (Available from: <https://www.ncbi.nlm.nih.gov/pmc/articles/PMC1479462/>. [cited 17 Jan 2019].)

Allison MM. Comparison of CINAHL® via EBSCOhost®, OVID®, and ProQuest®. J Electron Resour Med Libr. 2006;3(1):31–50.

Boddy K, Younger P. What a difference an interface makes: just how reliable are your search results? Focus Altern Complement Ther. 2009;14(1):5–7. DOI: http://dx.doi.org/10.1111/j.2042-7166.2009.tb01854.x.

Katchamart W, Faulkner A, Feldman B, Tomlinson G, Bombardier C. PubMed had a higher sensitivity than Ovid-MEDLINE in the search for systematic reviews. J Clin Epidemiol. 2011 Jul;64(7):805–7. DOI: http://dx.doi.org/10.1016/j.jclinepi.2010.06.004.

Boeker M, Vach W, Motschall E. Semantically equivalent PubMed and Ovid-MEDLINE queries: different retrieval results because of database subset inclusion. J Clin Epidemiology. 2012 Aug;65(8):915–6. DOI: http://dx.doi.org/10.1016/j.jclinepi.2012.01.015.

Collins M. Updated algorithm for the PubMed best match sort order. NLM Tech Bull [Internet]. 2017 Jan–Feb;(414):e3 [cited 17 Jan 2019]. <https://www.nlm.nih.gov/pubs/techbull/jf17/jf17_pm_best_match_sort.html>.

Iglewicz B, Hoaglin DC. How to detect and handle outliers. Milwaukee, WI: ASQC Quality Press; 1993.

R Foundation. The R project for statistical computing [Internet]. The Foundation [cited 17 Jan 2019]. <https://www.r-project.org/>.

Wickham H, François R, Henry L, Müller K, RStudio. dplyr: a grammar of data manipulation [Internet]. 2018 [cited 17 Jan 2019]. <https://CRAN.R-project.org/package=dplyr>.

Wickham H. ggplot2: elegant graphics for data analysis. New York, NY: Springer; 2009.

Auguie B, Antonov A. gridExtra: miscellaneous functions for “grid” graphics [Internet]. 2017 [cited 17 Jan 2019]. <https://CRAN.R-project.org/package=gridExtra>.

Makiyama K. magicfor: magic functions to obtain results from for loops [Internet]. 2016 [cited 17 Jan 2019]. <https://CRAN.R-project.org/package=magicfor>.

Wickham H. Reshaping data with the reshape package. J Stat Softw. 2007 Nov 13;21(1):1–20. DOI: http://dx.doi.org/10.18637/jss.v021.i12.

Dahl DB, Scott D, Roosen C, Magnusson A, Swinton J, Shah A, Henningsen A, Puetz B, Pfaff B, Agostinelli C, Loehnert C, Mitchell D, Whiting D, da Rosa F, Gay G, Schulz G, Fellows I, Laake J, Walker J, Yan J, Andronic L, Loecher M, Gubri M, Stigler M, Castelo R, Falcon S, Edwards S, Garbade S, Ligges U. xtable: export tables to LaTeX or HTML [Internet]. 2018 [cited 17 Jan 2019]. <https://CRAN.R-project.org/package=xtable>.

Dunn K, Marshall JG, Wells AL, Backus JEB. Examining the role of MEDLINE as a patient care information resource: an analysis of data from the Value of Libraries study. J Med Libr Assoc. 2017 Oct;105(4):336–46. DOI: http://dx.doi.org/10.5195/jmla.2017.87.

Ogilvie RI. The death of a volunteer research subject: lessons to be learned. CMAJ. 2001 Nov 13;165(10):1335–7.

Downloads

Published

2019-07-01

Issue

Section

Original Investigation