Jill Barr-Walker, MPH, MS
doi: http://dx.doi.org/10.5195/jmla.2017.109
Received 01 July 2016: Accepted 01 August 2016
ABSTRACT
Objective
This study assessed public health workers’ evidence-based information needs, based on a review of the literature using a systematic search strategy. This study is based on a thesis project conducted as part of the author’s master’s in public health coursework and is considered a systematized review.
Methods
Four databases were searched for English-language articles published between 2005 and 2015: PubMed, Web of Science, Library Literature & Information Science Index, and Library, Information Science & Technology Abstracts (LISTA). Studies were excluded if there was no primary data collection, the population in the study was not identified as public health workers, “information” was not defined according to specific criteria, or evidence-based information and public health workers were not the major focus. Studies included in the final analysis underwent data extraction, critical appraisal using CASP and STROBE checklists, and thematic analysis.
Results
Thirty-three research studies were identified in the search, including twenty-one using quantitative methods and twelve using qualitative methods. Critical appraisal revealed many potential biases, particularly in the validity of research. Thematic analysis revealed five common themes: (1) definition of information needs, (2) current information-seeking behavior and use, (3) definition of evidence-based information, (4) barriers to information needs, and (5) public health–specific issues.
Conclusions
Recommendations are given for how librarians can increase the use of evidence-based information in public health research, practice, and policy making. Further research using rigorous methodologies and transparent reporting practices in a wider variety of settings is needed to further evaluate public health workers’ information needs.
Evidence-based information in public health (EBPH) is an emerging topic in the field of public health. There are many important components to EBPH, but this review will focus on the aspect of EBPH that is defined as making decisions on the basis of the best available scientific evidence [1]. Guidelines in the literature describe the use of EBPH and stress the importance of this practice [2–4]. EBPH is often used to create interventions, with a general recognition that this approach is essential to changing public health outcomes [5]. Similarly, there is demand for public health policies to be based on existing evidence [6], and the principles of EBPH are increasingly being taught in public health departments [5, 7, 8]. Despite the belief that these concepts are important in public health, the use of evidence-based information remains underutilized in practice, and research plays a limited role in the formulation of policy and interventions in public health [6]. There are many possible barriers to the use of evidence-based information in public health, such as a lack of knowledge and skills regarding EBPH, lack of communication of evidence-based research findings to policy makers, lack of an organizational culture that supports EBPH, and lack of funding for EBPH resources [9, 10].
Librarians have been on the forefront of the evidence-based information movement by providing instruction and support to researchers. Librarians who serve public health workers, including medical and academic librarians, have a unique opportunity to ensure that this population utilizes evidence in their research and practice. Despite the variety of settings that employ public health workers, research on this topic has traditionally focused on those working in clinical settings [11, 12] and government departments [13, 14]. Few studies have surveyed public health workers in other occupational contexts, and currently no systematic reviews involve the information needs of public health workers in academia or private organizations.
Given the importance of evidence-based information to the fields of public health and library and information science, the knowledge, access, and use of this information by public health workers is a topic worthy of study. Although literature reviews on this topic exist [15, 16], none have employed systematic search strategies or critical appraisal. The main aim of this study is to assess information needs of public health workers based on a review of the literature using a systematic search strategy.
This project was originally completed as a master’s thesis as part of the author’s master’s in public health coursework at the London School of Hygiene and Tropical Medicine. Because the author conducted the review as a solo project, it should not be considered a systematic review, but rather a systematized review [17]. The potential biases that may result from this methodology are discussed in the limitations section.
Searches were carried out to address the question: what are the information needs of public health workers? Because the search question includes concepts from library and information science and public health, the following four databases were searched: PubMed, Web of Science, Library Literature & Information Science Index, and Library, Information Science & Technology Abstracts (LISTA). Gray literature was not included in the search. The field of library and information science has a well-established network of journals, and research in this field is published and disseminated through traditional channels. While public health researchers often create gray literature, such as unpublished reports, this method is not used widely in librarianship. Since the research question contains aspects related to this field as well as public health, the most relevant studies were expected to appear in databases that contained published literature.
The search was undertaken on April 1, 2015. Search terms included information, public health, librarian, evidence-based policy, and other relevant keywords based on the concepts of information needs and EBPH. A complete list of searches that were conducted can be found in supplemental Appendix A. Initial search terms were broad, and, upon review of the results, searches were narrowed to include greater numbers of relevant articles and increase the specificity of results. For example, a search in Web of Science for “public health” AND “information” yielded over 11,000 results, many of which were not relevant to the research question. Refining the search term to “public health information” yielded a result of 181 articles, the majority of which were relevant. Citation checking in Web of Science was used, and hand-searching, including reference searching and key author searching, was also utilized.
Date and language limits were placed on the search: only English-language articles and those published between 2005 and 2015 were included. Database searching yielded 1,615 articles. After the removal of 405 duplicates, 1,210 articles were included in title and abstract screening. During the title and abstract screening, 1,078 articles were excluded, leaving 132 studies to evaluate during full-text screening.
To maintain the focus on public health workers and information needs, exclusion criteria were used. Studies were excluded if there was no primary data collection, the population in the study was not identified as public health workers, “information” was not defined as below, or the major focus of the study was not about evidence-based information or public health workers.
To define exclusion criteria, essential aspects of the research question were defined. A public health worker was defined as “a person educated in public health or a related discipline who is employed to improve health through a population focus” [18]. Information was defined as “any stimulus that reduces uncertainty in a decision-making process,” and an information need was defined as “the recognition of what information can reduce this uncertainty as well as unrecognized or potential information needs” [3]. All definitions were taken from a well-designed study of public health workers’ information needs that Revere et al. conducted [16]. Exclusion of studies without primary data collection was necessary to limit the scope of this review to original research.
During full-text screening, ninety-nine studies were excluded. A list of reasons for exclusion is included in Figure 1. Thirty-three studies were included in the final analysis.
|
||
Figure 1 PRISMA flow diagram [53] |
Data were extracted from each study using the criteria outlined in the Critical Appraisal Skills Programme (CASP) checklist for qualitative studies [19] and the STrengthening the Reporting of OBservational studies in Epidemiology (STROBE) checklist for quantitative studies [20]. A form was created based on these checklists that asked fourteen questions of each qualitative study and twenty-two questions of each quantitative study. Extracted data included the type of study population, sample size, outcome measures, and potential sources of bias. The data extraction checklists can be found in supplemental Appendixes B and C.
Critical appraisal is the process of systematically examining research to judge its trustworthiness, value, and relevance in a particular context [21]. It is used to evaluate the quality of the studies examined, detect biases, and assess and discuss the validity, relevance, and usefulness of research evidence [22]. Data from each of the studies was critically appraised using the CASP checklist for qualitative studies [19] and the STROBE checklist for quantitative studies [20]. The questions used in the critical appraisal checklists can be found in Appendixes B and C.
A thematic synthesis was applied to the studies, whereby the findings were examined for analytical themes and compared across studies [23]. The findings were reviewed for key concepts and recurring themes, gaps in the current evidence base, and potential areas for further research.
Ethical approval was obtained from the London School of Hygiene and Tropical Medicine on February 16, 2015.
Twenty-one quantitative studies and 12 qualitative studies were included in this review. A summary of study characteristics can be found in Table 1. All quantitative studies used surveys, while 10 qualitative studies used semi-structured interviews, 6 used focus groups or observational methods, and 2 used surveys. One study used a randomized controlled trial study design [24]. Sample sizes ranged from 6 to 904. The median sample size for quantitative studies was 134, and the median for qualitative studies was 35. Many researchers used convenience samples, often consisting of academic or government employees of departments where the study was conducted [25–28]. There were several types of participants: health department employees, academic faculty, and clinical public health workers were the most common. On average, participants were young, with typically less than 5 years of total experience or time at their current jobs. In surveys of health department directors, the amount of experience was considerably higher.
Table 1 Descriptive summary of studies
Eighteen studies were conducted in the United States, with eleven occurring in one location and seven taking place either nationwide or in multiple states or regions. Seven studies were conducted in Canada, and four were conducted in Europe or Australia. Four studies were conducted in non-Western countries: India, Brazil, and Ethiopia.
There were several recurrent methodological issues in the studies. The most frequently occurring issues were low generalizability, selection bias related to response rate, small sample size, selection bias related to sampling, no data analysis, self-reporting used as a data collection tool, poor survey design, interviewer bias, and the lack of author acknowledgement of bias. A list of critical appraisal issues observed in the studies can be found in Table 2.
Table 2 Critical appraisal issues that occurred in the studies
The most common issue for critical appraisal that was observed across studies was a lack of information provided, often about sampling strategy, response rates, and data analysis. For this reason, it was not possible to complete the entire CASP or STROBE checklist for most studies.
Eight quantitative and 2 qualitative studies did not use data analysis [14, 26, 29, 30–36]. Most qualitative studies used thematic analysis and coding, but 40% of quantitative studies did not use any statistical analysis methods. Quantitative studies most often used descriptive statistics and tested bivariate relationships using chi-square or t-test methods. Several studies calculated odds ratios [10, 37, 38].
Many survey studies had low response rates, and authors infrequently addressed possible reasons for this. Overall, response rates ranged from 25%–100%, with a median of 65%. In 10 studies, the authors did not acknowledge or address possible biases in their research [25, 26, 29, 31, 37, 39, 40–43].
Five themes emerged during thematic analysis: definition of information needs, current information-seeking behavior and use, definition of evidence-based information, barriers to information needs, and public health-specific issues. Results of critical appraisal did not affect this analysis; all studies were weighted equally regardless of potential biases.
Twenty studies were devoted to examining the information needs of public health workers [11–14, 26–28, 30–33, 35–37, 42, 44–47]. Participants self-reported these needs, usually from a list of items in a survey. Information needs were defined as the types of information that public health workers need in their daily work. Several studies reported that public health workers need statistics, government reports and guidelines, and journal articles [11, 27, 31, 40, 42, 46, 48]. Staying current with the latest public health research, finding data, and finding materials for grant-writing were other areas of need [11, 13, 14, 32, 33, 49]. Participants frequently expressed the need for librarians but were often uncertain about the services that librarians could provide [11, 13, 26, 28, 29, 46]. Information needs often differed according to the roles and positions of participants [12, 40]. Local health department employees, for example, were more interested in practical knowledge, while state health departments had an increased need for guidelines and program planning [40].
Twenty studies discussed current information-seeking behavior and the use of information [11–13, 24, 26–28, 30, 32–37, 41–43, 46–48]. Data were collected by asking which information sources public health workers used in their work, usually in the form of a list. The most commonly selected information source was PubMed/MEDLINE, followed by Google and the Centers for Disease Control and Prevention (CDC) website [12, 13, 26, 27, 32, 49]. In one study, few participants were aware of or used any databases besides PubMed [32]. Journal articles were identified as one of the main sources of information across participant types [13, 26, 28, 32, 33, 35, 46, 49]. Many public health workers used colleagues as a source of information, often more frequently than they used online databases or librarians [12, 13, 27, 37, 42, 44, 48, 49]. Librarians were not a heavily used resource, and there was a prevailing lack of knowledge about the role of librarians and the services they could provide [11, 13, 26, 28, 29, 46].
Discrepancies between information needs and information use among public health workers were observed. Information needs as defined by public health workers included journal articles, peer-reviewed information, and help finding data. Reports of public health workers’ resource use differed from these findings: although use of PubMed was common, websites like Google and CDC were consistently rated as top information sources, while online databases containing peer-reviewed journal articles were rated among the lowest used [12, 13, 26, 27, 32, 49]. Although highly rated for information needs, librarians were underutilized, with many participants expressing uncertainty about their ability to access librarian services at their workplaces [11, 13, 26, 28, 29, 46]. One survey study found that while collaboration with other researchers was rated as very important, file sharing and collaborative data management programs like EndNote were utilized by less than 10% of participants [32].
Thirteen studies examined EBPH information [2, 10, 12, 25, 29, 34, 36, 40, 43, 45, 50, 51]. These studies surveyed health department employees, directors, and policy makers to learn about the use of evidence-based information in decision making (EBDM). Across studies, EBDM was consistently defined as an important tool that should be used by public health workers. In practice, however, public health workers did not use EBDM regularly [36, 38, 47]. Public health workers’ lack of knowledge about EBDM and the concept of evidence was demonstrated in several studies [25, 34, 36, 47, 50]. One study asked participants to define evidence and received diverse responses that represented a range of knowledge about this topic [44]. Those with less education in public health and epidemiology felt the least confident in using EBDM and reported the lowest numbers of finding and using evidence in their work [10, 36, 50]. Overall, public health workers were interested in increased training for EBDM, and this finding held true across workplace type and education level [25, 36, 39, 45, 47, 50, 51].
Eighteen studies surveyed participants about internal and external barriers to finding, accessing, and using information in their work. Internal barriers included lack of time, funding, training, staff, equipment, and subscriptions to journals [10, 11, 36–38, 42, 47, 49–51]. Public health workers reported that training was needed on how to find, access, evaluate, and synthesize information [12, 13, 25, 27, 32, 36, 45, 47, 49, 50]. Training was also needed on the processes involved in EBDM, including an explanation of what evidence consists of, how to find it, and how to use it in real-world situations [12, 13, 25, 27, 32, 36, 45, 47, 49, 50]. An inability to access information because of technical issues or lack of funding that led to the unavailability of full-text articles was also reported [10, 11, 36–38, 42, 47, 49–51]. The political climate, including local agendas that did not prioritize EBDM, was a common external barrier [10, 30, 34, 38, 39, 45, 50]. The need for organizational change and managerial support was cited as another significant barrier to using EBDM as this lack of support influenced internal factors like funding, training, and time [10, 11, 38, 45, 47, 50, 51]. Low-income countries reported the unavailability of computer equipment and lack of computer literacy as considerable barriers to using evidence-based information [37, 42]. One older study in the United States also mentioned these factors [12].
The final theme addressed public health–specific issues related to finding and using evidence-based information. Public health is an interdisciplinary field, and public health workers must search multiple subject databases, many of which might not reflect new trends in the field [44, 50]. In several of the studies in this review, gray literature, statistics, and government guidelines were cited as important public health sources that public health workers frequently used [11, 12, 27, 31, 40, 42, 46, 48, 49], but traditional online databases do not contain these materials.
Public health workers in several studies reported that using evidence could conflict with their mandate of community empowerment if community members identified different priorities than those recommended in evidence-based research [49, 51]. In some research settings, especially those involving underserved populations, there was a lack of evidence for public health topics, and the lack of transferability could preclude use of an evidence base for these populations [25, 44, 47, 50].
Public health workers see EBPH as a high-priority initiative [25, 36, 39, 45, 47, 50, 51]. However, there is a discrepancy between the stated importance of this process and its actual practice. It is apparent that public health workers are uncertain about some of the very basic aspects of this concept, including things like the definition of evidence, where to find it, and how to use it [36, 38, 47, 51]. Interestingly, few studies about information needs of public health workers considered these needs in relation to EBDM, although this initiative is increasingly important in public health departments. A consistent message that public health workers presented was their desire for training in EBDM and finding and using evidence in their work [25, 32, 36, 44, 45, 47, 51]. Although they expressed information needs covering a wide range of sources, very few public health workers were aware of the full range of services that libraries provide, including access to peer-reviewed articles and gray literature. Importantly, there was a distinct disconnect in the identification of needs and the awareness that librarians could assist with these needs. For example, many public health workers expressed the need to learn more about identifying and evaluating relevant articles when dealing with large amounts of information, but none acknowledged that this was a service that librarians could provide [12, 13, 27, 30, 36, 44, 45, 47, 49].
Each of these findings presents an opportunity for librarians to showcase their skills in finding, accessing, and using information. Librarians can use their evidence-based information skills to teach public health workers about EBDM. Even if librarians do not have advanced degrees in public health, their knowledge of the scientific information life cycle, including the production and use of evidence, can be leveraged in instruction. Librarians should also be aware of public health–specific research needs, including the importance of unpublished material like gray literature and statistics. While teaching public health workers to use evidence-based information, librarians should acknowledge the dearth of research on underserved populations in the public health evidence base; further research in this area should be encouraged and prioritized.
Some of the discrepancy between reported information needs and information use among public health workers may be due to a lack of awareness of library collections and services. Teaching public health workers about the importance of evaluating information found online may encourage them to use peer-reviewed journal databases rather than unauthoritative Internet sources. Raising awareness about the services that a librarian can provide, such as creating search strategies and setting up data management plans, can encourage public health workers to utilize librarians rather than asking other colleagues for help. Librarians should work closely with stakeholders, including directors of public health departments and public health faculty, to ensure buy-in and participation from the top down of an organization; it is essential that librarians and public health decision makers work together to increase the awareness of evidence-based information resources to be used in public health work.
This review replicates the findings of previous studies that found access, funding, and time to be major barriers to using evidence-based information [15, 16]. Additionally, several studies in this review highlighted external factors like political climate, organizational culture, and funding mandates [10, 30, 39, 45, 47, 50]. Funding mandates that require evidence-based information to be used in research can potentially increase organizational support for EBDM and should be supported.
Critical appraisal identified many potential biases in the quality of studies. Many studies used small samples or convenience samples to evaluate services to local populations [26, 28, 32]. The use of these techniques, while common in qualitative studies, can potentially limit generalizability of studies in their applicability to different settings. Public health directors were a potentially overrepresented group, as they were surveyed exclusively in several studies [10, 27, 30, 38, 47, 50]. Directors are often responsible for new initiatives like EBDM, but questioning them exclusively may lead to overrepresentation of their views in relation to other public health workers.
Potential selection bias can also occur because of issues related to sampling and response rate across studies. In studies that used nationwide surveys, the sample was often drawn from member associations like the National Association of Chronic Disease Directors, a group that is very narrow in scope [10, 50]. A methodological issue specific to survey studies was the lack of attention paid to participant flow. In several cases, respondents did not complete all questions or there was a high dropout rate, but this was not reflected in the analysis of results [24, 48].
Self-report, a commonly used data collection method, relies on participants’ memory or perceived needs. Additional measurements that assessed actual use of databases, for example, would have been a welcome addition to address potential self-reporting concerns. Importantly, author-addressed bias was an often overlooked evaluation. Studies that did address bias frequently cited self-reporting [10, 35, 38, 45, 47, 48], small sample size [12, 13, 30, 32, 34, 38, 44, 45], lack of generalizability because of a limited sample [11–14, 27, 28, 32, 34, 46, 47, 49, 51], and low response rates [10, 24, 36, 45, 50].
The most striking finding of the critical appraisal analysis is the fact that many studies omitted information about data analysis, response rates, sampling strategies, and potential biases. Ten out of 33 studies, including 40% of qualitative and 16% of quantitative studies, did not use any type of data analysis to evaluate their results. The lack of data analysis points to a possible gap in knowledge of the use of statistical data analysis methods, especially among librarian researchers. Even performing a simple bivariate analysis illustrates relationships between the results in a way that presenting raw data does not. Perhaps even more striking is the lack of information presented about sampling strategy, response rates, and data analysis in many of the studies. One-third of the studies did not address potential bias in their work, and many did not include information about how participants were selected or data analysis was conducted. These findings speak to a need for transparency in presenting research methods and education in the use of rigorous methodologies. Librarians and public health workers alike may require training in research methods to understand the importance of reducing bias, improving validity, and increasing transparency in reporting research findings.
There are several potential problems with having a single author on a systematic review. This practice is not recommended by the Institute of Medicine because of the potential for bias in selection, final screening, coding, and analysis [52]. For this reason, the current study should be categorized as a systematized review rather than a traditional systematic review, and such potential biases should be considered limitations of the current study [17].
The search strategy used the broad term “public health” and did not specify specialties in this field such as epidemiology or health promotion, which might have excluded relevant subject-specific studies. Four databases, two that included public health literature and two that included library and information science literature, were used in the search, which might have limited the results from the public health field. Gray literature and non-English publications were excluded, which might have further limited the results. Older studies, as far back as 2005, were included, which might have influenced some of the themes, including technological access as a barrier to using evidenced-based information [27]. Rigorous coding and thematic analysis methods, including line-by-line coding, were not used, and there were not multiple coders. This may have led to bias in the identification of themes.
As noted in “Critical Appraisal,” many of the studies had methodological issues. Therefore, definitive conclusions about themes or patterns cannot be made in regard to the topic of information needs and use of evidence among public health workers. Future reviews of the evidence should take the quality of studies into account when analyzing this body of work.
Librarians can help improve understanding and use of EBPH by raising awareness of evidence-based resources among public health workers. Partnerships between librarians and public health decision makers can lead to increased use of EBPH in research in all types of public health work environments. Future research on public health workers’ information needs should focus on the use of evidence in decision making and practice, and such research should attempt to address potential biases by using rigorous methodologies and transparency in reporting results.
1 Brownson RC, Fielding JE, Maylahn CM. Evidence-based public health: a fundamental concept for public health practice. Annu Rev Public Health. 2009;30:175–201.
2 Brownson RC, Fielding JE, Maylahn CM. Evidence-based decision making to improve public health practice. Front Public Health Serv Syst Res. 2013;2(2):156–63.
3 Forsetlund L, Bj⊘rndal A. The potential for research based information in public health: identifying unrecognised information needs. BMC Public Health. 2001;1:1–9.
4 Jacobs JA, Jones E, Gabella BA, Spring B, Brownson RC. Tools for implementing an evidence-based approach in public health practice. Prev Chronic Dis. 2012;9:1103–24.
5 Baker EA, Brownson RC, Dreisinger M, McIntosh LD, Karamehic-Muratovic A. Examining the role of training in evidence-based public health: a qualitative study. Health Promot Pract. 2009 Jul;10(3):342–8.
6 Bambra C. The primacy of politics: the rise and fall of evidence-based public health policy? J Public Health. 2013 Dec;35(4):486–7.
7 Gibbert WS, Keating SM, Jacobs JA, Dodson E, Baker E, Diem G, Giles W, Gillespie KN, Grabauskas V, Shatchkute A, Brownson RC. Training the workforce in evidence-based public health: an evaluation of impact among US and international practitioners. Prev Chronic Dis. 2013 Sep 5;10:E148.
8 O’Neall M, Brownson R. Teaching evidence-based public health to public health practitioners. Ann Epidemiol. 2005 Aug;15(7):540–4.
9 Brownson RC, Allen P, Duggan K, Stamatakis KA, Erwin PC. Fostering more-effective public health by identifying administrative evidence-based practices: a review of the literature. Am J Prev Med. 2012 Sep;43(3):309–19.
10 Jacobs JA, Dodson EA, Baker EA, Deshpande AD, Brownson RC. Barriers to evidence-based decision making in public health: a national survey of chronic disease practitioners. Public Health Rep. 2010 Sep–Oct;125(5):736–42.
11 Rutland JD, Smith AM. Information needs of the ‘frontline’ public health workforce. Public Health. 2010 Nov;124(11):659–63.
12 Turner AM, Stavri Z, Revere D, Altamore R. From the ground up: information needs of nurses in a rural public health department in Oregon. J Med Libr Assoc. 2008 Oct;96(4):335–42. DOI: http://dx.doi.org/10.3163/1536-5050.96.4.008.
13 Lê ML. Information needs of public health staff in a knowledge translation setting in Canada. J Can Health Libr Assoc. 2013;34(1):3–11.
14 Twose C, Swartz P, Bunker E, Roderer NK, Oliver KB. Public health practitioners’ information access and use patterns in the Maryland (USA) public health departments of Anne Arundel and Wicomico Counties. Health Inf Libr J. 2008 Mar;25(1):13–22.
15 Ford J, Korjonen H. Information needs of public health practitioners: a review of the literature. Health Inf Libr J. 2012 Dec;29(4):260–73.
16 Revere D, Turner AM, Madhavan A, Rambo N, Bugni PF, Kimball AM, Fuller SS. Understanding the information needs of public health practitioners: a literature review to inform design of an interactive digital knowledge management system. J Biomed Inform. Aug;40(4):410–21.
17 Grant MJ, Booth A. A typology of reviews: an analysis of 14 review types and associated methodologies. Health Inf Libr J. 2009 Jun;26(2):91–108.
18 Institute of Medicine. Who will keep the public healthy: educating public health professionals for the 21st century. Washington, DC: Institute of Medicine, National Academies Press; 2002.
19 Critical Appraisal Skills Programme (CASP). CASP qualitative checklist [Internet]. The Programme [rev. 2013; cited 10 Apr 10 2015]. <http://www.casp-uk.net/#!casp-tools-checklists/c18f8>.
20 Institute of Social and Preventive Medicine at the University of Bern. STROBE statement [Internet]. [rev. 2007; cited 15 Sep 2015]. <http://www.strobe-statement.org>.
21 Burls A. What is critical appraisal? London, UK: Hayward Medical Communications; 2009.
22 Young JM, Solomon MJ. How to critically appraise an article. Nat Clin Pract Gastroenterol Hepatol. 2009 Feb;6(2):82–91.
23 Zardo P, Collie A. Type, frequency and purpose of information used to inform public health policy and program decision-making. BMC Public Health. 2015 Apr 15;15:381.
24 Thomas J, Harden A. Methods for the thematic synthesis of qualitative research in systematic reviews. BMC Med Res Methodol. 2008; 8: 45–60.
25 Eldredge JD, Carr R, Broudy D, Voorhees RE. The effect of training on question formulation among public health practitioners: results from a randomized controlled trial. J Med Libr Assoc. 2008 Oct;96(4):299–309. DOI: http://dx.doi.org/10.3163/1536-5050.96.4.005.
26 Adily A, Ward JE. Enhancing evidence-based practice in population health: staff views, barriers and strategies for change. Aust Health Rev. 2005 Nov;29(4):469–77.
27 Lê ML. Information needs of public health students. Health Inf Libr J. 2014 Dec;31(4): 274–92.
28 Turner AM, Petrochilos D, Nelson DE, Allen E, Liddy ED. Access and use of the Internet for health information seeking: a survey of local public health professionals in the northwest. J Public Health Manag Pract. 2009 Jan–Feb;15(1):67–9.
29 Wallis LC. Information-seeking behavior of faculty in one school of public health. J Med Libr Assoc. 2006 Oct;94(4):442–6.
30 Alvarez Mdo C, Franca I, Cuenca AM, Bastos FI, Ueno HM, Barros CR, Guimaraes MC. Information literacy: perceptions of Brazilian HIV/AIDS researchers. Health Inf Libr J. 2014 Mar;31(1):64–74.
31 Campbell DM, Redman S, Jorm L, Cooke M, Zwi AB, Rychetnik L. Increasing the use of evidence in health policy: practice and views of policy makers and researchers. Aust New Zealand Health Policy. 2009 Aug;6(1);21.
32 Charbonneau DH, Marks EB, Healy AM, Croatt-Moore C. Collaboration addresses information and education needs of an urban public health workforce. J Med Libr Assoc. 2007 Jul;95(3):352–4. DOI: http://dx.doi.org/10.3163/1536-5050.95.3.352.
33 De Groote SL, Shultz M, Blecic DD. Information-seeking behavior and the use of online resources: a snapshot of current health sciences faculty. J Med Libr Assoc. 2014 Jul;102(3):169–76. DOI: http://dx.doi.org/10.3163/1536-5050.102.3.006.
34 Harris JK, Allen P, Jacob RR, Elliott L, Brownson RC. Information-seeking among chronic disease prevention staff in state health departments: use of academic journals. Prev Chronic Dis. 2014 Aug 14;11:E138.
35 Larsen M, Gulis G, Pedersen KM. Use of evidence in local public health work in Denmark. Int J Public Health. 2012 Jun;57(3):477–83.
36 Maylahn C, Born C, Hammer M, Waltz EC. Strengthening epidemiologic competencies among local health professionals in New York: teaching evidence-based public health. Public Health Rep. 2008;123:35–43.
37 Andualem M, Kebede G, Kumie A. Information needs and seeking behaviour among health professionals working at public hospital and health centres in Bahir Dar, Ethiopia. BMC Health Serv Res. 2013 Dec 27;13:534–69.
38 Brownson RC, Reis RS, Allen P, Duggan K, Fields R, Stamatakis KA, Erwin PC. Understanding administrative evidence-based practices: findings from a survey of local health department leaders. Am J Prev Med. 2014 Jan;46(1):49–57.
39 Cilenti D, Brownson RC, Umble K, Erwin PC, Summers R. Information-seeking behaviors and other factors contributing to successful implementation of evidence-based practices in local health departments. J Public Health Manag Pract. 2012 Nov;18(6):571–6.
40 Kapadia-Kundu N, Sullivan TM, Safi B, Trivedi G, Velu S. Understanding health information needs and gaps in the health care system in Uttar Pradesh, India. J Health Commun. 2012;17(sup):30–45.
41 Merrill J, Rockoff M, Bakken S, Caldwell M. Barriers to information access among county health department employees. AMIA Annu Symp Proc. 2007 Oct11:1050.
42 Raj S, Sharma VL, Singh A, Goel S. The health information seeking behaviour and needs of community health workers in Chandigarh in northern India. Health Inf Libr J. 2015 Jun;32(2):143–9.
43 Higgins JW, Strange K, Scarr J, Pennock M, Barr V, Yew A, Terpstra J. “It’s a feel. That’s what a lot of our evidence would consist of”: public health practitioners’ perspectives on evidence. Eval Health Prof. 2011 Sep;34(3):278–96.
44 Jacobs JA, Clayton PF, Dove C, Funchess T, Jones E, Perveen G, Brownson RC. A survey tool for measuring evidence-based decision making capacity in public health agencies. BMC Health Serv Res. 2012 Mar 9;12:57–69.
45 Léon G, Ouimet M, Lavis JN, Grimshaw J, Gagnon MP. Assessing availability of scientific journals, databases, and health library services in Canadian health ministries: a cross-sectional study. Implement Sci. 2013 Mar;8(1):1–13.
46 Mortensen HJ, Alexander JL, Nehrenz GM, Porter C. Infection control professionals’ information-seeking preferences. Health Inf Libr J. 2013 Mar;30(1):23–34.
47 Sosnowy CD, Weiss LJ, Maylahn CM, Pirani SJ, Katagiri NJ. Factors affecting evidence-based decision making in local health departments. Am Journal Prev Med. 2013 Dec;45(6):763–8.
48 LaPelle NR, Luckmann R, Simpson EH, Martin ER. Identifying strategies to improve access to credible and relevant information for public health professionals: a qualitative study. BMC Public Health. 2006 Apr 5;6:89–9.
49 Yousefi-Nooraie R, Dobbins M, Brouwers M, Wakefield P. Information seeking for making evidence-informed decisions: a social network analysis on the staff of a public health department in Canada. BMC Health Serv Res. 2012 May 16;12.
50 Dodson EA, Baker EA, Brownson RC. Use of evidence-based interventions in state health departments: a qualitative assessment of barriers and solutions. J Public Health Manag Pract. 2010 Nov–Dec;16(6):E9–E15.
51 Peirson L, Ciliska D, Dobbins M, Mowat D. Building capacity for evidence informed decision making in public health: a case study of organizational change. BMC Public Health. 2012 Feb 20;12:137.
52 Institute of Medicine. Standards for systematic reviews [Internet]. The Institute [rev. 2011; cited 17 Aug 2015]. <http://iom.nationalacademies.org/Reports/2011/Finding-What-Works-in-Health-Care-Standards-for-Systematic-Reviews/Standards.aspx>.
53 Moher D, Liberati A, Tetzlaff J, Altman DG. Preferred reporting items for systematic reviews and meta-analyses: the PRISMA statement. PLoS Med. 2009 Jul 21;6(7):e1000097.
Jill Barr-Walker, MPH, MS, jill.barr-walker@ucsf.edu, ZSFG Library, University of California, San Francisco, San Francisco, CA
Articles in this journal are licensed under a Creative Commons Attribution 4.0 International License.
This journal is published by the University Library System of the University of Pittsburgh as part of its D-Scribe Digital Publishing Program and is cosponsored by the University of Pittsburgh Press.
Journal of the Medical Library Association, VOLUME 105, NUMBER 1, January 2017