Annotated bibliography for MW Evidence-Based Practices DG.
Ryan, P. 2006. “EBL and Library Assessment : Two Solitudes?.” Evidence Based Library and Information Practice 1(4): 77-80.
This commentary explains how EBL and library assessment really are different. Library assessment is undertaken to show the library's contributions to its communities, is user-based, and relies on local evidence. EBL relies on research literature to reveal evidence that answers a problem. "It ... attempts to integrate user-reported, practitioner-observed and research-derived evidence as an explicit basis for decision-making," (p. 78). EBL requires evaluation of the literature to "guide practitioner(s) to the best answer" (p. 78).
Lisa R. HorowitzAssessment Librarian, Library Assessment and Business IntelligenceMIT Libraries
Greenwood, H. and M. Cleeve. 2008. “Embracing change: evidence-based management in action.” Library Management 29(3): 173-184.
Case study of how "evidence-based management" (EBM) was incorporated into a single public library in the UK. More on the lines of library assessment than EBL. Good examples of how a culture of EBM was embedded, and how staff were taught to gather evidence that had meaning. However, this article was not useful in explaining or applying EBL.
Booth, A. 2006. “Counting what counts: performance measurement and evidence-based practice.” Performance Measurement and Metrics 7(2): 63-74.
Booth clarifies the distinction between performance measurement (which I think of alternatively as library assessment in the U.S.) and EBLIP. Both have a goal of service and operational improvements, but EBLIP looks at findings from research that aid managers in decision-making, while performance measurement relies on "audit and benchmarking data," which I interpret to mean assessment data.
Evidence that can be used for decision-making includes performance measurements -- user-reported data, and librarian-observed and automated data -- and EBLIP, that is, research-derived evidence. Several frames of evaluation can be used to determine if the evidence-based practice achieved success: is the service improved? has the practitioner improved? is the organization improved?
Questions are the basis of EBL process. SPICE focuses these questions: Setting (context for the question), Perspective (who are the users/potential users), Intervention (what is being done), Comparison (what are the alternatives), and Evaluation (how success of the intervention is measured) pp. 69-70.
Booth summarizes how evidence-based healthcare provides guidance for EBLIP. Many of these are basic tenets of library assessment, such as starting with a question, triangulation and use of outcome measures. But the reliance on research-derived evidence continues to be the extra piece offered by EBLIP, while performance measurement explains that evidence can also be provided by users, observations, benchmarking, etc.
Booth, A. (2010). Upon reflection: Five mirrors of evidence-based practice. Health Information & Libraries Journal, 27(3), 253-256.
This is a short commentary about a proposed change model: pre-contemplation, contemplation, preparation, action, and maintenance. After all, the goal of evidence based practice is to bring about meaningful and lasting change in practice. The article quotes Todd's (2003) definition of evidence-based practice:
"Evidence based practice is about best practice and reflective practice, where the process of planning, action, feedback and reflection contributes to the cyclic process of purposeful decision making and action, and renewal and development."
Dr. Jason Martin
Booth, A. (2009). A bridge too far? Stepping stones for evidence based practice in an academic context.New Review of Academic Librarianship, 15(1), 3-34.
This article is an overview of the first decade of the evidence-based library and information practice (EBLIP) model. Booth notes how EBLIP has grown in use, but cautions against the use of the term "evidence-based" as a catch-all for any use of data in decision making. Booth thinks EBLIP can do great things in librarianship and, in the five years since his last review, has become more sophisticated and incorporates more types of quantitative and qualitative evidence. In order to move forward, librarians engaged in EBLIP need to become better at widely dissemenating information, find more time for EBLIP, and engage better with theory. The keys for wider acceptance of EBLIP is leadership, training, and acceptance of EBLIP in a library's culture.
The article describes the five steps of the EBLIP process.
1.) Formulate a question
2.) Gather evidence from the literature
3.) Assess the evidence
4.) Assess the costs and benefits fo the action plan
5.) Assess the action pan
"Five A Model": Ask, Acquire, Appraise, Apply, and Assess
Lakos, A. (2007). Evidence-Based library management: The leadership challenge. Portal: Libraries and the Academy, 7(4), 431-450.
Lakos states libraries see the need for using data and evidence in decision making but do not engage in the practice "systematically or effectively." Assessment, change, and evidence-based practice need to become part of the library's day-to-day activities. Libraries need to redefine themselves in the "information economy," adjust to changes in academic publishing, and learn new skills. These changes require assessment and evidence-based practice and change. Lakos thinks leadership is the key to the profession adopting evidence-baed practice. He interviews library leaders for their positions on evidence-based practice.
Lakos uses the term "evidence-based management." He uses Sutton's (2007) definition of EBM: "It just means finding the best evidence that you can, facing those acts, and acting on those facts--rather than doing what everyone else does, what you have always done, or what you thought was true."
Abbott, W. A. (2006). Persuasive evidence: improving customer service through evidence based librarianship. Evidence Based Library and Information Practice, 1 (1), 58-68.
This article describes how Bond University (Australia) applied evidence-based practice to improve customer service. Three case studies are described, including the methods used to collect data, how the data were analyzed and used to make a decision, and what action was taken as a result, as well as lessons learned.
"...librarians ...need to develop the skills and a culture to effectively carry out evidence-based practice. These include the skills to articulate questions, undertake research, appraise research findings and implement a course of action. Above all it requires librarians to develop a culture of questioning and reflecting on what we do."
Glynn L. (2006). A critical appraisal tool for library and information research. Library Hi Tech, 24(3), 387-399.
This paper outlines and describes a critical appraisal tool and process that can be applied to library and information research. The critical appraisal tool provides a thorough, generic list of questions that one would ask when attempting to determine the validity, applicability and appropriateness of a study.