LLAMA MAES (Measurement Assessment and Evaluation Section) Section
This Assessment Toolbox is a place to post links to helpful information on assessment, examples of surveys, and other assessment tools. Please add your own favorite links to assessment information.
General Assessment Links:
- Instruction: Sample Rubrics from the Association for the Assessment of Learning in Higher Education
- Sample size calculator
- National Center for Educational Statistics Academic Libraries Peer Comparison Tool
- Microsoft Excel Statistical Options
- Survey Random Sample Calculator
- Social Science Research Methods
- Society for Technical Communication Web Usability Toolkit
- Library Assessment Conference 2008 Presentations and Posters
- Evidence-based Library and Information Practice, 4th International Conference 2007 Papers
- Open Access Journal: Evidence Based Library and Information Practice
- Assessment Tutorial from OSU (registration required but free)
Examples of Assessment Tools:
- 3-2-1 Assessment(from East Carolina Univ.)
- Research Consultation Evaluation Form (from East Carolina Univ.)
- Faculty Survey (from East Carolina Univ.)
Date/Time: Sunday, January 26, 2014, 4:30 to 5:30
Location: Philadelphia Convention Center Room 102 A
- Lisa Hinchliffe, Professor/Coordinator for Information Literacy Services and Instruction, University of Illinois at Urbana-Champaign
- Donna Tolson, Library Strategist, University of Virginia Library
Lisa Hinchliffe will provide an overview of culture of assessment and organizational practices that are associated with such a culture. Following these opening remarks, discussion group attendees will participate in table discussions of barriers to developing a culture of assessment, reporting out their findings to the group. Then, Donna Tolson will discuss how the University of Virginia developed a culture of assessment at their library, current issues they are facing, and new priorities such as how to assess strategic directions. Her presentation will lead into table discussions about prioritizing assessment efforts. The discussion will conclude with a general Q & A.
This might of interest to other MAES members:
Southeastern Library Assessment Conference
The Southeastern Library Assessment Conference provides an opportunity for those interested in advancing the library assessment and user experience conversation to gather together to share and discuss practical ideas and information.
Call for Proposals
The Southeastern Library Assessment Conference invites proposals for the October 21-22, 2013, conference to be held in Atlanta, Georgia.
Program proposals should be designed to fit within a 45-minute timeframe, which includes time for questions. We encourage thoughtful, timely proposals on any topic related to assessment in libraries of all types, including, but not limited to:
- Creating assessment plans
- Data-driven decision making
- Demonstrating value
- Developing a culture of assessment
- Ethnographic studies
- Getting started with assessment
- Impact on student learning, retention, progression, and/or graduation
- Learning outcomes
- Library instruction
- Reporting results to stakeholders
- Spaces and facilities
- Special collections and archives
- User experience
Proposals should include:
- Program title
- Name, institution, position title, and email address of each presenter
- Abstract of 200 words or fewer. The abstract should state clearly the relevance of the topic to library assessment and practical implications for libraries.
- Brief 2-3 sentence abstract suitable for the conference website and program
- At least three learning outcomes to be addressed during the program
Please submit proposals online by April 17, 2013. Notification of acceptance is by May 13, 2013.
Please direct any questions you have to the Conference Coordinating Committee.
Susan Bailey, Emory University
John Bodnar, Emory University
Ameet Doshi, Georgia Institute of Technology
Jennifer Jones, Georgia State University
Erin Nagel, Clayton State University
Sonya Shepherd, Georgia Southern University
Thank you - W. Bede Mitchell, Dean
Georgia Southern University
It was great to reconnect with so many MAES members at the Library Assessment Conference, which wrapped up today in Charlottesville, VA. MAES was extremely well-represented in the presentation schedule as well as in the poster session. Presenters included:
Consuela Askew (Florida International) "Using a Mixed Method Approach to Assessing Roaming Services: A Case Study"
Kathy Crowe (UNC-Greensboro) "Shop Your Way to Service Excellence: Secret Shopping for Academic Libraries
MAES Chair Rachel Besara and Kirsten Kinsley (both from Florida State) "Increasing the Impact & Value of a Graduate Level Research Methods Course by Embedding Assessment Librarians and Library Assessment"
Allyson Washburn (BYU) "Student Information Seeking Behaviors: A Case Study in Collaboration"
Ken Wise (Tennessee) "Methods for Measuring Return on Investment for Digitized Special Collections"
Lisa Horowitz (MIT) "The Assessment Needs of a Data-Driven Organization"
Jeanne Brown (UNLV) "Quest for Continuous Improvement: Applying Feedback and Data Gathered through Multiple Methods to Evaluate and Improve Use of a Library's Discovery Tool"
Scott Britton (Univ. of Miami) "Mining Library and University Data to Understand User Populations and Behavior" (this was a standing-room only presentation!)
Meg Scharf (Central Florida) "Closing the Loop: Are Libraries Communicating Assessment Results to Students?"
Bob Fox (Louisville) was on the closing panel, presenting a wrap-up perspective from research libraries
Poster presenters included: Jeanne Brown, Karen Neurohr, Cheryl Albrecht, Rachel Besara, and Kirsten Kinsley.
I'm sure that I missed some MAES members but, wow, this was an impressive showing!
Slides and other materials will be available on the conference website at: http://www.libraryassessment.org/ Additionally, as in past years, ARL will produce complete proceedings of the conference.
Jennifer Paustenbaugh, Immediate Past Chair, LLAMA-MAES
Keynote Cory Lown will provide context, strategies, and resources for creating data visualizations for effective communication. Cory is Digital Technologies Development Librarian at North Carolina State University where he designs and develops applications to improve end-user resource discovery and use of library services.
Following Cory's talk five librarians will provide brief presentations on how they applied data visualization in their libraries:
Rachel Besara, Florida State University: Using Roambi, a mobile business intelligence application
Klara Maidenberg, Ontario Council of University Libraries: Visualizing the Council's Ask a Librarian data
Este Pope, Coconino Community College: Using Prezi to present data to the District Governing Board
Jamie Hollier, Colorado State Library: Using infographics to evaluate a new computer centers
Robert Dugan, University of West Florida: Using Counting Opinions to assess user experiences
Program conducted by LLAMA MAES at ALA Annual 2010.
More than 250 people packed Washington Convention Center 145A on Monday morning to listen to presenters from ten libraries describe informal assessment techniques they used to quickly evaluate and improve services throughout their libraries. Here are summaries of the ten presentations. The PowerPoint from the session and most of the speakers’ notes are available in the LLAMA MAES group area on ALA Connect http://connect.ala.org/node/107288 and on Slideshare.
See attached presentation and handouts.
Short summaries of the 10 presentations follow.
Karen Neurohr and Jennifer Paustenbaugh, Oklahoma State University presented a combination survey and focus-group-lunch technique aimed at student scholars. Scholars were identified by the university. Since the focus groups were informal they were termed "listening sessions." Together the survey and listening sessions served to uncover the most important topics for the library to address related to this population. The activity will be repeated next year.
Louise Lowe and Judith Brook, Mercer University presented on their use of product demonstrations involving students, and the impact of the student feedback on purchasing. One trial was for a coffee vendor, with taste tests being quite popular with the students (a log was kept next to the coffee for students to weigh in on their opinions). The pattern with products was that students preferred value over bells and whistles. Decisions were based on the feedback, and a To Do list was put on a poster in the library, with items checked off when completed.
Kornelia Tancheva, Cornell University, discussed their use of unobtrusive user observations. They performed both day and night observations, simply walking through and noting student activities. Since security walks through at night as well, they felt the students did not perceive they were being studied. She cautioned against over-reliance on observations. Their follow up will be to find out why students were doing what the staff observed them doing.
Sharon Naylor and Bruce Stoffel, Illinois State University, reported on their investigation of chat reference through focus groups. They identified a campus faculty member who became their advisor. He suggested that they continue doing focus groups until they heard nothing new, which is what they did. They found that their students saw web 2.0 modalities to be social, not for academic purposes. They also found that the students prefer a personal approach.
Jeff Gatten, California Institute of the Arts spoke on the use and value of poster surveys for a distinct population, one that is right-brained and less linear. The poster approach was interactive -- students grouped around the poster and answered three questions together. He also felt that the library got feedback they would not have in a traditional survey, since the students could report their immediate frustrations, which might fade. In fact they did try an online survey and got no participants.
Kirsten Kinsley and Rachel Besara, Florida State University, discussed what they learned from interviewing students in their natural environments on campus. As part of a project to renovate student study spaces in the library, librarians and staff asked students about their study habits. Most of the interviews were done by the library’s undergraduate student services staff, many of whom were only slightly older than the students. Interviews were recorded using a digital voice recorder, and were later transcribed. The results challenged many of their assumptions about students, for example: what “quiet” means to students, the nuances of group study, and when students like to study and for how long. The results identified a need for more night-time services, more software for curriculum support, and more technical support.
Wanda Dole and J.B. Hill, University of Arkansas at Little Rock (UALR), presented the results of a combined method assessment of use of UALR by community users. Quantitative measures included an examination of data from the library’s integrated library system, print management system, and donor list. Qualitative measures included data collected from a survey of community user needs and expectations. They found that community users broke out into two groups: borrowers and computer users. Borrowers were more likely to be UALR graduates, use the library for school work, and encourage others to use the library. Computer users were more likely to use the library for Internet access and to live nearby. While there have been few if any financial gifts from community users, the cost of providing access is also low. Despite having no borrower’s fees and no overdue fines, the library has lost only $4,000 in non-returned materials over a two-year period. Dole and Hill concluded that offering library services to unaffiliated users has been an important contribution to the local community, even though “good will” benefits are hard to quantify.
Ameet Doshi, Georgia Tech, described the “Flip the Library” assessment project. Georgia Tech Library’s 20-person student library advisory group was tasked with looking at four areas of the library (entrance, signage, study areas, and website) and suggesting improvements. The four groups took 15-30 minutes to record each area with a flip camera, and then met as a group to debrief. “Flip the Library” allowed library staff to view the library from the student perspective to help assess completed renovations, inform new signage and way-finding efforts, and identify a graffiti problem.
Lisa Horowitz, MIT, described a five-question Zoomerang survey used to determine if the benefits of the Humanities Library’s bookmobile service - used to promote recreational reading, DVDs and music CDs before long weekends and breaks – outweighed its costs. Staff members involved with the bookmobile were surveyed to understand better the impact of staffing the bookmobile on their regular workload. She concluded that the increased visibility for the library was worth the staff time invested and that the bookmobile was valuable to those who used it, as well as to staff. The informal assessment of 51 users provided the information needed to move forward with a decision to continue the service, but with fewer outings.
Kathy Ray, American University of Sharjah, observed undergraduate students in the library during the busiest hours to determine why students chose to use particular spaces in the library. Fifteen observations were conducted over a period of five weeks. She marked on a photocopy of the floor map where people were sitting and what they were doing. The study helped pinpoint one particularly noisy area where people liked to socialize in an area that was intended for quiet study. Furnishings in the area were reconfigured and a browsing area expanded into the space, resulting in a decrease in the noise level.
The LLAMA Measurement, Assessment, and Evaluation Section (MAES) Discussion Group will meet Sunday, January 22, 2012 from 4:00-5:30 pm at the Dallas Convention Center, Room A308. Join us in conversation as we explore best practices in creating institutional infrastructure for effective and sustainable measurement, assessment and evaluation of library services. Sarah Murphy, Coordinator of Research and Reference Services at The Ohio State University Libraries, and editor of The Quality Infrastructure: A Programmatic Approach to Measuring, Improving, and Analyzing Library Services (forthcoming from ALA Editions), will get the discussion started with a brief overview of the topic.
We hope to see you there!