ERUS Discussion Forum, ALA Annual 2010, Washington, D.C.
ERUS Discussion Forum 6/27/10 ALA Annual
8am-10am, Capital Hilton, Washington, D.C.
Present: Colleen Seale (U of Florida, ERUS member), Kornelia Tancheva (Cornell, ERUS Committee chair); Jan Kemp (U of Texas, san Antonio); Jennie Gerke (U of Colorado); Stephanie Alexander (U of Colorado), U of Florida Richenda Brim (Getty)
The discussion centered on various statistical packages used for tracking reference desk traffic.
At the University of Florida, they use the Analytics module of LibAnswers, a product of Springshare that also developed LibGuides, which offers a drop-down menu for frequently asked questions; all staff entered them initially, and then they were combined in 12-13 “most common ones”; it has a notes field for question content and answers, so that you can create a knowledgebase; you can select patron status, date, time and location; the business librarian uses a separate instance of LibAnswers. Springshare is very receptive to customer feedback; for instruction, they use a home-grown system.
At U of Texas, San Antonio, they use manually counted visits input on spreadsheets; interested in a knowledgebase, especially considering that the information desk and the IT help desk will be integrated; and in tracking staff participation in answering questions; they do an analysis once a year in light of tracking the nature and content of the questions, patron status, etc.
At the U of Colorado: they looked into Libstats but the programming language was not supported by their IT department; looked into refTracker and found it too expensive; built a home-grown system; they also have a check box for FAQ; course numbers associated with the course; ; except for one branch, the system is generally adopted; are planning on writing a referral form; they have a “swamped” form, where you can enter “12 printer questions”; in Access; you can run reports and create graphs either in Access or in Excel; every month they distribute “highlights” to staff rather than all the data; they track who is answering the question in order to motivate staff; they use the data for staffing decisions; drawbacks—there is a level of clean-up involved in getting the data into Excel
Getty: use hash marks; considered Libstats but no IT support for it; they do use Libguides, so they will be looking into Libanalytics
Cornell: until very recently used two different systems to track reference and instruction, both created with the help of a student group in a computer science class in house. Two drawbacks: the systems were separate and did not allow for tracking outreach activities. Very recently developed another in-house system, CountIt, which integrates all user services: reference, instruction and outreach. Another advantage is the fact that it has a Quick button to allow a simple click for busy times, instead of entering all the information. The previous system was used to track traffic patterns, adjust staff coverage, and record contents of questions for content analysis and respective staff training. All of these advantages are also present in the new system.
Discussion centered on tracking virtual reference and reference volumes. Colorado has noticed an increase in traffic and this year started double-staffing between 10am and 4 pm or all the time during finals. Another discussion topic included training ci