Go to:
Discussion
Online Doc
File
Poll
Event
Meeting Request
Picture
ALAConnect Helpdesk (staff)'s picture

Discussing Discovery Services: What's Working, What's Not and What's Next?

When: 
Sunday, June 29, 2014
3:00 pm to 4:00 pm, US/Pacific

Come and hear about and discuss the newest developments in discovery services in libraries with your colleagues. Discussion forum sponsored by the *new* RUSA RSS Discovery Services Committee (formerly the RSS Catalog Use Committee).

More information about this conference session

Colleen Seale's picture

 

Mireille Djenno, Gwen Gregory, and Ling Wang, University of Illinois at Chicago Library, Chicago

What We Discovered About Discovery: Comparing Two Discovery Systems at One Academic Library

In the spring of 2013, librarians at the University of Illinois at Chicago (UIC) undertook a usability study of Summon and WorldCat Local. The goal of this study was two-fold: to learn about the search behaviors of different types of patron groups and to test the ease-of-use of both discovery systems with an eye toward determining which tool to retain for the library on the longer term. The results of the testing were a significant, if not overriding factor in the decision regarding which tool to retain. The results have yielded important insights that have allowed us to customize the tool we kept and to adapt our information literacy instruction.

We also learned a great deal about how to design and conduct usability testing of discovery tools. Our analysis of the usability tests revealed some interesting (and unexpected) results that conflicted with our initial observations and participants’ stated preferences for discovery tool features. We tested faculty, graduate students, and undergraduates and found a great deal of variation both in the usability of the tools relative to each user group and in the groups’ performance on usability tasks relative to each other.

Our discovery layer implementation is ongoing and we have begun collecting user feedback about our chosen tool by placing a feedback button within the interface. An Implementation Task Force is tasked with reviewing the data and making necessary improvements to the interface and the back end.

In our brief presentation, we will share which tool we chose and why!

Van Houlson, University of Minnesota

Alma in the Morning:  The Impact of a Next Generation Discovery System on Patrons and Staff at a Research Library

The University of Minnesota-Twin Cities recently implemented the ExLibris Alma based discovery service after two years of planning and it may be a good case for the discussion forum.  I would present on the positive features as well as the technical challenges for staff.  And I would report on the reaction from faculty and students.  There have been extraordinary examples of it working well and disappointments.  This discovery system illustrates how the scope of networked resources impacts on search results and the expectations of users.  It also illustrates the constraints discovery systems impose on other finding tools and the need for communication and outreach.  Alma is an example of a system that consolidated many enterprise functions while providing innovative discovery features.   Can the new replace the old?  What features of the "old" did we need for our patrons or staff?   I will describe some of the changes to our MNCAT Discovery that were made based on feedback from users.

Emily Keller, University of Washington Libraries

Dumbing down or drilling down? Librarians’ perspectives on discovery tools

As users become increasingly accustomed to “simple” search tools such as Google, their expectations in other information environments, such as libraries, are changing. In response, many libraries are moving away from traditional catalogs towards more user-oriented “discovery” systems that bring together a range of information streams including books, articles, media, and other materials.

For librarians, discovery systems bring conceptual shifts in the research landscape. While many users appreciate the convenience afforded by search tools that offer many types of sources in a single search, some librarians are concerned that these tools “dumb down” search and create a false impression of comprehensiveness and precision. Other librarians are excited to be freed from teaching library-centered tools so they can focus instead on the critical thinking, evaluation of sources, and other higher order competencies.

In this lightning talk I will present findings from my exploratory research on librarian attitudes towards discovery systems. What are the points of resistance? What do librarians think is gained or lost in the shift towards these tools? What opportunities and challenges do librarians see in this movement towards discovery layers? What are the implications of librarian beliefs and attitudes for successful implementations, reference and instruction services, and organizational culture?

Rosalind Tedford, Wake Forest University

Future of Discovery: Analytics, Meet Instruction

I think if the true promise of discovery services is going to be realized we have to get away from the idea that they will be ‘as easy to use as Google’ and assume that just having a discovery service is enough for our students to be successful in their research. Google may be easy to use, but it’s not easy to make it useful as a research tool. Google decides what is important to the user based on what else that user has searched for or clicked on in the past. This ‘filter bubble’ is invisible to the end user and can significantly hinder the benefits of using it as a research tool. What discovery services have that Google does not offer is the ability to ‘tweak’ your filter bubble parameters by using the facets and filters. But that significant difference between the two platforms is lost on most users unless it is taught. What librarians need to be better at is using real information about how their users are using their discovery layer to modify the messages they send to their users about how to use them efficiently and effectively. Data on what is searched for, how limits/facets are used and other metrics can help us understand search patterns and thus understand how best to teach the use of these invaluable products.