Core Artificial Intelligence and Machine Learning in Libraries Interest Group

last person joined: yesterday 

✉ Send an email to ALA-CoreAIAndMachineLearningInLibraries@ConnectedCommunity.org to start a discussion or share a file.
Portraits of three Core members with caption Become a Member: Find Your Home: Core.

 

  • 1.  Has anyone promoted the use of AI to staff to improve communications (e.g., spelling, grammar, and overall writing quality)?

    Posted Jul 22, 2025 12:20 PM

    I am not sure if this is appropriate group for this question, so I apologize if this would be better posted somewhere else. 
    ----

    We're launching a new initiative to promote a culture of data-informed decision making, and part of that includes encouraging the use of AI tools (such as Microsoft Co-Pilot) for data analytics. However, these tools can also be used to streamline communications, particularly for improving/checking spelling, grammar, and so forth. 

    Have you encouraged your team to use AI in this way? How are AI tools being integrated into your communication workflows, and what has the impact been so far?

    I understand this can be a sensitive topic, so I'd love to hear how your library is approaching it. Any recommendations, cautions, or lessons learned would be greatly appreciated. 

    Thank you, 
    Bill

     



    ------------------------------
    Bill McIntire
    Public Services Director
    Dayton Metro Library
    He/Him/His
    ------------------------------


  • 2.  RE: Has anyone promoted the use of AI to staff to improve communications (e.g., spelling, grammar, and overall writing quality)?

    Posted Jul 22, 2025 11:38 PM

    Hi Bill, this is as good a place as any to ask so thanks for bringing it up!

    At my org we haven't used AI tools for communication things, partly because we want to be conscientious about how and when we use it because of how many ways the dice can roll in not great ways. What's given us pause are the climate costs, cognitive/ability costs, and potential psychological costs, but there are many other facets one can look at it with.

    Climate-wise, studies keep coming out around just how much energy (and therefore water) LLM's use. While the numbers hop around as technology cycles between more energy efficient and then more powerful (at the cost of the same efficiency), it's just easier to answer "if I can write this email without [insert various negative environmental impacts from water consumption, e-waste from high technology churn, x amount of carbon into the atmosphere, etc...], maybe I'll just do that instead." Does it take more work on my part? Yeah, and that's okay. Life can be boring or hard sometimes.

    Cognitive/ability-wise, it's a similar argument. This article talks about how LLM's have already begun impacting how we write and talk to each other, and a report from researchers at the  Max Planck Institute suggest that the language we use is getting... well... flat? boring? meh? Not to say that every email has to be a work of human-creative art, but they also don't have to be perfectly massaged. And the more we offload our writing tasks to machines, the less practice we get, and the worse we get at it, and this has the makings of a vicious cycle I think. See also: Your Brain on ChatGPT: Accumulation of Cognitive Debt when Using an AI Assistant for Essay Writing Task by Nataliya Kos'myna out of MIT's Media Lab. It's not great.

    Finally, the psychological and social costs make me genuinely nervous. MIT Media Lab has a whole list of publications about life AI, and some of the more recent pubs make compelling cases around limiting exposure and usage to AI. Aside from the one above, there's also How AI and Human Behaviors Shape Psychosocial Effects of Chatbot Use: A Longitudinal Controlled Study  which is the one that clinches it for me. To quote (emphasis mine):

    Results showed that while voice-based chatbots initially appeared beneficial in mitigating loneliness and dependence compared with text-based chatbots, these advantages diminished at high usage levels, especially with a neutral-voice chatbot. Conversation type also shaped outcomes: personal topics slightly increased loneliness but tended to lower emotional dependence compared with open-ended conversations, whereas non-personal topics were associated with greater dependence among heavy users. Overall, higher daily usage–across all modalities and conversation types–correlated with higher loneliness, dependence, and problematic use, and lower socialization.

    But there's also:

    And they're all worth giving a look. Not to say that there aren't some good use cases -- I know just enough python to be dangerous, and tools like Claude and ChatGPT have allowed me to do things in minutes that would've otherwise taken days (or not been possible at all). But again, it's still a game of evaluating what I can get from it against what it takes from me/us. 



    ------------------------------
    Peter Musser
    Chair, ALA Core IG for Artificial Intelligence and Machine Learning in Libraries

    Head, Library Services
    ISKME
    He/Him/His
    ------------------------------



  • 3.  RE: Has anyone promoted the use of AI to staff to improve communications (e.g., spelling, grammar, and overall writing quality)?

    Posted Jul 28, 2025 11:32 AM

    Peter, 

    This is incredibly insightful. I appreciate the time and attention you put into the response....especially all the links. I am the chair of our quantitative assessment committee and I plan to bring up some of the issue you raised here. Much appreciated. 



    ------------------------------
    Bill McIntire
    Public Services Director
    Dayton Metro Library
    He/Him/His
    ------------------------------



  • 4.  RE: Has anyone promoted the use of AI to staff to improve communications (e.g., spelling, grammar, and overall writing quality)?

    Posted Jul 24, 2025 09:16 AM

    Hi Bill,

    We have Google Workspace and through that subscription now have access to many paid Gemini features. Though interactions with Gemini are not used to train the AI models or for ad targeting (allegedly), we of course still emphasize that staff should avoid inputting any personally identifiable information into the tools and be mindful of patron privacy.

    That said, we're not tracking specific instances of staff usage. I know some have used it for drafting emails, social media hashtags, and creating alt-text for website images. The latter has been very helpful as we seek to meet WCAG 2.1 compliance. Does it create the perfect alt-text for each image? No, but it is an improvement and is typically more detailed than what staff might otherwise input.

    Personally, I've used it to create department reports, reword program descriptions for brevity, and draft website announcements. I have found it dramatically improves the time it takes me to complete some of these writing tasks that don't necessitate a more human touch or message. Sometimes you just need a quick way to succinctly relay some information and for tasks like that it does a great job.

    Peter did an excellent job summarizing the costs of using these tools and those costs are certainly worth consideration when formulating any initiatives that include AI use.

    Best,
    Michael



    ------------------------------
    Michael Bartolomeo
    Librarian, Emerging Technologies
    South Huntington Public Library
    Huntington Station, NY
    ------------------------------