This is incredibly insightful. I appreciate the time and attention you put into the response....especially all the links. I am the chair of our quantitative assessment committee and I plan to bring up some of the issue you raised here. Much appreciated.
Original Message:
Sent: Jul 22, 2025 11:38 PM
From: Peter Musser
Subject: Has anyone promoted the use of AI to staff to improve communications (e.g., spelling, grammar, and overall writing quality)?
Hi Bill, this is as good a place as any to ask so thanks for bringing it up!
At my org we haven't used AI tools for communication things, partly because we want to be conscientious about how and when we use it because of how many ways the dice can roll in not great ways. What's given us pause are the climate costs, cognitive/ability costs, and potential psychological costs, but there are many other facets one can look at it with.
Climate-wise, studies keep coming out around just how much energy (and therefore water) LLM's use. While the numbers hop around as technology cycles between more energy efficient and then more powerful (at the cost of the same efficiency), it's just easier to answer "if I can write this email without [insert various negative environmental impacts from water consumption, e-waste from high technology churn, x amount of carbon into the atmosphere, etc...], maybe I'll just do that instead." Does it take more work on my part? Yeah, and that's okay. Life can be boring or hard sometimes.
Cognitive/ability-wise, it's a similar argument. This article talks about how LLM's have already begun impacting how we write and talk to each other, and a report from researchers at the Max Planck Institute suggest that the language we use is getting... well... flat? boring? meh? Not to say that every email has to be a work of human-creative art, but they also don't have to be perfectly massaged. And the more we offload our writing tasks to machines, the less practice we get, and the worse we get at it, and this has the makings of a vicious cycle I think. See also: Your Brain on ChatGPT: Accumulation of Cognitive Debt when Using an AI Assistant for Essay Writing Task by Nataliya Kos'myna out of MIT's Media Lab. It's not great.
Finally, the psychological and social costs make me genuinely nervous. MIT Media Lab has a whole list of publications about life AI, and some of the more recent pubs make compelling cases around limiting exposure and usage to AI. Aside from the one above, there's also How AI and Human Behaviors Shape Psychosocial Effects of Chatbot Use: A Longitudinal Controlled Study which is the one that clinches it for me. To quote (emphasis mine):
Results showed that while voice-based chatbots initially appeared beneficial in mitigating loneliness and dependence compared with text-based chatbots, these advantages diminished at high usage levels, especially with a neutral-voice chatbot. Conversation type also shaped outcomes: personal topics slightly increased loneliness but tended to lower emotional dependence compared with open-ended conversations, whereas non-personal topics were associated with greater dependence among heavy users. Overall, higher daily usage–across all modalities and conversation types–correlated with higher loneliness, dependence, and problematic use, and lower socialization.
But there's also:
And they're all worth giving a look. Not to say that there aren't some good use cases -- I know just enough python to be dangerous, and tools like Claude and ChatGPT have allowed me to do things in minutes that would've otherwise taken days (or not been possible at all). But again, it's still a game of evaluating what I can get from it against what it takes from me/us.
------------------------------
Peter Musser
Chair, ALA Core IG for Artificial Intelligence and Machine Learning in Libraries
Head, Library Services
ISKME
He/Him/His
Original Message:
Sent: Jul 22, 2025 12:20 PM
From: William McIntire
Subject: Has anyone promoted the use of AI to staff to improve communications (e.g., spelling, grammar, and overall writing quality)?
I am not sure if this is appropriate group for this question, so I apologize if this would be better posted somewhere else.
----
We're launching a new initiative to promote a culture of data-informed decision making, and part of that includes encouraging the use of AI tools (such as Microsoft Co-Pilot) for data analytics. However, these tools can also be used to streamline communications, particularly for improving/checking spelling, grammar, and so forth.
Have you encouraged your team to use AI in this way? How are AI tools being integrated into your communication workflows, and what has the impact been so far?
I understand this can be a sensitive topic, so I'd love to hear how your library is approaching it. Any recommendations, cautions, or lessons learned would be greatly appreciated.
Thank you,
Bill
------------------------------
Bill McIntire
Public Services Director
Dayton Metro Library
He/Him/His
------------------------------