AI tools like ChatGPT are advancing by the day. Here's how librarians are using them, plus an overview of valuable AI tools and terms.
Illustration and SLJ October 2023 Cover by James Yang |
Related reading:Teaching AI Literacy: The Road Ahead In this article: |
That quote isn’t from a teacher or a librarian or an expert in the field, although it does come directly from the source. It was written by ChatGPT, the AI-powered chatbot that has captured imaginations—and stoked fears—over the impact machines that seemingly can think for themselves might have on the future of just about everything, including education.
Most readers would be hard-pressed to discern whether that first sentence was written by a human or AI. So perhaps it’s understandable why so many school leaders had an initial knee-jerk response.
“It was a very fearful reaction of like, ‘Oh no, everybody is going to cheat,’” says Richard Culatta, CEO of the International Society for Technology in Education. “But what has happened as the educator and librarian community has really started to explore the possibilities is that this has huge potential for education. This isn’t just a cool new website. This is a real tool.”
That response has played out at the macrolevel. Almost as soon as it appeared, some of the largest school districts in the country, including Los Angeles, Seattle, and New York City, banned access to ChatGPT, citing fears over the ease with which students could cheat.
Some districts have already reversed their decisions. In New York City, Department of Education chancellor David C. Banks wrote that the city’s initial response “overlooked the potential of generative AI to support students and teachers.”
“If you play around with it, you’re like, is this thing alive? It truly seems to be understanding and manipulating language,” says Maya Bialik, who teaches seventh-grade science outside of Boston.
Before moving to the classroom, Bialik was a researcher at the educational nonprofit Center for Curriculum Redesign, where she literally wrote the book on artificial intelligence in education, so she’s not surprised by the initial reaction of many educators. “Definitely at first there’s fear. A teacher even asked me, ‘Should I be afraid?” I showed her how to use it and she became a total power user.”
AI is certainly complicated. It’s a powerful tool that is seemingly advancing by the day, and while some teachers and librarians might fear it, there’s no hiding from the impact it will have on learning.
Beyond ChatGPT: Useful AI ToolsToday, ChatGPT and AI are sometimes referred to interchangeably, similar to how AOL was once equated with the internet in its early days. But ChatGPT is a single AI program among many that are available to students and librarians. “It’s a powerful tool. But it’s critical that educators are exploring other tools and figuring out where each can be useful,” says ISTE CEO Richard Culatta. Here are a few suggestions. Canva ChatPDF Curipod Grammarly Khanmigo LessonPlans.ai MusicLM NOLEJ QuestionWell Stable Diffusion Winston Wisdolia |
Overcoming fear starts with understanding, and there’s a lot to unpack when it comes to artificial intelligence.
First, while ChatGPT is new, AI is not. The idea of machines that can think for themselves was first proposed in 1950 by the British mathematician Alan Turing. AI was popularized in science fiction, most notably in the murderous form of HAL 9000 in Stanley Kubrick’s 1968 film 2001: A Space Odyssey . But reality proved far less scary in the form of IBM’s computer chess AI program, Deep Blue, which defeated world chess champion Gary Kasparov in 1997.
At its most basic, AI is an advanced computer system that can process vast amounts of data to solve problems. ChatGPT and others like it are a special type of AI called large language models that are trained through exposure to huge quantities of text—billions of pages collected from websites, tweets, news stories, and more—and utilize natural language processing not only to understand human words and phrases but to generate human-like responses.
These new generative AI programs can create almost anything an individual can think of, and text is only the tip of the iceberg. There are already AI-powered tools that make artwork, photorealistic images, video, and music, all from a simple prompt.
Culatta says he recently worked with students using a tool called Stable Diffusion, a text-to-image AI program. “Students were working on a project creating a city of the future. I said, go in and have Stable Diffusion generate the image of what you’re creating.” The results wowed kids and adults alike.
These tools can also make the lives of teachers and librarians a lot easier, turning tasks that usually take hours into minutes. There are programs to write lesson plans, make flashcards, and create tests. In addition to using AI tools in her classroom, Bialik has created her own, QuestionWell, which generates questions for everything from exit tickets to multiple-choice quizzes and creates multiple versions of a test for student retakes.
It’s not only small developers bringing AI to classroom technology. In July, Khan Academy announced a partnership to integrate its AI student tutor and teaching assistant, Khanmigo, with Instructure’s learning management system, Canvas. The tool is designed to offer personalized support to students and help teachers write lesson plans, create rubrics, and even provide feedback on student essays.
The possibilities are exciting—and overwhelming. And librarians can and should play an important role in figuring out how exactly this technology can enhance student learning.
Joni Gilman is the media specialist at Seckinger High School in Gwinnett County, GA. Her school is part of a cluster where AI is being incorporated into all areas of learning, kindergarten through 12th grade.
Gilman says she views AI tools as collaborative partners, helping to make her job more streamlined. She’s used ChatGPT to brainstorm lesson ideas and for help making reading suggestions.
“Students think we’ve read every book in our collection. But I can ask [AI] for an engaging summary of a book, or help creating different promotional things to grab student attention, whether it’s a social media post idea or giving book recommendations,” Gilman says.
Gilman is already imagining a library where AI could “help us create better collections when we’re making book orders. AI can give us a diversity audit, tell us where we have holes in our collection, look at usage data to find emerging topics, and make better book orders. It can help us with weeding and resource allocation.”
She’s quick to add that “nothing is going to replace that human interaction when talking about books with an expert.” In large part that’s because AI isn’t perfect. ChatGPT is only trained on data up to September 2021; a lot of its responses aren’t current. Once, Gilman says, it recommended a book that didn’t exist, an AI phenomenon called a hallucination.
Since ChatGPT and other AI tools aren’t connected to the school’s library system, recommendations aren’t always available in the collection, but even that will change soon. Accessit, a library management system that Gilman’s school uses, is planning to give customers access to a third-party AI tool called Talpa Search, an experimental program that can help students find a book they’re looking for, even if they don’t know the name of it.
“For example, the user might search for ‘book about dragons with a red cover,’ and Talpa would then search multiple databases to find options that fit this search,” wrote Sarah Jordan, chief customer support officer at Accessit, in an email. “You can see which results are in your library that you can borrow, as well as other books which fit your search criteria.”
AI will be more than just a tool, however. Schools will also need to consider how to teach students to be critical users and prepare them to live in a world where they will be learning and perhaps, one day, working alongside AI.
School librarians are perfectly suited for this new role.
“They have experience curating content, they have a creative, physical space to host kids and teachers to try things out, and, by the nature of their job, they interact with nearly every faculty member and every student,” Culatta says. “It’s the perfectly written job description for a person to help a school explore a new tech geared to creating content.”
Much in the way librarians evolved from teaching traditional literacy to training students in media and digital literacy, the next step is AI literacy. Just like learning the skills to effectively search an online database, students will need to learn how to converse with AI to get the best research results and to distinguish between what’s real and what’s AI generated.
“There is so much out there that students will find that won’t be true or will be fake,” Gilman says. “It’s going to be key for librarians to build that bridge and help students better understand what they are finding on the internet.”
That’s easier said than done. Teaching students how to critically assess where information comes from is a crucial part of information literacy, but almost an impossibility when it comes to AI. It’s one of the reasons that make Sara Kelley-Mudie, a librarian and educational technology specialist at Beaver Country Day, an independent school in Newton, MA, anxious about the technology.
“How do we ask students to be ethical consumers of information when it’s easier and easier to just get an answer and not actually be able to trace it to its origins?” she says. “I teach students how to investigate sources, trace claims and quotes, and that’s really hard to do with generative AI, especially because it feels so authoritative.”
Although fears that AI might overthrow humanity are overdramatic at this point, there are certainly areas for concern, especially when it comes to students and issues of privacy, bias, and cheating.
In terms of bias, Bialik acknowledges it is a worry. But she also points out that human beings can be just as problematic, if not worse. Bialik holds a master’s in Mind, Brain, & Education from the Harvard Graduate School of Education and says she is consistently shocked to hear teachers make sweeping, biased statements about students.
“In psychology, you gather data and make sure it’s not tainted. But in teaching, you don’t have time for that, and there’s a lot of shortcuts that are full of bias,” she says.
“At least on systems, when it becomes codified, you can see the bias,” Culatta adds. “When a teacher is making decisions that are biased, there’s no way I can get an audit of that. When you have a system making recommendations with bias, at least in that case you can review and do something about it.”
Even privacy concerns aren’t so black and white. “It’s easy to say all data should be private,” Culatta says. “But then when you peel it back, what if a student is at risk? Or what if I can tell through an AI interaction that a student is struggling with a math concept and AI can alert a teacher about that and make sure they aren’t lost? Understanding where protections should be is critical, but it’s not binary, either completely private or completely open.”
When it comes to cheating, anxiety has been particularly high within the humanities where, for the first time, students have access to what is essentially a calculator for natural language. Some educators are already using Winston, a detection tool that determines if student work was created via ChatGPT or other AI large language models. Meanwhile, other disciplines have been dealing with this issue for years—from students using Photomath (photomath.com/en) to solve algebra problems to completing world language assignments with the help of Google Translate.
Educators have had to rethink instruction with those tools in mind, says Kelley-Mudie. Now the rest of the field will have to do the same. Or as Culatta puts it, if there’s a concern about cheating, the problem is not with the AI tool, but with the assessment.
For example, an assignment to write an essay about the American Revolution is easily answered with the help of ChatGPT. Strategically, a teacher could assign students to actually have ChatGPT write multiple essays about the American Revolution and then go through each one to critique, find mistakes, and fill in information that the AI program may have overlooked.
Another solution for teachers is to put the focus on how students document their process. For a research seminar she taught last spring, Kelley-Mudie had students include acknowledgements in addition to works cited. Alongside sources, students referenced all the things that inspired or helped them, including AI tools.
“Not in a gotcha way but in a reflective way,” Kelley-Mudie says. “The final product is great, but we really should care about the steps they took to get there.”
Gilman agrees that bringing AI into the open is the right way to make it a tool for learning rather than cheating. At her school, teachers are already talking with students about how ChatGPT can be a collaborative partner to provide feedback, removing any excuse for not turning in a revised essay.
But Gilman also recognizes other areas of concern. She says equity is going to become an important issue for student access when tools like ChatGPT, which are currently free, begin to charge a fee. She also says the ability to create lifelike deep fakes has the potential to be very harmful to students from a bullying perspective. Schools will need to have policies in place to address these kinds of hazards.
Despite the risks, Gilman still says that AI and “the benefits of it, the things that it can do, are just mind-blowing.”
And what does ChatGPT have to say about it all?
“As an AI language model, I don’t have personal opinions or desires. The decision of whether to use AI models like ChatGPT in schools ultimately rests with educational institutions, policymakers, and educators. It is important to carefully consider the benefits, risks, and ethical considerations associated with AI implementation in educational settings. Open discussions, thorough evaluations, and informed decisions should guide any choices regarding the use of AI technology in schools.”
A human couldn’t have said it better.
An AI Glossary of Terms
Algorithm Artificial intelligence (AI) Chatbot Computer vision Deep learning Generative AI |
Machine learning Natural language processing (NLP) Neural network Reinforcement learning Robotics Supervised learning Unsupervised learning |
Andrew Bauld is a freelance writer covering education.
We are currently offering this content for free. Sign up now to activate your personal profile, where you can save articles for future viewing
Add Comment :-
Be the first reader to comment.
Comment Policy:
Comment should not be empty !!!