This page is licensed under CC BY-NC-SA. It was adapted from Fact Checking AI in the AI Literacy guide by the University of Georgia Libraries under a CC BY-NC-SA license.
As you evaluate the accuracy and credibility of an AI tool and its responses, consider some of the following questions:
Does it cite its sources? Are they real or hallucinations? AI can hallucinate or fabricate information, presenting imaginary or nonsensical statements as facts.
Who is the intended audience and what is the rhetorical purpose? Test how the tool's responses change if you tweak the context for your prompt.
Recognize and monitor for bias. Use prompt engineering to test out and attempt to correct for its biases. What AREN'T you seeing? What voices, perspectives, and accounts of the world are over or under-represented?
Try lateral reading: Explore outside your source to verify its information. As Michael Caulfeld says in Web Literacy for Student Fact Checkers, "Once you get to the source of a claim, read what other people say about the source (publication, author, etc.). The truth is in the network."
The SIFT method asks you to make "four moves" when evaluating a source.

Learn more about the SIFT method - U Chicago LibGuide
The SIFT method was created by Mike Caulfield. The image above is copied from his materials under a CC BY 4.0 license.