Skip to main content
Does ChatGPT tell the truth?
Updated over a week ago

ChatGPT can be a helpful tool, but it's not perfect. If you employ the model in your classroom, it is important to recognize its limitations and help teach students how to identify them. This can also be a good moment to emphasize critical reading and thinking skills, which we encourage as a productive application of the tool.

  • It might sound right but be wrong

    • Sometimes, ChatGPT sounds convincing, but it might give you incorrect or misleading information (often called a “hallucination” in the literature).

    • It can even make up things like quotes or citations, so don't use it as your only source for research.

    • Sometimes it might say there's only one answer to a question when there's more to it, or misrepresent different sides of an argument, mistakenly giving each side equal weight.

  • It doesn’t know everything

    • ChatGPT's knowledge is not up-to-date, so for the most part, it doesn't know about current events or trends.

    • ChatGPT is currently primarily trained in English.

    • We can’t say definitively what it does and does not know, and don’t understand entirely when it does or does not express confidence in incorrect assertions.

  • No access to tools like calculators or the internet (mostly)

    • ChatGPT can't browse the web or access up-to-date info from the internet without plugins enabled.

    • It can't verify facts or do things like complex calculations without access to the Internet or use of plugins.

Did this answer your question?