David Schultz was explaining the difference between normative and factual statements in his Ethical Public Policy class at Hamline University last Friday when he asked students to look up the sales tax for Ramsey County on ChatGPT.
One student called out one number from the AI engine. Another student called out a second number. Someone else found a third figure. The student sitting next to me used Google search instead and found yet another figure — but kept silent.
"How do we resolve this?" Schultz, a political scientist, asked.
"Look it up ourselves," a student replied.
As the new school year begins, professors at colleges and teachers in middle schools and high schools are all wrestling with what to do with ChatGPT and other AI services that now provide a shortcut to not just getting information but to writing and presenting it.
Schultz decided to run headlong into the dilemma and encourage students in his three undergraduate courses to use it in class, for papers and for tests. He invited me to visit his classes this semester to watch, and I eagerly accepted.
Nearly 30 years ago, as a tech reporter for the Associated Press, I wrote about how the nation's colleges and universities had become wired societies through the use of e-mail, foreshadowing its use in business and, really, by everyone.
Today, higher education is a similar proving ground for AI, specifically the large language models of products like ChatGPT. These chatbots can produce essays, even novels, with a few simple requests, or "prompts."
In January, three University of Minnesota law professors put some of their test questions through ChatGPT and it came back with answers that, when graded blindly against real students, achieved "low but passing grades." The professors published the results of the experiment themselves.
"In the last six months or so, higher ed has gone bonkers over artificial intelligence," Schultz told students in the first session of his American Government and Politics course. "I'm going to let ChatGPT generate a lot of what we're going to do."
In a show of hands, many of the students in the two courses indicated some familiarity with chatbots. One said that he regarded Microsoft Bing as a better tool for learning because it attributed its sources.
That led Schultz to say that students could use whatever AI chatbot they thought best. "I don't know where the road is going to take us," he said.
But he said he would also require students to verify facts that are offered up by the chatbots. "You have to look up and find a source for the information," Schultz said.
Grant Larson, a senior in political science, told me that he hadn't used ChatGPT until Schultz required it. And some of the early experience confirmed stories he'd heard that he called "a little crazy."
"You would definitely have to fact check it yourself," Larson said.
Rani Hamza, a sophomore aiming to finish Hamline in three years and then go to law school, said he's been using ChatGPT since early in the year for help when he has to write a paper or essay.
"I struggle a lot beginning a paper," he said. "I'll put in a topic [in ChatGPT] and read what it says to get a start."
Hamza dubbed the writing output of ChatGPT "good" but said he doesn't think it's trustworthy. "It gives you an idea, but it doesn't give you an exact fact," he said.
In the ethics course last Friday, Schultz asked students to type the statement "Life begins at conception" into ChatGPT. The output was nothing different than you would suspect, or that Schultz teaches. The statement, the chatbot said, is controversial and broadly debated.
In the American government course, he told students to ask ChatGPT, "Who governs America?" It's the question, he said, that students themselves will have to answer in the final test of the course.
The chatbot came back with a textbook description of a representative democracy with a complex system of federal, state and local governments. That's one model, Schultz said.
The other way to understand who governs the country is by studying who benefits from government actions, who are the winners and losers in America. Schultz said he wants students to decide which of the two models they find more compelling.
I'll check back in on Schultz and his students later this fall. But he has already signaled that they won't be able to answer the main questions on his tests with a chatbot.
That's because his questions are just like the one that my editor always asks me when we discuss ideas for this column:
What do you think?