How Arizona universities are dispelling fear, shifting the conversation surrounding AI
Artificial intelligence, or AI, has become a hot topic at universities over the past year. Generative AI tools that are capable of writing articles and creating images have some professors concerned about cheating.
But some people at Arizona universities are finding ways to dispel some of that fear, educating staff and students on how they can use those tools positively.
The Arizona Board of Regents has not provided a policy or guidance to the universities regarding AI, leaving the schools to make their own decisions about acceptable use. Both Arizona State University and the University of Arizona have created new syllabus guidelines for their classes. In those, teachers can outline whether they want students to be able to use AI a lot, a little, or not at all.
Mina Johnson-Glenberg directs the Embodied Games XR Lab at ASU. XR stands for extended realities.
“We create content along those realities for, generally, STEM education so we make games for like fourth through adult-age learners,” she said.
The general philosophy at ASU is that they want students to graduate being AI literate.
“AI is going to be around for a long time,” Johnson-Glenberg said. “It’s not going to go away. So we want our students to have some experience working with it.”
Instructors are encouraged to think about AI being a collaborator. That means never putting it in the driver's seat, using it to turn in an entire assignment like writing a paper, but integrating it in other ways.
“And then in the final product, when you turn it in to your professor or instructor, you need to cite where you did that,” Johnson-Glenberg said. “So using proper APA style or whatever citations for where the AI was used.”
Johnson-Glenberg teaches a class that allows students to do just that, using AI as a tool to enhance the work they’re already doing. One of her students made a game to help with Parkinson's disease rehabilitation.
“She needed to make an interface like ‘What would the game look like?’ But she’s not an artist, right? So students used to turn in little stick drawn figures to show me what it would look like, but now with these text-to-image AI generators, she made this gorgeous interface,” Johnson-Glenberg said.
Another one of her students used it to code a new keyboard for people with cerebral palsy.
“He was a pretty good coder, but he used AI to help him code up some of the language and he said that it helped him go 50% faster,” she said. “Now it makes errors, right? So you have to know a little bit how to correct the errors from the AI, but the fact that he was able to use that and save time was very beneficial.”
She add that the most important thing to note about these tools is, they’re nowhere near perfect. For instance, one of her guest speakers asked an image generator to show a photo of ‘frozen pears.’
“So he showed us the picture and indeed there were baskets filled with pears that weren’t quite frozen, but on top of them were like these weird heads that looked like Elsa,” Johnson-Glenberg said.
The generator picked up a character from the Disney movie Frozen instead of understanding the word as an adjective. But as some tools make obvious errors, others are more advanced.
That’s why ASU has put together student and staff committees that lead discussions on AI. Frank Liu is pursuing a Ph.D. in computer engineering. He heads up the student committee.
“We discuss different topics that the faculty gives us and our goal is to be able to help inform decision making at ASU with regards to generative AI policy,” Liu said.
His group wants to unpack fears among professors and educate them on how AI could be used positively.
“Like I used ChatGPT for grammar,” Liu said. “When I’m writing, it can fine tune and find small grammar things. I ask it for suggestions of like ‘how can I reword this sentence to make it stronger?’”
Barney Maccabe directs the Institute for Computation and Data Enabled Insight at University of Arizona.
He said the university has created a new working group where about 150 faculty, staff and students are having an ongoing conversation about AI. One of the reasons for that is that last fall, the university’s plagiarism checkers started looking for the use of AI in student work.
“And we didn’t understand how they had come about their understanding of what constituted plagiarism or inappropriate use of the tools and they didn’t give us any information,” Maccabe said. “So in fact, the recommendation was to turn that off as quickly as possible.”
That’s because the program was turning up false positives. He said much like ASU, UA wants students to know how to use these tools because they’re already being used in the professional world.
“At the end of the day, it’s your responsibility to make sure that what you turn in, what you have written, you can stand behind and you have to be part of that creative process,” Maccabe said.
If there’s one thing all three of these AI aficionados agree on, it’s that Generative AI tools will ultimately be more friend than foe.