With artificial intelligence being a tool that can both help and hurt students in their education, finding a way to address it in a way that walks the fine line of control is a challenging task.
“Teachers are going to approach AI differently,” professor Eric Odegaard said.
Odegaard teaches English at Citrus College and heads a workgroup created by the Academic Senate that addresses what language professors should use in their syllabus regarding the use of AI.
The Academic Senate was formed to “ensure democratic participation of the faculty in shared governance,” as said on their website. It also aims to make sure policies and procedures for Citrus College are made with the best interests of the college in mind.
“Instructors noticed students needed some guidance when it came to what’s acceptable in terms of using AI technology in the class,” Odegaard said.
Odegaard said the Academic Senate decided the best way to handle this would be to create a workgroup that created ideas on dealing with students’ usage of AI.
Because AI is a relatively new piece of technology, defining parameters on student usage has proven to be a challenge.
Four statements were made from most permissive to most prohibitive.
These were options professors could choose to input to establish their expectations for students’ usage of AI.
They varied on what would be acceptable for students to use on assignments and what would be categorized as plagiarism.
The most permissive statement said that ChatGPT and other AI tools could be used when the professor gives explicit permission to avoid it being misconstrued as plagiarism.
The statements of permissive and prohibited contained a combination of both, just leaning slighting more in one direction.
The most prohibitive statement mentioned tools such as ChatGPT and Grammarly may not be used at any time. The possible punishment of notifying the Office of Student Affairs was also included.
However, Odegaard said no professor is mandated to say anything about AI.
“The Academic Senate does not have the power to tell individual instructors how to teach,” Odegaard said.
Stances on AI usage in the classroom vary widely across campus. Some permit and encourage the use, while others do not accept it in any form.
Professor Jack Hanna, who teaches psychology, created a separate section in his syllabus where he provided his stance on AI and on how he deems it acceptable for his Psychology 203: Research Methods in Psychology course.
“First, AI can be incredibly useful,” Hanna said. “We can use it to make our resumes, write our papers, create presentations, create exam study materials, explain difficult material to us and much more.”
Although Hanna addresses the benefits, he said the technology is still relatively new so users would be wise to proceed with caution.
He said that he permits the usage to his classes as long as it does not overlay with the submission of original work.
The gray area, in which AI can be used sometimes and prohibited in others, is something that Hanna said he promotes and emphasizes.
Psychology is a field that involves both original writing and scientific fact. The scientific fact, and studying to retain that information, is where AI has been able to be introduced.
On the other hand, an English course is meant to promote the development of fundamental skills such as writing and comprehension, Odegaard said.
This is where he said he believes the line should be drawn.
“I go the most prohibitive route because I am teaching writing.” Odegaard said, “I think what happens with AI is that it takes away students’ ability to gain skills because it does some of that work for them.”
He uses the most prohibitive statements in his syllabus.
“Since allowing another writer to help complete any part of your work is plagiarism, then using AI tools such as ChatGPT, Grammarly, Quill, or other similar tools based on generative artificial intelligence is also considered plagiarism,” Odegaard said.
Odegaard said he knows another professor, Senya Lubisich, being a prominent voice in the workgroup alongside him.
Lubisich is a history professor and the coordinator for online classes at Citrus.
“Failure to address AI and its use in your course leaves a gray area and potential for confusion,” Lubisich said.
She said she believes the gray area is a downside, that clear lines should be drawn if students can use AI for their coursework.
Because Lubisich coordinates online education, she said has a front seat to a whole different set of challenges regarding AI.
With online classes, professors have to find ways for students to establish their voice that professors can become accustomed to, Lubisich said.
Bots threaten to take away a student’s voice and become their sole contributor to assignments and this has affected the way professors teach, with more focus on language than ever before, Lubisich said.
“It feels much more like policing than teaching when we are checking for AI-generated content, plagiarized content,” Lubisich said.
Citrus has made no statement about the use of AI and Odegaard said it does not need to, yet Lubisich said she believes it should.
“That being said, I do think that Citrus should have guiding language tied to academic integrity about the use of AI,” Lubisich said. “We need to be clear that the overriding purpose of academic assessments is to see what the student has learned and can do.”
Because there are so many different courses and subjects throughout Citrus, having one way to use and apply AI would be difficult, which is why statements were created and suggested for use, Odegaard said.
Although AI acceptance varies greatly across the college, both Lubisch and Odegaard predict it will remain in education, and likely become a bigger question mark in Citrus’ future.