Skip to main content

Accessibility menu

Skip to main content Skip to footer

Detect, Design, and Integrate

Posted 12:58 p.m. Monday, Feb. 20, 2023

ChatGPT screenshot

How to encourage student learning and motivation in the wake of ChatGPT

By: Bryan Kopp, Tesia Marshik, and Lindsay Steiner 

As a follow-up to our first blog post, “AI help, the semester is starting” and CATL session, “AI did my homework”, we offer recommendations to instructors for addressing concerns (and opportunities) regarding new AI tools such as ChatGPT. We recognize that instructors may wish to engage with AI in different ways, and to different extents, so our suggestions fall into four main categories: deter, detect, design, and integrate. Ultimately, we encourage you to choose the approach(es) that best suit your teaching context.  

DETECT 

One of the first questions many instructors have upon learning about new technologies like ChatGPT is “how do we stop it?” And while it is understandable to have a negative reaction and/or to assume the worst about its use in educational settings (indeed, instructors had similar reactions to the introduction of calculators, spell-check, and even ballpoint pens!), it’s important to recognize that AI is here to stay. Attempts to ban access outright (as some school districts have tried) are simply unlikely to be effective given the ubiquitous nature of technology in our students’ lives. But one of the simplest and most proactive measures you can take to deter its use in your courses is to develop a syllabus statement in which you clearly communicate your expectations. As noted in the previous CATL blog post, there are pros/cons to identifying potential sources for cheating, but it is generally better to be direct rather than hoping they don’t know about it, or assuming students already know what counts (or doesn’t count) as academic misconduct. When you clearly set your expectations from the start, you’ll be less likely to have unpleasant interactions with students later. 

The second question instructors often have is “how can I tell if a student’s work is AI-generated?” The short answer is it’s not easy. Academic researchers and tech companies are working to develop ways to detect whether a sample of written text was generated by a program like ChatGPT. “Turnitin” has suggested they will release an add-on to their originality checking software for educational use later this year. In the meantime, there exist two free online detectors: 

However, there are many limitations regarding the use of these detectors: 

  • They report the statistical likelihood that text was AI-generated, not definitive proof. So even if a text was 90% likely written by AI, there is a 10% chance it wasn’t. 
  • They are not very transparent in showing how the likelihood estimate is determined (i.e., they are not like plagiarism detectors, where you can directly compare the student’s work to some other source)
  • Perhaps most concerning: the validity of these detectors is still relatively unknown and some reports suggest that their accuracy in identifying AI text may be as low as  26%. There have been multiple instances where both detectors have been known to falsely characterize authentic text as being AI-generated and vice-versa. These systems will likely continue to evolve but, without current research to support their validity, they should only be viewed as supplemental tools, as opposed to definitive solutions. 
  • Given how rapidly the technology is evolving, efforts to “catch” it currently mimic something like a cat-and-mouse game. With the design of each new app to detect AI-generated text, comes the development of another app to fool or circumvent it. 
  • Lastly, because they are not integrated into e-learning platforms (as Turnitin is integrated in Canvas), there are significant practical and ethical barriers to using stand-alone programs such as those listed above. Specifically, instructors must individually copy-and-paste or upload student papers. And if they aren’t uploading all student submissions, instructors may selectively act upon suspicion, which opens the door to bias and equity concerns about which students are more likely to be targeted for review. One potential compromise is to randomly sample students’ written work to be reviewed by such programs...but many instructors who care about academic integrity may ultimately find that unsatisfying. 

Because of these limitations, at this time, CATL suggests that it may be more worthwhile for instructors to devote their time and energy towards designing engaging classroom assignments and assessments that make using AI-text programs less attractive/helpful, or otherwise integrating such technology into their courses in ways that may benefit student learning. If, however, you decide to use AI-text-detectors, it is important that instructors abide by the UWL Academic Misconduct policies/procedures.  

DESIGN 

Instructors have an opportunity to rework and redesign assignments in new and exciting ways. Assignment redesign can focus on how to support and assess learning in ways that are not easily addressed by using AI tools. Additionally, we can revise assignments with an eye towards improving student engagement and encouraging accountability in the work they complete for our courses.  

Consider the following steps when redesigning an assignment:  

  • Contextualize assignments for students so that they understand the purpose and expectations of the assignment within a course unit/the entire course. 
  • Emphasize the process work needed for an assignment. While the final product is important, asking students to submit process work such as brainstorming materials, outlines, drafts, and more can communicate the importance of students’ thinking and ideas that we may not see in a final document. 
  • Ask students to reflect on and explain their work throughout your course. Reflection can happen in discussions (in-class and online), informal reflective writing assignments, or reflective annotations within an assignment deliverable. 
  • Assign work in class using a low-tech 
  • Integrate multimodal/multimedia aspects into assignments. 
  • Make assignments more authentic and specific. Consider developing a client project or Community-Engaged Learning course. These kinds of assignments have a “life” outside of the classroom and may provide additional motivation for students to submit authentic work. 

INTEGRATE 

A third way instructors can address AI is by integrating it into our classes. AI is already affecting higher education and future impacts are likely beyond school. How will students' post-graduate plans and future professional lives be transformed by AI? While it is too early to answer this question, we can start thinking about how AI relates to the courses we teach. Below are four strategies you can use: 

  • Talk about it. Consider devoting a portion of a class or a unit to discussing the ethical dimensions and professional implications of AI in your subject area. Questions about the limits of AI in your field, the value of human-powered reasoning, or the rewards and risks of technology more generally can engage students with topics they will encounter in the future
  • Promote information literacy. Many are predicting AI will be used to generate and propagate misinformation. For this reason, it is more important than ever to help students develop their critical evaluation skills. One way to do this is to give students AI-generated output on a question specific to your class. Next, have the students fact-check the responses, practicing strategies such as lateral reading (looking across several sites/resources rather than just one). 
  • Assign as process work. AI may be viewed as an assistant, if an unreliable one. Instructors can ask students to use AI to generate content as part of their research process for an assignment. Students can then work to validate or support findings with research. AI can also be used to generate templates that can help students jumpstart the writing process. Either way, the goal would be for students to go beyond the AI response and/or show they understand it. 
  • Generate samples to critique. A variation on the above themes is to give students sample AI output in response to a prompt you ask in class. Next, have the students critique it using a rubric, scoring guide, or review questions. Such practice can help students internalize evaluation criteria for the assignment and see for themselves the limits of AI.

Continue the conversation 

Want to continue exploring how AI may affect (or even improve) your students’ learning? Sign up for CATL’s new AI Community of Practice! Our first meeting is on Monday, February 27 from 1:10-2:05 pm in Union 3115. 



Permalink