Show Us Your Syllabus: Chatbot Edition


Chatbots on the Syllabus

I had the chance to speak at Hofstra University's teaching and learning conference yesterday, and I'm pretty sure someone expressed a concern about ChatGPT, the new artificial intelligence text generation tool from OpenAI, in every concurrent session I attended. Hofstra starts classes next week, so faculty there are right now updating their syllabi to address the advent of ChatGPT and similar tools. And I know Hofstra faculty aren't the only ones making last minute changes to their syllabi in reference to chatbots.

Last week in the newsletter, I shared a syllabus statement on ChatGPT that was shared on Twitter by Davidson College professor Chris Marsicano, and I invited you, dear readers, to share your new syllabus statements on ChatGPT with me. I know that when working through course policies, it can be incredibly helpful to see how other instructors are handling such things in their syllabi, so I now present the first installment of what will likely be a recurring newsletter feature: Show Us Your Syllabus!

Andy Van Schaack, who teaches human and organizational development at Vanderbilt University, wrote in to say that it might be time to change the definition of plagiarism. It's no longer just passing off someone else's ideas or words as your own, but it's also taking credit for ideas or words that aren't yours. Unless, of course, you consider ChatGPT to be a "someone" who's ideas and words can purloined. For his syllabus statement, Andy asked ChatGPT to write one for him. Here's the result, with Andy's citation:

=-=-=-=-=-=-=

Use of AI-Based Writing Tools

  1. Students may use ChatGPT or other AI-based language models to assist with generating ideas or writing prompts, but the final product must be their own original work.
  2. Students may not use ChatGPT or other AI-based language models to complete assignments or exams.
  3. Students must cite the use of ChatGPT or other AI-based language models in their work, following the style guidelines for the course.
  4. Students are responsible for understanding the limitations and potential biases of ChatGPT and other AI-based language models and for critically evaluating their output.
  5. The use of ChatGPT or other AI-based language models is not a substitute for participating in class discussions or completing course readings.
  6. The instructor reserves the right to modify this policy as necessary.


Note: The preceding section was generated by ChatGPT from the prompt, “What would be a good syllabus policy regarding the use of ChatGPT (or other AI-based language models) to use for a college-level course?”

=-=-=-=-=-=-=

I appreciate that Andy included the prompt he gave ChatGPT that yielded this statement. Including the prompt in a citation seems to be a growing best practice when citing AI generation tools. I see this a lot in the alt-text for images generated by DALL-E, for instance. Asking students to cite their own use of ChatGPT seems sensible at this moment. There might come a day when we think of such tools like we think of spellcheck and don't feel compelled to cite them, but we're not there yet, at least not in higher education.

I have questions, however, about points 1 and 2. What does it mean for a student's final product to be "their own original work"? If a student used ChatGPT to generate ideas for a paper assignment, is the resulting paper the student's "original" work? Original is a complicated term in academic contexts, and I'm not sure ChatGPT understands those complications. Similarly, in point 2, what does it mean to use ChatGPT to "complete" an assignment? Point 1 would imply that some use of ChatGPT is okay, so maybe point 2 just means you can't have ChatGPT write the whole thing for you? Again ChatGPT's word choice is questionable.

Lots of questions, and I think Andy shares them. He said he's planning to have students document their various uses of ChatGPT and then compiled that documentation so they can analyze it as a class. "We’ll all work to figure this out together," Andy wrote. That seems smart at this point in 2023.

Megan Kittle Autry teaches a Master's level technical communication course in the College of Engineering at North Carolina State University. Instructors who teach writing like Megan definitely have ChatGPT on their radars this spring, and Megan has added a statement to her syllabus about the tool as well as a whole class session devoted to it. "My challenge," Megan emailed me, "is that while I don't want students to use ChatGPT to complete assignments for my course, I know they will likely use one of these tools in their workplace (or be building them!)."

Here's Megan's new syllabus statement:

=-=-=-=-=-=-=

AI Use Policy

In this course, you will learn effective writing principles, along with how and when to use them. You will gain confidence as a writer through practice and feedback. To that end, using an AI program to complete the work for you is not permitted in this course. We will discuss how and when you may use it in your profession.

=-=-=-=-=-=-=

It's shorter than Andy's, but Andy got ChatGPT to write his and ChatGPT is nothing if not longwinded. I like how Megan has framed her AI use policy around the goals of the course, which include learning the principles of writing and gaining confidence as a writer. To the extent that using ChatGPT short circuits those goals, students shouldn't use ChatGPT or similar tools.

I recently asked Margaret Rubega, who teaches ornithology courses at the University of Connecticut, about the new sound ID tool in Merlin, the bird identification app from Cornell University. I use Merlin all the time when I'm birdwatching, and I like that it can clue me into the birds around me by listening to their songs and calls. Margaret, however, isn't training amateur birdwatchers, at least not in her ornithology laboratory class. "The goal," Margaret wrote me, "is for students to gain skills... to advance their careers as professional field biologists. When the bird identification you are doing is DATA, it is important for you to be able to reach an ID for yourself... and to have a sense of how certain (which is to say, uncertain) you are about the ID." To that end, Merlin's sound ID tool isn't useful, in part because it doesn't convey its own uncertainty, and so Margaret forbids its use. As in Megan's technical communication course, the AI has the potential to short circuit the learning goals for the course.

I have been wondering, however, how we might use AI tools like ChatGPT in service of particular learning goals, like the way writing and rhetoric faculty at the University of Mississippi asked their students to use an AI tool that generates counter-arguments to a given argument. Or the idea I suggested in last week's newsletter of having students debate issues with ChatGPT as a way to understand argument structure or be more mindful about their word choices. As Megan indicates in her syllabus statement, her students are likely to use tools like ChatGPT in their profession, and finding ways to use these tools in support of course learning goals would be one way to prepare students for their use in future jobs.

Monica Linden, who teaches neuroscience at Brown University, shared her new syllabus statement on AI tools. This one is long, but worth reading. Like Andy, Monica had ChatGPT help write her syllabus statement.

=-=-=-=-=-=-=

USE OF ARTIFICIAL INTELLIGENCE TECHNOLOGY IN THE CLASS:

This is written specifically about ChatGPT, but applies to using any artificial intelligence technology in this class. I actually had ChatGPT write this statement, with a couple of editing prompts, to help you think about both what it can and cannot do. I copied this verbatim from ChatGPT, so the statement is not my own.

As a student in a writing class, you may be wondering how to use ChatGPT ethically and effectively.

First, it is important to remember that ChatGPT should be used as a tool for idea generation and grammar and syntax assistance. For example, you can input a topic or a writing prompt into ChatGPT and use the output to generate ideas and to understand different perspectives on the topic. You can also use ChatGPT to proofread and check your grammar, punctuation and syntax. However, it is important to note that the output should not be presented as your own work and should always be used as a tool to supplement and support your own ideas and writing.

Another way to use ChatGPT ethically is to properly cite any information or ideas generated by ChatGPT in your work. This means identifying the source of the information and giving credit to the author. This way you can use ChatGPT as a reference and not as an origin of the ideas.

It's also important for you to understand that ChatGPT should be used as a tool to assist your writing, not as a replacement for your own ideas and critical thinking. So, always try to think critically and come up with original ideas and concepts, and avoid relying too heavily on the tool.

As a student, it's important to know how to use the tool effectively. Your teacher will provide guidance on how to use the tool in a way that helps you generate ideas and improve your writing, rather than just using the tool to complete your work.
By using ChatGPT ethically and effectively, you can improve your writing skills and enhance your ability to generate ideas and communicate effectively.

Now that you see what ChatGPT wrote, you see both strengths and weaknesses in how it writes. For example, I find it to be a little repetitive and sometimes more broad than I would like. You can also probably tell that it’s not in the same voice as the rest of the syllabus. WORK IN THIS COURSE SHOULD BE IN YOUR VOICE!

=-=-=-=-=-=-=

Monica's right that ChatGPT gets a little repetitive, but that's because ChatGPT needs an editor. The message, however, is consistent with the other syllabus statements above. AI tools like ChatGPT should act as aids to student thinking, not replacements for student thinking. And like the other statements, Monica invites a conversation about these tools with her students. This is in stark contrast to much of the messaging to students around Wikipedia when it hit scene years ago. Back then, many faculty forbade the use of Wikipedia instead of engaging in a discussion with their students about what the tool was and could do for them. This move that both Monica and Andy made of having ChatGPT draft a syllabus statement about ChatGPT is another way to communicate to students that these are tools worth exploring.

Since Monica's statement also mentioned students citing their use of ChatGPT, I asked her what that might look like. She said she's still figuring that out, but she's thinking of giving students some options for describing how they use ChatGPT:

  • I did not use ChatGPT for this assignment.
  • ChatGPT helped with grammar and readability, but the original writing was my own.
  • ChatGPT helped with idea generation, but I wrote the assignment myself.
  • A small amount of my work (just a couple of sentences) were written by ChatGPT
  • ChatGPT significantly wrote my work, but I edited it.

I like this menu, and I can also see it evolving over the semester as Monica and her students explore the tool and as the tool itself changes. Honestly, I didn't know I could get ChatGPT to check my writing's grammar and readability. I'll have to try that out.

To wrap this discussion up, I want to circle back to four main points about AI tools and teaching that I've been workshopping on my blog:

  1. We are going to have to start teaching our students how AI generation tools work.
  2. When used intentionally, AI tools can augment and enhance student learning, even towards traditional learning goals.
  3. We will need to update our learning goals for students in light of new AI tools, and that can be a good thing.
  4. New AI tools will require a rethinking of community norms and expectations around academic integrity.

I think we've touched on three of these four points. I'm not seeing explicit attention to new learning goals in light of new AI tools in the syllabus statements above, but that's not surprising at this moment in time when instructors are just barely wrapping their heads around what these tools can do. I can imagine that twelve months from now, if AI writing assistants are standard practice in engineering professions, then Monica's technical communication course might have some new learning goals.

Thanks to Andy, Megan, and Monica for showing us their syllabi! I think this was a very successful first installment of "Show Us Your Syllabus." Keep an eye on future newsletters for more along these lines.

Lessons Learned from Pandemic Teaching

Back in the fall, I wrote down a number of themes I wanted to explore in the Intentional Teaching newsletter and podcast. One of my bullet points says, “Pandemic teaching is mostly over. Now what?” In this week's Intentional Teaching podcast episode, I talk with a couple of educators who have a fairly unique perspective on this question.

Regan Gurung is associate vice provost and executive director of the Center for Teaching and Learning at Oregon State University, as well as a professor of psychology. Dwaine Plaza is a professor of sociology at Oregon State. The two of them are editing a forthcoming book titled Onward to Better: How Facing a Pandemic Will Improve Higher Education in the 21st Century. Regan and Dwaine are in the interesting position of having read about two dozen chapter submissions for the book, all about lessons learned from teaching during the COVID-19 pandemic authored by faculty, staff, and administrators, including a healthy amount of teaching center directors. Full disclosure: I am one of those contributors! I wrote a chapter on our experiences with pandemic teaching at Vanderbilt University.

I asked Regan and Dwaine on the podcast so I could pick their brains about what they’ve learned reading and editing all those chapters. What lessons has higher education learned from such a challenging time? What lessons should higher ed learn? And how can we get ready for whatever challenge comes next? We had a great conversations, with important lessons for both faculty and administrators. You can listen to the episode here, or search for "Intentional Teaching" in your favorite podcast app.

Around the Web

This is the part of the newsletter where I link to things that I find interesting in the hopes that you do, too.

  • What's the plural of syllabus? - In honor of this week's feature, I present more discussion about the plural of syllabus than anyone asked for. Spoiler: There's good evidence for syllabuses as the correct plural.
  • Complex Instruction - In response to a query I posted on Twitter about the value of structured group work over "just turning students loose on a worksheet," David Coffeey suggested I check out this episode of the podcast he co-hosts with John Golden, Teaching Like Ted Lasso. This episode features an interview with math educator Joy Osland about complex instruction, and it's full of great advice for promoting more equitable small-group interactions among our students.
  • The Perils of Lecture Class - Over in my student-facing newsletter, Intentional Learning, this week's issue deals with the perils of lecture class. Here's a quote to get you interested: "Learning is hard work. If it feels super easy, it's probably not actually changing your brain in useful ways."
  • Religion and Contemporary Issues - While I was at Hofstra yesterday, I learned that their religion department revamped their undergraduate major recently, retitling it "Religion and Contemporary Issues." It seems like a very clever approach to taking a traditional liberal arts major and retooling it to connect with other disciplines and set students up for career success.

Thanks for reading!

If you found this newsletter useful, please forward it to a colleague who might like it! If you have thoughts on any of the topics above, please reach out via email, Twitter, Mastodon, or LinkedIn.

Consider supporting Intentional Teaching through Patreon. For just $3 US per month, you get access to bonus podcast content, Patreon-only teaching resources (like a guide to facilitating class sessions on Zoom), an archive of past newsletters, and a community of intentional educators. Patreon supporters also get 20% of my book, Intentional Tech: Principles to Guide the Use of Educational Technology in College Teaching, when they order from West Virginia University Press.

Intentional Teaching with Derek Bruff

Welcome to the Intentional Teaching newsletter! I'm Derek Bruff, educator and author. The name of this newsletter is a reminder that we should be intentional in how we teach, but also in how we develop as teachers over time. I hope this newsletter will be a valuable part of your professional development as an educator.

Read more from Intentional Teaching with Derek Bruff

Teaching in an Election Year with Bethany Morrison It is a presidential election year here in the United States, and that can make for some high-stakes discussions in our classes. I had been wanting to share some strategies for navigating those conversations, when earlier this month I saw a new blog post titled "Preparing to Teach During the 2024 Election" from Bethany Morrison, political scientist and assistant director at the Center for Research on Learning and Teaching (CRLT) at the...

Improving Teaching at the Institution Level with Lindsay Masland Back in September 2023, Beth McMurtrie published an article in the Chronicle of Higher Education titled "Americans Value Good Teaching. Do Colleges?" The piece explores some of the ways that the policies and practices at colleges and universities don't live up to the promises those institutions make concerning their teaching missions. While the article is mostly bad news for those who care about teaching and learning, McMurtrie...

A set of ten emoji faces, all showing various states of confusion

The paradox of textbooks During a recent online workshop, I wanted to introduce the idea that students might need help organizing the new information they encounter in our courses. I told the faculty in the room to imagine they were going to sit down with some friends to play a board and when they opened the game's rulebook, they were confronted with this: I asked the workshop participants to share an emoji in the chat that represented how they felt about that. Here's a recreation of their...