My Current Approach to Using Artificial Intelligence

As I get ready to teach my first day of two separate classes this semester, both of which center on Artificial Intelligence, I thought it would be useful (to me, at least) to take a minute to talk about my current approach to AI in general. This is something I think about fairly regularly and it’s something that has steadily changed over time as I get more and more used to incorporating the technology into my workflow.

I find generative AI extremely useful, and it helps me shave literally hours of work off my to do list. However, it has to be used carefully, or you can really mess up a project without even realizing it. When I was asked to teach an ENG 100 class, I had about a week to prepare. Yes, I had taught ENG 100 once before, technically, but it had been paired with a second class, and I was just filling in for someone who had to step away at the last minute. I didn’t design the class, and I generally just followed along on rails for the entire semester. If I was going to teach it again, I really wanted to do it my way.

Of course, preparing a class in a week isn’t exactly ideal, and I worried I wouldn’t be able to manage it. That’s where AI came in. (For reference, I used the paid version of ChatGPT for this project.) I started off by generally having a conversation with it. What types of papers do most people assign in freshman composition courses? What are some different approaches I could take to the class? Was it always accurate in responses? It didn’t matter: I was using it for a pool of ideas, and there weren’t really any “right” answers. It acted much like a fairly well informed friend might act in the same situation.

Through that conversation, I narrowed down my topic and the framework I would use to teach. (Batman and using AI to write.) But that still was miles away from where I needed to be. I had to pick papers. I had to figure out when they’d be due. I had to come up with a workload that would be sustainable for me. And then I’d have to write all of that into a syllabus.

Once again, AI stepped in to give me shortcuts on almost all of those steps. Instead of having to struggle alone, beating my head against a wall to come up with ideas, I could skim over thoughts it came up with, looking for the ones that made the most sense. When I needed to write a section of my syllabus, I asked it to tackle the first draft, and then I edited as needed. (AI is a fantastic writer for content that doesn’t really matter that much. Stuff where you need to convey basic facts quickly and efficiently, and you know exactly what you want those facts to be.)

After I figured out the papers I wanted to have my students write, AI helped me come up with a schedule for the entire semester, presenting me with options for what I could teach when. I knew I wanted to have us watch something Batman-related every period for around 20 minutes. It gave me suggestions for clips that would work for each topic. Sometimes these suggestions were just . . . wrong. (It helpfully suggested we watch the first hour of Batman Begins today). But they were all a great starting point, and I could quickly see if they would work or not. If they wouldn’t work, AI was ready to give me alternatives.

I got everything done with time to spare, and it wasn’t nearly as stressful as it might have been.

So why shouldn’t we just use AI all the time for everything? I think the biggest caveat I have for it is that we need to be sure not to offload the wrong things to it. Grunt work is fine, but the actual thinking? I want that to be mine. In a composition class, my goal is to teach the students how to take their thoughts and get them down in a form that other people can see those same thoughts clearly. AI can definitely help with this, especially if people are weak writers. But the students still need to be coming up with the thoughts in order for actual learning to happen.

It’s easy to see an assignment (“Write a 5 page paper about a linguistics topic of your choice”) and realize you can copy and paste that assignment into ChatGPT and it will helpfully write something for you. You can then copy and paste that into your final paper and be done, right? But what have you done? What learning occurred? It would be like me sitting down to learn to play the piano, only to discover that it’s an electric keyboard that’s preprogrammed with 1,000 songs. Can I claim I can now play the piano, just because I know how to hit the play button?

Nope.

The same is true for writing. It’s easy to fall into the trap that the only important part of a paper is submitting it. That’s not why papers are assigned. If students can understand that, then hopefully they won’t be as eager to turn to generative AI for all the “answers.”

For this class, I’m going to expect much more from my students in terms of writing quality. Standard generative AI responses aren’t B or even C quality. They’re D or F. The point will be to learn how to use the tool to get more out of it.

We’ll see how it goes.

Leave a comment

×