Refresh loader

Prior to (or instead of) using ChatGPT with your students

Prior to (or instead of) using ChatGPT with your students

I have been thinking, reading, and writing a lot about OpenAI’s ChatGPT product over the last month. I’ve been writing from the perspective of instructional design/faculty development/edtech mostly in higher education, though I did dive into a bit of K-12 (which is totally out of my element).

I understand the allure of the tool and the temptation to have students use it. It is new and shiny and everyone is talking about it. It is also scary, and sometimes we can assuage our fears by taking them on directly. 

But I suggested across two other posts that educators might not want to have students directly work with ChatGPT via having them sign up for a free OpenAI account for the following reasons:

  • Student data acquisition by OpenAI
    • Anytime you use a tool that needs an account the company now has an identifier in which they can track your use of the site to your identity
    • You need to provide personally identifiable information like an email/phone number/google account to create your OpenAI account
    • Their terms are quite clear about collecting and using data themselves as well as sharing/selling to third parties
  • Labor Issues 
    • Using ChatGPT is providing free labor to OpenAI in their product development. They are clear about this in their terms and in their faq page.
    • I don’t want to go down the “robots are coming for our jobs” path but many people (including the people building these tools) do envision AI having major impacts on the job market. Is it okay to ask students to help train the very thing that might take opportunities from them? It could be making opportunities too but shouldn’t they understand that? 
    • And I didn’t mention this in the other posts but AI has horrible labor practices exploiting global workers who train these systems. Do we want students to be part of that? Shouldn’t they at least know? 
  • ChatGPT is not a stable release, it could change or go away at any point. It is estimated it costs $3 million USD every month to keep it running. What happens to your assignments if it is down/gone?
    • ChatGPT has been released as a “Research Preview” and no one really knows what that is
      • It might be similar to a “Public Beta” or a “Developer’s Beta” but both of these come with an assumption of a public release which we do not have with ChatGPT
    • It is often down or slow because of the large number of users
    • Features are changing all of the time (for instance chat histories have disappeared and reappeared a few times already)

 

After suggesting this I got a good bit of push back. “But AI is such a big deal Autumm, and it is going to change the world, and students need to be prepared … and… digital literacy and… and… and…”

I hear you my good intentioned pedagogue. And yet I still have these concerns. So, here are just some ideas of some things you may want to do with your students prior to having them directly use the ChatGPT product with a free OpenAI account – and (I’m kind of hoping) maybe you want to have them do these things instead of using ChatGPT with a free OpenAI account.  

Socially Annotate OpenAI’s privacy and service Terms 

Wouldn’t it be great if students better understood what they were getting themselves into by creating that account with OpenAI? A social annotation activity using a tool like Hypothesis of OpenAI’s privacy policy and terms of service (TOS) can start this understanding. I’ve done this several times out on the open web with various collaborators. TOS and Privacy Policies are dense technical and legal readings so doing it as a group with in line comments really helps. If you can invite a guest annotator who has a background in law or policy great and if not consider having a reading before the annotation about what to look for in a privacy policy and how to read a TOS.  

*Note – This one can be somewhat problematic if your school does not provide a social annotation tool as students likely need to create an account with a social annotation provider who does not have an agreement with your school and that could be the same problem you are trying to avoid. I do feel better about Hypothesis because they are a non-profit but you could also get around this by copying the terms/policy and sharing it in your school supported cloud word processor (Google Docs, MS365, etc) and just using the commenting feature. 

Play the Data, Privacy, and Identity game with your students

Instead of “playing” with ChatGPT (cough, nota toy, cough) in your class you could play the Data, Privacy, and Identity game developed by Jeannie Crowley, Ed Saber, and Kenny Graves. First developed as an in person activity, read Jeannie’s blog post overview of the game. Then check out the resource page where you can read instructions and print off cards. Looking for an online version? Since the team published this with a CC 4.0 license I adapted it into an online version on a simple WordPress site using H5P that requires no login and collects no data. 

Discuss big issues around AI like labor and climate

Have a discussion with students about big issues with AI that are likely to affect them. A good overview of the issues with large language models can be found in Bender, Gebru, McMillian-Major, and Shmitchell’s 2021 paper On the dangers of stochastic parrots: can language models be too big. A discussion of this paper will set you up to dive deeper on the issues.

Impacts of artificial intelligence on labor directly speak to the world of work that students will graduate into. This report from the US-EU Trade and Technology Council about the impact on future workforces can be a starting point. You may want to break it into sections and keep in mind that it is US/EU centric. Follow up (or start with, depending on your context) a more global perspective. You could check out MIT Technology Review’s whole series of articles on AI Colonialism or the recent reporting from Time about OpenAI paying workers in Kenya less than $2 a day for grueling work training the model (you will need a content warning for SA and have to figure out how to get around the paywall for the Time Exclusive but other great articles about this report exist like this one from Chole Xiang on Motherboard).

Large language models like ChatGPT take a lot of computing power to run and all of that electricity has a carbon footprint that we are still trying to figure out how to measure. Discussing this with students helps them to understand these potentials. Maybe start with a discussion around this MIT Tech Review article on how Hugging Face is attempting to better measure things

Conduct a technoethical audit 

If you don’t know about all the resources on the Civics of Technology site you are in for a treat. Here I’m specifically going to recommend their resources around EdTech Audit but the site has a great larger curriculum with all kinds of resources. I’m not sure that ChatGPT is really “EdTech” but if you are thinking of having students use it then you are using it as EdTech. I think the questions, handouts, and examples provided here will serve you in getting your students to analyze some of the implications from the articles and activities listed above. 

Analyze your data collected from other social media platforms

Check out HesitaLabs Digipower Academy. They have several tools, which run in the browser and collect no data, which allow you to examine and better understand the way social media platforms use your data for targeted advertising. It does require that you request a data export from these various platforms but they have instructions for how to do that for each platform. After the tools analyze your data they provide you with dashboards and metrics to help you better understand why you are being targeted the way that you are (because we are all being targeted in some way). Don’t feel comfortable having students download their own data (can they really secure it)? They have sample data you can run too.

Work Through The People’s Guide to AI

What even is an “algorithm”? What is the difference between AI and Machine Learning? The People’s Guide to AI is a workbook helping you to answer these questions. It is filled with relatable descriptions, activities, prompts, and so much more! You could spend the whole term working through this thing!  Written by Mimi Onuoha and Diana Nucera a.k.a. Mother Cyborg, with design and illustration by And Also Too. Licensed CC-NC-SA 4.0 this workbook is also available in print for the affordable price of just $7 USD – and you will want to write in it so paper copies are not a bad idea.

Learning objectives

These are just some of my ideas for activities and assignments. You can come up with your own but perhaps you might consider the following learning objectives (or something like them) to guide you. 

Prior to creating an account with OpenAI students will:

  • Discuss the value that their personal data holds with various actors (themselves, friends/family, school, corporate, government) 
  • Demonstrate an understanding of typical tech product cycles and compare them to non-typical ones
  • Compare how power is held by various actors (themselves, friends/family, school, corporate, government) 
  • Analyze workforce implications of AI at home and globally  
  • Create a personal data security plan 

 

These are just some ideas, and I’m sure they are flawed in various ways, I’m sure they won’t work for every course, and I’m sure some folks are already doing something similar or even better. But the message I’m trying to send here is just think about some of the larger picture in AI, and have students think about it, before you have students sign up and start “playing” with something they don’t understand.

~~~~

Image by Kevin from Pixabay

  • *This post is especially messy as I accidently hit publish while drafting late at night. The section Analyze your data collected from other social media platforms was added the next morning. And then later the next evening I added the Work Through the People’s Guide to AI section. I just keep thinking of things to add!
  • ** No ChatGPT was used in composing this post

7 thoughts on “Prior to (or instead of) using ChatGPT with your students

  1. Thank you for such a deep (in resources) and wide (in perspectives) post! My colleagues and I will be returning to this often.

    I would like to add the possibilities of using OpenAI’s API as a way to avoid students logging in themselves. We’re experimenting with this on Youth Voices, using a WordPress plugin, AI Mojo. Questions remain about how we will pay for this service. The templates inside the plugin allow teachers to pretty easily train GPT-3 (and 3.5 once released) to give students results that fit our pedagogical goals, and it’s easy to share and replicate these prompt template’s with each other and with our students.

    1. Thanks for the comment Paul. I’m not as familiar (and I don’t think most instructors are) with the paid API product as I am with the free ChatGPT product. But I guess my question/concern here is that much has been made about how the ChatGPT product has special training (now coming to light that it was on the backs of low paid Kenyan workers but I digress) that keeps it from being as biased and as incorrect. So, it would seem to me that the training that you speak of would be really really important if someone were to go this route. Yes?

  2. Not to disagree but… for the email to access maybe look up “burner email account”

    And it’s not for the results but if using a search tool that rhymes with “scroogle” your privacy might be more at risk… a good read on this (and in comic form!) is https://contrachrome.com Our actions all the time there are always training a system. Do we balk at that?

    The steps before using ChatGPT suggested are really good, but aren’t these the approach we ought to be doing across the board?

    I completely suspect the aims of OpenAi are anything but manipulative and self/profit serving, and likely in line with many others.

    1. Hi Alan, thanks for coming by and for your comment. There is a lot going on with your post and I’m not sure why you decided to frame it as a disagreement – it sounds like we are agreeing on a lot. I mentioned burner emails in another post where mitigation of data harm was the topic but this one is specifically grappling with what you could do if you didn’t have an account at all. You are right that everyone could likely have better data practices across the board. Hey I have a great idea – how about we start with a new company most people have not been involved with instead of making the same mistakes over and over again.

  3. Pingback: eLearn @ UCalgary

Comments are closed.