17 Comments
User's avatar
Christy's avatar

I'd be interested to see a piece about how AI is affecting student/teacher relationships. Since the mid-October update, Turnitin has been flagging a high percentage of student work as AI generated. This is work you can watch students write in real-time with no actual AI use. Students are now receiving automatic zeros or only able to redo work for partial points. These are high performing students: AP/honors students. It makes sense that they have a higher writing ability. However, it's also eroding the trust they have in their teachers when their original work is automatically given a zero or partial credit. Giving in them the chance to redo the assignment is stressful, as it sets them up for the same AI flags, and takes up their already precious time redoing something when they could be working on a new assignment, studying, or participating in an activity. Turnitin boasts a high success rate, that seems overly inflated to land large accounts from districts and universities, however several universities are already opting out.

Expand full comment
Andrew Kordik's avatar

Hi Christy! I really appreciate your comment. And I'm totally with you on it. I considered including a line about how unreliable AI checkers are. In my experience, they are often wrong and, as such, they cannot be the determining factor accusing a kid of academic dishonesty.

Expand full comment
Christy's avatar

Now to convince a large school district who paid $$ for a license. 🥴 I see both sides, but this isn't it.

Expand full comment
Andrew Kordik's avatar

Totally. I'm open with students about the shortcomings of AI checkers. I tested one of our recommended checkers by pasting articles I wrote before 2022 (going back to 2016) and most came back 70%+ AI. So we discuss things like that, but we also discuss how the checkers are often right. Increasingly, tho, I'm moving toward handwritten assignments in class, which works because my school recently banned cell phones from classrooms. Still, our AP exams are online so kids do need practice typing LEQs and DBQs. No perfect solution.

Expand full comment
Jules's avatar

This happened to my sister in college. She had proof that she did not use AI, but her professor wasn't even interested in seeing it. Her university had no policy to offer direction to her or the professor in that situation, so it was left completely to the professor's discretion. She was told to rewrite the whole paper for partial credit -- and the grade was worth a large percentage of her overall grade for the class. The experience completely eroded my sister's trust in the professor, and it even hurt her faith in the university since there were no protections for her. She just had to accept a major reduction in her grade without "due process," so to speak.

Expand full comment
Carey Gregg's avatar

I was coming here to say something similar. I'd love a follow-up from a university perspective. My son is a college junior. This means that as a high school freshman, school closed midway through the second semester. He then more or less missed his sophomore year (online school, of course, but the social interaction, etc). Then a semester before graduation, ChatGPT was released on a wide scale. (Point being, higher learning has been difficult at every turn for this age group.)

Now students across his large, top-ranked university are experiencing exactly what you described: being accused of using AI to do their work with no recourse or ability to defend themselves and no university policy to help define parameters. So far, it has not affected him, but there are posts in university parent pages at least weekly looking for help for students who are experiencing this exact scenario. As you mentioned, it's harming trust between professors and students (both directions) and students (and paying parents!) and university systems.

I work in data security and governance, where AI is a HOT topic right now because enterprises are training their LLMs on data with no real understanding of what data they own, who has access it to it, or where it resides, but they are liable if protected data gets out. Because of that background, I'm able to offer my son *some* guidance on protecting himself, but when policies vary from professor to professor, and seemingly occasionally vary according to daily mood (subconsciously, I'm sure--professors are only human!), it seems like a matter not *if* this will happen to a student, but *when*.

Expand full comment
Megan Pieper's avatar

I do wonder if it will lead to more schools doing blue book essays as they can insure the kids are actually writing it as it is done in class.

Expand full comment
Eli Brock's avatar

I’ve been in the classroom for 10 years, and have taught at schools across the economic spectrum. During those years, I have taught in the humanities: literature, composition, and history. I hear your argument that AI (and LLMs) are here to stay whether we like it or not. I also hear your argument that AI will be a critical part of students’ future workplaces. However, I think the real skills that employers will be looking for are critical thinking and reasoning, the ability for sustained attention on a text, problem, or project, and the capacity to work with other humans…something that takes equal amounts of curiosity, vulnerability, and bravery. In short, as students increasingly lack the interest in doing school work because they are over worked, stressed out, hungry, or you-can-fill-in-the-blank, they then lack in the ability to *read* texts of all kinds. When we forget how to read, we then forget how to write. Reading and writing require all kinds of problem solving and force us to practice empathy for a historical figure, fictional character, or, when we are composing, the reader. This is the most important work that students are doing—working out how to be human. And in an age of the dehumanization of entire groups of people, artificial intelligence will rob students of opportunities to do some of that messy work before they hit adulthood. With respect, I suggest that kindness, decency, and the ability to tolerate process are not things that students will learn by incorporating AI into their homework. Instead, let’s push back against the tech bros that insist that their way is best and remember that we are humans and cannot be optimized for maximum output.

Expand full comment
Andrew Kordik's avatar

I totally agree. There's no doubt that critical thinking skills remain paramount. In fact, along with dozens of other soft skills like those you mentioned, critical thinking skills are arguably more important than ever. My view on AI is not that it's some educational panacea -- because it's not. I'd be happier without it and I think most classrooms would be better without it. But as long as we don't educate kids in the uses and abuses of AI (while still offering an academic experience that centers reading, critical thinking, etc., which is what I hope is always emphasized in school), we allow a situation where our kids don't know how to protect themselves from it as individuals or our society from it as voters -- and that, to me, is a world where the vices of AI run wild, trampling upon us, and one where we fail to mold it toward more democratic ends. But long story short, I appreciate your comment and couldn't agree more about the things that should always be at the center in education.

Expand full comment
Gina S Meyer's avatar

When things do not make sense, you always have to ask, “who benefits from this bad situation?”

If you’re wondering who benefits from uneducated students, here’s the answer:

“As citizens, these students could also be misled by a flood of AI-generated fake news and disinformation, which could hurt the trust our democracy depends on.”

Expand full comment
Jessica Jaccar's avatar

Thanks for this article! After reading it I was inspired to reach out to my child's school to inquire about where they are in implementing a comprehensive AI curriculum. I was excited to hear that they are in the works of doing this and they gave concrete examples for both teachers and students.

Expand full comment
Megan Pieper's avatar

I recently saw someone post about going to an open house for one of the big AI schools. The school boasted that kids in their school only did formal education 2hrs a day and the rest of the time was used to work on passion projects. To them this touted the idea that kids could start working on entrepreneurship skills and lessons that they were actually interested in. They also touted that their students were happier bc they didn’t have to use as much school subjects they weren’t interested in. In my opinion this would leave a lot of kids uneducated on important topics bc 10hrs a week is not adequate time to learn English, Math, Science, and History. Some of these might be perceived as being ahead bc they have great knowledge in one area, but at the same time they would be lacking education in others.

Expand full comment
Erin's avatar

I have read about this/these school(s). I even looked into a couple of them when my son was struggling at school. The marketing is tempting, but I do also worry about the lack of community, human interaction (social is a huge part of school), and being taught on different subjects by different teachers.

Expand full comment
Emily's avatar

All I can say is that kids should not have access to these tools. They don't understand how doing the hard work of writing papers/etc. benefits their brains and teaches them critical thinking skills. They don't have a well-developed ability to look ahead and think long-term, and AI gives them the chance to shortcut the very hard parts of life that will set them up for success down the road. They don't understand the very dire consequences of any of this, which is why it is incumbent upon the adults in this country and in the world to enact policies that heavily regulate the usage of AI. Period.

Expand full comment
Andrew Kordik's avatar

Yep. I've become a big fan of the idea of digital watermarking. As a classroom teacher, it's probably the regulation I'd most appreciate. It could protect us from a flood of AI-generated papers and kids from our false accusations.

Expand full comment
Erin's avatar

I agree with you AND I also think that if we don't allow the youth to use AI tools (with a heavy focus on training and education which has to start with the educators themselves), they will be missing out on a part of our world that is here whether we like it or not. To me, part of guiding our youth includes teaching lessons on a myriad of things we wish and hope our children don't have to experience (how to be responsible on social media, the responsibility of learning to drive and the dangers of driving while intoxicated, the reality of racism in our country, etc.). I understand your point of view; I don't think we can keep AI out of schools/education or our youth's experience.

Expand full comment
Erin's avatar

I joined my kids' tech advisory committee and AI has become our soul focus. I work in tech (manage our department) and for both - my organization and the district's committee - coming up with an effective policy around AI has been one of the most daunting tasks I have been a part of in a very long time. I think the reason is because of the unknowing, the lack of training and fear - we fear the unknown, which is so much of what AI is. This article was insightful and let me realize, there are so many facing the same challenges around AI policy, training and use.

Expand full comment