Episode # 40
AI, the Classroom, and the Future with Kathy Aurigemma
September 28, 2023
About This Episode
Meet Kathy Aurigemma, MEd., co-founder of SWTILO and K-6 Media Specialist & Technology Teacher at Cincinnati Public Schools. Witness her profound expertise in Instructional Technology Integration as she talks Kara and Caryn through how she got started, what she’s working on now, and her visions and views of what Artificial Intelligence means for education (and life in general) in the years to come.
If you’d like to learn more about her work with SWTILO, head on over to their website: https://sites.google.com/view/swtilo
Introducing Kathy Aurigemma, MEd., a trailblazing K-6 Media Specialist & Technology Teacher at Cincinnati Public Schools. With a profound expertise in Instructional Technology Integration, Kathy’s commitment to education knows no bounds. She ignites the passion for learning in her students, guiding them to discover their voices and empowering them to advocate for themselves and their communities. Join us as we delve into Kathy’s journey as an innovative educator, where technology, dedication, and mentorship converge to shape the future leaders of tomorrow.Twitter/X
[00:00:15] Kara: Joining us today is Kathy Arajema, and she is a K through six technology teacher at Cincinnati public schools. But I know Kathy from, well, actually Corinne knows Kathy too, because of Info Ohio and the Southwest Tilo group in Ohio. So, Hey, Kathy, how are you?
[00:00:39] Kathy: I’m great. Thank you so much for inviting me onto the show today.
[00:00:42] I’m excited to talk about things I’m passionate about. I love this podcast and I am just thrilled that you invited me. Oh,
[00:00:50] Kara: yeah. Well, and we’re excited to talk to you because you are a great technology innovator and you have big ideas and [00:01:00] thoughts and it’s just always. Wonderful to chat. So, so I’m going to just start by asking you to tell us and our listeners how you ended up in education.
[00:01:14] Kathy: So it’s one of those stories where you just know you’re meant to be where you’re supposed to be. So I was, I hope you got a minute because it’s going to take a while. But so I was living in Vermont and I had gone back to school for alternative licensure to be a school library centers management. So to be a school librarian and I did alternative licensure there through the University of Vermont.
[00:01:34] My goal was simply to be the librarian in my kid’s school. And so it’s kind of a bit, a bit of a big deal.
[00:01:43] That’s not the easiest thing to pull off. But indeed, I was set to be the librarian in my kid’s school. This was 2006. At that time, my husband was working for Bombardier Capital, and he was doing a wind [00:02:00] down. And we knew we were going to have to leave. We knew we were going to have to leave Vermont, but we didn’t know when.
[00:02:04] So sure enough, it turned out we’d have to be gone by July, when I was going to start my next job in August. So, I end up in Ohio. Hmm. I still had the same goal. I want to be the librarian in my kids school. I come here, though. And there is no alternative licensure program and I call Wright State, I call over and they’re like, you don’t have a bachelor’s in education.
[00:02:29] There’s no way you’re going to get a school library endorsement in Ohio. And I thought, well, why would I go back? I have 24 graduate credit hours and I have a bachelor’s. Why would I ever do that? So what I did was I applied for Kent State’s library school and I thought, you know, if I can’t get them in school, I’ll get them after school.
[00:02:47] But it just didn’t feel right. It didn’t feel right. So I made one last desperate call to Dr. Susan Bird, who was at Wright State. And I said, that’s who was there when I was there. That’s exactly [00:03:00] right. And I left her a tearful message that was so choked up. She had to at the end, replay it over and over to get my phone number.
[00:03:06] And I said, I think I’m about to make a huge mistake. I know I’m meant to be a school librarian, but if you tell me it’s not possible, I will put that dream aside. Turn fully to what I’m going to do and by the time I left my number again I was choked up and so she calls me back and she says Kathy. I’ve good news We’re gonna start an alternative licensure program.
[00:03:25] You’re gonna be the first person in it You’re gonna be the first first person out of it and I was and so in 2009 I ended my program I passed my practice Two points and I was the school librarian in my kid’s school. And so that was, and my son went to school with me for grades five and six at Berry Intermediate.
[00:03:46] So it was meant to be. I am absolutely meant to be doing what I do. And so that’s how I ended up in education. But more than anything else, I’m here to help people [00:04:00] keep it real. Technology has always been my thing. And so I’ve always been a librarian and a technologist. Always one in the other. To me, they’re the same.
[00:04:10] So yes, agreed. Even though I’m technically a technology teacher here, I still am in the library as well.
[00:04:17] Kara: Well, and I love that you just pursued. You know, like you didn’t get the answer that you wanted the first time.
[00:04:25] So you try, try, try it again to figure out if there is a way.
[00:04:30] Kathy: And because you knew you wanted to do it exactly right. And the segue to is that intelligence, I value intelligence highly, but the older I get wisdom, I want to know books and we’re going to talk about artificial intelligence here in a little bit, but wisdom.
[00:04:47] Is what we all need, right? Wisdom. So I think that those two can be married, but we can never dismiss the value of wisdom. Yes, we need intelligence. [00:05:00] Yes, we do. But we also, at the end of the day, I think need to rely on wisdom as much as anything. So, so that’s what I did. Collect the wisdom of my community and said, you know here’s the, here’s the thing that was nearly impossible to pull off.
[00:05:13] Think about trying to get a job in a. School where they’re trying to get rid of that job. How many dominoes had to go into place for me to walk, I didn’t walk into that job. You see him saying, I started going for that job in 2006 when I started volunteering in the library.
[00:05:30] Kara: You laid the groundwork there.
[00:05:33] Kathy: The odds were, but guess what? Because I’ve pulled out the impossible in the past, I don’t need to do that anymore. I don’t to do that anymore. Right now. What I do gotta do is show up and, you know, at, at this stage of where we’re at, you know, they used to say, just close your door and teach. Mm-hmm. , we’re not allowed that anymore, but I don’t think we have a choice either.
[00:05:57] Do you know something? If what they’re [00:06:00] asking for us is so unreasonable and so un undoable, I can’t feel bad about not being able to do it anymore. I can’t be mad about an impossible job. Let me get this straight. You want me to differentiate instructions for kids who are going to be giving a standardized test?
[00:06:17] You know? People don’t get it. People are never going to get to change minds anymore. I don’t. Yeah. I have to tell my children. I can’t get you to change your mind until I get you to open your mind. Can’t open your mind until I blow your mind. I love that. Oh, yeah. You’re going to hear more about it here in a second.
[00:06:37] Yeah. It is something I’m very passionate about. So thanks for putting up with me. I try not to talk so circularly, but I like it. I’m known to.
[00:06:48] Kara: So no, it’s great. It’s great. And leading in with change and all the disruptions and the expectations and et cetera, et [00:07:00] cetera, et cetera. Why don’t we talk about AI?
[00:07:05] Kathy: For real. So I am a futurist and a disruptor, a change agent. You know what I’m talking about? That’s the stuff we were brought up on, right? In school. It’s one thing to see it, say it. It’s another thing to do it. Because nobody likes change, everybody, right? To change your mind is to break yourself down.
[00:07:24] When we’re trying to get, we know about the neuroscience of learning, right? For, in order for you to take something new, right? And synthesize it, right? We have to break down what you know already. You know, you’re rocking addition. Wow, look at me, dude. And then we’re going to throw a multiplication at you.
[00:07:40] What? Wow. You have to trust me. Enough to know that I’m going to break you down, but I’m going to help you put yourself back up, right? And be the master of multiplication, right? You’re the master of addition. Don’t you worry, you’re going to be the master of multiplication. Trust me. I know you can do this, right?
[00:07:59] So, [00:08:00] but when we’re asking in this polarized society of ours, politically, particularly for people to change their mind, nobody will anymore. And that’s why we’ve got the trigger warnings. That’s why we’ve got all of this. People are so scared literally for their lives. That they refuse to change. They don’t want to be disrupted, right?
[00:08:20] It’s nothing but stress for them. They’ll do everything they can not to change their mind. Having said that, thinking all alike or listening to just one idea, just always keeping to the line and following the status quo is what got us here. So we have systemic problems, the likes of which we’ve never seen before.
[00:08:45] Thinking the same way. It’s never going to get us out of the problem that was created from everybody thinking the same way. So we need every neurodivergent mind perspective [00:09:00] resource that we can possibly get our hands on. And guess what? Our fate literally does ride on the ability to do that. Okay. So when I’m thinking about AI, artificial intelligence, I’m thinking about.
[00:09:15] Enhanced. I’m thinking about the aggregate. I’m thinking about the collective intelligence because while we’ve been just chilling at home during the pandemic, while we’ve been asleep, in a sense, computers are never sleeping. And they have been digitizing the entire body of knowledge ever, ever produced where this is.
[00:09:43] We say these are electronic ones and zeros. They exist. Believe me, they’re often on. There’s a magnet in your computer that holds those. There’s magnets in massive computer farms that are overseas. Somewhere there is a physical device that has this [00:10:00] piece of information that we’re just transmitting right now.
[00:10:02] This isn’t some nebulous thing. This is a real thing. So data exists. Real. And the people who own the data own the world, right? And that is when human learning will be outstripped by machine learning.
[00:10:19] I would argue it’s going to be even faster than that. So, what do we do in the meantime? We better get on board. We better make it work for us. Because if it’s teaching itself, it’s going its own way. We have to be teaching it about us. We have to make it work for us. And so my whole theme this year, and has always been, I’m going to work smarter, not harder.
[00:10:44] AI is the answer for that. AI is absolutely the answer for that. And I’m going to use it with kids, with myself to level a playing field. I’m going to use it to distill. What’s most important. I want to use it to elevate [00:11:00] conversation. I want to use it to streamline the BS that’s getting asked of me all the time.
[00:11:07] The stuff that has nothing to do with teaching. You want me to do that? Okay. I’m going to ask Google to help me like that. And so Creative AI started with ChatGDP. ChatGDP is like the front runner at the race. And now they’re like. Because Microsoft, Google, they already all had these, they’ve had them under wraps.
[00:11:30] And so now they’re just taking them out and going for a spin with them. And they are astonishing. They are astonishing. We’re going to talk about this after care too, about getting your lights on. But and so I’m always going to ask before. Remember I said I was going to ask Susan Bird. I’m going to ask AI every single time now, before I do anything, I’m going to ask it.
[00:11:57] And it’s going to come back with things I never even thought of. And [00:12:00] maybe it got it wrong, but that’s okay because it’s actually a conversation. And I can say, Hey, you got it wrong. What I really want to know was this, or let me help clarify. This is what I’m looking for. And every time we have a conversation in which we come up with brilliance.
[00:12:16] That’s recorded as well and becomes another source and everything like that. One thing, so there are some resources that I want to point out to you guys, but definitely the Cult of Pedagogy. Yeah. That is right. Yep. Great. Episode about using chat GDP to generate examples. Mhm. So kids learn best by what is it not?
[00:12:41] Do you see him saying? So one of those aspects of teaching kids is it’s it’s this. You want them to be able to identify. But what helps them to identify is what it’s not right. That’s what helps them understand what it is. The more they see what it’s not. And I’m going to give a better example of this.
[00:12:59] Okay. [00:13:00] So we’re worried about plagiarism, right? But when you get into the English language arts classroom, right? And you are trying to get kids to understand author tone and style. And say you’re talking about Poe, right? And his poetry, right? What are the hallmarks of his poetry? Are one of the things you want to know, right?
[00:13:19] What is the tone, right? Now, you can ask kids, and what are they going to do? They’re going to go to school or whatever, right? They don’t give a care. But what if you asked them to tell AI to write a poem in the style of Edgar Allan Poe? What are the hallmarks? What are the hallmarks? How will we know that it truly is?
[00:13:42] Well, first of all, you’ve got to look up and know what other Edgar Allan Poe poems there are. You’ve got to look up and analyze what words does he use? Like, he uses the word so when you think about he uses the word, I’m going to blank on it right now, but it’s basically it’s, you know, you think about Lenore and the penth and all of [00:14:00] that, it’s about loss.
[00:14:02] And it’s about that kind of stuff. So you can say, Hey, generate this and see what you come up with. Right? So make a poem about a woman who has died. There’s a fella who’s a writer and a chemist or whatever. And he’s lost his girl and now he’s looking everywhere in his haunted house for her. Right? So basically you’re saying.
[00:14:23] What’s up with the story that you just read? Like, tell me, what’s, what are the inputs here, right? So you’re asking them to not only identify them, now you’re wanting them to input, import them. You want them to analyze them, right? And then when you get it back, wait a second, that doesn’t sound like Poe at all, does it?
[00:14:39] No. Okay, let’s try again. What is it we have to go back and change so that it does become this Poe thing? And I’m telling you, it happens just like this. And there are professors that are saying, do your first draft in AI. And then we’ll talk. So you still have to put in what your thesis statement is. You still have to put in what your evidence is, [00:15:00] right?
[00:15:00] What your sources are. But have that generate your outline. Let’s get started now. Because I can tell you, it elevates the conversation. It’ll give you stuff you never even thought about. Oh yeah. And so when you’re taking a look at what the results are, again, it’s trained to just answer your question no matter what.
[00:15:23] But sometimes you’re like, Hey, I didn’t even think about that. So if you listen to call the pedagogy, it talks about making these iterative examples and saying, all right, I want to see an illustration of these are math concepts, right? And so they talk a little bit about not just what they look like to do the calculation, but like when it’s applied.
[00:15:48] And then they’re starting to talk to about like American civics. They’re like, what is this constitutional bill? Right. What is the U. S. Second amendment, right? That kind of thing. But then it’s like, what does it look [00:16:00] like when an American is exercising their right to the second amendment? I mean, that’s like a whole nother thing.
[00:16:07] Give me an example of what it looks like when an American is exercising their rights according to the second amendment, you know, that level of conversation I had to do a I was wanting to do a pride event and. I wanted to know what pride was one aspect of pride, right? So I asked her, what is pride?
[00:16:29] Question mark, right? I want it in 250 words. It literally gave me a platform. It gave me a manifesto. And I thought, Oh, look at that. I haven’t thought about that. I didn’t think about this empowerment. I didn’t think about bringing marginalized populations together. What, what, what, what, what? I had seven other things I hadn’t even thought of.
[00:16:48] Okay, but 250 words is too many. Give it to me in 75 words. And it did. Give it to me in one sentence and it did and when you think about kids you [00:17:00] know news ELA, right? Noozella. Yep. That’s what they do. But guess what? Kids can do that for themselves. Now kids can say, I need to learn about this concept.
[00:17:11] What you just showed me is way, way too high for me. Can you explain it at a third grade level? Yeah. Can you explain it at a kindergarten level? Because it will. It can explain a concept to a postdoc all the way down to a pre k. Which like instantly. Yeah. Instantly. Now, do you still have to go check it? Yes, you do.
[00:17:34] Mm hmm. Do you still say, yep, that’s me. Is this true? Could this be true? Those questions critical thinking are even more important than they were before and if all’s I can get kids to do Is think they say, well, I can’t read Ms. A. I know nobody’s reading anymore. I can’t write Ms. A. Yeah, I know. I’ve seen your papers, but guess what?
[00:17:57] That’s not a pass. I’m sorry. [00:18:00] Yeah, you still have to think. And so everybody’s using voice to text, which by the way, feeds the AI every second, everybody’s getting their stuff on YouTube, Tik TOK, whatever it is. So then now I’m about to ask you a question. That you can’t answer with an app, right? That’s what I need to do is ask those questions that are not just wrote.
[00:18:23] The assumption is you can find the information. The assumption is you will find the information. As a matter of fact, you’ll find the information, bring it to us, but then I need you to synthesize and apply. Because guess what? Our lives depend on it. I think our lives literally depend on it.
[00:18:52] Kara: And like the synthesizing part, I think is such a critical skill.
[00:18:59] Kathy: And they [00:19:00] fight you tooth and nail for it. But again, also too, when we think about the ability to differentiate and the ability to speak to their curiosity, to spark their curiosity, because you know, you and I could put the same question, same query to JET GDP, and we would all get three different answers.
[00:19:20] Because it’s based on our data, right? Our in so personalizing it, the ability to have them look up something that they’re passionate about. I mean, you know, you’re on the right track when you can’t get them to shut up. Yeah. Right.
[00:19:33] And we know we can make that happen. We know we can make that happen, but they don’t care about a question they haven’t asked for themselves, book they haven’t chosen for themselves. We know how to get them to ask the questions, right? We know what a good question is, right? We have to get them to do the work.
[00:19:49] of asking the questions, right? We know that, but when you think about it to the depth of knowledge between identify and show him an example, right? [00:20:00] It’s like somebody exercising this. Yeah. Right. Yeah. On that Jennifer Gonzalez’s Cult of pedagogy thing. They talked to a neurotechnologist. That’s me. I know. I said, that’s my next gig right there. And that’s where we’re at. It’s astonishing. In this day and age, we’re talking about artificial intelligence yet. We know so little about the human brain.
[00:20:24] Yeah, but now here’s my thing. When I was in Vermont getting my licensure, right. Somebody was working at the national Institute of health on mapping the human genome and they did. And they did it took a few years, but they did it look at all that has happened since then just how Medicine has been transformed by this right?
[00:20:48] Yeah They will map the human mind and What will that create? What will that create because we think about consciousness can they create [00:21:00] consciousness? Do we profile can we be looking for the traits that will make my kid more successful than your kid. You talk about eugenics. Yeah. Okay. So I’m going to pause, but I want you to think about this too, though, is that
[00:21:18] that Technology already exists. It’s already here. We got CRISPR. It’s already here. What AI is going to do is show us the potential for it. What AI is going to show us is how far it will go. Again, we got to be out front now, right? We got to be thinking about this stuff now. And until I can blow your mind, until I can get you to even consider that what I’m talking about is real, because what do you guys do?
[00:21:46] You guys trust me. You know me. Yeah. Right. You’ve made me long enough. There’s enough time that you can trust me. But yeah, Kathy or Gemma just saying most people ain’t got time for that. And I’m like, [00:22:00] you might want to make time. Yeah. So, but, but, but again, I’m going to say like, I know just because I know how bad it could be, doesn’t mean I stopped.
[00:22:12] As a matter of fact, I doubled down. I jumbled it up and this gets me excited now because this thing I’ve been seeing growing this thing I’ve known about and wanted to talk about for so long. Yeah, it’s here and it’s even bigger than I thought. And it’s even faster than I thought. And another point I was going to make is that no technology was ever used as it was designed.
[00:22:34] Hmm. Oh. Nobody ever says, I’m going to build, you know, this amazing thing that can help solve genetic problems for kids who have a congenital defect. No, no, we’re going to make designer babies out of that stuff. Right. So, yeah, yeah. I mean, that’s the only thing you got to know too. The person who is doing the CRISPR project, she stopped it.
[00:22:56] I don’t know that, but from the ethics point of view, she [00:23:00] did. You know, and you think about too, Google’s ethicist up and quit. That was a couple of years ago. I don’t know if you remember this. Well, the DeepMind project’s been going on and she’s not there anymore.
[00:23:10] Kara: Okay. I’m curious. Just, it’s on the same, same lane. But if you had to challenge, like Corinne and I, or another teacher when it came to AI, what kind of a challenge would you give us?
[00:23:28] Kathy: Okay, I can tell you right now. Okay. What I want you to do with AI, first of all, is put all your lesson plans up.
[00:23:37] Just do it. Just do it. You can plug your standard in now. Mmhmm. You can say who your kids are. Literally, this is how I’m doing my lesson plan this year. They want me to have lesson plans to the end of the year. Okay, I’ve got, and this is what cracks me up, is don’t I have the same 6th and 7th grade? You know, 4, 5, 6, 7?
[00:23:56] I have those same kids you do. I have to teach those same standards that you [00:24:00] do. Why do I have to make all this up? Like, why am I trying to recreate the real? Again, why am I having to differentiate everything when I’m going to give them a standardized test anyway? You don’t want to hear what I’m really going to do.
[00:24:12] You don’t want to hear what I’m really, you actually do not want to hear what I’m really want to do to hear what you can tell your superiors. Right. And I want to show the state that I’ve done this, right. Don’t get me started on this, you know, but literally if what you want to do is make me do something so that you can check a box.
[00:24:37] I’m going to check a box to get that, because what I need to do is to, to design personalized learning experiences for the kids that are sitting in front of me right now. And I’m never going to know where they are. I’m not, I can’t teach them unless I can reach them. I can’t reach them until I can go where they are.
[00:24:58] And you never give me a [00:25:00] ding dong second to do so. So if that’s the case, if that’s the case. The stuff you’re asking me to do, the administrative stuff. No, right up the line, right up the line. And I’m still going to need to design instruction on the fly. For, I should say, how about this? Facilitate experiences, facilitate experiences for the students that are in front of me.
[00:25:23] They’re dynamic. You talk about neuroscience, a teacher, nobody working harder, right? Behavior, interests, time, content, right? A thousand million things popping all the time, right? And then here comes the next crew and the next one. And you’re going to do it.
[00:25:42] Kara: It’s going to say that’s a whole new bunch of all those things.
[00:25:45] Kathy: Yeah. Yeah. That teaching is so easy. No. So, and I’m not even going to start and try to crack that wall. I’m done over that. I don’t care. Yeah. You think, you know, what’s going on. Great. Knock yourself out. Yeah, [00:26:00] I’m serious. Like, you know what? The paradigm, like, I don’t even bother telling people about it. I’m like, yeah, you’re right.
[00:26:05] You’re right. They all sit in rows. Yeah, they really pay, behave. They pay attention all the time. They would never disrespect the teacher. You’re right. Yeah. But, but no, no, I would challenge reality is awful. I mean, to automate, I guess what I say is streamline the administratable stuff that you don’t have to do.
[00:26:21] Right, right. We can line that stuff so that you can design interactive experiences for kids. And so when you say challenge, I would absolutely challenge them. And this is my challenge for myself is to create the Videos, like create videos, the, or the how about this? The tasks, right? The learning tasks are elastic.
[00:26:43] Do you know what I’m saying? So that instead of listening to me, blah, blah, blah, blah, blah, blah, blah, about the basics of the directions, you’re going to go here. This is your playlist. Does that make sense? Your choice board. This is, these are the things we have to do, but here’s. [00:27:00] What you need to turn into me, and this is how you’re going to turn it into me.
[00:27:06] So like when I’m doing stuff now, like I’m pushing it all out through, actually I can do it through Kami now. And it’ll make a copy for everybody. They simply go up here to don’t get me started on the universe user interface, but there’s a thing they can press, which is a floppy disk. Don’t get me started that turns them and then big confetti goes confetti goes.
[00:27:25] All right. So once I show you that you have in effect opened an assignment. Checked off on the acceptable use policy. Because that’s what I need you to do, and that you have submitted it to me because the confetti went. You can go on because so and so, you can’t even find the confetti button. But, yeah, you see, yeah, that’s using AI to me is to say, hey.
[00:27:49] You have to go do this. You go to that. I want to streamline all that stuff so that I don’t have to. Yeah, because I can’t. It’s impossible to all these levels at the same time.
[00:27:59] Kara: That’s the [00:28:00] way it is. Thank you so much for.
[00:28:04] Kathy: You’re welcome.
[00:28:05] Kara: Joining us. Thank you.