We Not Me - Reed Hepler
===
[00:00:00]
Dr. Nikki Harding: welcome to the We Not Me Inclusion and Action Podcast. , i'm Nikki Harding. I am really excited for our guest today. His name is Reid Hepler. He is joining us from Idaho and I ran into him on LinkedIn and I.
Was looking through his posts and I was incredibly impressed with all the things that he was sharing about AI in education. I reached out to him and asked him if I could have a conversation with him, and he [00:01:00] immediately replied, I would love to talk to you. I am really excited to pick your brain today, Reid. Could you tell us a little bit about yourself and what led you to, your interest in ai?
Reed Hepler: to AI is kind of different than what most people might expect. A lot of people think, you know, ~oh, ~oh, I just came to it for my job and I opened it, and then I practiced. ~And then ~my journey, ~actually, ~I, ~um, ~initially was going to go into music in my undergrad. ~I. ~And then I changed my major five times 'cause music is not the most lucrative thing. ~And ~I eventually went into history. And then I finally went into library science. And so I got my master's in library science and my first job which I work currently as ~the digital initiatives and librarian ~the digital initiatives librarian and archivist at the College of Southern Idaho. I thought I need to be training educators, how to work with the library students, how to work with the library. I need to create good trainings because the last trainings we had were like 20 years ago. No fault of the librarian's own, ~just that's how it happened. ~And so I thought I need to know instructional design. ~And so ~I then looked at this field and I said, I should get a Master's in this. ~And so ~[00:02:00] I actually just barely ~just in, ~in May got my master's in instructional design, ~but in, in, in, ~in my first courses, ~in, ~I started in January ~actually until after ~two months before. Chat, GPT, this is in 2023.
So in November, 2022, chat, GPT was released. And I thought, I'm never going to be one of those. ~Oh, AI and instructional design, this is gonna be a huge thing. ~I'm not gonna be a part of that. Look at me here. , Two and a half years later I am. ~That's basically like half of what I do. ~It's basically a third or fourth of my, of my job. I'm coming at AI from an structural design point of view, but really from a library and information science point of view. And a lot of my work is actually based on AI and information literacy. ~Um, ~I actually just barely made a Boodle Box Custom bot. ~Um, ~we can go into what this is later. ~That kind of it, ~it works with Mike Caulfield's shift framework to help people analyze claims they find on internet and information literacy principles.
So that's kind of my background ~and where I'm coming to this from.~
Dr. Nikki Harding: That was one of the first things that I saw on your LinkedIn. And I consider myself pretty good with technology and ~I, ~I understand a lot about ai, but what is a Boodle box and what is this [00:03:00] sift toolbox ~that, ~that I saw? I didn't understand what all of that was. I am fascinated by it, though.
Reed Hepler: ai it's an AI system the providers of Boodle Box. And it's they changed their ERL bit ago, but anyway, ~it, ~I think it's, is it Bud do ai Good one. Yeah. Boodle box ai. so they have APIs with chat g BT cloud, like six or seven different, models and. So they provide all of these to educators or to users.
So we can use the paid versions, but we only contract, you only work with Boodle Box. And Boodle Box also ~is, ~is completely FERPA compliant. And so they have contracts with all that types of states and institutions. So that's what Boodle Box is. and sorry, ~uh, ~what was the other half of your question?
Dr. Nikki Harding: ~Um, ~it says sense making with sift toolbox on Boodle box. ~So~
Reed Hepler: list
Dr. Nikki Harding: what is all of that?
Reed Hepler: toolbox is a, it's basically a custom [00:04:00] bot that was designed by Mike Caulfield, and it focuses on what's called the sift method of an information literacy. ~So the shift method. ~Is four steps that every researcher, or educator or student should take to evaluate their the information resources they come across. ~And I'm gonna put a link, I don't know if you should you share links with this podcast okay. Yeah. So~
Dr. Nikki Harding: Yeah. ~Yeah, ~we can put the links in the show notes for sure. ~That would be very helpful. ~
Reed Hepler: ~So here is, here is the. ~Here's the Southern New Hampshire University's iteration of the sift method. ~If I can get the, ~the first step is to stop and ~to ~take a step back and look at what you're looking at.
Look at what you are reading ~and what you are trying to ~and let go ~of kind ~of all the preconceived notions of ~what is the, ~what is a reliable source. ~and ~the second step is to investigate who, who wrote this, who are they affiliated with? What have, what work have they done in this area?
Are they qualified to do this? What is the narrative ~of this, ~of this? This organization or this person, what bias might they have? And then the third step, the F is find better coverage. So find other resources that are talking about the same issue or the same event. What are they saying about it? Are all these things corroborating [00:05:00] what this claim is saying? And then t is Trace all the claims to the original source. So Mike Caulfield found this, ~claude ~and he said, this is actually gonna be really good at communicating all these ideas and helping me to help people use an AI tool to practice information literacy.
So he went and he made like, it's like 2000 lines of code of this custom bot. ~And so I'm going to, apologies. I think my. My web capability is because I'm using Google Meet is not the there we go. ~Here it is. Okay. Here is Mike Colefield page that has his custom bought on here. That is for you to take,
Dr. Nikki Harding: ~Oh ~would you mind sending that to me in an email and I will put those in the show notes?
Reed Hepler: Yes. ~Fantastic.~
Dr. Nikki Harding: Yeah, that would be great.
Reed Hepler: yep.
Dr. Nikki Harding: ~So~
so when you say investigate, how do people know anymore when they are investigating what is ~and, ~and is not on the internet? I know deep fakes can look so real. Even when you're watching a video, how do you know what is real and what's made up? Especially, ~I know ~when we're. [00:06:00] Looking at tiktoks ~and, ~and social media video, like ~how do, ~how can you tell.
Reed Hepler: the first thing I tell people is try to look for signs that things are fake. ~Try to look for, try, ~try to look at the information it's communicating. And is that information accurate? And does it track with other things that people are saying? if you focus on the data or the information being presented rather than, oh, does it look fake? Does it look real? Let's look for the telltale times of AI generation, that type of thing. ~You, ~you, you're much more. Likely to detect fake things if you're looking at the data or the information being communicated than looking for the generation signs, ~you know, ~signs of things that have been generated or things like that. ~So that's that. That's the first thing. ~other thing that I would say, a lot of people are saying, the lighting, I'm not really, and I have to say I'm not really one to look to like be an AI detector or look at like AI videos. I mostly work with text things, [00:07:00] sometimes with images, but mostly with text. ~And so that's the that's kind of what I focus on. ~So if you're looking at images, lighting is a big thing. But text, ~I, and as, ~as someone who does not write, the typical person writes, if you read my blog post or ~if you're, my, my, ~my posts, which you, Dr. Harding have apparently~ I, ~I do. So I would say, ~you know, ~if it doesn't have a normal cadence. Then that probably is a sign that it's AI generated. But I have been told by an AI detector that my writing is ai.
~So that's really not, that's kind of ever critical. That's really not reliable either. ~So, but again, focus on the data. Focus on the information rather than, looking for signs that has been AI generated in fact. I wrote a post about this and I'll, ~I ~link to this too about AI detection and about information literacy. It was mostly, I think, and I think it was the one about AI and elections, but the point was there was a parable~ or, ~or a comparison in the 1950s and by. ~A bunch of c and ~a bunch of Christian denominations of how do you know when something's fake or something's real, but when counterfeiters look for fake money, they don't study the fake money.
They study the real money [00:08:00] and then they notice because they're looking for the real stuff. And so ~I, ~I kind of said, well, that's bit simplistic, but that's basically what I do is I know what reliable things are going to have and then ~I know ~it jumps out at you. If there are differences or if there are occlusions. To how things look and how things are reported and things like that. ~So that was a long-winded answer.~
Dr. Nikki Harding: ~So we need to stick with, ~so we need to stick with the, ~um, ~tried and true practice of triangulating data, right? Like, if this is true here.
Reed Hepler: Yes.
Dr. Nikki Harding: I liked your, your answer was far more technical than mine. So how can, how can teachers routinely help students understand this? Because the, ~um, ~spread of disinformation is so so prevalent and students today, you know, ~um. ~I, I am old. So when that was, when I was in school, this wasn't a thing that we were concerned about.
~Um, ~email was exciting when I was in high school, ~um, ~which really dates me, ~but ~but now kids have, ~um, ~true things and then they have disinformation spread at the same frequency. How can [00:09:00] they practice how can teachers help them practice this? Digital literacy all the time. How can teachers help students understand the importance of differentiating fact from fiction
Reed Hepler: ~say, um, and ~this related to ~the ~the Boodle box spot that I created, the shift method is really the best way to go. ~Um, you, ~you have a lot of librarians ~and, ~and this is going out of style. Incidentally ~go, um, ~I dunno if you've heard of the CRAB method. C-R-A-B information literacy.
So it used to be, you know, look at the currency, look at the relevancy, look at, the academic, background ~of the ~of the source. Does it look credible? Does it have, reliable ~about ~data? And is it easily accessible in this way? That's not really the way that we look at things anymore.
So the shift method, the stop investigate, find better coverage and trace claims to the original source. Though that looks at kind of the information provenance. Where is it coming from? ~Is it, ~is it a [00:10:00] verifiable chain of things? And so that kind of instills a sense of behaviors. These are behaviors you should use rather than, these are the hallmarks you should look for in every single source you find.
And if it doesn't have this, then it's not useful. And that's especially true because a lot of reliable information is coming in the form of blogs. It's coming in the form of videos ~is coming in the form of, ~non-mainstream media. And a lot of people can't afford, ~you know, ~to have, ~you know, ~news subscriptions. A lot of people are communicating through social media, and so a lot of those other sources don't have the, ~don't have the crap ~checklist ~of, of, ~of valid sources. The other thing I would say is, that's kind of ~how to, ~how to share about information that you see. The importance ~of this, ~of verifying is you just have to look, in the news, ~how do I say this? ~There are a lot of people in today's world that act on ~and ~incomplete information. And you can see, if you look at what [00:11:00] happens, the impact that has and the effect that has. And so ~just kind of, you, ~you just have to look at the news or you can look at, history.
You could go through, have 'em research history. When have people asked it on inaccurate information and when~ what, what, ~what impact did that have? ~One, ~one of the most absurd things that I've, read about, ~and I learned, and, ~and I use this all the time. In fact, ~this is gonna, ~this will be a low stakes, really fun way to do this.
Have you ever heard of the Magdeburg unicorn before? Dr. Harding? Okay. The Magdeburg unicorn. So Google that. ~in fact, if you wanna Google it, you can put it as as another link. ~The Magdeburg unicorn is one of the worst fossil reconstructions in human history in the 16th, in the 17th century. ~And story is different. is kind of meta. ~was either a group of miners or a group of mocks who found this fo fossil of a wooly rhinoceros. And of course, you know, they don't know what the heck a wooly rhinoceros is, so they say, oh, this is a unicorn. And they have no frame of reference for this, obviously. And ~uh, ~if they say it's a unicorn, especially if it's mo mos, say [00:12:00] This is a unicorn.
Of course it's a unicorn. ~So it's published widely. ~This is a unicorn. This is the berg unicorn. And obviously now this is not as accepted~ it actually got published in like scientific periodicals. ~It got published in basically kind of as proof that unicorns existed. It got published as anthropological proof that the Bible was real. ~Or and, ~and so that persisted for years. Now, here's the tricky part. Like I said, the account is split between if it was monks or if it was. Minors. And also only extent recordings we have ~of this, ~of these two recordings are like second or ~third second or third accounts, ~third hand accounts. And the person ~who, ~who made a color drawing of this was known to be mystical, ~like the panful things, ~but we have other accounts showing that this actually was a real thing that was found, but his interpretation might not have been. So there's just multiple things of, this is what you can do ~and what you can I.~
Effect if you don't have information in literacy at the forefront. And of course, the, this wasn't people trying to do by other people. They really thought what they found was a unicorn, at least in [00:13:00] the beginning. And then later it seems like, you know, and this color drawing, he knew full well it didn't exist, but he wanted to publish it anyway.
So really interesting. You can go and research that and you will find academic articles saying like, oh, this wasn't really real. This was never a thing. And so it kind of adds to the layer of did it really exist? Was it interpreted in this way? ~That's, so just kind of going through that saga is a fun way to~
Dr. Nikki Harding: Interesting.
Reed Hepler: why you need to get it facts straight. 'cause you'll be mocked, hundreds of years later on the podcast.
Dr. Nikki Harding: ~It's like, ~it's like an ancient game of telephone.
Reed Hepler: ~in webinars. ~Yes, it's, and so I always tell educators and the variance, it's our job that our students don't make the magdeburg unicorn of AI generated content. Like we want to make sure they're doing things correct and we're doing things correct. ~yeah.~
Dr. Nikki Harding: ~Yeah. ~So I think maybe the important takeaway is that we all need to slow down and digest content before we share it.
Reed Hepler: Yeah.
Dr. Nikki Harding: Yeah, I'm probably, ~um, ~I was thinking, ~um, ~about my two teenage daughters, but then I was thinking maybe I need to be self-aware, ~um, and, ~and maybe do the same thing myself.
Reed Hepler: ~everybody does it. ~Everybody does it. case in point, ~I saw, ~I saw a [00:14:00] blue sky post about an occurrence at the Library of Congress. ~And I was in, ~I was incensed, you know, like how could this happen? It was one of their collections that was being taken down or so and like, this is ~just as ~just ridiculous. How could this happen? And I was really mad, almost reposted it. Because it was so familiar, it was so common and it was so alike other things that had happened, it was like, Hey, this is the only post I've seen about this. Other people I know who would know more about this would've posted about this before this person would've known about it. Let's go see. And it turned out that it was not true, but it was so like other things that had happened that almost thought it was true. I almost reposted it, but I caught myself. So yeah, ~everybody, ~everybody has to kind of take a step back and say, Hey~ that's. ~Let's really think critically about this and like you say, ~with with, ~with deep fakes, that risk is even greater.
~But I would say and this is another tangent, I'm sorry if you, maybe you wanna good on this, ~but a lot of people ~are, ~are focusing on, literacy and ai and how do we know AI generated things and how do we know, if something ~is, ~is from an AI or not. And if something is an AI or [00:15:00] generated with ai, ~that, I mean, ~it means it's not trustworthy.
And I would push back on that. I would say something is either true. Or it's False it's honest, or it's misleading. And obviously there's, you know, things, but AI generation of something doesn't really matter in the long run. ~If fault. ~If it's true, it's true. It doesn't matter how it's communicated.
You might as well say that if something is posted on Facebook, it's not true. But that's not the case. I mean, obviously this is kind of an oversimplification. ~There's multiple layers of ai data processing and training and bias and all these things, but. ~I think we need to look at information literacy in general and stop trying to say, oh, how do we do this with ai?
How do we do this with ai? And say, okay, let's just apply general principles. We've always applied to artificial intelligence to this as an educational technology ~like, ~and another one of my posts I talked about, the Tea Cup model of technology integration created like three or four years ago by Dr.
Curry and others. and why not just apply this MO model to ai to LLMs rather than create a whole new thing that's so anyway.
Dr. Nikki Harding: Yeah. [00:16:00] So for some reason I'm connecting this to, ~um, ~I, I just reread a story about when, ~um, ~when doctors first understood the importance of handwashing. It didn't catch on until everybody did it because it didn't start saving lives really until everybody did it. And the connection I'm making is, you know, information literacy it's spotty.
You know, we can't stop the spread of disinformation until we all understand how disinformation spreads. ~Um, is, is kind of where my thought is going. ~So we all need to be more aware and more, ~um, ~conscientious ~of, ~of the content that we're sharing, ~um, ~online in order for it to be really effective.
Otherwise, we're all unknowingly spreading disinformation. It's really, really valuable. Yeah. ~Um, ~so you've written three books, is that correct?
Reed Hepler: ~track. I, ~I've written one well, about ai. I have written three books. ~I have authored three chapters in a book about ai. ~I've written one on library science and one on cataloging, and I've contributed a chapter to another book. That's what I've [00:17:00] done.
Dr. Nikki Harding: All while also working full-time at a library and consulting as well, so ~you're not big on~
Reed Hepler: I wasn't, and then in February we just had twins. ~Um, ~in fact, that was one of the ~rea another ~reasons that I thank you, that
Dr. Nikki Harding: Oh,
Reed Hepler: bit late ~because I had to get them out~
Dr. Nikki Harding: congratulations.
Reed Hepler: And, ~uh, ~Maddie is awesome. I. About kinda, and I know I was like, you, you, you don't have to do that, but she's, no, no, no, no.
You know, let's get 'em out of the house. ~Um, ~so you can have, you know, a quiet and so we got them out. But, ~um, ~so, so now I have to have downtime a little bit and, and that's good. But another thing, and I have to say is the College of Southern Idaho has been awesome because those two, in fact, all of these textbooks and stuff that I've been writing it's, they really want to me in this.
And so they to do some of that on. On, on, on their time because I'm helping them. I'm helping, the general field of education and, and ai, and, and library of science. And so they've been really
Dr. Nikki Harding: Mm-hmm.
Reed Hepler: that. ~They like letting me work from home, that type of thing. So they've been awesome about this.~
So yes, it is not a lot of downtime, but it is not as hectic as it may [00:18:00] seem.
Dr. Nikki Harding: So ~what, what, ~what kinds of things do you write about?
~I.~
Reed Hepler: Again, obviously information literacy. ~This~
Dr. Nikki Harding: I am ~assuming ~dis.
Reed Hepler: copyright is another one. Copyright. Nobody wants to talk about copyright. If you wanna have another episode about copyright, I will bore you to death with copyright information 'cause no, it's like the most boring thing ever, but I love it. And then the third thing I like to write about is, I love writing about culture and history and music, and so that really hasn't come a lot into this, but it came really into kind of the organization information and my cataloging stuff, and so kind of the organization how we think about. and putting things in their context of their history. That really hasn't been as, as public. But that's my three big things. And of course now AI took the information. Literacy is connected with ai now. Copyright is connected with AI. ~Now. ~history stuff not so much, but I guess maybe, yes. Nevermind. I lie that turned into my archives writing and that is [00:19:00] connected with AI now 'cause of my two blog posts.
So everything is permeated with ai. But yeah, archives, information literacy, and copyright are my big things. ~yeah, I.~
Dr. Nikki Harding: Awesome. So you mentioned music. ~Have you, um, ~have you done a lot with the overlap between music and ai?
Reed Hepler: the first time I've ever told anyone this. So this is really exciting. ~wrote have you ever heard of UDO before? ~UDO is a music generator?
Dr. Nikki Harding: ~Uh.~
Reed Hepler: No, not two years. A year and a half ago I. It was like the week after UDL was released, I was at the Association for Educational Communications and Technology conference, 2023 conference. And I sat there and I, I was thinking, my sister-in-law, she. ~I don't know why we talk about the greatest things. We were talking about how I love buffalo chicken wraps and she does not like buffalo chicken wraps. And so I was like, I'm gonna make her, you know, something that says that buffalo chicken~
Dr. Nikki Harding: Okay.
Reed Hepler: And so I made a song about buffalo chicken wraps and how awesome they are. ~I. And ~and now she uses it in all of her, like client. She's like, here's something that, ~like ~my brother-in-law made for me about buffalo chicken wraps. So that's kind of ~the, ~the main thing that I've done. I've experimented a few times with it.
I also made and so [00:20:00] I, for a year ago was really into Fugue Bach, particularly. I made an AI generated fugue. ~I had AI gen, I had, I had~
Dr. Nikki Harding: Mm-hmm.
Reed Hepler: a fugue out of that theme and put it onto I-M-S-L-P, ~which is a, ~which is a music coasting website for sheet music. So, and I also recorded it, so it's called Ein Fugue. it's actually called, ~I, ~I forget what it, the old title is, but it's like ein fugue has been made with AI in German.
~I, I, ~I know German a little bit, so I used, anyway title, but if you want that link, that is something I haven't told anyone either of those stories. So that's where I've been with music. I concluded that ~AI really couldn't ~AI really couldn't do that well with music at that point. And I think that there's a difference and it's a [00:21:00] very compartmentalizing of me.
'cause why would you think this of text and not of music? I think that AI training protects generators is fair use. Completely. Absolutely. And but the caveat that putting a PDF in a prompt and saying, Hey, can you analyze this for me and make something based off of it is not training and is copyright violation music? I think there's something to be said about it's just a different type of creation and so I don't think that training on that type of on, on, on music is very used and that's very convoluted and I know that that is not consistent. and I should say probably, I guess it is. But what is the number one thing that you don't want to do? When you're wanting to generate music, you're like, make me music in the style of this. Make me music in the style of this person of that this group. You don't really do that with text. And that's really my big hangup. So I suppose that people weren't saying, I wanna make a song about what was the [00:22:00] one that I, Oklahoma in the style of Luke Bridges. Wouldn't have as much of an issue with it. ~I'll have to get back to you on that a little bit, okay?~
Dr. Nikki Harding: Well, I have a question about that. So I was just having a conversation with another group about using AI for curriculum accessibility for people with disabilities. ~So, um, ~so what would be your thoughts about using AI for, ~um, ~people with disabilities to access curriculum standards,
Reed Hepler: intriguing. ~So~
Dr. Nikki Harding: especially like.
Reed Hepler: Yes. So are we, are you talking about having like an AI chat bot that would communicate that to them or ~like an ai, ~like a voice reader type thing?
Dr. Nikki Harding: ~Like ~like maybe they have well, okay, here's an example. My son,
Reed Hepler: Yes.
Dr. Nikki Harding: Has Down Syndrome and he attends, ~uh, ~he's 21 now, so it's not a traditional school, but he attends, ~um, ~a school and, ~um, ~they use, ~uh, ~music. Generator thing to create songs that they sang for parents night or whatever. But an example standard would be [00:23:00] understanding, you know, how and I wasn't a music teacher, so I don't know what music, I don't know specific music standards, but let's say, ~um, ~it was to understand rhythm, so they would be generating.
Rhythm and they would be inputting you know, certain words. They would be accessing the standard using ai, ~um, ~but they wouldn't necessarily be using the drums or, or playing on the piano. ~Does that~
Reed Hepler: Okay. ~So~
Dr. Nikki Harding: make sense?
Reed Hepler: about, and essentially, and I have to try this to the instructional design drug in here you're asking ~how, ~what do you think about students or educators helping students use AI to meet learning objectives? With me. ~Okay. ~Yes. This is an excellent, that is
Dr. Nikki Harding: Mm-hmm.
Reed Hepler: things to do. ~And I, ~basically what you're talking about with your son there, ~it, it's, ~it's project based learning. It's experience based learning. That's another ~thing. I didn't my fourth ~thing I write about project based learning. I love it. That's the best way to learn is doing it yourself. ~And you know, ~here's this objective, ~how, ~here's the [00:24:00] project that I wanna do.
Let's combine the two. Yes, project based learning is one of the best things to use AI with. ~And, I, ~I have another post, I don't know if I posted this or if I have written on it. I've got some flag for it, but I don't like formative assessments as much. I like summative assessments ~Most, I I should say I like formative assessments.~
I don't like grading them, and I'm not even, I've only taught like two classes, but, ~my, ~my point is if they can do something, ~if they can, ~if we can take 'em through an experience, have 'em learn, ~having, ~having learned, ~you know, ~the terminology, but then apply it. If they can show that they're applying these concepts, even if they don't give the vocabulary right, then ~they're, they're normally the concepts, they're, ~they just might need to study ~on the, on the, you know, ~the keywords later a little bit. ~and so ~just 'cause there's different types of learning and there's different types of understanding. ~So I absolutely. I ~I did an o other post, in fact, ~in, in, ~in my chapter for the Utah Education Networks, OER on ai. It's called Navigating. My chapter is called Navigating Benefits and Concerns of AI When Discuss.
No Navigating Benefits and Concerns when Discussing Generative AI with faculty and staff. That is a long, long title. I talk about using the teacup method specifically, and [00:25:00] I say, this is how you should use it. This is why you should use project-based learning. and that's one of the big yes, I love it. I also, I incorporated those ideas. ~Um, I, ~I did a webinar a few months ago called Personal Learning with AI, with Steve Harger on of Library 2.0. And that basically was how do you come up with your own learning objectives? How do you come up with your own learning plan and how do you use AI to help you learn skills by yourself without having to do a course or anything like that? anyway, so yes, that is like the best use case of AI in education. Yes. ~Yeah.~
Dr. Nikki Harding: I ~love it. ~Love it. We need to have a paradigm shift sometimes about what is cheating versus what is accessibility. I think we're ~on ~on the cusp of that in some places, maybe.
Reed Hepler: there's the caveat too, that, ~and this is, ~this is the thing that people ~that~
Dr. Nikki Harding: Yeah.
Reed Hepler: ~like when I said this I said, ~if an AI can do your assessment by itself, you really need to rethink your, it's not the AI's fault, it's your assessment isn't following basic [00:26:00] instructional design principles. ~And it's not. And, and so yes.~
Dr. Nikki Harding: Yeah.
Reed Hepler: . Stop. Penalizing students for using technology to do things that they need done, whether that be ai, whether that be,
yep.
other technologies. But yes, is not cheating. And I should say technology is not cheating. I remember. We, we had to sort data or something in, in high school for some, I think it was the statistics. And instead of doing ~these all, ~all these things, I just ~went and I ~went into a spreadsheet and I said, take all these values and put it in and do all these things with the spreadsheet.
And I said, here's the spreadsheet that I used. And then he penalized me because they used the spreadsheet rather than manually going through and doing it by hand. If somebody thinks that way, why would you penalize them for that? But anyway, it was really, really interesting, ~but.~
Dr. Nikki Harding: Alright. Well Reed, thank you so much for your time today. I really appreciate it. ~I think that, um, we have some really good things to share with our audience. ~How can people get ahold of you? 'cause I know you do some independent consulting, so if somebody would like for you to help them, how, how can they get ahold of you?
Reed Hepler: If you're looking academically for like academic stuff, I would go to R [00:27:00] [email protected]. That's my CSI thing. I do all my academic stuff through that. If you're looking like to have me come and like, consult with your organization or, train people or, you know, do videos read [email protected] is my email~ my, I'm, ~I'm on LinkedIn, like Dr.
Harding said, or my blog read heppler.substack.com is also a good resource. ~I, ~I like to talk to people there and I usually will refer people to some of my blog posts, ~which may seem self-serving, but it's where I write all my thoughts down for everything I probably will spam you, Dr.~
~Harding, with all these blog posts that I wrote, but it'll, that's just to kind of get my ideas out there in text form. yeah, that, those are, ~those are the best ways to contact me.
Dr. Nikki Harding: I will link your, ~um, ~substack and all your other links that you send me in the show notes that will be on the website.
Reed Hepler: no
Dr. Nikki Harding: Okay. Awesome. Thank you so much. And, ~um, ~I will also put a link to nomination. So if you know somebody who is doing amazing work in education, I would love to hear from you. Please nominate your friends and colleagues for the We Not Me Podcast.
That nomination form will also be in this show notes. I look forward to hearing from you.