Want to know how K12 schools are navigating the adoption of AI and what district-level leaders really think about GenAI EdTech tools?

Join us for this free webinar where we discussed AI technology, literacy, training, and the responsible adoption of GenAI tools in K12. Our panel explored what is working well - and not so well - across their districts from a school leader and practitioner’s perspective.

Presented as part of our AI Launchpad: Webinar Series for Educators.

K12 District-Level Perspectives on AI

  • Amanda Bickerstaff

    Amanda is the Founder and CEO of AI for Education. A former high school science teacher and EdTech executive with over 20 years of experience in the education sector, she has a deep understanding of the challenges and opportunities that AI can offer. She is a frequent consultant, speaker, and writer on the topic of AI in education, leading workshops and professional learning across both K12 and Higher Ed. Amanda is committed to helping schools and teachers maximize their potential through the ethical and equitable adoption of AI.

    Dr. Patrick Gittisriboongul

    Dr. Gittisriboongul has been involved in information and educational technology throughout his career as an educator and professional for over 20 years. He is currently the Assistant Superintendent of Technology & Innovation for Lynwood Unified School District and previously served as the Assistant Superintendent of Innovation for the San Diego County Office of Education, where he led county-wide efforts around career technical education, educational technology, professional learning, and data science. In Lynwood, he has been responsible for setting the vision and course for Lynwood Unified’s Digital Equity Roadmap and providing leadership and expertise regarding digital learning and information technology systems within the district.

    Samantha Armstrong

    Samantha is an Instructional Technology Coach and Virtual Coordinator for the Muhlenberg school district in Pennsylvania. She brings a practitioner's perspective on AI in education. She has been working with teachers to build a foundational understanding of Generative AI, ethical considerations, everyday tasks, & instructional planning tips. Next steps in her district include ways to differentiate content, process, and product using AI tools. Samantha believes that AI has the potential to change the ways we teach and learn.

    Brett Roer

    Brett Roer, the Ohio Regional Director for The AI Education Project (aiEDU), is a passionate advocate for the equitable integration of AI in education to revolutionize the educational landscape. Following 16 years as a public school leader and educator in New York City, Brett has spearheaded initiatives and cultivated national partnerships for innovative ed-tech and non-profit educational organizations. His thought leadership and expertise have made him a sought-after panelist and moderator on a variety of topics at prestigious events such as the ASU+GSV Summit and EDTECH WEEK NYC, as well as digital panels for Digital Promise, NationSwell, the LeveragED Foundation, and Outlier.org.

  • Hi, everyone. I'm Amanda. I am the co-founder and CEO of AI for education. This is part of our AI launch Pad Webinar Series. So excited to have you all here? This is one that I've been thinking about for a long time. Which is this idea of like what's happily happening at the district level. And so we talked from the practitioner level in the classroom. We've talked about instructional coaches and leads. We've talked about all different types of opportunities. But

    00:02:02.420 --> 00:02:27.390

    Amanda Bickerstaff: we haven't really thought about it from a system level. And so today, we're really gonna focus on the system level approach and so I have 3 amazing panelists here with me. I've got Samantha. I've got Patrick, and I've got Brett. These are people that I have so lucky to have be part of my network patrick and I met where we did a panel together at the AI plus Edu conference. Samantha is part of our women, and I, AI and Education Company

    00:02:27.390 --> 00:02:40.120

    Amanda Bickerstaff: Conference group. And then Brett actually is the person that helps us get our first school ever which is pretty amazing, so I don't think a application would be the same without him, and that connection he made so

    00:02:40.120 --> 00:02:46.549

    Amanda Bickerstaff: really excited to have you all here I see some early morning people in Sydney and people from all over the Us.

    00:02:46.550 --> 00:03:11.519

    Amanda Bickerstaff: And so it's just really great to see our diversity, and what I'd love to see is that you know, we wanna always set the scene in sense of this is a community, and so use the chat and the ability to connect with each other, to ask questions, to share resources. We're not gonna do prompt engineering today, but we always try to complete something practical. So we will be talking about practical strategies. And the last thing is, if you have a great resource, share it. We'll be sharing

    00:03:11.520 --> 00:03:24.829

    some, but we'd love to see that happen. And we are almost at the end of our first year of our webinar series. And so we have 2 more. So this today's is a K 12 level perspectives and actually deleted the wrong one. So sorry, Brian Eldridge.

    00:03:24.830 --> 00:03:45.290

    Amanda Bickerstaff: He will be doing a a webinar next week that's really practical on differentiation strategies with AI, and then the last one is with Tom Vander Arc, who is, I? Randomly? Tom and I spent 3 h together yesterday in 3 different meetings. And so he is going to be talking with me around durable skills and the learning experience. And how it's changed with AI

    00:03:45.290 --> 00:04:02.110

    Amanda Bickerstaff: So without further ado, I'm really, really excited to have this amazing panel here with me today. I know how busy you all are, and just really appreciate your time, and we always start the same way. So I'm Gonna start with Samantha. Love for you to give us an introduction about you and your first time using generative AI.

    00:04:02.140 --> 00:04:17.229

    Samantha Armstrong: Sure, I've been in education for a really long time. I have a master's in elementary Ed. One in educational strategies. I just completed my supervisory certification and curriculum and instruction, and I was in fifth grade teaching everything in Jacksonville, Florida.

    00:04:17.230 --> 00:04:41.419

    Samantha Armstrong: Then. I was teaching Ela and social studies in Pennsylvania and Mullenburg School district, where I am now. This is my third year as an instructional coach, and I'm also a virtual coordinator. So I've been basically working with our teachers and presented to our assistant principals in our county, too. But the first time for Chat G. Ppt was a little different. I came home, started to think about some different things over the summer as things were happening last year.

    00:04:41.420 --> 00:05:06.420

    Samantha Armstrong: and my husband was talking about a new job in sales, and he said, You know what. Let's go ahead and check this out and just see if it works with me for outlining my approach. For, like, how do I build relationships? How do I build sales? What do I do? And we wanted to compare it to what the real world was saying and what that was saying. We did it with Chat Gp. We did it with Google Barred. We looked at a little bit of everything. And then, just for fun, we made a grocery list for things under $200, then started to do different things with what if it was

    00:05:06.420 --> 00:05:11.050

    Samantha Armstrong: gluten free. What if it was this so? Nothing educational to start at all?

    00:05:11.100 --> 00:05:26.820

    Amanda Bickerstaff: I always love it because it's like a workshack, right like, how do you use it? Do you come to it from your own perspective? Is it kind of a test? Is it something that you're using for your practice? etc. So now, Brett, so, Brett, do you mind answering the same question, introducing yourself in the first time you use generative AI.

    00:05:27.020 --> 00:05:39.270

    Brett Roer: Yes, absolutely. Thank you, Amanda, again for having me my name is Brett Roer. I have worked in public education in New York City for 16 years, have worked in Ed Tech and have become a really big evangelist of the power of AI.

    00:05:39.560 --> 00:05:42.619

    Brett Roer: I first was really exposed to AI at the

    00:05:42.780 --> 00:05:48.570

    Brett Roer: in early January, at the DC. Capital region. AI Education Summit. That's where they had hundreds of leaders convening.

    00:05:48.610 --> 00:06:01.699

    Brett Roer: and was the first time I really saw how you could use this in the classroom, how you could use this to support districts, and I thought about my building leader had how I could apply this, but similar to what Sam just said. Most of my first interactions prior to that were.

    00:06:01.850 --> 00:06:12.530

    Brett Roer: you know, writing children's stories for my 2 young children, making workouts and seeing if it could do it quicker than my memory, asking for playlists, for workouts or for a friend. So just

    00:06:12.610 --> 00:06:20.739

    Brett Roer: really seeing what's capabilities run a personal level. And then, realizing the amazing power it has in the world of education pretty quickly thereafter, in January

    06:21.120 --> 00:06:45.190

    Amanda Bickerstaff: it's great you you actually came to before I did. I was still kind of figuring out my life after 6 months of travel. January. So Brett knows more than I do, or at least started at an earlier stage, has been so supportive of. I know so many principals and and leaders. Across the world country, which has been really great, and the last is Patrick. So Patrick, like, almost got the latest arrival before a

    00:06:45.190 --> 00:07:09.459

    Amanda Bickerstaff: for a session. Dr. Philippa Hartman wins with 3 min before but Patrick is an assistant superintendent, so the fact that he is here at all. We are very, very lucky to have you, and I think that your background is, as both a former technologist and also assistant superintendent, really gives you, I think, one of the the best handles on this moment in time of anyone I've talked to. So it's really appreciate you being here, and would love to to know a little bit more about you.

    00:07:09.680 --> 00:07:38.510

    Dr. Patrick Gittisriboongul: Awesome again, Amanda. Thanks for having me on as well really appreciate being on this panel, and esteemed people you have here. Patrick Gaddis rabungle longest last name in Lindwyn unified school district. I got vials that you can buy. I'm selling it now, if you wanna you know, sign up for my chat. Bot I got get us from Bungalow dot GPT. I'm gonna be by that domain name. But I'm the assistant superintendent of Technology and Innovation, with Lindwick unified school district. Here in Southern California we have 12,000 students, 12 elementary schools, 2 middle and 2 high schools.

    00:07:38.510 --> 00:07:51.709

    Dr. Patrick Gittisriboongul: My first experience with Jen AI is probably a year from today. Really working with Chat Gp, really understanding how it works what it can do. How it can really write stuff for you if you will. Right?

    00:07:51.710 --> 00:08:16.570

    Dr. Patrick Gittisriboongul: you know II did have some difficulty getting on because everyone was trying to get on that. That chat. Gpt, bot! I was like, what is this this tool that's supposed to displace everything that's supposed to generate all of these papers for you and write your essays and write your emails. So it's about a year from today. Again, II do think that AI is not new. We interact with AI on a daily day to basis. You know whether it's you through your phone, through your map.

    00:08:16.570 --> 00:08:41.549

    Dr. Patrick Gittisriboongul: You know, you had to use Google Maps to get here today. So again, we interact with AI all the time, you know, data is always, you know, there's AI that that curates your your data as well. So whether you're shopping your retail, whether you're you're buying stuff for your kids or for your families this holiday season that you you're in someone's database because they're trying to make sure that they personalize the experience for you as a kid

    00:08:41.549 --> 00:08:53.140

    Dr. Patrick Gittisriboongul: customer. So AI is not new as I. AI is here my experience with Jenny AI was actually a year ago. So chat Gp, and you know, I'm excited to hear about Google stuff to today. That just be announced. So

    00:08:53.140 --> 00:09:18.039

    Amanda Bickerstaff: absolutely well, yeah, I mean, it's it's been a bit wild, and that the you know, we always start with like, you know, if you got one of these like 84% of us interact with AI, literally every day like whether it's your face recognition like. And now this pieces. And I think that's an important way to start when we're talking about today, which is that while this has been, you know, even a full year in, we're still really questioning what the capabilities are, what the impact

    00:09:18.040 --> 00:09:31.800

    Amanda Bickerstaff: will be. It is by far something that is not come from this magic. It is not very dust, it is not something that hasn't led from come from 60 years of of content and development and thought and research.

    00:09:31.800 --> 00:09:45.709

    Amanda Bickerstaff: But what it has done is that it's become a consumer facing technology at a very early stage like never before. And so I think that is really why I think this is so important for us to talk at a system level, because it isn't something in which you can.

    00:09:45.710 --> 00:10:04.210

    Amanda Bickerstaff: you know, put on the shelf and say, Okay, we're gonna come back to this when the technology is safe and reliable. It's something that is happening at an uneven rate across your district already. So my first question is, Samantha, that will everyone will have a chance to answer. Is that what has been the impact of generative AI on your district?

    00:10:04.910 --> 00:10:19.569

    Samantha Armstrong: Well, it kind of started with the secondary English department. First, with the idea of how did these magical essays come to be in a couple of minutes, when all of our chromebooks are blocked from using it, and clearly we already know the answer to that.

    00:10:19.570 --> 00:10:38.220

    Samantha Armstrong: but it was one of them was really a funny anecdote where they were saying this one eleventh grader was saying how they really really love their college dorm room, and they've just loved the experience of college. Remember, I said they were eleventh graders, so that was a little bit easy to find without having to do anything else. But then the shift in that impact came

    00:10:38.220 --> 00:10:57.630

    Samantha Armstrong: from okay, yes, while we started some professional development that's gonna have to be flexible and work with different things, and the kids are supposedly blocked. But they're using their phones. They're using things at home. Let's use it as a starting point instead. And then how can you be ethical? How can you be responsible? How can you go from what you created? And now what?

    00:10:57.780 --> 00:11:25.910

    Samantha Armstrong: And always think more critically anyway, cause we want that, no matter who you are. So instead of passive learners, which, since Covid, that's been a really hard thing to get back. Let's utilize that even playing field, and then take it further. And once we said that idea, instead of their cheating on that essay. Yes, they're not in college, but they're also having a starting point they may never have had before. So now, what can we do with that? So that's pretty much where we started with that impact from the cheating to the Oh, maybe we can use this.

    00:11:26.120 --> 00:11:34.800

    Amanda Bickerstaff: That's great. And I think that I you know, Ela, teachers everywhere, had a moment and maybe are still having a collective moment. And and I think that

    00:11:34.800 --> 00:11:59.780

    Amanda Bickerstaff: what's happened is we've collapsed like this idea of generative AI into like. It's a cheating tool that still is the rhetoric even a year after. And so I'm really glad to hear that you and we'll talk much more about how you did that, because you are. There are a lot of districts that haven't been able to make that shift from that kind of, you know, rhetoric to that action statement and that ability to start incorporating it. So I'm glad you've done that. But so Bret has a bit of a different perspective, right?

    00:11:59.780 --> 00:12:15.940

    Amanda Bickerstaff: Right? Because you're actually thinking about it from how do you connect and tech tools that are building to pilot schools? How do you actually talk about adoption. So what's been the experience that you've had when talking to districts like, what is the general theme and experience of that are happening out there.

    00:12:16.310 --> 00:12:26.540

    Brett Roer: Yeah. Well, I'm really fortunate that in the past few months I've helped to spearhead initiative in the State of Ohio with the nonprofit AI Education Project Aidu.

    00:12:26.570 --> 00:12:38.500

    Brett Roer: and so I've been fortunate that I've been able to attend an AI summit that was held at Butler County in Ohio, and got to speak with district leaders and county leaders and building leaders, as well as some educational teachers in their classrooms.

    00:12:38.790 --> 00:12:43.620

    Brett Roer: really listening to what was that first moment of trepidation? What's happening?

    00:12:43.780 --> 00:13:00.050

    Brett Roer: How are we reacting to it? And then what is the current needs? And for many folks on Ohio? And then also recently getting to attend an AI conference in white and working with New York City leaders in person. And virtually it's kind of. There's 2 different points right now. There are folks who

    00:13:00.060 --> 00:13:17.549

    Brett Roer: are now at the point where they're fluent and comfortable enough, and now they want to figure out how to empower the rest of their community while putting in those you know, proper safeguards. And so they're looking for thought leaders to help them build that policy and make sure that they're really thinking through. And there's others who recognize they have to

    00:13:17.550 --> 00:13:43.199

    Brett Roer: embrace this, but don't know where to start and are looking for those first initial steps. So that's really what I'm seeing on a national scale is. And it's not by any region. It's. There's people in the same part of the country that are just at different points, for whatever reason. So I'm looking to support them in meeting them where they're at getting them the tools and resources they need finding those solutions, but most importantly, just getting people comfortable. In recognizing whatever tool you start with.

    00:13:43.450 --> 00:13:49.929

    Brett Roer: just embracing that this is here, and having an open mindset about learning and sharing it with your whole district and community.

    00:13:50.230 --> 00:14:15.440

    Amanda Bickerstaff: It's great. We see a lot of head nods because I think there's a lot of really like the the idea that, like we could be in a room of 10 people, a hundred people, a thousand people. And we have the same kind of scale of like. Never used it. Use it all the time. Hate it. Love it, you know. It's something that we can ignore to stuff they like, get on the bus and get run over. And I think that that's what's really interesting. But I think that your point of meeting, like people where they are

    00:14:15.440 --> 00:14:40.689

    Amanda Bickerstaff: is so important. But it's something that's really really hard to do at a district level, because you have so many competing priorities. So now to so, Patrick, I know you guys are really on the forward foot on this, and you have great staff around it, but I know that you also have similar struggles. So what actually has been the impact over the last year of generative AI in Linwood? Yeah, I would say, added organizational level, it's had a profound impact. And what I mean by that is that

    00:14:40.740 --> 00:14:57.440

    Dr. Patrick Gittisriboongul: from every layered approach that you take from the political landscape, like when it whether it's your board whether it's your parents, your teachers, and your staff, along with your students and parents. It really making sure that the organization that everyone in the community you know, it rallies around

    00:14:57.440 --> 00:15:22.419

    Dr. Patrick Gittisriboongul: has a policy or responsible use. Plan in place is something that really is transforming organizations. Right? It. It's allowing us to take a step back, take an interlook, and really a identify those stakeholders in a way that really makes it inclusive engages everyone in that process right? And so we, we approach it 3 different ways, right? We really wanna make sure that AI is responsible is responsibly used. Right? We know that we

    00:15:22.420 --> 00:15:38.970

    Dr. Patrick Gittisriboongul: have students and staff and and parents that may may use it for an if areas purposes as the it director as well. You have to make sure you you know you watch for that right? The second is making sure that the data that is put into these chat bots are protected right? And so what keeps me up at night is, if

    00:15:38.970 --> 00:16:07.810

    Dr. Patrick Gittisriboongul: you know, a teacher, a staff member, whoever it is, is putting like students data, whether, like you write me a Iep for Amanda, bigger staff, who, you know, lives here and attends this school that keeps me up at night. And so that training, that professional development of staff students just to make sure that that data privacy, you know, digital literacy comes is at the forefront where this AI wave is part of that. And the last is making. It's inclusive, right? So we know that

    00:16:07.810 --> 00:16:26.469

    Dr. Patrick Gittisriboongul: it's gonna create more gaps. Right? The pandemic caused a lot of gaps. Digital equity cause a lot of gaps devices cause a lot of gaps. Tools cause a lot of gaps. This AI is gonna continue to exacerbate those gaps right? It's gonna be kids and staff. And you know, adults and principals who know how to use AI and those that that do not

    00:16:26.470 --> 00:16:51.230

    Dr. Patrick Gittisriboongul: right? So there's gonna be so some folks that, hey? I I'm done with my school plan right? I'm done with my District Lea plan or my, you know, local control accountability plan, because it's written by AI, right? And so there's gonna be others that you know may be so like, you know, struggling. And I don't know if I'm really comfortable on that. So across the spectrum right is how we're looking at and approaching it, really looking at every stakeholder, making sure that

    00:16:51.230 --> 00:17:16.179

    Dr. Patrick Gittisriboongul: we do outreach, we do professional development, that we really plan for that. We have an AI task force that we formed right this year. And so we're meeting 4 times every other month. And so the first session is really around making sure that there's fundamentals. The second is really making sure we have ethics and oversight. The third is around equitable implementation. And then the last is really how you lead the transformative change that this tool is already going to be there. We've already talked about how AI,

    00:17:16.180 --> 00:17:17.269

    yeah, is impacting

    00:17:17.660 --> 00:17:26.619

    Dr. Patrick Gittisriboongul: economies, jobs, all that kind of stuff. So how is it impacting the workforce as well? And how are our teachers preparing our students for that. So that's that's how we're looking at it.

    00:17:27.050 --> 00:17:37.929

    Amanda Bickerstaff: So I would say that to those of you that listen to Patrick and go. Oh, man, I'm on the earliest part of this this piece, and like this, all sounds amazing. The one thing I will say is that

    00:17:37.960 --> 00:17:53.510

    Amanda Bickerstaff: you know, we are all at different stages as well, and, like Patrick, is going to be on like like Lynnwood is going to be on a like a very early adopter and very structured approach. But the things that I think we all can take from what we hear from everyone here is that

    00:17:53.600 --> 00:18:14.930

    Amanda Bickerstaff: this isn't something that is going to be a one and done. This isn't going to be something that you can do a pretty, you can create a policy you can walk away from, and whether it's one, you know, whether it's every you know, every other month, or it's a task force or what it looks like. I think the idea of longitudinal support and actual integration of this across

    00:18:14.930 --> 00:18:39.719

    Amanda Bickerstaff: the system is really really important. And, like you, you know, like, I think, that Patrick has identified a really wonderful way to do that. But I would also say that at this stage, considering what I see is that even if you do parts of this and start incorporating it as a change management process, which is what this is happening, that you're on the right path. And then, like what we can do is, we can start like you can see, like these best practices that we're sharing here

    00:18:39.720 --> 00:19:02.880

    Amanda Bickerstaff: to keep going. And I think that that's, I think that sometimes what can happen is, we think we have to be that great at the beginning, when actuality just kind of starting it and and being willing to keep doing. It is exactly what we need right now. So I and then we can share people that are doing this amazing work like we get to do today. So I'm gonna actually go forward a little bit. We're gonna skip a question because I wanna get to the meat of the conversation, which is

    00:19:02.880 --> 00:19:15.560

    Amanda Bickerstaff: this idea of opportunities and challenges. And so I know a lot of people are struggling, especially at the system level of getting started which we just talked about, because it's really hard to understand

    00:19:15.560 --> 00:19:34.820

    Amanda Bickerstaff: what these tools will and can mean right now. So, Samantha, you have a really good handle on. I love you like you actually do a great job of interrogating, you know, Ed, techs and organization supporting your, you know your staff. But like, what do you actually think are the opportunities and the challenges of generative AI for your district.

    00:19:35.550 --> 00:20:01.679

    Samantha Armstrong: Well, I think it kind of starts with what you've all said. That idea of you gotta meet everybody where they are, and you need to give them the foundation like, what is AI where you may know nothing you have. You're an expert user. But that's thinking educationally also, where our kids are never in the same place. So our educators are also gonna feel that way. You also have that idea of bringing people along, but when you share that foundation, then you also know everybody's why is going to be a little bit different.

    00:20:01.680 --> 00:20:16.040

    Samantha Armstrong: So you need to make sure that you're really, truly listening to your audience. So that could be the department that could be the grade level that could be the administrator that could be working with somebody. Just listening with the different pieces that are there like. I sat with our

    00:20:16.040 --> 00:20:27.450

    Samantha Armstrong: assistant superintendent and his secretary, and we work through prompt generation just even that beginning piece, and did that with assistant principals and everybody. And we're all at such different places that we need to honor that

    00:20:27.550 --> 00:20:43.639

    Samantha Armstrong: and understand. I'm gonna give you the foundation, and some of you are ready to fly. And some of you are just gonna stay just like Bret was saying with that one tool that's gonna make all the difference in the world. Whatever's gonna be best practice and meet what your kids need and what your teachers need is where you wanna go. But you need to listen first.

    00:20:43.640 --> 00:21:04.100

    Samantha Armstrong: So I think the biggest thing that I found that so positive is as we're sharing these things like we share the tools, maybe one or 2 of them, and then we just sit and explore. And I ask for people to share with me. What do you need? What are you working on? Show one of those examples, and then just give time because we always lose time. The biggest thing is

    00:21:04.100 --> 00:21:23.429

    Samantha Armstrong: that, as you said, it's not one more thing. It's not another initiative that's gonna go away. And it's very easy to be jaded with that, especially since Covid, to worry that. How do I balance all this? You know you're telling me. Take care of myself, but you're also saying, Take care of all of these social, emotional needs, that where do I do that and honor my kids and also

    00:21:23.490 --> 00:21:47.279

    Samantha Armstrong: bring up that student achievement? So I mean, the things that have been positive for me is just sitting with, like the media specialist who said, How am I going to integrate my Sd standards and my literacy standards for third graders. Only. Here we go, and, you know, just getting so excited. But again listening first, because if I have that foundation. That's great. But that doesn't mean that's what you need. So I think, starting out at that, even playing field is great.

    00:21:47.480 --> 00:22:09.639

    Samantha Armstrong: and then really fine, tuning it and personalizing it to your staff, who in turn can personalize it with their kids depending upon where you are. One of the other things that was really neat was we worked with our El teachers just even in our the magic canva tools where we were talking about things that we wanna honor the native language. We wanna make sure also that we're bringing them along in their levels of proficiency.

    00:22:09.640 --> 00:22:21.370

    Samantha Armstrong: and how the canva translate in 2 s was giving them things that they've always needed. It was giving them that text to image where you could have examples and not example. So it was just exciting

    00:22:21.470 --> 00:22:27.210

    Samantha Armstrong: to see the light bulb go on, and to know the things that they've always wanted to do. But they didn't have the time to do

    00:22:27.260 --> 00:22:53.459

    Samantha Armstrong: just fine tuning with the High School English teacher. She was working on a mock trial in Greek mythology, and she said, You know, really thinking about this idea, and here's my question, but I'm not getting a lot so well, let's keep iterating your prompt and let's go further and further, and as she got more and more. It made so much more sense, because it isn't a Google search. It's a conversation. It's a thought partner. So I think those are the pieces that you find as you do it more.

    00:22:53.460 --> 00:23:15.789

    Samantha Armstrong: but not everybody is on the same playing field. But to wrap up with that I apologize. The challenge. The big challenge. Is that mind shift of moving from just one more thing to that idea of. It's our thought partner for ourselves and for our students to think more critically. And if we remember, everything we're doing is to be beneficial to our district, like ethically and effectively.

    84

    00:23:15.840 --> 00:23:20.080

    Samantha Armstrong: then we can always come back to the true y of what we're doing and what our kids need.

    85

    00:23:21.020 --> 00:23:50.270

    Amanda Bickerstaff: That's great. And first of all, I absolutely love that you are like shot out of a cannon, since, like all the positives, because I think sometimes it can be quite like we have all these challenges, and I'm sure you hear challenges all the time in your work. But the fact that you, you know, when we talked about the prep before today, we're like, you know, let's really focus on those those stories, because I think we don't hear enough of those stories. We definitely don't highlight them. But I absolutely love that. This is something that like you get to see those light, full moments because we'd often as leaders don't get to make things better

    86

    00:23:50.380 --> 00:24:11.930

    Amanda Bickerstaff: for teachers like we definitely don't like. We usually just have to give them more stuff like, oh, here's this other thing you have to do, and that's why that thing, you know, is real like, you know. Let's I'm just gonna close my door, and it'll go away like like that thing will go away, and it usually does, to be honest but I think that this is a real opportunity by highlighting and showing that like, actually.

    87

    00:24:11.930 --> 00:24:35.310

    Amanda Bickerstaff: I can make it better for you. You can find that thought partnership, that brainstorming partner, that getting started or finding the place in your like where you'd really struggle, which I really love, that that's the place that you get so excited about. And I'm just really glad to have you here to be able to share that. So Brett, coming from like the more kind of like strategic place when you talk to whether it's in Ohio or Hawaii or New York City

    88

    00:24:35.320 --> 00:24:38.729

    Amanda Bickerstaff: like, what are those opportunities? And what are those challenges?

    89

    00:24:39.050 --> 00:24:46.749

    Brett Roer: Yeah. And I'm also gonna keep it pretty optimistic because I get to see it. I get to see people learn these things for the first time

    90

    00:24:46.840 --> 00:25:07.320

    Brett Roer: and make sure they leave these professional developments or sessions really excited for the next day teaching. And we all know, right former building leader. that sometimes isn't the feeling you get when you leave a professional development, you feel that one more thing, and you've just been given another task. So just to go over some of the positives. I've seen one

    91

    00:25:07.360 --> 00:25:08.599

    Brett Roer: Sam already said it.

    92

    00:25:08.870 --> 00:25:18.719

    Brett Roer: First, you need to set the stage right. You don't know exactly what your audience will be like, but finding great professional development thought partners. So aid u AI for education.

    93

    00:25:18.900 --> 00:25:26.789

    Brett Roer: making sure people just kinda get into where we are right now, where the last year is gone. But also, you know, the historical uses of AI.

    94

    00:25:27.070 --> 00:25:40.780

    Brett Roer: Then it's really important to give someone the opportunity to create and play. So in Hawaii Playlab did hackathon with Aidu and teach for America. So here's a great example. I'm with a member of Tfas team who trains

    95

    00:25:40.800 --> 00:25:44.480

    Brett Roer: new teachers in Hawaii, and they serve, you know, a very diverse population.

    96

    00:25:44.640 --> 00:25:51.490

    Brett Roer: One of the initiatives in Hawaii right now is bringing native American cultural heritage into the curriculum.

    97

    00:25:51.630 --> 00:26:05.859

    Brett Roer: Now we know that could live on paper and never be put into practice. So I just showed them really quickly how, using Chat Cbt, you could find the standards embed them, and all of a sudden create engaging do now, or very

    98

    00:26:06.120 --> 00:26:23.150

    Brett Roer: small impactful projects in every subject area in Hawaii, based on grade level. You could then up and lower the reading level. And this person was blown away because they've been trying to think about, how do I embed that, and in a few minutes they had a great source to start with, and that light bulb went off. So one of the things that

    99

    00:26:23.200 --> 00:26:33.699

    Brett Roer: when I plan professional developments with folks like we're doing in Ohio or our first big professional development for all of the county leaders there they call esse educational service centers.

    100

    00:26:33.820 --> 00:26:44.309

    Brett Roer: We said, our goal is that there is a light bulb moment for each of these people. We have to figure out how that light bulb goes off, that this will make me better at my job. and in the long term it will make me more efficient at my job.

    101

    00:26:44.380 --> 00:27:10.029

    Brett Roer: And whatever that job is an education or role that you have, this can improve the quality of how you're engaging with whoever it is. You're serving in education. So that's really where II come from is trying to figure out. What do you need to have that? Aha! Moment that will at least keep you excited and engaged enough to continue to persevere with a new technology. Last thing I'll say about how I try to capture the positivity of the power of AI.

    102

    00:27:10.070 --> 00:27:17.329

    Brett Roer: Whenever I see that light bulb, I try to immediately ask someone to quantify in minutes how much time this might have saved them tonight or tomorrow or

    103

    00:27:17.540 --> 00:27:26.170

    Brett Roer: so on and so forth. So if a teacher was like Oh, this just saved me 20 min, and it's better than how I would have been able to scaffold or differentiate the exact lesson I was going to do tomorrow.

    104

    00:27:26.250 --> 00:27:31.299

    Brett Roer: Say, now, imagine, if you did that for 5 days, what would you do back with that 100 days, a hundred minutes

    105

    00:27:31.340 --> 00:27:32.550

    Brett Roer: next weekend?

    106

    00:27:32.660 --> 00:27:48.669

    Brett Roer: What would you do? And then, if you extrapolate that over the course of the school year. I try to get administrators to think about that or superintendents. That's really how I'm hoping people start to quantify and scale AI. You can do something better save time. But then, most importantly, what are you doing with that time to reinvest in yourself or the community? Serve?

    107

    00:27:48.710 --> 00:27:56.659

    Brett Roer: That's where I hope we are going. So the challenge is, how do you do that at scale? How do you reach educators across the country, at the district level, in the classrooms.

    108

    00:27:56.890 --> 00:27:58.820

    Brett Roer: That's the challenge that I'm

    109

    00:27:58.860 --> 00:28:06.620

    Brett Roer: most interested in solving is, how do you scale those light bulb moments as quickly as possible in service of the greater good in education.

    110

    00:28:07.640 --> 00:28:26.740

    Amanda Bickerstaff: And I mean, that's Brad. That's the whole goal of AI for education. So we're definitely questioning that same way of like, how do we think about this and scale. And it's it's a really really big question. So I think it's really one that I think we'll see more and more like opportunities to do that as organizations become

    111

    00:28:26.740 --> 00:28:49.249

    Amanda Bickerstaff: more kind of they grow, too, which I think is really interesting. And I, you know, Playlab is one of our we actually had a one of our webinars where Ian came on and shared playlab, and I think we really like to see like educational, like tool tooling that is appropriate and available, which I think is a place that specifically, that's also safe for students. And I think that actually is a place

    112

    00:28:49.250 --> 00:29:18.339

    Amanda Bickerstaff: that we're still lacking in and so, you know, I think that there is a real question as well as like, how do you shift from a teacher leader perspective into, like the supporting student perspective, which we definitely, if we have a little bit of a handle on, how to do that with teachers and leaders. I think we have a very, very like, much less of that for students. So okay, now to Patrick. So what do you think about those, those potential, those possibilities, those places to be optimistic. And then those places that we have some deep challenges.

    113

    00:29:18.500 --> 00:29:33.900

    Dr. Patrick Gittisriboongul: Yeah, Amanda, so like, say. And Bret, I do agree there's a lot more positive than there are negative. But I'm going to be the naysayer in the room. So part of the challenge with J. AI is that it's by definition biased right? So the tool.

    114

    00:29:33.900 --> 00:29:45.629

    Dr. Patrick Gittisriboongul: Ha! It it wi whoever developed it is, there's already gonna be bias into the tool. Right? So making sure that our staff, our students, our parents, understand that the tool that they're gonna be using

    115

    00:29:45.630 --> 00:30:09.299

    Dr. Patrick Gittisriboongul: has bias already built in. Now, how do you then adjust the way you prompted the way you do things, the way you you know articulate. How do you wait? How you make it? More things more efficient. Right? So, understanding that and making sure that you have, you know, other ways of working around it. That's key. The second is around what Amanda, I think you and I relate on is around data as well as impact.

    116

    00:30:09.370 --> 00:30:36.350

    Dr. Patrick Gittisriboongul: Right? So every organization is is really trying to develop what the impact is what the efficacy is. So now that we have all these AI tools, we have these chat bots. Is it making things a more efficient? Yes, check? Is it making students more to productive? Yes, maybe right. But it also helps shift that conversation right? It's about making sure that that human aspect of that dialogue of, you know, teachers and staff

    117

    00:30:36.350 --> 00:30:51.059

    Dr. Patrick Gittisriboongul: reaching out to folks, but like much more important than the AI tool itself. Right? So hey, everyone's gonna be assuming that we have AI already built in right, the email I sent to you the letter I wrote the recommendation, the plan that was submitted.

    118

    00:30:51.060 --> 00:31:17.110

    Dr. Patrick Gittisriboongul: Then we're gonna create this more suspicious society, right? And we're gonna look at things are gonna go. Well, really, looks like AI J. Dan. AI wrote it. Oh, this resume looks like someone took that in. Jenny. I just someone's cover letter. Was this right? So I think we're gonna have to start thinking about it as like every sector, every organization, even in education. Right? So as I'm looking at letters, resumes, and all the stuff that's being written like my

    119

    00:31:17.110 --> 00:31:46.050

    Dr. Patrick Gittisriboongul: like, my radar is on. And I'm like, I think the AI wrote that right. Let's meet this person and let's hear from this person. Right? Same thing is gonna happen with staff, with students, with parents. Everyone. Right? My AI is, gonna talk to your AI. And then what's gonna happen then? Right? It's gonna have to be a dialogue. It's got to be that human aspect. It's gonna have to be the experiences that really propel and move organizations forward as well as school districts, right? Because schools and districts and like like public education.

    120

    00:31:46.050 --> 00:31:51.829

    Dr. Patrick Gittisriboongul: that's like it's magic right? And the magic happens because of the interaction between

    121

    00:31:52.020 --> 00:32:04.350

    Dr. Patrick Gittisriboongul: teachers, students, parents, principals. Everyone is involved making sure that you want kids to thrive and really achieve. Right? So I think that's it goes back to the basics, right?

    122

    00:32:04.350 --> 00:32:32.520

    Dr. Patrick Gittisriboongul: Great tools, great bookkeeping. Great, all that stuff. But what are we really do using it for? Right? So efficacy impact stuff that you love Amanda. And I feel like Patrick and I like, are we are buddies on this. We're actually building. So Corey Lane crouches on is in the audience. We're building a responsible Jenny framework where we're asking Ed, tech tools to actually talk about transparency around bias and hallucinations, models, etc.

    123

    00:32:32.590 --> 00:32:53.069

    Amanda Bickerstaff: But I'm gonna ask. I'm gonna ask Dan to put into the chat the video I did today on Gemini. So Gemini is the brand new release from Google, which is has a lot of hype around being better than GPT. 4. And there's a 60 page paper, and I'm gonna ask Patrick how many times was bias mentioned in that 60 page paper.

    124

    00:32:53.210 --> 00:33:01.769

    Dr. Patrick Gittisriboongul: probably like a thousand 0 times 0 times like they they and I. There's a guy named Tobias.

    125

    00:33:01.940 --> 00:33:23.990

    Amanda Bickerstaff: So if you search bias, there's one guy named Tobias. So hi to bias. But it has a section. So actually, before I did the video and like before, when I was thinking about doing it, and before I fully read it today, it's like, Oh, they talk about the training data. And when I read it, I'm like, How is it possible in this day and age. And if you are here and maybe don't know as much about how large language models work is that they're trained on the Internet

    126

    00:33:23.990 --> 00:33:45.539

    Amanda Bickerstaff: and the Internet. No one ever said the Internet was an unbiased place. No one like II go to Reddit for balanced communication about both sides like this just doesn't happen. And there's what's explicit implicit bias. The thing that's crazy to me is that at this stage we have a brand new model that's supposed to be best in class by Google, who fired someone when they talked about AI bias. So Tim writ.

    127

    00:33:45.610 --> 00:34:08.440

    Amanda Bickerstaff: And they put out a paper today that does not once mention it. And so I think that when we talk about the challenges there are distinct challenges that we have, because these tools are not responsibly made. And they're definitely not being made for schools. And so, while there is an opportunity to embrace around efficiency, there are deep questions about reliability, about exacerbating or creating new biases.

    128

    00:34:08.440 --> 00:34:28.740

    Amanda Bickerstaff: And also the idea that it actually can like create spaces in which we don't actually know what the impact will be. So I think that that really is something that we think about a lot in the work that we do, and why I vibe so much with everybody on this panel. Because it is something that that balance is really important, that, you know, we have to continue as leaders. to

    129

    00:34:29.022 --> 00:34:47.639

    Amanda Bickerstaff: have the the strongest voice in this. And so I think that this is going to be really important. I'm actually gonna ask we're gonna go a little bit off book. So I wanna ask Samantha about like you getting like talking to Commigo about like what their tools actually done, and how they how that interaction change when you were equipped with good questions.

    130

    00:34:47.730 --> 00:35:11.499

    Samantha Armstrong: Well, I think the biggest thing is, I was talking about the idea. I am part of decision making, but I am not the final decision maker. So that's something that's important, that when you're having those conversations that's a wonderful thing to share. And it's a wonderful thing to have a foundation. But then, once I started to utilize a lot of those 6 questions that are in that resource for AI for education. The conversation shifted

    131

    00:35:11.500 --> 00:35:29.680

    Samantha Armstrong: dramatically to wait. So a lot of people know some people, no, not many people have asked those questions, and it's exactly what needs to be done, it needs to boil down to what are you ethically doing? What are you doing deliberately to make a difference, not to see how quickly you can get this out

    132

    00:35:29.680 --> 00:35:41.009

    Samantha Armstrong: compared to somebody else, and it was a great conversation, and very well received. But really necessary to have that conversation, and just be really, really transparent as much as they can be.

    133

    00:35:41.400 --> 00:35:49.750

    Amanda Bickerstaff: Yeah. And I think this is and Dan, if you don't mind dropping in the 6 questions resource, that would be great. And like, I think that

    134

    00:35:49.940 --> 00:35:54.650

    Amanda Bickerstaff: sometimes in this moment there's this like speed towards building.

    135

    00:35:54.680 --> 00:36:22.580

    Amanda Bickerstaff: and like not actually slowing down enough to actually have deep conversations about what a technology should be for a classroom for students, for teachers. And I think we're just kind of speeding through this goal. And everyone's saying, Okay, okay, like generative AI is the thing. Let's build towards that. So I think that's something that like, we know more and more that like, that's why that upscaling is really important. So we have lots of really good questions for the audience. So it's been like, 5 more minutes. And and I'm gonna do our like

    136

    00:36:22.680 --> 00:36:48.239

    Amanda Bickerstaff: best strategies. Okay, so what are our best strategies for implementing like? So if you have, we know we have an audience, global audience, we have people that have never started, that are starting. Those that are in like Samantha's point where you have to like. You need to manage up like, you know, and that's hard. And then you've got, you know, all these different perspectives. But like, what is your kind of number one or 2 best strategies for the people that are watching today.

    137

    00:36:50.180 --> 00:37:03.109

    Dr. Patrick Gittisriboongul: Any any order, Amanda? When we go. Wait, why don't we go, Patrick first, and then we'll go, Bret and Sam, we'll stop. We'll we'll end up. Yeah, go ahead, Patrick. Yeah, I would say, start small. Right? So like, with anything. Always start with the pilot, you know, identify

    138

    00:37:03.150 --> 00:37:26.350

    Dr. Patrick Gittisriboongul: teachers, staff adults, principals, whoever it is, to champion the cause, you know, and really identify some used cases, some examples of AI being used, some best practices across the board. That would be the first, I would say. Listen right, and really respond to any of the questions that come up to from anyone, right? All your stakeholders that are in your organization, and then really

    139

    00:37:26.400 --> 00:37:52.460

    Dr. Patrick Gittisriboongul: make sure that you also figure out ways to measure efficacy. I'm I'm gonna bring that back up again, because, you know, like your return on your investment, your time and and like again, gaps exist right? And so how are you gonna be? What does success look like for you as an organization? Right? Making sure that you have. You know your future state, your current state and really wrapping those 2 together. And so that way you're able to to start showcasing. You know this small

    140

    00:37:52.460 --> 00:38:14.699

    Dr. Patrick Gittisriboongul: pilot that you've you started so, and then also again embrace failures as well. So I think it's important to say, Oh, well, this this task force or this group wasn't the right fit. Let's figure out. Maybe this is an another group. Maybe it's the English teachers, right that we need to work with. Maybe it's the math teachers or the the the this school. So I would say, start small embrace failures and really stay focused around efficacy.

    141

    00:38:15.020 --> 00:38:34.050

    Amanda Bickerstaff: Yeah, I love that. And I think that we need to give our English teachers a hug. I think we need to give them a collective hug because this has been a year and so every English teacher in the world. We, we feel you. We care about you. We're sorry this is complicated, and that you guys are the first slide of defense. But II so agree, Patrick like it. Just

    142

    00:38:34.050 --> 00:38:58.830

    Amanda Bickerstaff: it's just about getting started. And like, I think it's so fun to break chat to like, do things that don't work that like it can be really, really fun. And like, I'm telling you right now, like starting small and even being willing to talk to students about this, like, I know, we avoid talking to students about some things because of inconvenient voice, like, we're not sure what they're gonna say. But I'm telling you right now, starting small and having those conversations with students.

    143

    00:38:58.830 --> 00:39:18.430

    Amanda Bickerstaff: you're gonna be blown away. And so I love that. That is something that I know that you guys incorporate that zoom voice component, and the work that you do. And I think that like it's such a great like way to to get people talking. And like, you know it is something that's it's a weird moment. We're in a weird moment, guys, let's be weird together. So Bret over to you. After that

    144

    00:39:18.780 --> 00:39:22.050

    Brett Roer: great segue. Oh.

    145

    00:39:22.160 --> 00:39:23.750

    Brett Roer: first of all.

    146

    00:39:23.960 --> 00:39:33.850

    Brett Roer: couldn't agree more with Patrick starting small, and yet I find myself working on a State, a whole State initiative. One of the first in the country. So we are. Next week, having the first time

    147

    00:39:34.120 --> 00:39:41.939

    Brett Roer: having folks from each esc come together for a virtual Pd. And so we were planning that we actually did think about

    148

    00:39:42.180 --> 00:39:49.889

    Brett Roer: one of the long-term goals by the end of this calendar, by the end of this school year is to have identified one to 2

    149

    00:39:50.160 --> 00:39:56.450

    Brett Roer: AI champions within each. Etsc, so when you actually take that across the whole state, you're only really talking about 50 to a hundred people.

    150

    00:39:57.020 --> 00:40:16.959

    Brett Roer: Now, though you have someone at each county that can become a train, the trainer that can become that voice that continues the work forward. And now you're not thinking state by state, you're thinking county by county, and then district by district, and so on, and so forth. So no matter where you're starting as a district leader, if that you know, if you're part of the audience that is making that up tonight,

    151

    00:40:17.050 --> 00:40:22.460

    Brett Roer: still, think about who are you trying to build capacity for, and then, most importantly.

    152

    00:40:22.610 --> 00:40:25.939

    Brett Roer: start small in terms of what you hope to achieve. So

    153

    00:40:26.000 --> 00:40:34.849

    Brett Roer: you know the audience, whether, again, you're trying to train the district, level, the school level, the classroom level. Make sure in that first iteration. It is fun.

    154

    00:40:34.920 --> 00:40:52.959

    Brett Roer: Do things that you know. You're always tasked with right, creating a positive culture, embracing learning, and bracing your successes. Figure out what that looks like in the environment that you are in charge of leading, but I would definitely say, create a playful experience around AI. There's no harm in this.

    155

    00:40:52.960 --> 00:41:09.850

    Brett Roer: You can only get better, and, as I hear often. This is the worst version of AI. We're ever gonna encounter. Right? This is like Beta compared to where we're gonna be in a few years. So embrace that this is going to keep changing and then the last thing I would just say in terms of starting small is

    156

    00:41:10.280 --> 00:41:15.580

    Brett Roer: really make sure you again. Think of at the end of that session. Professional development.

    157

    00:41:15.920 --> 00:41:34.470

    Brett Roer: What is something someone tomorrow could do and make their day just a little bit better. You don't have to change the world of education, or what your classrooms look like, we know that's what fails in most initiatives and professional development settings don't go too fast with this. Make sure you're listening to your audience and capturing what's one thing that there's a current pain point for you.

    158

    00:41:34.710 --> 00:41:44.269

    Brett Roer: and figure out how to use AI to make that a slightly better experience for them that will just overall make them more more apt to want to continue this learning process along with you

    159

    00:41:44.610 --> 00:42:11.579

    Amanda Bickerstaff: absolutely. And we see that value. Creation is like so important. Like, if you can show value to an educator, we are educators. What do we wanna do is educate. And so like, sometimes we have, we have a convincing thing we need to convince. But we did a we did a a. A. Pd. In Glasgow. So our fifth country, which is super cool and so, and they were really nice to give us feedback on the session, and like out of the feedback. There were like 5 times that, the

    160

    00:42:11.580 --> 00:42:24.269

    Amanda Bickerstaff: teacher said. I now see the value for my students like I was scared of doing this with my students because they found value in themselves. So I think this idea of like finding that thing that's going to like. If you hate rubrics

    161

    00:42:24.530 --> 00:42:49.039

    Amanda Bickerstaff: or you don't like you have newsletters, or are you need to think about like setting out a plan, or you don't even have a good idea for your. You know the bulletin board, if you're an elementary school teacher like these are the opportunities. But like, when that happens, where do we transfer that to the bigger thinking in terms of what's gonna do at the student level. Because, again, like that is like, that is where we really need to start going more and more especially as we get into the next semester.

    162

    00:42:49.040 --> 00:43:01.199

    Amanda Bickerstaff: So, Samantha, you're gonna you're gonna close us off before we go to the question. And anybody has. We have so many good questions, but if you have time we're, gonna I'm sorry if you have questions, please, put in the QA. And then we'll make sure that we get to them. So, Samantha.

    163

    00:43:01.760 --> 00:43:22.170

    Samantha Armstrong: and it is coming down to that district level. So it's kind of a neat place to think like, what's that one administrative task that you need but collaboratively talk to your grade level. Talk to your department, you know what will make the bigger difference for us, because again, we're not isolating ourselves just having a conversation with the chat bot, we're actually sharing that information and working with other people. But then, from the kid point of view.

    164

    00:43:22.170 --> 00:43:32.879

    Samantha Armstrong: They love poking holes and things they love telling you something's wrong. They love to go back and say, Okay, so this is what the and then you name the tool. Provide. Excuse me, provided for us.

    165

    00:43:33.020 --> 00:43:50.670

    Samantha Armstrong: But now let's change the point of view or debate I/O like. Let's just go ahead and say we think that you know self-driving cars are great. But the department of Transportation in Pennsylvania doesn't go. And then it's conversations. It's speaking, it's listening. It's critical thinking that kids that never

    166

    00:43:50.670 --> 00:44:08.090

    Samantha Armstrong: maybe knew where to start. Now they have a reason. They have the novelty and the variety. A teacher. I can think of myself when I was trying to differentiate things is that this is awesome. And in my real time, data going through what everybody needs. And then that lasted for X number of minutes. And now I need to do that again.

    167

    00:44:08.090 --> 00:44:29.610

    Samantha Armstrong: But now I could do that with the framework that I know I can work from to really, truly meet those kids needs. So I think, get that administrative task. Get that one strategy that, you know, takes you way too long. That, you know, is gonna make a big difference in your kids. Lives. Go there to start and then work towards the other pieces. But just look at it as the guidelines and the guardrails, because.

    168

    00:44:29.710 --> 00:44:39.990

    Samantha Armstrong: to say it's static, it clearly isn't. It's gonna keep on moving, and we need to be flexible, but understand what the why is and what we really need to accomplish for our kids and for our staffs.

    169

    00:44:40.260 --> 00:44:53.160

    Amanda Bickerstaff: Yeah, oh, man, so II agree completely. And I think that this is this is the the unique place we're in right, because it is like there are so many different approaches. But I think that there's a real like.

    170

    00:44:53.160 --> 00:45:17.939

    Amanda Bickerstaff: Use us as your like. You know whether it's Brett working with, you know the kind of strategic level. If it's Samantha, and it's Patrick working at like the actual like school and district level. That is a huge theme of like, let's just get started. And like, find the way in and, like you find the way in. Then you could really extend and expand and like, have a longitudinal piece like, I started to think about us at AI for education as change managers

    171

    00:45:18.200 --> 00:45:28.720

    Amanda Bickerstaff: like I never thought that's what we were trying to do. But like it feels like a change management process. And I think that that's what we're talking about. A lot of times with change management. The worst thing you could possibly do would come in and say.

    172

    00:45:28.880 --> 00:45:29.790

    Amanda Bickerstaff: Here.

    173

    00:45:30.310 --> 00:45:54.820

    Amanda Bickerstaff: you know, suck it up. Everyone like you just gotta do it. It's gonna this, is it? And like, you know, this is what it is, and like, especially when we have so much noise. So I think that like if everybody, if you have to head off because I know we're coming to the 45 min mark. I hope you feel inspired by this panel in terms of like, we're all at different stages, but there are so many ends to this and that, the more that you can do it in a consistent and systematic way, that prioritizes and privileges

    174

    00:45:54.820 --> 00:46:19.600

    Amanda Bickerstaff: failure and experimentation and value creation and efficiency. You're in your best possible spot to really start to rethinking what this is gonna look like in your education system going forward. So thank you. Everybody for joining. If you have to head off, we totally understand. If you can stay. We have so many good questions. And so I'm gonna we have a lot of good questions, and we have 15 min and so I'm going to let's we're gonna start with

    175

    00:46:19.600 --> 00:46:47.089

    Amanda Bickerstaff: a question from Amanda and so she talks about those we speak to those who are hesitant to use Jenny I. Because they fundamentally do not believe it's ethical because of the he of the actual training models. Right? Like the the, these models themselves are really deeply, potentially harmful about the way that they've been created. So I'm gonna go to either Brett or Patrick, who, I feel like, would want to answer this. So either, or and Samantha, if you also want to answer. But does anybody have a strong want to answer this?

    176

    00:46:47.360 --> 00:47:12.040

    Dr. Patrick Gittisriboongul: Yeah, I would say, you know, we we conducted a an speed debate on AI ethics in our first task force. Just so, you know. And so I would ask, I would ask the question, back is, Are you comfortable with AI being used in the healthcare industry. We are you comfortable with using AI for personalized marketing? If you aren't or you are I'm gonna keep asking those questions because AI is in everything.

    177

    00:47:12.210 --> 00:47:15.699

    Dr. Patrick Gittisriboongul: right? So whether or not you're not comfortable with it. Now

    178

    00:47:15.860 --> 00:47:34.919

    Dr. Patrick Gittisriboongul: you're gonna have to, because it's in your phone. It's on the. It's on your computer. It's everywhere like you think about it, cameras that are in stadiums, cameras that are out in community. They are AI generated as well. So I would say, let's have that discussion. Let's have that debate, because, that that's where it's at. AI is here.

    179

    00:47:34.920 --> 00:48:01.119

    Amanda Bickerstaff: Yeah, a little bit of like. So Patrick is on the, you know. Get on the bus, or get rid of a bit a little bit, I mean, is it? Some of this is just like straight up, pragmatic like. If you search for a pair of pants on it like Amazon, and you suddenly see pants everywhere that is artificial intelligence. And like, that's just, and it's been around for a hot minute. And so, and if you're comfortable with Amazon serving you.

    180

    00:48:01.120 --> 00:48:09.659

    Amanda Bickerstaff: maybe you're not, but maybe you at least accept it like, it's this is, it's different. But it's that same foundation. So I think that you know, sometimes we have to call it like it is.

    181

    00:48:10.500 --> 00:48:24.390

    Brett Roer: yeah, I just I wanna echo something so Patrick said this earlier, right? The biggest challenge amongst many is is the the biases that are embedded into AI. So I wanna address this question. I think from 2 perspectives, right?

    182

    00:48:24.430 --> 00:48:26.849

    Brett Roer: One. This is a, this is a

    183

    00:48:26.920 --> 00:48:28.559

    Brett Roer: a mindset.

    184

    00:48:29.000 --> 00:48:36.750

    Brett Roer: It there is inherent biases across education and in society. I was a social studies teacher across New York City,

    185

    00:48:37.160 --> 00:48:51.360

    Brett Roer: teaching a curriculum that wasn't very inclusive to a population that none of the students could relate to the people that they were learning about in history. So this is not a new issue in education. We also, though for the first time like I couldn't rewrite that history book. I would have to find resources outside of it.

    186

    00:48:51.430 --> 00:48:52.550

    Brett Roer: I know.

    187

    00:48:52.610 --> 00:48:59.830

    Brett Roer: can use AI to actually create inclusive stories that are representative of the populations that I serve.

    188

    00:49:00.430 --> 00:49:08.049

    Brett Roer: I know we all know this, but you can actually create translations for students. You can use colloquialisms from their native

    189

    00:49:08.240 --> 00:49:17.400

    Brett Roer: parts of the country. Right? So Spanish obviously has variations of depending on where you live. You can write things for that student and ask it to use expressions from that

    190

    00:49:17.600 --> 00:49:22.670

    Brett Roer: part of the country. Is it perfect? No, but better than mine, and also

    191

    00:49:22.930 --> 00:49:39.200

    Brett Roer: highlighting the fact that you also can be part of that change by being very intentional about what you put into that AI tool. So you can state. For example, we know this, or maybe not everybody. But if you use Dolly and you ask for certain professions, you are going to get very skewed gender and cultural

    192

    00:49:39.430 --> 00:50:01.010

    Brett Roer: pictures that are just, you know, biased and you know sexism and racist. And so you can write those things in your prompts, stating, II would like you to create a picture that looks like this because these are members of society and these roles. And and you know it's a small part to play. But it's it's a way you can show

    193

    00:50:01.100 --> 00:50:05.199

    Brett Roer: even students, what is a cultural biases right now around this field.

    194

    00:50:05.430 --> 00:50:14.430

    Brett Roer: how can you make sure that like you can train AI, or how would you make a chat button? Have those culturally relevant discussions? So anyway, I just want to say, meet people where they are.

    195

    00:50:14.520 --> 00:50:27.100

    Brett Roer: Show them also. The last thing I'll say is, AI does have some things that are not bias. So just, for example, grammar and fluency. Rules are not bias. So showing someone like you can use this as a way to get your students from here to here.

    196

    00:50:27.150 --> 00:50:40.160

    Brett Roer: meet them where they are and what they do accept as technological improvements and then show them the other tools. But yes, go slow. You're not gonna change mindsets overnight around AI. But use the power to convince them that they play a role in where AI is going.

    197

    00:50:40.500 --> 00:51:07.300

    Amanda Bickerstaff: Yeah. And and then there's another question. I'll just kind of quick answer in the sense of this idea of like, do we have questions in terms of students thinking critically around these tools? And when the most important things we can do is actually show students hands on like like I did a video where I uploaded my image to Gbt for vision. And it told me like, how did that? I should smile less, and talked about my hair or healthiness, and I should ruin Myles, and that, like Dan, was the CEO, not me

    198

    00:51:07.300 --> 00:51:17.190

    Amanda Bickerstaff: and the kids that I showed that to the level of like absolute, like horror and like it, stopped them in their tracks. And they were like.

    199

    00:51:17.190 --> 00:51:35.520

    Amanda Bickerstaff: I need to think about this differently. And I think that that is like it's hard to talk about bias, because it's everywhere. And it's it's sometimes it's explicit. Sometimes it's harmful a lot of times it's harmful. Let's just say it's mostly harmful, if not always harmful. But like it's hard to to see and understand. But like, you can actually show someone

    200

    00:51:35.520 --> 00:51:51.579

    Amanda Bickerstaff: what American biases to the models that have been created. And I think that that is like actually really interesting, because I think we've never been able to do that. But now you could do it like there's so like that. There's a meme right now that little thing where make it the most Turkish or whatever. And it is so

    201

    00:51:51.590 --> 00:52:14.710

    Amanda Bickerstaff: racist and also deeply misogynistic and sexist like. It's never a woman. That kind of thing like something like that can really like like, create these spaces where students really understand what this means in a deep way. Okay, so I'm gonna go to to Samantha. There's a question about like from Rob about shifting from this idea of assessment as a final product, a student submit to assessment as a process

    202

    00:52:14.840 --> 00:52:22.149

    Amanda Bickerstaff: and like, how do you think about that when you're taking about instructional coaching around the integration of AI into actual assignments and supporting student learning.

    203

    00:52:22.690 --> 00:52:50.399

    Samantha Armstrong: Well, I think when you look at it as at the pro, as the process where, especially for writing, teaching, writing can be painful when it's not concrete to them. They don't understand why this set step means this when you say to revise it and hand it in. That means write it more neatly, or type it again. And just change a word. And so now you're saying, okay, let's just focus today on this one piece and iterating with them. Okay, what if we change the tone of this?

    204

    00:52:50.550 --> 00:53:07.079

    Samantha Armstrong: They may not not have ever known what tone was before. But this is suspenseful. No, this is this is so now you're also utilizing the vocabulary that they're gonna need to use in their own writing. And then you're changing whether it's point of view whether you're walking through those ideas. But you're showing them those things.

    205

    00:53:07.080 --> 00:53:23.529

    Samantha Armstrong: and they're actually physically seeing those things happening and then giving them time. Okay, now for that one little piece you're chunking this. You're working in that process way instead. So here's where I started. Here's where I ended up, and reflecting throughout which we always want them to do anyway.

    206

    00:53:23.570 --> 00:53:38.810

    Samantha Armstrong: turn to a partner, talk to them. This is why I added this, this is why I changed that. I'm not sure what to do here and have those conversations. So yes, you're using it to reiterate something, but you're also continuing to get them to push themselves further and say.

    207

    00:53:39.060 --> 00:53:42.530

    Samantha Armstrong: this is a great starting point. I never would have known where to begin.

    208

    00:53:42.810 --> 00:53:54.539

    Samantha Armstrong: Now this is where I'm going to take it, and giving student voice to something that they may never have wanted to, because they wanted to just sit back. They didn't want to talk about it, but now they can have that conversation and say, here's why this is mine.

    209

    00:53:54.740 --> 00:54:01.280

    Samantha Armstrong: and that's a completely different way of looking at things. That maybe they never felt comfortable with trying before.

    210

    00:54:01.900 --> 00:54:24.910

    Amanda Bickerstaff: Absolutely. I mean, just in the safe. Spaces to land are really interesting with this and even differentiating where you know, if you you could give everybody like a chat TV pass like it will say they have 13 to 18 year olds with permission, with training, and you say you get to use it once in assignment. But if you really struggle with getting started, I'm totally okay with you using it for thesis statements. But you have to tell me your decision making.

    211

    00:54:24.910 --> 00:54:37.320

    Amanda Bickerstaff: But if we're at the if you have trouble with like the actual body, then you're gonna help. II want you to help us take your really well form. You know your your thoughts and help you put it into something more refined, or, if you need some help with editing.

    212

    00:54:37.320 --> 00:54:56.029

    Amanda Bickerstaff: and I think that that's where we can even differentiate, support and have students make choices and help them make choices and learn to understand what they need, the help with, which is also really hard to do like I think we get to college, and we don't actually know what we do well, and what we don't do well, right? Like we're like surprised that it's like, Oh, without support. I actually really can't start a paper.

    213

    00:54:57.340 --> 00:55:20.590

    Amanda Bickerstaff: This one. This is why nothing we do is paper. I didn't know that till I went to college. Let's just say that. But I think that this is really interesting in terms of like how that can be really strategic and like shift resifting like what learning is and what support is. So we have some more questions, and we're gonna do a little bit of rapid fire. So they asked one of you each of these questions, and if you hate it, you can pass it. Okay. So question from Jane is very, very tactical.

    214

    00:55:20.590 --> 00:55:28.150

    Amanda Bickerstaff: like, what do you think about like actually mandating AI literacy and like a number of Pd hours on AI.

    215

    00:55:28.150 --> 00:55:29.170

    Amanda Bickerstaff: What do you think

    216

    00:55:29.700 --> 00:55:31.799

    Dr. Patrick Gittisriboongul: I'm gonna go to? Patrick agreed.

    217

    00:55:31.810 --> 00:55:36.089

    Dr. Patrick Gittisriboongul: Yes, California already, mandate mandates, that now, too. So

    218

    00:55:36.200 --> 00:55:39.089

    Amanda Bickerstaff: how much do you think if you're gonna give a number?

    219

    00:55:39.250 --> 00:55:43.049

    Let's say number of Pd. Hours

    220

    00:55:43.610 --> 00:55:45.110

    Dr. Patrick Gittisriboongul: like 4 h.

    221

    00:55:47.220 --> 00:55:50.900

    Amanda Bickerstaff: I got a Brett Brett got out. That sounds big. What do you think, Samantha?

    222

    00:55:51.790 --> 00:56:06.899

    Samantha Armstrong: Well, I do think that we need it, and I would say at least that amount of time, because I know that when I talk to the assistant superintendent saying, let's get everybody to have that foundation, you know. Let's just start there so that conversation can begin. And it is a necessity to have just that common ground.

    223

    00:56:07.870 --> 00:56:21.889

    Amanda Bickerstaff: Absolutely. Okay. So Nick asks. And I'm gonna go to Brett. Do you foresee districts implementing chatting teams, similar popular lms with students and teachers or purchasing specific education tools that utilize. AI,

    224

    00:56:22.730 --> 00:56:28.700

    Brett Roer: yeah. So I think this is one of the more fascinating things I'm excited to see where the next few months go because

    225

    00:56:28.740 --> 00:56:40.029

    Brett Roer: so many ed tech companies and so many solutions have embraced AI in a good way. Right? They're trying to make their solutions better more personalized for student learning, and for, you know, teacher support.

    226

    00:56:40.260 --> 00:56:54.240

    Brett Roer: However, there's also. Now again, we've named so many innovative free resources that people can kind of make those on their own. Not everybody can has that skill set right now, but the fact that those can now be like open, sourced, and free

    227

    00:56:54.350 --> 00:57:07.419

    Brett Roer: is a really interesting point in in society especially in the educational space where you might not need to have these large district contracts and purchasing orders, because some of those things can now be created through Chat TV. Very simply. And once

    228

    00:57:07.420 --> 00:57:26.769

    Brett Roer: someone does that and is able to share that in a way that's replicable, maybe tick tock, like it's gonna scale. And teachers are gonna go to that before they go to those large district partnership. So really interested to see where that goes, Amanda. There's one thing I want to say about that last thing you just asked about number of hours. I don't have an answer for the number of hours. But Pd. Cannot be talking at people about AI. It

    229

    00:57:26.810 --> 00:57:35.240

    Brett Roer: the number of hours that matters is, how much time are you giving them to play on AI? Not how much time are they sitting in a conference room or the cafeteria, the auditorium

    230

    00:57:35.810 --> 00:57:40.509

    Brett Roer: having someone tell them about AI. So that's what matters to me. I hope California is doing that.

    231

    00:57:41.490 --> 00:57:46.870

    Amanda Bickerstaff: I totally agree. I mean, that's we we were. We will not do

    232

    00:57:47.050 --> 00:58:08.729

    Amanda Bickerstaff: a first intro A under 75 min, and B without hands on keyboard like, even if it's someone is willing to pay us. And of course, if you would like to pay us. We're very happy to to do that. But you know, because of that, like, it has to be hands on keyboard. We refuse to do that because it's it's up to that point. It has to be hands on. Okay. Last question, rapid fire.

    233

    00:58:08.910 --> 00:58:14.380

    Amanda Bickerstaff: A lot of policies at schools are deficit based. How can we make it asset based

    234

    00:58:17.410 --> 00:58:25.639

    Amanda Bickerstaff: anybody? Wanna take that it? How do we frame it? More positively? I can always answer, too, as a person that's thinking about policy all the time depending on what you guys want.

    235

    00:58:28.050 --> 00:58:39.609

    Dr. Patrick Gittisriboongul: I think we gotta create a culture of making sure that we identify the strengths in our kids. Culture starts with the behaviors and actions of the adults in the room. So if the adults in the room don't have that mindset

    236

    00:58:39.700 --> 00:58:56.410

    Dr. Patrick Gittisriboongul: out, so that they don't need to be there. So why are they in public education, in the first place? So that's the first thing I would say so. Sorry I'm it's it's rapid fire guys. No, I think it's absolutely I mean, it's it's real like, II totally agree, like, I think, that

    237

    00:58:56.490 --> 00:59:04.670

    Amanda Bickerstaff: some like, we need to be realistic. But like also, it's about a culture of change and openness. And I think that we're gonna learn a lot about where schools are

    238

    00:59:04.680 --> 00:59:29.599

    Amanda Bickerstaff: based on the next couple of years in terms of like, who who is taking the risks? Who is like redefining learning, who is creating more asset based or strength based approaches to these tools. And I think that's just really interesting. And so I'm gonna stop there because let's stop with an area of strength. Because I think this is an opportunity. And we, this is why we love this series. So first of all, thank you to everyone on this panel. Not only

    239

    00:59:29.600 --> 00:59:54.419

    Amanda Bickerstaff: am I so privileged to have you part of my community and people that I consider very important, and in the world of education, but also as part of the AI for education journey. Really appreciate that. But I hope that you, everyone that audience want to say thank you to you as well cause. I know your days are busy, but I just wanna say thank you to everyone here for sharing your insights, for being practical, and for saying that you know what? There's actually not one way. There are many ways. So the find that that the way in that

    240

    00:59:54.420 --> 00:59:55.440

    small step

    241

    00:59:55.440 --> 01:00:11.050

    Amanda Bickerstaff: that way to make things better for once. That's what we want. So just thank you to everyone, for here everyone at home go to bed if it's too late. If you have a day ahead of you, enjoy like, if you're ready to have dinner, do that as well. Just really appreciate everybody and hope to see you at our next webinar. Thanks, everybody.