Education, Social Cohesion, and Democracy in the Age of AI

"There are lots of reports and events about AI and AI and education–few if any have focused on what the emergence of AI means for social cohesion and preparing young people for democracy.

Educators and decision-makers alike will have to wrestle with how AI will interact with the competencies we expect for our learners, how we judge the quality of our educational investments, how to adapt educator practice, and policy. Join our conversation with educators and experts discussing these issues as well as the upcoming report, "Uncharted Waters: Education, Social Cohesion, and Democracy in the Age of AI."

  • Amanda Bickerstaff

    Amanda is the Founder and CEO of AI for Education. A former high school science teacher and EdTech executive with over 20 years of experience in the education sector, she has a deep understanding of the challenges and opportunities that AI can offer. She is a frequent consultant, speaker, and writer on the topic of AI in education, leading workshops and professional learning across both K12 and Higher Ed. Amanda is committed to helping schools and teachers maximize their potential through the ethical and equitable adoption of AI.

    Mark Stolan

    Mark Stolan is a lifelong educator and has a strong passion for helping educators maximize their potential with technology. He currently serves as the Platform Coach for New Tech Network, a non-profit organization that helps bring project based learning to schools all across the country.. In addition to his expertise on Learning Management Systems and leveraging technical tools in a PBL environment, Mark also brings experience serving as a founding member of the Lobo School of Innovation, a New Tech Network school in San Jose, CA founded in 2016. He also got the unique opportunity to be a math teacher for 8 years and an assistant principal for 3 years at the middle school he attended as a student! Mark is a passionate believer in the ability for technology to shape the education experience for all students and is a strong advocate for integrating new technological tools into the classroom.

    Ace Parsi

    Ace Parsi serves as the Director of Coalition Engagement at iCivics. He first got exposed to iCivics through serving on the Steering Committee and the Implementation, Pedagogy, and Curation Task Forces of the Educating for American Democracy project. Prior to this role, Ace held a number of policy, research, and school-based positions including as Director of Innovation at the National Center for Learning Disabilities, Director of Deeper Learning at the National Association of State Boards of Education, and Director of Policy at the National Service-Learning Partnership. Other organizational stops on that journey included equity-focused work at the Alliance for Excellent Education, Policy Analysis for California Education, and Fair Oaks Community School in Redwood City, California.

    Ace’s civics journey began when he and his family immigrated to the US from Iran when he was eight. His own experience as an English language learner and free and reduced price lunch student led him towards a passion for utilizing education as a driver for greater equity and inclusion. He holds a Masters in Public Policy Degree from the Goldman School of Public Policy at the University of California, Berkeley. He is a proud resident of Morgantown, West Virginia where he resides with his wife, Clare, and daughter, Ella.

  • Amanda Bickerstaff: Hello, everyone! Welcome we're gonna be getting started just in a couple of minutes. We're glad to have you here. It's gonna be a good, strong little group that we have going really excited about having you all here, as we're focusing on something that I think is ultimately really about as important of a topic as we possibly can have right now. So really excited to have you all here. If you wanna say Hello, I see some familiar faces so glad that you can join us.

    8

    00:01:21.600 --> 00:01:27.559

    Amanda Bickerstaff: Yes, we are really excited to have you all here today focused on

    9

    00:01:27.560 --> 00:01:52.560

    Amanda Bickerstaff: we I called this presentation, democracy and AI and so the I am, you know, as you all know, if you've been with our our webinars before, we're really, really focused on the impact of AI on education, but also on responsible AI. And when I met Ace who's gonna be featured in just a minute a couple, I guess about a month ago I was really just so pleased to see that civic now, and a large organization

    Amanda Bickerstaff: of of organizations focused on building. You know, students civic awareness, their skill sets, their, you know, and their connection to our democracy was just really pretty amazing. And so there's a new report that's been released just 2 weeks ago. And so we're gonna have a bit of an overview of what that report is. And then we're going to have some time with some special guests who are not in the scene, including

    Amanda Bickerstaff: a student. We love our student voice here. So Zombie is gonna be sha sharing her perspective of why this is so important from a student perspective, as always say, Hello, get involved. I mean, we're gonna be a little tight group. So like, say, Hello, I know some familiar faces in the and our wonderful group. So where you are, where you're coming from, and also, if you have resources to share. What's really cool about today's Webinar is, it's focus on

    Amanda Bickerstaff: a key piece of content. You can take back to your context. And so not only do we have this report, but we also have all of these amazing resources that are really practical around what this moment of time means for our students in our schools. So what I'm gonna do is I'm gonna stop sharing now, really excited to have you all here. And I'm gonna ask I used to come to the this. There we go. Oh, and I think maybe did. I is our.

    Amanda Bickerstaff: I'm gonna figure out how to make sure that the the chat is gonna work, cause I think that's probably an issue. But I wanna introduce Ace the first ace. I always ask the question, love you to introduce yourself, but also, what was your first experience? The gender of AI.

    Ace Parsi: Absolutely thanks so much, Amanda, for for the time. I'm my name is Parsi. I'm the director of Coalition engagement here at Isabex, which is the organization founded by the recently late Justice Sandra Day O'connor, who deeply believed that schools had a core mission to our constitutional democracy.

    Ace Parsi: And and I share that belief. I I'm I'm a first generation immigrant. I've immigrated from a country who is people are fighting for democracy every day. And so, you know, as the song goes, you don't know what you got until it's gone.

    Ace Parsi: and this is part of what we're talking, what we're here for is that there's just like an inflection point for our society here. My first experience with generative AI was, I was listening to an interview with Sam Altman, who was talking about this being

    Ace Parsi: that people not understanding the concept of exponential growth, that this is all.

    Ace Parsi: write it, raising up as an a in in an exponential way. And

    Ace Parsi: I had thought about that because we were just coming out of Covid, and he was comparing it to Covid as another example. And so I got on this site, and I started looking at Chat Gbt. And I started looking at the comments. And

    20

    00:04:30.590 --> 00:04:48.960

    Ace Parsi: you know it was. It was. People were saying that this is the most important inventions inspire. And I thought to myself, like, Oh, you know, we're talking about this in very playful way, and as we should, you know, there's some things that are just really fun about it. But there's also really important implications for our democracy.

    21

    00:04:49.522 --> 00:04:53.300

    Ace Parsi: And so if we made the mistake

    22

    00:04:53.880 --> 00:05:09.811

    Ace Parsi: 1213 years ago, when we put the most powerful tool humanity had then created into middle schoolers back pockets without any sort of training. We couldn't do that in this case with AI, that this has incredibly important implications for how we come together as communities.

    23

    00:05:10.300 --> 00:05:14.519

    Ace Parsi: our democracy as a whole. It's it's long-term health and fate.

    24

    00:05:15.051 --> 00:05:23.499

    Ace Parsi: And so that's, I think, the level of importance of this particular conversation. And so that was my first experience with it.

    25

    00:05:23.940 --> 00:05:28.999

    Amanda Bickerstaff: Did you have a first prompt? Did you try it out? What! What was your your first? Go.

    26

    00:05:29.350 --> 00:05:45.836

    Ace Parsi: It was, I think there was a something about writing a valentine to my wife as a country music song, including some other, and I I and I think we were my daughter had put something in there about a rabbit librarian in the in the

    27

    00:05:46.760 --> 00:05:51.479

    Ace Parsi: and that image prompt there. So I think that those were the things that I'm like. Oh, this is playful.

    28

    00:05:51.490 --> 00:06:08.319

    Ace Parsi: but I'm glad I get to go back into my professional thing and think about like with others who really care about this issue and that connection to say like, Well, what do we do with this? To make sure that we take advantage of the positive without the challenge. But Librarian Rabbit, I think, was the probably the first thing that I saw.

    29

    00:06:08.320 --> 00:06:34.510

    Amanda Bickerstaff: I love it. I always love everyone's first prompt. And so as we're gonna get started. The way that this gonna run is that we're going to have Ace actually run through this report and also focus on some key like resources you can bring back to your contacts. I know we have people from all across the spectrum here, and those that will be watching later. And so I put this the the link in the chat so you can follow along and ace it's up over to you now.

    30

    00:06:34.510 --> 00:06:35.799

    Ace Parsi: Thanks so much, Amanda.

    31

    00:06:36.161 --> 00:06:50.079

    Ace Parsi: So I wanted to note that our coalition, the civics now Coalition which some of you will be familiar with includes now over 330 organizations, and it includes a pretty diverse set. So we have the

    32

    00:06:50.080 --> 00:07:07.429

    Ace Parsi: Ronald Reagan Presidential foundation and the Jfk Library, the Urban League and the Cato Institute. We have pro CRT anti CRT type organizations, and they just all believe the schools had this mission, and we were partnering with our what I call our sister coalition, which is the listen first coalition. They're having an wet vent.

    33

    00:07:07.430 --> 00:07:34.919

    Ace Parsi: The national week of conversation actually right around that AI literacy day so hopefully that that could be a part of that work as well. And there are over 400 organizations that focus on social cohesion. So our imperative was around that sort of intersection education. K, 12. Education, that civic mission of education, and how it plays into our democracy and overall social cohesion. Wanna highlight? So Mark was part of this group. But we had some really interesting diverse

    34

    00:07:35.412 --> 00:07:40.957

    Ace Parsi: thinkers on this. On this topic. And I'll go over

    35

    00:07:41.780 --> 00:08:04.479

    Ace Parsi: I will urge you. I mentioned this in our webinar that we did during civic learning week to you know the report itself is, I think, 10 pages. There's some collateral products and things that are 2 pages I encourage you not to put it through AI to just to actually read it and think about and be part of this conversation, because this is where these sorts of conversations begin.

    36

    00:08:04.840 --> 00:08:06.722

    Ace Parsi: our our beginning.

    37

    00:08:07.695 --> 00:08:21.139

    Ace Parsi: imperative was. I think, that there's some people out in the field that will speak about AI in apocalyptic terms, and there are some people that will say, this is the greatest thing since sliced bread. We think that it could be both.

    38

    00:08:21.180 --> 00:08:33.440

    Ace Parsi: and the reality will. W. Which one will win out is not, going to be accidental, it'll be intentional. It'll be based on our intention and our proactive actions and conversations that we have.

    39

    00:08:33.817 --> 00:08:50.630

    Ace Parsi: We the field, we the AI for education field, we, the civics, now field we the listen first field around how this plays out. So we wanted to embrace that there's both really positive realities for our democracy, and also challenges that we have to address.

    40

    00:08:50.750 --> 00:09:05.449

    Ace Parsi: And then in our conversation it was. It was interesting. And, Mark, I'm taking us back to the memory lane right now, because then, one of the first conversations it came out of it. Really like everything is civic, right? Like people having jobs is civic.

    41

    00:09:05.894 --> 00:09:29.279

    Ace Parsi: You know, the the everything in some context is, is the arts are civics. In some way. So we have to kind of draw some boundaries around that. So what we said that this group and this work is going to focus on is that K. 12. Civic learning. So not higher. Ed, not workforce and AI. And then, there are a lot of issue specific. So like Lgbtq rights.

    42

    00:09:29.663 --> 00:09:42.319

    Ace Parsi: The districts, free speech. What does it mean? All that kind of? And what we wanted to really focus on was not the issue specific, but the transferable application. And we know.

    43

    00:09:46.410 --> 00:10:06.249

    Ace Parsi: Notice that we didn't wanna go beyond our I wanna play with this ideas and the that we have. So we really focus on the humanities. We have the National Council for teachers of English and a lot of people in the civic space and other. So we th, those were the this, that's where the areas coming from. And our questions really related to those in those areas. One is.

    44

    00:10:06.585 --> 00:10:23.370

    Ace Parsi: how can it? Both the civic learning field embrace the power of this for good in very Spider man Spiderman like terms. And how do we? How do we make sure that young people are really prepared to use this tool for the betterment of social cohesion and democracy

    45

    00:10:23.410 --> 00:10:37.799

    Ace Parsi: and and then, what are the applications like? We are thinking about this for schools. But the listen. First coalition is a 400 plus coalition of organizations that don't just care about schools. They care about broader social harmony, cohesion, bridging divides.

    46

    00:10:37.810 --> 00:10:40.400

    Ace Parsi: And so we wanted to see the applications

    47

    00:10:41.190 --> 00:11:01.359

    Ace Parsi: and getting back to that point that this can have both negative and positive implications. And ha! What are the actions? And what are the realities that we can look to? And so there's a 2 pager that Amanda will share into the chat as well. That kind of like draws this out in in more text. But basically

    48

    00:11:01.360 --> 00:11:21.650

    Ace Parsi: for us, for to be successful, and as this technology changes. There were some specific lenses that we wanted to focus in on our actions, whether those actions are practice, policy, research or otherwise, that one that the negative challenges will come if we're ethically vague.

    49

    00:11:22.106 --> 00:11:43.150

    Ace Parsi: About, you know what these, what the technology is going to do. And we just kind of have a wait and see. But if we can incorporate young people into the ethical conversation at the very outset, that we can possibly have them be part of that conversation, a a and and really impact something that will have

    50

    00:11:43.270 --> 00:11:48.509

    Ace Parsi: just massive implications for their lives for years to come, so that they need to be part of that conversation.

    51

    00:11:48.540 --> 00:12:12.820

    Ace Parsi: We noted that there's uses of AI that can be very individualistic, and that that can actually reduce the size of our communities. And so in the instruction in our how we practice that sort of civic learning. We wanna make sure that that AI use. There's social cohesion and civic benefits when it's collaborative. So that we engage students together in this work.

    52

    00:12:13.153 --> 00:12:39.016

    Ace Parsi: We know that this is changing every single day, like, I think that there was one meeting where we started the meeting. And then at this, while the meeting was happening Openai announced that Chat Gbt could look at images, you know. So this is a lot of it is changing, and our school systems are not really good with change. And so we need to set up processes that are impact. Our reactions can be more dynamic.

    53

    00:12:39.360 --> 00:13:05.159

    Ace Parsi: we one of the things that's a big challenge is that the AI gives answers in very authoritative terms. So so students can passively accept that input. And we know that that can be a huge challenge for misinformation and disinformation. We wanna make sure that students are interrogating the AI content. It's really important for information literacy.

    54

    00:13:05.160 --> 00:13:19.142

    Ace Parsi: and that students are just consumers of this. AI input, they're also producers of it. In the same way, the social media students just could don't consume information. They also produce it. And that has both positive and negative implications for our society.

    55

    00:13:19.780 --> 00:13:39.009

    Ace Parsi: We know that there's students that can use AI functions to cheat or produce or work. So what does that mean for our assessments? So we need to make sure that our assessments are used in a way that drive out original insights from students and perspectives, and that this isn't something, because it gives sort of sort of that disciplinary

    56

    00:13:39.010 --> 00:13:58.819

    Ace Parsi: knowledge that this is we're going to have to. And this is really important. We're gonna have to think about across disciplines. You are not just a civics teacher. If you teach in the social studies, you are also civic teacher. If you teach in the English language arts, the math, the sciences, all these imperatives that I'm highlighting here have implications across our different disciplines.

    57

    00:13:59.553 --> 00:14:03.100

    Ace Parsi: And so then we talked about alright. Well.

    58

    00:14:03.750 --> 00:14:08.359

    Ace Parsi: being civic ready isn't just about knowing something.

    59

    00:14:08.370 --> 00:14:32.019

    Ace Parsi: It's about the combination of knowledge skills, dispositions and behaviors. So our history, our heritage skills like information, literacy, assessing reliability, critical thinking and collaboration, dispositions like just appreciation of free speech and differences, civility, curiosity behaviors like being an informed voter, and we're in an election season serving on juries, volunteering.

    60

    00:14:32.030 --> 00:14:56.800

    Ace Parsi: and in the same way that I highlighted earlier that this could have both positive and challenging implications. We tried to dig deep and say like, Well, what are those? Some of those positive and challenging implications? And you know Amanda can share the 2 pager that has those this broken out more. But just to give the executive summary of it, that this is something, not that students can engage in a more deep way with knowledge.

    61

    00:14:57.163 --> 00:15:00.249

    Ace Parsi: and content can have more ready access to it

    62

    00:15:00.610 --> 00:15:13.820

    Ace Parsi: like well, an example. My my daughter reads this book, Nathan Hale's hazardous tales, or maybe yeah, I think it's Nathan hails hazardous sales, and it was a historical figure in Oregon, and she debated them.

    63

    00:15:13.880 --> 00:15:28.120

    Ace Parsi: We're using it, you know, gave the AI the opportunity to play that role. So that that was like, I mean, it's a positive thing to be able to access knowledge, but it can also be a tool for disinformation and misinformation. It can help prioritize the skills

    64

    00:15:28.140 --> 00:15:56.560

    Ace Parsi: of th these sort of civic outcomes. But then, without algorithmic or digital literacy. These critical skills may not be developed. And they and it, students can be just passively soaking this information in it can for us a social cohesion purpose. I know some of the things that become really particularly challenging in this time is that we don't talk with people that have any sort of differing views than we do, and you can use the AI for that purpose, and that can be really good.

    65

    00:15:56.560 --> 00:16:16.919

    Ace Parsi: But it can also read, depending on how that deep fakes and other things that go go along with it, it can breathe greater distrust and division. Amanda and I will email back a bunch of times during this present setting up this presentation. And if it was just Amanda using her AI to email me.

    66

    00:16:17.020 --> 00:16:37.329

    Ace Parsi: And I was using my AI to email her that would have an implication for our relationship together. And then the behaviors. It has an opportunity to give that sort of like opportunities for young people to engage in more types of things, but it can also, undermine that trust in each other in that same sort of way.

    67

    00:16:37.968 --> 00:17:02.820

    Ace Parsi: So what do we do with that? We believe one of the things that we felt heartened by was some of the best practices in civic learning that we know youth, voice inquiry based learning those opportunities that we don't sacrifice knowledge that that historical knowledge is really important. The opportunities like service learning

    68

    00:17:02.840 --> 00:17:30.140

    Ace Parsi: and experiences and extracurricular activities. All those things are still relevant. They are still the same sort of things. But now we have to think about them and ground them into how AI can affect those those sorts of experiences, and to the extent that our North Star is those relationships and being able to ground our work in that that can be the the guide for us to to move forward.

    69

    00:17:30.560 --> 00:17:34.039

    Ace Parsi: I'll talk a little bit about this is we can't. I think.

    70

    00:17:34.340 --> 00:17:47.399

    Ace Parsi: I am not one of those people that thinks that we can just leave it to the technology companies and everything will be okay. I don't think they are. I mean, one of our great partners in this whole work has been SIA, the software industry.

    71

    00:17:47.620 --> 00:18:11.090

    Ace Parsi: And they don't think that I think that we need an investment so we need an investment in technical assistance and professional development. We need guidance on how to how the challenges, not only, and what schools can do in terms of cheating on an assignment, but what does that mean for our social cohesion. So we need, we need that guidance. We need a major investment. So currently.

    72

    00:18:11.190 --> 00:18:20.239

    Ace Parsi: for every $50 we invest as a country in stem learning, and I do not want to be one of those people that poopoo stem.

    73

    00:18:20.750 --> 00:18:24.820

    Ace Parsi: We invest in equivalent 50 cents right now in civic and history, learning

    74

    00:18:24.860 --> 00:18:26.439

    Ace Parsi: that is not

    75

    00:18:26.745 --> 00:18:34.440

    Ace Parsi: proportionate to the challenges that we're gonna face in this particular area. So we need an investment we need to be able to advocate for that.

    76

    00:18:34.795 --> 00:18:54.919

    Ace Parsi: We need support because a lot of our educators don't have a the grasp of some of you all on this call, and we need to do use our net leverage, not just as funding, but also to recognize teachers, students schools, that are doing a and technology companies frankly, that are doing this work really effectively.

    77

    00:18:55.565 --> 00:18:59.104

    Ace Parsi: We ended with here and I and

    78

    00:19:00.060 --> 00:19:19.950

    Ace Parsi: There was a little bit of a debate internally about whether we should even have that this section, because things are changing so fast that some of these questions may be answered. And we wanted this to be timeless. But I think actually, a lot of these questions are still relevant is, you know how we're the state wide standards around AI and information literacy affecting behavior.

    79

    00:19:19.950 --> 00:19:38.510

    Ace Parsi: So their standards are good lever. And what are those high quality standards look like? How do we prepare our educators? What are the qualities of preparing educators? And I don't mean that just generally. AI, I really mean that in the intersection around social cohesion and democracy like, there's specific practices. And we need training around those practices

    80

    00:19:38.929 --> 00:19:51.149

    Ace Parsi: has that ubiquity changed, how teachers interact with their students and what it, what what they think the students need to be because it's not just about resources. It's about how teachers conceptualize their role

    81

    00:19:51.560 --> 00:20:07.199

    Ace Parsi: more broadly. What are the practices that can be implemented? And what? And then, just broadly speaking for safety purposes for students. What's that look like in the future? So thank you for this time. I really appreciate it. And I'm looking forward to the conversation.

    82

    00:20:07.400 --> 00:20:08.200

    Amanda Bickerstaff: That's amazing.

    83

    00:20:08.200 --> 00:20:09.229

    Ace Parsi: Yeah, my screen.

    84

    00:20:09.230 --> 00:20:21.059

    Amanda Bickerstaff: I just wanna first of all, for those of you that are doing this work already. I think there are a couple of things that really stick out to me, and I know I saw that in the chat, too. I think I think one is this idea that we are on a spectrum. I think that the idea that this is

    85

    00:20:21.320 --> 00:20:46.310

    Amanda Bickerstaff: we, we tend to talk about technology as like the 2 parts, like, we're either a doomer or an optimist, right and or like. But what we don't usually have is the the right level of conversation to identify that there are good and bad in all things right, and that it's an intentionality that has to happen with us. Right? We have to be intentional to ensure that the good things that weigh the bad things. And, as you have said, social media devices

    86

    00:20:46.310 --> 00:21:11.099

    Amanda Bickerstaff: is our. We're a a moment where we missed. I think we can all agree that we really missed that moment for our our kids. Specifically, I think it's impacted us all in in really interesting ways and sometimes good again and sometimes bad. But I know that there has been. We've seen now, there have been significant and deep, and you know, existential impacts of giving students and young people devices and social media. And I think that this is that moment

    87

    00:21:11.100 --> 00:21:36.100

    Amanda Bickerstaff: that we don't wanna miss. And so I'm so grateful for the work of civic now, and everything that was done. Because what we're doing is we're not waiting for 5 years or 10 years, and we're we're saying, No, no, in the first, you know, year and change of this being a technology that is in our, you know, our Co collective consciousness, we're gonna go right at it. And so that's why I was so excited to have a string. And I'm actually gonna call up the rest of our our special guests. So they're gonna be so great. So we have 2 special guests with

    88

    00:21:36.100 --> 00:22:01.640

    Amanda Bickerstaff: with us today, we have Mark stolen and we have zombie. And so, as you can see, diverse group here, and we're gonna actually do a bit of a a little bit of our traditional panel. Now that we've had that amazing foundation that was given by Ace, and with the fact that we have your 10 page you know paper. You have the couple of one pagers around application, which I think is really great. But now we're gonna talk about what this really means. Okay, as I always love that like, we have these supporting documents.

    89

    00:22:01.640 --> 00:22:06.317

    Amanda Bickerstaff: But I think we have to have the stories, too. Right? We need to talk to people and get those stories.

    90

    00:22:06.690 --> 00:22:16.810

    Amanda Bickerstaff: So what we're gonna do is we're gonna ask a couple of questions to everybody. We're actually gonna start with zombie. So you get the same question that I ask everyone which is, can you introduce yourself? And then also talk about the first time you use generative AI.

    91

    00:22:18.020 --> 00:22:42.770

    Saanvi: Definitely. Hi, I'm Sandy. I'm 19 years old. I'm currently a soft, a sophomore at Uc. Berkeley, and I'm really excited to take part in this panel and hear about the amazing work that ace and everyone has been doing with regards to putting this report together. So congratulations. And yeah, I guess my first time using generative. I AI was probably like, I probably didn't even realize that it was this. And I I just had to do some research with it quickly to double check if it worked. Canada's AI, but like

    92

    00:22:42.840 --> 00:22:58.190

    Saanvi: the auto fill on Google like the email auto fill, I think definitely, probably the first time. And I didn't even realize that was like AI. But yeah, the first time I ever use that or even apple auto auto fill. Yeah, I think it's probably the first time I did.

    93

    00:22:58.550 --> 00:23:27.659

    Amanda Bickerstaff: Yeah. And I think that's actually a wonderful first place to start, because AI has been part of our lives for a long time. And it doesn't mean chat. Gbt, and so we think about this sent to Aces point. If you're suddenly not writing your emails anymore because you're just using the hit button. Has that changed the way that you're interacting with people? Has it made you more responsive? Or has it made you less engaged, or a little bit of both. And so I think that's a really interesting place to go, especially since it's not just students that don't know that these tools are artificial intelligence. But I can tell you as

    94

    00:23:27.660 --> 00:23:35.930

    Amanda Bickerstaff: doing this work with teachers and leaders across the country. It's them, too. Okay, Mark. Next, will you introduce yourself and talk about your first time with generative AI.

    95

    00:23:36.320 --> 00:24:00.819

    Mark: Hi, thanks for joining thanks for having me join everyone. My name is Mark Stolen. I am currently the platform coach for New tech network we bring project based learning to schools all across the country. I actually was. It was one of the skeptics of AI. And I actually right around the time that I was asked to join aces aces civics. Now group, my daughter had come home from school and said, Daddy, have you heard of this thing called AI. My teacher said we could use it to make stories, and I'm like.

    96

    00:24:01.160 --> 00:24:22.249

    Mark: alright if my first graders telling me it's time to use AI, it's time to go use. AI. So I actually was working on another project for my work, and I needed to write an empathy interview, and the questions are always hard to write. So I told I told it. Hey, Chachb, I need to make this empathy interview ask me any questions, and I was able to produce, produce an empathy interview that I was able to use the following day. So pretty cool stuff.

    97

    00:24:22.700 --> 00:24:47.680

    Amanda Bickerstaff: Very interesting, but also man first graders like, you know, I I think there is. We advocate at AI for education, for modeling like best best best practice with even younger students, because they might be interacting with generative AI and Snapchat. If they have access to these tools, they are AI tool. So hopefully, it was a modeling exercise, not just letting kids lose the Ch 3.5 or 4. Although I I know that there can be some fun that can be had with some admin

    98

    00:24:47.680 --> 00:25:07.940

    Amanda Bickerstaff: generation with our younger people. So we're gonna start our our panel, which is really great. We're gonna and the first question is to everybody, and I think we've seen this work that's been done by civics. Now in this larger group, why do you think it's so important for us to have taken this time that we have collectively to talk about the relationship between.

    99

    00:25:08.120 --> 00:25:15.869

    Amanda Bickerstaff: you know, AI and Civics education. But really just AI. And the way that we teach our kids right now. So what do you think, Sandy?

    100

    00:25:17.160 --> 00:25:42.150

    Saanvi: Definitely. I I think that first, like definitely acknowledging that technology has been transformational for students, it's made education way more accessible. It's made extra Ca, extra curricular activities way more accessible. And I think, especially in the context of this report, as we're talking about civic readiness, and it's allowed us to share information at an unprecedented magnitude. It's been transformational for educating the broader population, not just students and young people, and I think, even particularly with young people

    101

    00:25:42.150 --> 00:26:06.160

    Saanvi: and students. They are highly informed on issues that are very far away from them now. And it's improved, like generally, their ability to connect with diverse viewpoints. And it's also been great for education in the sense that students can learn from AI. I know a lot of my peers. We oftentimes turn to like Chat Gbt. Now, if we want a quick answer to a question, and don't wanna like bother our teacher or something. It can also support teachers in in providing extra.

    102

    00:26:06.160 --> 00:26:31.129

    Saanvi: you know, lending hand, for in in in cases where they might not be able to support every student individually, but at least AI can provide clarifying information or additional resources that students can use in the classroom. So overall AI and technology has, I think, has been great for education, but I also think that there's the, you know. It's a double edged sword when unchecked. We aren't even really sure the extent to which AI has been harmful, and we have, like a few use cases that we can point to, for

    103

    00:26:31.130 --> 00:26:56.089

    Saanvi: where it definitely has been arm harmful, for example, like in the context of misinformation, like AI generated misinformation. Exacerbates like digital echo chambers. And that phenomenon where, like, you know, in you know, young people specifically, but also individuals. They can become entrenched in their own viewpoints without critically evaluating information or engaging in meaningful on offline action, so that

    104

    00:26:56.090 --> 00:27:20.829

    Saanvi: of like critically thinking about civic topics like that might not happen as much as a result of AI's contributions to certain misinformation. Spirals, also, like the correlation between, like generally growing online consumption and exacerbated polarization, I think, is definitely contributed to a lot of, especially when young people are engaging with this polarizing, these polarizing conversations online.

    105

    00:27:21.133 --> 00:27:33.869

    Saanvi: A lot of limited trust in our institutions. Amongst young people. Specifically. So, there's growing like youth, civic apathy. We can even see this in the context of this year's election, where a lot of young people just don't align

    106

    00:27:33.870 --> 00:27:35.199

    Saanvi: like with.

    107

    00:27:35.220 --> 00:27:41.940

    Saanvi: like the idea or understand the purpose of this election they don't align with, I'm certain, with any ideology.

    108

    00:27:42.170 --> 00:28:03.899

    Saanvi: So I think that what we're talking about today is like really important, not just in like understanding the harms associated with AI, but also focusing on the positives of AI and technology, and how they can be better realized by understanding this. You know the the potential for AI to be harmful. So yeah, I think I think that's why this is super important. I'm really excited to chat about it.

    109

    00:28:04.090 --> 00:28:29.060

    Amanda Bickerstaff: Well, first of all, how articulate are you? Amazing? So this is why I I wanna just say I'm gonna take any time I have to like always say, if you're not talking to students, and you're missing. And there are people that are gonna be watching this that is not talk to students about this moment time. This is why you have to do it, because that is not only a well articulated like thoughtful answer, but also something that you deeply experience as well. And so I just think that this is why I'm so glad that you're here with us today.

    110

    00:28:29.060 --> 00:28:49.489

    Amanda Bickerstaff: And and I also just want to point out that, you know, when Josh, Vt. For vision was put into the world they had a model card and model card is where these tool like this is like kind of the way they get around talking about risk associated with these models. And in the model card, it talks about the the potential impact of significantly more disinformation

    111

    00:28:49.560 --> 00:29:14.480

    Amanda Bickerstaff: because of the ability to not only create an image that's never been created, but create text that can now connect to that image that is also false, and that they they had no solve for it. There was no, there's no red teaming, there was no fix for it. There's no way to to fix it from what they did. They still release Gbt. For vision into the market. And so I think that there is something to be said about where we're talking about that these are not just things that we're talking about from the perspective

    112

    00:29:14.480 --> 00:29:25.219

    Amanda Bickerstaff: of students and teachers and leaders in the space, but also that this is a known issue from those that have the power to create these tools. So now to mark, that is a hard like. I don't know.

    113

    00:29:25.220 --> 00:29:29.853

    Mark: Honestly, Amanda has set me up for failure with following both of those.

    114

    00:29:30.210 --> 00:29:34.340

    Amanda Bickerstaff: After this this is gonna be, actually, you guys can leave. It's gonna be Zombie. I'm gonna go off to you.

    115

    00:29:34.340 --> 00:29:59.259

    Mark: That was the plan. Right? That was the plan the whole time. I think the reason this conversation is so important is because of the fact that our world is changing, and our educators need to be prepared to teach in that world how many people still own the the such and such for dummies book, or owns an encyclopedia. Right? Those are long gone to the way the dinosaur, because the way the Internet has changed it. If we are not aware of what AI can do.

    116

    00:29:59.260 --> 00:30:10.640

    Mark: what AI can provide our our learners and and ourselves. It's gonna pass us in the fast lane, and we're gonna be the one stuck with that encyclopedia set that we were sold for $1,500 and just collects dust in the corner.

    117

    00:30:11.128 --> 00:30:13.080

    Mark: I really think that

    118

    00:30:13.390 --> 00:30:29.873

    Mark: teachers oftentimes are, as ace mentioned in his presentation. They're the slowest to adapt to change right. The Covid was the first time that everyone had to suddenly use a learning management system, because now you can't just put it on the whiteboard. Nobody's in front of you for the whiteboard. So

    119

    00:30:30.190 --> 00:30:49.549

    Mark: Sometimes these big shifts caused by major changes, such as the product. The AI becoming much more prevalent in the world, forces teachers to adapt and forces educators to figure out, what is it that AI can do? What is it that AI can provide? And how do I make sure that my students are prepared to use it in the future.

    120

    00:30:49.927 --> 00:30:51.060

    Mark: I think there's

    121

    00:30:51.340 --> 00:31:17.369

    Mark: there's an opportunity. But there's also a chance to do some real harm. There's a lot of schools that are banning AI because it's just it's easier than trying to understand it. And and if if that's gonna happen that's gonna do with those students disservice because they're gonna have less access and less opportunity and less practice. And and ultimately they probably will find a way to use it outside of school anyway. So what good is it to ban it, ban it, and not allow them the opportunity to learn in a controlled environment.

    122

    00:31:17.790 --> 00:31:44.770

    Amanda Bickerstaff: Yeah. And so, Mark, I mean, it's such a fascinating moment. Right? This coming. So we have like this AI kind of revolution coming right off the back of our covid impact. Right? And I think what we saw is that there were a lot of changes that happened during Covid. But we almost it was almost too short of period time. I I'll say I was locked in Melbourne, Australia. I do not want it to be longer. I was very. You know I was a prisoner of Australia for a couple of years. So that's not what I'm advocating for. But we didn't see is that there was a rubber band effect for most schools. Right? What they, what they did

    123

    00:31:44.770 --> 00:31:57.690

    Amanda Bickerstaff: is they went online. They started using blended now fully online structures. And then they kind of over rotated back to pen and paper. So it's really interesting to now see this AI kind of impact because it's so consumer

    124

    00:31:57.690 --> 00:32:21.699

    Amanda Bickerstaff: driven right. So many people are bringing it to the schools regardless. If it's a decision by a superintendent or school leader, and maybe we will see some of that connection, that some of that readiness, some of that more openness. And I will say that while schools are slow, it seems like the trajectory in schools is around the same adoption curve as industry, and I think especially that we, especially individual use, seems to be following that path.

    125

    00:32:21.700 --> 00:32:36.347

    Amanda Bickerstaff: And and then I was even say, for policy wide and and ethical guidance is even more than industry. So I think there is that chance for us to take a lead with with reports like civic now, but also the work that we're doing together. So thank you for that. That that perspective. Okay? Ace to you.

    126

    00:32:36.780 --> 00:32:36.960

    Ace Parsi: The.

    127

    00:32:36.960 --> 00:32:39.179

    Amanda Bickerstaff: So you help build the thing. So you have many.

    128

    00:32:39.180 --> 00:33:08.929

    Ace Parsi: Well, I mean, we all built it together. I I think that we one of the points that we raised in the report was that. Just be generous with us because this was human beings that we're having these discussions with our imperfect perspectives. I I I think that you know I've lived in 2 societies in my life. One is an authoritarian regime, and one is a democracy. You can have, an authoritarian regime without a sort of foundation of civil discourse, or any of that. You can't have a democracy with A

    129

    00:33:08.930 --> 00:33:27.009

    Ace Parsi: and the whole. The whole experiment is based on a system of trust. If you don't trust each other, if you don't trust the institutions, if you don't trust that your vote counts the whole thing is a House of Cars, and it falls down to a a part. And so I think that with this new inflection point of this technology, we need to be very intentional

    130

    00:33:27.010 --> 00:33:39.129

    Ace Parsi: about how you know the the context that we're teaching it and not be so entirely focused on the technical aspect of it coding and those other things but this broader conversation.

    131

    00:33:39.140 --> 00:33:41.180

    Ace Parsi: and you know I was just sharing with

    132

    00:33:41.770 --> 00:33:54.259

    Ace Parsi: the Amanda Mark and Zombie. I'm on my daughter's superintendents, advisory committee here in Mongalla. Well, I shouldn't. I shouldn't throw them under a bus.

    133

    00:33:55.454 --> 00:34:11.865

    Ace Parsi: But but I I I'm here on the superintendent's advisory committee, and we were supposed to have like different conversations that families really cared about. And I put in there like we should talk. It's an election year. We should talk about democracy and what we do for social cohesion. And I got a response that this isn't an important topic.

    134

    00:34:12.280 --> 00:34:36.359

    Ace Parsi: And so I think that we need to like, take this moment. And each of us needs to take our role and being advocates for this conversation, that this is a really important moment, that everything we rely on our jobs. Everything is based incumbent on the fact that we live in a society that trust each other. Enough, that we're still engaged in this sort of social cohesion. Democracy, if that falls apart, our lives change fundamentally. And so

    135

    00:34:36.380 --> 00:34:37.480

    Ace Parsi: for me.

    136

    00:34:38.360 --> 00:34:50.320

    Ace Parsi: working in the role that I do, I really believe that the foundation and the strength of our democracy isn't happening with us adults in the room. It's happening with youth in classrooms today.

    137

    00:34:50.580 --> 00:35:06.800

    Ace Parsi: And the decisions we make to what we prioritize, and the parts of these conversations that we need to be more explicit about are really going to be determined the outcome of that. So I think it's important for that that sort of reason. It needs to fall within a broader culture where this issue

    138

    00:35:07.384 --> 00:35:09.240

    Ace Parsi: this mindset, is important.

    139

    00:35:09.670 --> 00:35:31.480

    Amanda Bickerstaff: Absolutely, and let's give everyone ace. Give him a a virtual hug, cause I know that was a frustrating moment. So we're all gonna give you good good vibes right now to to ace I I do think that we see that play out, though, is, we see this like this uneven readiness spectrum right? And I think that a lot of it comes from a lack of time and awareness

    140

    00:35:31.480 --> 00:35:57.899

    Amanda Bickerstaff: of the impact, like a lack of time just in general by lack of awareness of the potential impact. So actually, it's a perfect segue to the next question. The next question is like, from, as you know from zombie, your perspective mark, and from ace like, what are you? What do we need to do to get people to listen like, what are the things like we need to do to get students to listen, to get teachers, leaders, or even superintendents, to listen to this important moment in time. So, Sandy, do you have your you ready to respond.

    141

    00:35:58.590 --> 00:36:16.442

    Saanvi: Absolutely, I think I think right now within this movement overall towards AI literacy, and also towards understanding AI more deeply in the context of civics. There's a little bit of a hesitancy for, like young people to engage in the conversation, because sometimes I think their purview is that

    142

    00:36:16.810 --> 00:36:41.810

    Saanvi: Others in the movement want to eliminate social media, and I think young people are really hyper, fixated on the benefits of social media in their lives. Right now, I think, like there should be a narrative. There can be a little bit of a narrative shift there in terms of emphasizing that this is about making platforms and making technologies better and serving the needs of youth rather than taking that alternate approach of removing the presence of those platforms. Because, as we've talked about. There are

    143

    00:36:41.810 --> 00:37:06.740

    Saanvi: so many benefits that should be realized, this just in a more responsible manner. I think in line with that, I think we need to be creating. I guess the conditions for users across the board, not just younger users, but any user to engage constructively and responsibly with AI. And I think that comes down to also breaking down what AI is. I don't think a lot of people understand what AI isn't kind of. Throw it around as a buzz word. I've seen a lot of tweets

    144

    00:37:06.740 --> 00:37:31.490

    Saanvi: like that where people will say something, and they'll mention AI. And I'm like, I don't know if that's AI, but that's great. And so that I think that's another thing that probably needs to happen is, I think, definitely considered, a buzz word, a lot of students. A lot of students themselves don't exactly know what counts as AI and how it works. So I think breaking down and demystifying technology across the board. And it's relationship to you as as a user and also as a

    145

    00:37:31.520 --> 00:37:54.601

    Saanvi: a person existing in this society. Is really really important. And I think that will help create the conditions for users to engage more constructively and responsibly with technology. And I think, as younger and younger people join these platforms. I know I like first joined social media in high school, I know, like my younger brother is a lot younger than joined like a lot younger than high school, so I know the age keeps

    146

    00:37:55.000 --> 00:38:13.489

    Saanvi: lowering. I think education can be a key lever for addressing that cause. Schools are a consistent institution that can reach younger age groups right when they are beginning to engage with these platforms. So looking at AI literacy information, literacy, not just for students, but also for educators, as we've talked about.

    147

    00:38:13.620 --> 00:38:38.040

    Saanvi: And I think also, this goes beyond just the context of civics, education not limiting AI literacy to the context of civics or social sciences. This affects all subject matters, especially stem so investing in civics, education, but also general literacy, education that is holistic and addresses all of these subject matters is is important as well. And then I think, in line with that, I I do believe that

    148

    00:38:38.464 --> 00:39:02.669

    Saanvi: investing in civics. Education, at its core is important. As we talked about earlier, a lot of the reason why engagement online can especially irres or irresponsible engagement online can be dangerous is because it might be rooted in distrust, in institutions at large, and that obviously creates more polarizing conversations that are harmful.

    149

    00:39:03.035 --> 00:39:29.010

    Saanvi: So I think, showing students and showing young people that democracy works like that. You know, they're able to experience the idea of community, not just learn about it in the classroom. And this community, both online and offline. I think achieving that trust will be so much more realistic when students understand exactly what their role is as citizens? And how that plays into their engagement on these online

    150

    00:39:29.020 --> 00:39:53.349

    Saanvi: platforms that are also just an extension of of our democracy in a different way. So I also wanna uplift. One thing that I think was also mentioned in the report is that it's really important to look at how civics education can keep up with this transition from in person to digital civic dialogue. And I think that whole idea of keeping up being sure that we're always one step ahead, or at least transitioning away from a reactionary approach

    151

    00:39:53.350 --> 00:40:01.659

    Saanvi: in the long run to one that's more preemptive and can anticipate the impacts of new technological advancements on education will be really helpful, too.

    152

    00:40:02.130 --> 00:40:27.109

    Amanda Bickerstaff: That's great, zombie. So I think that there's a lot to unpick there. I I do wanna kinda pick up a one thing that I think, is really interesting. Which is this idea of like how AI fits within the larger ecosystem that's already happened that goes beyond social media, but goes into the fact that we do have a lot of institutions with a lot of distrust. And then what we've done is we put out this new technology that can actually not be trusted because the new technology you you're too young to remember, like, dial up or like, if you

    153

    00:40:27.110 --> 00:40:50.090

    Amanda Bickerstaff: like. If your mom got a phone call, you get kicked off doing your homework, like, you know, like, there have these new technologies when they first start, are by Dent, of being new, not very good. And so, even if they were the most responsibly made tools, the amount of like inconsistency, unreliability, and and you know the fact that they are so nascent would already be something that would make them hard to trust. And so I think that what we have now is

    154

    00:40:50.090 --> 00:41:05.810

    Amanda Bickerstaff: this is Ca, that's kind of like that idea. But times like magnitude, right where it's so a part of our lives, it's gonna be in Google maps. It's in snapshot. There's a fake. Kylie Jenner on Meta, you know, like, are all these types of things that are happening that we can't dissociate

    155

    00:41:05.890 --> 00:41:15.760

    Amanda Bickerstaff: the idea of generative AI literacy from larger civics, conversations. And I think this is really the underlying point to the report, but also just to kind of underline for everybody here is that

    156

    00:41:15.810 --> 00:41:25.469

    Amanda Bickerstaff: it has to. We, when you try to talk about generi as an island to itself, you miss out on all the things that you can do, like Trojan horse. Again, just digital literacy.

    157

    00:41:25.470 --> 00:41:46.590

    Amanda Bickerstaff: informational literacy, civics, education, because it's such an amazing opportunity to show how these things work or don't work in the harm that it could cause or the potential of good. It's a actually a pretty easy way to show. So I think that your points are so important for those that are doing the work here. A, on listening to this call, is that, how do you connect this so that you're bringing in those viewpoints.

    158

    00:41:46.590 --> 00:41:57.705

    Amanda Bickerstaff: but also ensuring that you're not just talking about one thing, it's with one goal, but that you're actually looking at the bigger goal of actually creating a better, you know, hopefully, it's a better society moving forward, not a worse one.

    159

    00:41:58.090 --> 00:42:01.819

    Amanda Bickerstaff: So mark to you again. Hard that hard act to follow. Same question.

    160

    00:42:01.820 --> 00:42:07.060

    Mark: Same as zombie. Next question. I think

    161

    00:42:07.380 --> 00:42:09.990

    Mark: what comes to mind for me is just

    162

    00:42:10.180 --> 00:42:40.009

    Mark: the idea that AI has the ability to impact our lives beyond just the classroom that we need to be able to understand it. There was an exam. I was at a conference, and there was an example of the I guess the South Korean president, who was just elected, was not very popular with his his people, for whatever reason I thought he was cold. And so he used a deep fake AI to make these videos of him doing karaoke. And all these things to make him more personal and likable, and ultimately it worked. So not only are you dealing with the the candidate using AI for himself, he's then

    163

    00:42:40.010 --> 00:42:58.634

    Mark: publicizing all this information out that is generated by that's not him. So then there's this whole ethical conversation about. Is is that okay? Are you allowed to make a deep fake of yourself for the purpose of selling yourself. Is that not okay? Right? There's there's there's a world of possibilities that come out of of generative AI

    164

    00:42:59.440 --> 00:43:20.400

    Mark: becoming part of our lives, and so giving our young people an opportunity to see it in action, to see it impacting their lives, to see it in places outside of it can write an essay for you right? And being able to to show both the positives and the negatives that come with that, I think, is a very important part.

    165

    00:43:21.370 --> 00:43:45.889

    Amanda Bickerstaff: Yeah, I mean, how much fun is it? Okay? So, okay, you know, I'm a nerd everybody. I just said, How much fun is it like an activity on ethics. But I really believe that there is something like that mark, like how many fascinating questions can we ask and drive these really unique conversations that have never been had before. We've never been able to have, like ourself as a new, like, like a fake version of ourselves that can do what we can

    166

    00:43:45.950 --> 00:44:04.109

    Amanda Bickerstaff: like. And I think that is something that's really fascinating. And I think that like, if you could have those conversations and structured ways. I've seen it happen over and over again. If you give a 17 year old, a 16 year old, an 18 year old, time to have these conversations or someone like Zombie. They're gonna dig in because it's like they're rich.

    167

    00:44:04.160 --> 00:44:14.610

    Amanda Bickerstaff: interesting, meaningful questions that actually matters to them. And so a lot of times we have like education. Sometimes media literacy is hard or civics. Education is hard because it doesn't feel connected to ourselves.

    168

    00:44:14.610 --> 00:44:39.470

    Amanda Bickerstaff: But, like, I wanna be an influencer. What if I was a fake? What if I was like? Kind of me but an augmented influencer like? That's a fascinating question, like, what are the ethical ways in which that would work. So I think that these are just such really rich ways that we can engage each other in discourse, which is such a key to like a civic education approach in general. So we're gonna go to Ace. We're gonna work. We're coming up in time. So what we're gonna do is we're gonna have ace talk. And I'm gonna go rapidly

    169

    00:44:39.470 --> 00:44:45.400

    Amanda Bickerstaff: prior to the group on a recommendation. And then, we're gonna we're gonna say goodbye. But, Ace, what do you think? What is your.

    170

    00:44:45.400 --> 00:45:00.705

    Ace Parsi: Yeah, I mean, I think I'm much more concerned about the South Korean President example than I am about the Kylie Jenner example cause. I mean, like, what do you do of a fake of a fake? Sorry. That's that's like a political topic of Kyle dig at Kylie Jenner. But

    171

    00:45:02.764 --> 00:45:18.639

    Ace Parsi: but yeah, I I think that there's a to Mark's point in terms of like his elementary age student like we need to scaffold this. I don't want my daughter to interact with, like we need to be really mindful of what it is, what this is all will lead to and and think about, not just like the cool.

    172

    00:45:22.300 --> 00:45:46.579

    Ace Parsi: What are the ripple effects that will have for our society? Because ultimately we do not live in our individual rooms. We live in a society, and we have to always be mindful of how we teach young people to be in that society, how we embrace that responsibility within our own society. So that's gonna make me like using the word. Responsibility makes me sound like an old person. But I believe in that, and and that's what I would. I think that we need to really take this as a responsibility.

    173

    00:45:46.850 --> 00:46:13.550

    Amanda Bickerstaff: Yeah. You know what? We can be old, you know what fine like we're, I think this is a moment where, like, I think people are looking for that, too, though I don't think this is something in which people aren't looking for guidelines and support. I think more than anything people are looking for a path forward that's going to lead better to better things. And so let's be old. Let's let's go into things that we haven't talked about in a while and let's lead into it. So I think that's just really important. Okay, so we're gonna do a final round. So okay, what is your

    174

    00:46:13.550 --> 00:46:37.319

    Amanda Bickerstaff: best recommendation. Right now you have one. You only have less than a minute. What is your best recommendation for ensuring that we are meeting this moment meaning that we are supporting our kids, our teachers or leaders or communities, to not be in a position in 15 years where social media has impacted our kids in in ways we weren't expecting and maybe were, but didn't do anything. So, Zombie, what do you think is your biggest recommendation?

    175

    00:46:38.213 --> 00:47:02.879

    Saanvi: I think my biggest recommendation is similar to like what I said earlier about like shifting away from in a place being in a place where we're reacting to the implications of technologies after retain the harms to better anticipating them. And I think one way that we can do that is to ensure that while we adopt recommendations, like the ones included in this report, that those policy recommendations are complemented by similar like efforts, and even a narrative shift within industry within

    176

    00:47:02.880 --> 00:47:22.501

    Saanvi: schools, that there is like almost like a convergence of efforts across like different stakeholders, to work on this. And I think, part of that is also better user consultation in like technological design, in in industry and ensuring that we are like, you know, that responsible innovation includes conversations with other stakeholders like ours.

    177

    00:47:23.010 --> 00:47:24.240

    Saanvi: yeah, that's.

    178

    00:47:24.870 --> 00:47:38.989

    Amanda Bickerstaff: Amazing. Here, here, I'm going to like a think tank next week. Everybody and I get to talk to like people that are building these tolls. So I'm gonna take you with me, Zombie, in my pocket, and we're gonna ask questions like that. And we and see what we can do next to you, Mark. What's your recommendation?

    179

    00:47:39.790 --> 00:48:07.959

    Mark: Get AI in schools in a way that is useful for teachers and useful for students. Find a way to write to your your local local school district and get them to invest in some AI AI Pd for their staff, because once the teachers are using it, it'll trickle down to the students, even in smaller bites. I would bet dollars to donuts that my teeth. My daughter's teacher, had something about AI in a training, and that's why she was telling her students about it. So get it, get it in the hands of the teachers, and it'll get get its way down to the students for sure.

    180

    00:48:08.240 --> 00:48:35.699

    Amanda Bickerstaff: That is our business model mark. That is how we do it. That's how we know we impact students. Because I we've had. We've had Pds. Where the responses was. I was so afraid of this. Now I'm excited. But how do I do this for students like it's like I'm immediately like they went from just being fearful to like just accepting it and seeing the value and then immediately being like, I'm a teacher because I wanna help students. So how do I do that? So I think it absolutely is an an unblocker. So that's great. And then A/C, get the final word.

    181

    00:48:35.700 --> 00:49:01.729

    Ace Parsi: Yeah, I think Mark out. Olded us all for the dollars to donuts. Think and I've met Zombie for the first time last week. I'm shorter than Zombie, so I can fit in your pocket, man. That's work out I I think the the main thing. I'll put the linked into the chat. For if you have to look at one thing I would say, that's 7 criteria and so to be intentional, collaborative communal in this, and and that is, I think, the what I would leave with.

    182

    00:49:02.110 --> 00:49:26.630

    Amanda Bickerstaff: That's amazing. So I just wanna say, thank you so much for our panel. It is such a pleasure to be able to do this work and to advocate for the voices here and to give like I just I learned every time. And so I just really appreciate everything. I also appreciate, appreciate. Ace aces got one of the best bedside manners for meeting people on zoom like really does try to connect. And, you know, really embodies that work. So I just appreciate Ace and Mark and zombie and everyone

    183

    00:49:26.630 --> 00:49:51.529

    Amanda Bickerstaff: that was part of the civics. Now, Group, I just want to give you guys a couple of ways you can get involved. We have a prompt wire, of course, but there are 2 things that we talked about today. One is a AI Literacy day. So on April nineteenth it is a fully free day. We have resources. There'll be live. Pd. To. To Mark's point. We will have curriculum resources all about what is AI, and so starting to build that foundation of literacy. And so, if you want to do that, I put it into the channel. But please sign up whether your screen

    184

    00:49:51.530 --> 00:50:16.199

    Amanda Bickerstaff: or a district or participating partner. We have about a hundred participating partners already, and school districts from everything from Houston and New York to the whole Briar Cabiros here in North Carolina, Dpi. And out in, out in the Midwest. So we're really excited to have that. And the last thing is, we have a summit as well, which is pretty cool of a day of learning just like this, so I'm sure we'll be highlighting some great pieces here, and I think, Sandy, you might get a you got get a

    185

    00:50:16.200 --> 00:50:41.800

    Amanda Bickerstaff: a little bit of an invite to come. Speak in that as well. But that's gonna be focused on the idea of like, how do we start to do this in a way in which we're building? Not just capacity, but also starting to think about what the role needs to become like, how do we? How do we make education better with this new technology? So thank you. Everybody for joining. I appreciate everyone that's here with us as always. Hope you have a good night. Good morning. Afternoon. Have some lunch, whatever wherever you are. Thank you, everybody, for being a part of this. Bye.

    186

    00:50:41.800 --> 00:50:43.909

    Ace Parsi: Thank you, Amanda, and thank you for your community.