Now’s the time to lean in and think creatively about how emerging technology such as generative AI can augment and enhance the very human-centered work of SEL. In this webinar, we explored AI’s implications for supporting students’ mental and emotional well-being and ability to develop healthy human relationships.

Attendees heard about various ways educators are experimenting with AI to enhance SEL, along with a discussion of what precautions need to be taken. During this webinar, participants:

  • Explored opportunities and threats when it comes to AI’s potential impact on social and emotional learning.

  • Learned practical tips for experimenting with AI to make SEL more personalized for students.

  • Set a personal “next step” goal to explore how AI can help them better support students' social and emotional development.

Presented as part of our AI Launchpad: Webinar Series for Educators.

AI + SEL & The Mental Health Crisis

  • Amanda Bickerstaff

    Amanda is the Founder and CEO of AI for Education. A former high school science teacher and EdTech executive with over 20 years of experience in the education sector, she has a deep understanding of the challenges and opportunities that AI can offer. She is a frequent consultant, speaker, and writer on the topic of AI in education, leading workshops and professional learning across both K12 and Higher Ed. Amanda is committed to helping schools and teachers maximize their potential through the ethical and equitable adoption of AI.

    Margot Toppen

    Margot is a veteran SEL program developer who embraces innovative strategies to engage learners through the body, heart, and mind. Building from her early work using dance as a medium for character education, Margot went on to launch EduMotion’s award-winning SEL Journeys product, which now finds its home in the whole-child wellness portfolio of CATCH Global Foundation. Always in pursuit of pushing boundaries to expand the definition of student success, Margot actively engages in many SEL and ed tech learning communities and values thoughtful dialog and collective impact.

    Anabel Ibarra

    Anabel Ibarra embarked on her educational journey in Irving ISD, transitioning from a High School Spanish Teacher to an Instructional Technology Specialist and ultimately assuming the role of an Administrator. Currently serving as the principal of Bowie Middle School, she is captivated by the transformative potential of technology in education. Ms. Ibarra envisions utilizing technology to enhance learning experiences and contribute to positive change in the educational landscape. With a dedicated focus on positive relationships, accountability, and shared responsibility, Ms. Ibarra is excited to contribute to Bowie Middle School’s commitment to lifelong learning. As a catalyst for professional development and curriculum enhancement, she has played a pivotal role in Bowie Middle School's achievements, including its recognition as a Capturing Kids' Heart National Showcase School and a Solution Tree Model PLC School. Under her leadership, Bowie Middle School has also been honored as a Verizon Innovative Learning Labs Grant recipient through the Heart of America, a national education nonprofit.

    Kristie-Ann Opaleski

    Kristie Opaleski is a social emotional learning (SEL) specialist as well as a veteran high school English teacher for over 20 years. Kristie consults for educational organizations on the intersection of SEL, mindfulness, and mental health. Her current role as an ELA instructional coach for grades 6-12 with Jackson School District in New Jersey focuses on educating teachers on artificial intelligence (AI) and how to harness its power in the classroom.

  • Amanda Bickerstaff: Okay, Hi, everyone. I'm Amanda, the CEO and co-founder of AI for education cannot believe this is the twenty-third webinar we have done this year on AI and education. It's our Launch Chat series, and I am so excited to be here today with such an amazing panel of experts and amazing women as well about this important topic around AI and social learning

    Amanda Bickerstaff: with a real focus on mental health. We all know that we are in a mental health crisis with our young people and also with our teachers. We have an enormous amount of social isolation, as we have impacts that our Co post covid fatigue, that are a part of our lives. Unfortunately, we have new and and like difficult safety issues that happen. We have our teachers experiencing burn at race like never before.

    Amanda Bickerstaff: And we have this brand new thing that's almost a year old. If you're actually here from outside of the Us. It's probably already chatty 3.5 s. Birthday. And so we have this brand new thing called AI. That is a real question. And II think we're gonna talk about this.

    Amanda Bickerstaff: The on balance of like is AI a good thing

    Amanda Bickerstaff: or a bad thing, or is it somewhere in the middle when it comes to students, mental health and well-being. So I want, as we always say, and thank you so much for the people that have already said, Hello! We've got, you know, Georgia, Michigan. We've got Oregon. We have so many great people. Singapore. I'm sure we always have like one person that's like 30'clock in the morning. Please go to bed. I promise this will be recorded. We're glad to have you here.

    Amanda Bickerstaff: if please get involved with your colleagues. I already see that we have such a great like. We already have people that are working in and around mental health and well being so, please make sure to get involved. This community that exists inside of our AI for Education Webinar series is one of our favorite things. So what we want you to do is really get involved. We also have a brand new set of AI of prompts for our prompt library that Margo has led the charge on, and Christian

    Amanda Bickerstaff: has has submitted as well. So I want you guys to prompt along with us and and to use these these tools so that you can start thinking about how you apply them into your practice. And then, finally, if you have a great resource, please share we love to be able to do this with you all. And a final thing is this is our last chance. We only have 3 more until the end of the year. How crazy is it? That's almost Christmas, everybody.

    Amanda Bickerstaff: but we have a differentiation in 2 weeks. Next week we have district level perspectives on AI. And then we're we're gonna be talking about the durable skills of the future so really excited to have you all here. As Corey said, almost almost chat to me to his birthday. But I'm gonna come off sharing, and I'm going to introduce our panel. I'm so excited to have this panel, say, and I wanna say, thank you to our first panelist, Margo, who not only wrote a beautiful

    Amanda Bickerstaff: article around this topic, but also really has shepherded this panel today to talk about this important topic. And so I'd love for you the same way we open every webinar is. I'd love to you to tell us a little bit about yourself, and then your first time using generative AI.

    Margot Toppen: Alright. Thanks, Amanda, and hello to everyone out there in Webinar world. Really happy to be here today. And really happy to further engage in the dialogue around this conversation. So

    Margot Toppen: A couple of months ago I started exploring new opportunities. Kind of a new chapter in my life. And I know I have worked in Sel program development for more than 15 years now. And I'm very passionate about it, and I noticed that a lot of con the whole.

    Margot Toppen: and you sphere as I like to say, it was buzzing about AI. But in the field, in my field of social, emotional learning II noticed there wasn't. It didn't seem like there was as much conversation going on. Of course, once I did open the lid on that conversation. I found plenty of voices to engage with and plenty of people. Doing really incredible work. In terms of exploring how?

    Margot Toppen: the field of social, emotional learning and mental health education can be enhanced, augmented by new and emerging AI technologies and also thinking about what are the risks? That, and threats that we also need to keep at the forefront of everything we do. So this has been a really exciting journey for me that

    Margot Toppen: I just keep wanting to go deeper and deeper into but to answer your question, Amanda, about my first like my uha moment. With realizing the what

    Margot Toppen: what these new generative AI tools may mean for the future of education. Was actually more like around a year ago, and chat gpt kind of exploded into the forefront

    Margot Toppen: And so in the past couple of years I was working. I had the opportunity to lead a major expansion and update to a Pre. K through eighth grade. Health and wellness, product, portfolio and a priority item. That was due for a closer look on our roadmap was improving and expanding assessment options provided to teachers. So in addition to the more knowledge-based assessments we already had.

    Margot Toppen: we wanted to better address universal design for learning framework and offer teachers, authentic skills, based assessment options, and so we had done some market research. And we had pretty clear ideas of how we wanted to put this together, but I decided to take a look at how AI could possibly assist in brainstorming ideas, and of course I was blown away by its power to meet. Student needs just suggesting so many

    Margot Toppen: ways for students to demonstrate health related skills and ways to adjust.

    Margot Toppen: like those tasks based on their learning differences and preferences. So that was really my Aha! Moment of recognizing, like just the incredible power to offer exponential options for differentiation. So that's that's when when I started really leaning in and paying attention to what was going on. And, Amanda, I know you're diving way deeper into that topic, and a a webinar you just mentioned that's coming up in a couple of weeks of differentiation.

    Amanda Bickerstaff: Yeah, no, I think it. I mean, I think the things that I love about that is that sometimes we might be feel like we're kind of on our own here, and that like when you started looking into it and started talking about this. AI Sel. You probably found the pockets where either people really didn't know how to get started, and you helped them get started, or you gave them a space to have that conversation. And so I think that this is such an important one, because we do not see a lot of resources around AI and sel.

    we done. And I think that this is our opportunity to do so. So thank you so much for that. And then, next to Christian, who is, who had a busy day today, who is, you know, sel specialist and a Ela coordinator. So she is doing the work, everybody. But I'd love to know a little bit more about you. And then your first interaction with generative AI.

    Kristie-Ann Opaleski: Sure! I was 23 year veteran English teacher out of New Jersey. Yes, and II love English to the point where my kids are like, what is wrong with you because they're like no one gets this excited about a metaphor. I'm like I do so I've always kind of, you know. I thought that the power of language, and then I'm a, I believe, very firmly in meditation. I got certified as a

    Kristie-Ann Opaleski: meditation coach as well as a breath work facilitator, and I started helping teachers before the pandemic with that just kind of, you know, facilitating some sessions with them and bringing it more into the classroom. I got certified as an sel specialist, and I was doing the work for the district at the the district level for the 2 high schools.

    Kristie-Ann Opaleski: and kind of aligning it with the middle schools in the elementary, because Sel would used to be synonymous more with elementary and it was kind of just fell off after middle school. It's like, Oh, they're 14. They know everything. Yes, that's what they think. That's not true, though. So you know, I was like, I gotta bring this in. So I started bringing that in a little bit more and had a lot of fun with that. Unfortunately, my first

    Kristie-Ann Opaleski: experience with generative AI had to be last year as an English teacher. I was like reading all about it and wondering, is this like the death of the essay, which it's not by the way, and I found a student's response that was well above what he was able to do. And I was like, Oh, maybe this is AI. And I had no idea, I went on to chat. Gp created the account.

    Kristie-Ann Opaleski: and I just started like trying to paste his work in. And I was like, this isn't what I'm supposed to be doing. This isn't right. And then I was like, Oh, I gotta prompt it. So I put in the prompt that I wrote. It was for rhetorical analysis.

    Kristie-Ann Opaleski: and it didn't come up the same. But I noticed the language patterns, and I was like, Oh, okay, so I had to have a very honest conversation with that student. Who did, I have to say? Admitted to using it. And then we had a nice little work session, because I was like I don't get it. And then he showed me how he did it, and then, and I was like, you still take the 0. But I'm gonna give you the opportunity to resubmit with your own ideas because you just educated me on AI. So it worked out for both of us.

    Amanda Bickerstaff: you know, and II think that you know what II think the rhetoric of a strategy is definitely overblown. But if you taught you how to do it as a like in your now on this webinar, like, how funny is the genesis of that. And I just wanted to point out also this idea, like, sel kind of like stopped like it was like, Oh, we'll teach our like up to seventh and eighth graders. And then then we're gonna leave kids into high school, which is like

    Amanda Bickerstaff: way more complex and require self regulation and metacondition. And like in ways than ever been before. So I love that. This is something that you really saw as you were starting to think about the middle and secondary grades. And we're seeing that more and more commonly which I really appreciate, and the last one at least, also doing the work. So thank you for being here, too, is Anabel, who is a is is a principal. So everybody here like, do you know, she has a crazy job and is

    Amanda Bickerstaff: come and is willing to do this work with us. We really appreciate that, Anabel, but I would love to understand a little bit more about you. And then also that first experience with generative AI.

    Anabel Ibarra: Alright wonderful. So I am a middle school principal in Irving, Texas. It's west of Dallas, right next west of Dallas right in the Metroplex. I. Our school is about 800 students. And so our district is, I would say, like Midsize. I think we have 3,100 students all around. I'm one of 8 middle schools in the district. But I have been in education for now 14 years. This is my fourteenth year, and I started off as a foreign language teacher at the high school level.

    Anabel Ibarra: And then I went into instructional coaching and it was specifically on technology, instructional coaching. And so it was all about incorporating different tools, new tools and making lessons a little more interesting.

    Anabel Ibarra: From there I went into administration, and now of principal. But I my first experience with AI was last. Let's see, March, maybe my husband came home saying, Hey, I found out about this new thing called Chat Gp. And kind of introduced me to it, and was telling me how he was using it to create like these crazy songs, and giving it different prompts. And I thought, like, how fun would it be for a math teacher to say, Hey, write me a rap song in this type of style, for

    Anabel Ibarra: you know, solving

    Anabel Ibarra: 2 step equations and so I automatically took it to the education gear. And so I started playing with it, and just adding in different prompts, and then my favorite thing to do when I was on maternity leave when that happened. So I got a lot of time to play with it, but when I returned to school I started using it myself. I like to say, sometimes I just word format all these things, and with Chat Gbt it polishes my thoughts and words, and makes it sound so much more coherent. And so that was my first experience with it.

    Amanda Bickerstaff: II love it because II do think it's always like a worshack test like we all have different ones, and I love, how your your husband's like it makes silly songs, and you're like.

    Amanda Bickerstaff: or it can make very silly songs about math, like we always say, Take into like the place in which we're like, what like, what could it possibly do? So I absolutely love that. And I think you have a unique perspective as well, because you're coming from someone that probably was in that, probably. But you were supporting your staff to to corporate tools like this. So I love that. That's a thing. And we'll talk about that. We

    Amanda Bickerstaff: we'll do a live demonstration. Everyone's gonna have a little bit of show and tell we're gonna we're gonna use our like, our circle time together to do a show and tell some great practices. And I know you're gonna be showing where you brought in actually, a technology that is fit that's directly created for mental health and well being. So, I think that's gonna be really great to see and so I have our kind of first big question, and we always like to start with a a pretty important one. And so

    Amanda Bickerstaff: we know that social muscle learning is about people right? And it is about, or we call it hard work, right? And it's not it. Sometimes it can be hard work, but it always is hard work. And so the question that I have for you all is like, Where is the role of AI

    Amanda Bickerstaff: in that type of work? Is it? Is it not? Like? Where do you think that right now, considering we know that students are using these tools and that are actually using these tools for mental health and well, being potentially more than their 5 paragraph. Fsa. So where do you think that line is

    Amanda Bickerstaff: starting with Marco?

    Margot Toppen: Well, I think for sure that there's, you know, a lot that we're all, you know, really just figuring out right now of exactly how this can fit in. But I think it's

    Margot Toppen: absolutely imperative that we do figure that figure it out. Because tech the technology is is not going to go away. And so I think that if if we don't

    Margot Toppen: figure out how to use generative AI as a way to personalize

    Margot Toppen: how we explicitly and intentionally help kids build their social-emotional competencies. you know. Then

    Margot Toppen: we're gonna miss opportunities to differentiate, to connect with them, to speak their language. And so I think we absolutely must be experimenting and using things like the prompt library that I'll be sharing later to just see, okay, what is the new twist? If these are the approaches I've been using? And now I can, you know.

    Margot Toppen: plug that into a chat pot to kind of reimagine, or think about it, or more personalize it for my specific students and their specific needs.

    Margot Toppen: yeah, I think that we'd just be completely missing the boat if we didn't start doing that

    53

    00:15:53.340 --> 00:15:54.640

    Margot Toppen: right now.

    54

    00:15:54.840 --> 00:16:04.029

    Amanda Bickerstaff: What I love. And I think this is a thing we try to crystallize a lot. Is this idea of like taking what you do and then seeing what's possible.

    55

    00:16:04.030 --> 00:16:27.179

    Amanda Bickerstaff: and so like, instead of necessarily replicating the effort that you would make like, what? Where can I add that differentiation, that personalization? Where can I add an sel tip, or strategy in a place that wasn't one before, because I didn't have enough time or I didn't. Ha! I couldn't think about it in a different way, because one of the things that's so great about AI. Is that what you said, Annabelle, that that I'm gonna say, thought, partner, instead of

    56

    00:16:27.180 --> 00:16:37.819

    Amanda Bickerstaff: you know, helping you with your your word. But I do think that that's the case, though, like, how do I actually have it? Reframe my own thinking? And so, Margo, your point about that. I'd love that.

    57

    00:16:37.820 --> 00:17:02.540

    Amanda Bickerstaff: It's not about like. Let's just do it faster. But can we do it better? Can we do it more consistently. Can we do it in a way that actually meets our students, needs more directly, which I think is really the goal of what we want to see happen. So that's a a beautiful answer. I really appreciate that. So already, Christy. And I mean, you're working with kids. And you're working with teachers directly about this kind of work. So where do you see that line between, you know? Ais helping, or is it? Should it not be helping?

    58

    00:17:02.890 --> 00:17:13.610

    Kristie-Ann Opaleski: Oh, no, it definitely is helping. I'm gonna echo. You know what Marco said is the idea that we have to meet with the kids where they're at and where they're at is online, especially teams. I

    59

    00:17:13.853 --> 00:17:38.410

    Kristie-Ann Opaleski: mean, that's where everything is. So if you don't, you know enter their universe, you're never gonna make those connections. And that would. That's what Sel is about. I do think that it should be. It's like more of a tool or a branch, if you I don't know if the word is vain or ventricle. I'm sorry if anyone is a science teacher and listening of the heart muscle. I'm trying to use a metaphor. So it's just a little like a branch of it. But human interaction has to be at the core. It can't just be

    60

    00:17:38.410 --> 00:18:03.380

    Kristie-Ann Opaleski: a chat, bot, and you know they're giving advice. It it you need that human connection. That empathy that you can see, whereas you know a device can give you an app can give you advice. It can give you suggestions. It can give you research, can we, you know, do a lot of good things, but it can't wrap its arms around you. It can't, you know, really look in your eyes and be like no, that's not. It's not resonating

    61

    00:18:03.380 --> 00:18:32.109

    Kristie-Ann Opaleski: like there's something still there. Keep talking, you know. It's only what you. They feed it. And again, you know, children are not self aware, you know, and I'm not, you know, trying to be derogatory. They're growing into that. So if they're not realizing that so they need that human element, they need a teacher or they need a parent. They need someone there to kind of help guide them. So I believe AI is very important in Sel, but it can't be a standalone or a replacement for the human contact and discussion

    62

    00:18:32.720 --> 00:18:42.220

    Amanda Bickerstaff: absolutely. And I mean, at this stage there is no replacement. And for, like you said, sometimes you need a hug. Sometimes, sometimes you just stop talking.

    63

    00:18:42.260 --> 00:19:05.190

    Amanda Bickerstaff: sometimes it actually is enough. And like, we don't need to keep pushing our students like we could actually say, You know what we'll come back to this when you're ready, and that requires our ability to like, know our students to see our students to see queues and respond to queues and and to ourselves, be both self aware and both, and proactive in those moments, and it requires like absolutely human intervention and touch.

    64

    00:19:05.190 --> 00:19:29.430

    Amanda Bickerstaff: And so II think that we're nowhere near AI replacing that part of our lives like. And and there are things that it's gonna replace pretty fast and is replacing. But like there's like even our conversation right now, like the ability to know, like how engage. We are like like our connections to each other, because we're encouraging each other in this moment is another example of like just being there and like a simple smile. Or, you know, encouragement that happens

    65

    00:19:29.690 --> 00:19:56.170

    Amanda Bickerstaff: has goes so far beyond just even the like deliberate actions. We take an Scl. So I really appreciate that, Christiane. I think the audience did as well, which is great. And then to you, Anabel, I know that. You know you have to have. Take a slightly different view because you're you're you're now the pastoral care like you have pastoral care. You have some politics. Secl is pretty political right now you've got the like a lot of competing priorities. So what do you think about this question

    66

    00:19:57.260 --> 00:20:16.349

    Anabel Ibarra: honestly, in in the school setting. it's always going to come down to the hard work. right? That that is the core of what we do, building relationships with our students, making sure that they know that they're cared for. And then it's also teaching them, you know, once was like character counts and responsibility, self awareness. And all of that

    67

    00:20:16.490 --> 00:20:17.960

    Anabel Ibarra: can be done

    68

    00:20:17.970 --> 00:20:32.450

    Anabel Ibarra: through different AI tools. And it's at different levels whether it's staff members using it to create prompts and then using that to connect with the students, students using it to find different ways to self regulate.

    69

    00:20:32.450 --> 00:20:50.919

    Anabel Ibarra: So I think there's different levels of how it can be implemented. I always preach like, let's work smarter, not harder. Let's be efficient and effective. And at the core of it. That's where AI is coming in. It is just a tool. It will not replace the human but it's about being efficient and effective.

    70

    00:20:52.880 --> 00:21:17.449

    Amanda Bickerstaff: I mean that I think that is a huge piece. Right? Let's make. Let's make it a little bit more efficient, and let's actually give time for the things that matter like, can we? Can we start to lower all that like noise outside the classroom and really refocus that into the hard work, I think, is really a place that we can create right now with what the tools could do right now. And so I think that this is really good question. And so before we go on to like doing a little bit of that show and tell I do. Wanna.

    71

    00:21:17.450 --> 00:21:35.759

    Amanda Bickerstaff: But I always think about this moment in time, as we already have a lesson that's been learned, which is social media. We already have this moment in time in which we, you know, devices went to hands of young people and social media became something that is ever present, without any stopping.

    72

    00:21:35.760 --> 00:22:00.619

    Amanda Bickerstaff: without any like knowledge, or stopping or thought about what this really meant for young people. And right now there are 41 states that are suing Meta around their knowledge of students using these tolls underage and under permission levels. And so I know that we have those questions about. We have already seen that you know, technology can have negative impacts on students, social like self image, their ability to communicate their their

    73

    00:22:00.620 --> 00:22:14.930

    Amanda Bickerstaff: collaboration skills, etc. So what do you think that like? What does that mean for this moment in time? Now, in terms of how we support students to use these tools if we are going to use them. how can we support them? To not have the same things happen where it has this negative impact.

    74

    00:22:15.090 --> 00:22:16.650

    Amanda Bickerstaff: Margot, to you?

    75

    00:22:18.340 --> 00:22:36.569

    Margot Toppen: Yeah, I mean, I think there's a real opportunity to do a lot of work around. Digital just digital literacy, digital citizenship, type, skill building. I think that's a natural kind of connection point for how it connects to kind of the social and emotional competency, skill building.

    76

    00:22:36.970 --> 00:22:54.090

    Margot Toppen: you know the question of whether or not tech uses contributing to the youth. Mental health crisis is is asked a lot and if when you ask kids, there are kids who say, I find my community, I find my tribe online like

    77

    00:22:54.100 --> 00:23:18.139

    Margot Toppen: it has got, it has gotten me through some really hard times. So for you know, while for sure, we know there are also lots of damaging things happening to kids, Psyche's because of tech and social media and constant screen connection time. There are some good things happening, too. And so I think

    78

    00:23:18.840 --> 00:23:19.830

    Margot Toppen: you know.

    79

    00:23:20.260 --> 00:23:41.399

    Margot Toppen: similar. I think, with that as well as now. Generative AI the opportunity for us in the field of education is to help kids to process. Okay, these tools are going to be part of my life. They are part of my life. So that's an opportunity to use that as a starting place for that social and emotional competency building.

    80

    00:23:42.350 --> 00:23:48.529

    Amanda Bickerstaff: That's great. And I think this is maybe the the big difference here is that, like AI is a tool, whereas social media

    81

    00:23:48.870 --> 00:24:16.989

    Amanda Bickerstaff: I don't know we could really call it a tool. It's a tool for some things, but maybe not the ones we want it to be but that the idea of like kind of taking the lessons we've learned. But then, thinking of them moving it just slightly, like it actually is something that you need to know how to use. But like you have to be aware of, like what the impact can be. So I really think that's an interesting point of view that I think, can help us, maybe structure the way we approach that in the classroom. So, Christy, I'm same question, do you like? What like? Where where do we keep kids safe? And how do we help them be critical?

    82

    00:24:17.340 --> 00:24:37.290

    Kristie-Ann Opaleski: I think that we have to get once again get away from the stark dichotomy that a lot of people have where it's like tech is bad or tech is good, and it only fits in one box because it's more nuanced than that, and the lines are all blurred, you know. And again, it's a tool, and I don't think a tool is good or bad. It's neutral, and it's how you use that tool.

    83

    00:24:37.290 --> 00:24:47.439

    Kristie-Ann Opaleski: However, with AI, it's only as good as its developers. So you know, they're human. So there's biases. There's things in there that they didn't necessarily intend.

    84

    00:24:47.440 --> 00:25:12.070

    Kristie-Ann Opaleski: And with this generation they grew up with all of this. So they're aware of the algorithms like they weren't surprised. I remember the first time I realized that my phone was listening to me. And then we did like an experiment in class with it. And they were like, Yeah, they are like, what do you think is going on when you talk into your phone? So like, what do you think? Like they were? Oh, not okay with it, but like more aware. And a lot of them, I have to say at the high school level.

    85

    00:25:12.200 --> 00:25:39.050

    Kristie-Ann Opaleski: watch the social dilemma on their own, and you know, a lot of classes showed it as well. So they understand the basic idea of algorithms. So understanding that and then now with generative AI, they are understanding that there are issues, but that isn't a means to not, or a reason for to not to use it. Whereas, like you, said, social media tool is not the first word that would come to mind. I mean, like you said, there are instances like my students do an actual

    86

    00:25:39.050 --> 00:25:42.550

    research plan, and a lot of them do a social media campaign.

    87

    00:25:42.550 --> 00:26:06.740

    Kristie-Ann Opaleski: So they're raising awareness about their global issue. That's a wonderful tool. That's not how they see it. At first, when I suggest that, like what? How they can use social media to promote, they're like, Oh, yeah, they're like, that's what influencers do. I'm like, yeah, be an influencer for good, and then they just laugh at me. Okay, so I think the idea that you know to to try to just say, you know, blanket AI is better.

    88

    00:26:06.740 --> 00:26:28.719

    Kristie-Ann Opaleski: We can't use. It is foolish, and it's short-sighted. They need it for the jobs they use it already. They're aware they use it, it's natural to them. So why would we take a tool away when it can offer such opportunity and such advancement for them? You know, academically as well as social and emotionally and just helping their them succeed in life. And isn't that ultimately what education is about?

    89

    00:26:28.770 --> 00:26:40.679

    Kristie-Ann Opaleski: Okay. So you know, getting back to to that core. So yeah, I think it's it's it's a large part of a larger conversation. But we have to get away from. It's just, you know this is good or bad, and we have to get away from that dichotomy.

    90

    00:26:41.100 --> 00:27:06.879

    Amanda Bickerstaff: Yeah. And III don't know what has happened to only have 2 options all of a sudden, like we just moved away from a gray area, or like nuance or balance, and it's either like AI, bad AI, good like AI doomer as that like acceleration is. And what we're missing is like the thing to make us special or ability to actually understand complex things. If we, you know, we give students and people the opportunity

    91

    00:27:06.880 --> 00:27:30.060

    Amanda Bickerstaff: to actually have some cognitive dissonance. They usually can do it. But it takes effort right? And like this idea that it's only good or bad is just a. It's a false economy to your point, but it is so ever present right now. It's like AI is only for cheating like that like. And it is crazy. Because if you look at the data, students are not using it only for cheating, you know what they're using is character AI,

    92

    00:27:30.060 --> 00:27:52.079

    Amanda Bickerstaff: to create like avatars and buddies they hang out with during the day, and like they can talk to like. That's what Jen Z is using these tools for way more than doing their homework. And I think that this is something that, like we cannot seem to get past that, but like your to your point, like we have to, it is imperative for us to move out of that rhetoric. So to you, final to Annabelle, what do you think?

    93

    00:27:53.310 --> 00:27:56.200

    Anabel Ibarra: I was just just thinking there's there's good and

    94

    00:27:56.370 --> 00:28:08.569

    Anabel Ibarra: equal to everything, and it's all about how you use it, how you implement it and their students. So how you teach the use of the tool? Because that's what's going to make the difference. Right? With

    95

    00:28:08.570 --> 00:28:33.189

    Anabel Ibarra: any tool comes responsibility. And it's about teaching them how to do it appropriately. I wonder if, like back in the day, when word or spell check came out, if this was a big thing. Also, kids are gonna know how to spell anymore like they're gonna use that. And they're gonna cheat. And now their essays are going to have no spelling errors like. Was it a big thing then? And then how did we incorporate that into just the new way of education?

    96

    00:28:33.190 --> 00:28:45.699

    Anabel Ibarra: And so II just wonder. And I'm waiting to see more of how it gets used in the classroom, and how it gets used by students other than just that negative connotation of students are using it to cheat.

    97

    00:28:46.680 --> 00:29:14.799

    Amanda Bickerstaff: which is a great segue to let's actually kind of start looking into like the ways in which we can support that. But what II think, Anabel, yeah, it was like the calculator, like the you know, the pencil. But you know all these. I joke that like kids, you know, back in the days of Socrates, someone had a cheat sheet on like they chiseled their cheat sheet of their answers back in the day, you know, on that stone tablet. But I do think that's what's really interesting is that we've had these inflection points. Maybe we don't. All we've come from different era. So maybe we don't all

    98

    00:29:14.800 --> 00:29:27.859

    Amanda Bickerstaff: like I don't remember what happened with spell check came in, but like I do remember when Wikipedia did, and like, no kids are ever gonna like, do research ever again. And that was kind of what that that rhetoric was. And we've seen things shift and change.

    99

    00:29:27.860 --> 00:29:51.950

    Amanda Bickerstaff: And so I think that this is another example of that. And we're so at the it. We're so at the early stages of this like, when Wikipedia was a year old, like no one really used it, you know. It took a while, whereas in 2 months, chatty everyone, you know, a hundred 1 million people used it. So it's it is kind of a difference in that that manner, too, like just how fast this is. Put us into a position that we have to have an opinion, and we have to train students explicitly how to use these tools.

    100

    00:29:52.030 --> 00:30:16.700

    Amanda Bickerstaff: So now we get to do our circle time. So our virtual second time today starts with Margo, who has done a beautiful job and shout out to Kelly and Dan, on my team, who have done a great job of getting these new prompts up in the library. So you wanna share your screen and actually show people how they can start using chatty and other tools to start building new and amazing ways to incorporate. Scl.

    101

    00:30:22.380 --> 00:30:23.939

    Amanda Bickerstaff: Think you're on mute, Margo?

    102

    00:30:26.500 --> 00:30:41.649

    Margot Toppen: Sorry about that. Okay, I got so excited about sharing my screen that I forgot to unmute myself. Yeah. So for folks out there who have not yet been on the AI for education website, and sorry if I'm looking over here because I got my screen shares over here. So

    103

    00:30:41.780 --> 00:31:07.660

    Margot Toppen: the on the AI for education website. If you Google it? And then look for the educator resources and go to Prompt Library. You'll land on on this page, and you'll see that Amanda and her team have been building out all kinds of great ready-made prompts to drop into your chat of choice, to do all kinds of

    104

    00:31:08.170 --> 00:31:37.969

    Margot Toppen: awesome things with generative AI but as of literally an hour ago, there is now a social emotional learning section. Prompt library. So I've been collaborating with Amanda's team to get these ready for primetime. So there are now. And Christian also contributed a self care plan that she has done with her students in her classroom. So I'm gonna let her talk about that prompt

    105

    00:31:37.970 --> 00:31:54.009

    Margot Toppen: in a minute. But the rest of the prompts are ones that I worked on. And as I, this is step that I've been thinking about. About. Okay, what are some of the common sel strategies that

    106

    00:31:54.180 --> 00:32:05.930

    Margot Toppen: we see over and over again, throughout all different sel, curriculums and products, and etc. And what are ways that AI may be able to augment or enhance

    107

    00:32:05.930 --> 00:32:30.409

    Margot Toppen: those strategies. And I also did for anyone who's not super well versed in the field of Sel, and is not most a lot of people out there are familiar with the castle framework. So if you're not familiar with the work of Castle, they're sort of the world leading organization that's done a lot of research and advocacy work for Sel.

    108

    00:32:30.410 --> 00:32:55.380

    Margot Toppen: And their framework for social, emotional learning is very widely adopted. So when you come over to my. So their framework includes these 5 domains of social and emotional competency self awareness, self management, responsible decision, making relationship skills and social awareness. So conveniently. I have a prompt related to each of those competency areas ready for you to try out here. And so

    109

    00:32:55.380 --> 00:33:20.349

    Margot Toppen: you can explore that. And when you read about them you'll be able to tell which one is to build self awareness which one is to build self management skills. But just for fun. I thought I would. Do my personal, favorite social, emotional learning competency that I'm most passionate about working on with students, which is social awareness cause. I think our world just needs a whole lot more of that.

    110

    00:33:20.350 --> 00:33:21.480

    so

    111

    00:33:21.510 --> 00:33:42.450

    Margot Toppen: so this is a prompt to customize the practice of classroom meetings. Using an AI chat bot. So what you can do with anything in this prompt library? Is, basically, we give a sample prompt where we

    112

    00:33:42.570 --> 00:33:56.899

    Margot Toppen: instruct the AI chat bot on the persona. We want them to act as when they're generating ideas and answers for us. And then you customize it with the things that are kind of bracketed off in the prompt

    113

    00:33:56.900 --> 00:34:19.580

    Margot Toppen: I am going to go ahead and actually just copy and paste this example prompt for a class meeting. So read it to. We will be telling the chat bot. You're an expert educator and instructional designer with expertise in social and emotional learning and the castle framework create a 10 min classroom meeting agenda for my second grade classroom. That includes a greeting.

    114

    00:34:19.580 --> 00:34:25.619

    Margot Toppen: a prompt for sharing ideas, a community building activity and a message of the day. Focus the top

    115

    00:34:25.620 --> 00:34:49.400

    Margot Toppen: focus on the topic of standing up to bullies and down in sort of the prompt making the prompt work for you tips like I have a note. So bullying prevention month is in October. So maybe you wanna do classroom meetings in October around that theme, and you could use this to help you. So to give an idea of what Chat Gpt would do with this prompt

    116

    00:34:49.610 --> 00:34:57.750

    Margot Toppen: I'll go over to the Chat B. Gpt tab on my browser and just paste in that sample. Prompt that I just read to you.

    117

    00:35:04.970 --> 00:35:14.969

    Margot Toppen: and there we go. So it did everything I asked it to do. So. It it gave me a 2 min greeting

    118

    00:35:15.320 --> 00:35:30.480

    Margot Toppen: and a 2 min, prompt for sharing idea for students to share ideas with thought partners. There the the prompt it was. Imagine, you see a friend being bullied on the playground. What would you do to help them share your thoughts with the partner?

    119

    00:35:30.570 --> 00:35:56.790

    Margot Toppen: And then so on and so forth. So basically just created the exact meeting agenda that I wanted it to. So what you can do with this is then, if you love it as is on the first try, you can run with it. Or if there's something in it that you want the chat Bt, Gpt. To provide further guidance or ideas like, for example.

    120

    00:35:57.100 --> 00:36:23.179

    Margot Toppen: this. Imagine you see a friend being bullied on the playground, what would you do to help them? If you're not sure exactly what types of answers you're hoping for students to give you. You could ask the Chat Bot to. Can you give examples of the types of answers that I might be looking for from my students. And it will. It will provide that for you. So you can. That's just an example of the ways you can kind of dig a little deeper.

    121

    00:36:23.840 --> 00:36:31.670

    Margot Toppen: Amanda, did you want me to go any deeper on this? Does that give us a nice baseline?

    122

    00:36:31.930 --> 00:36:37.760

    Amanda Bickerstaff: How fun is this, though I'm sorry, like I'm nerding out, because, like it's just.

    123

    00:36:38.030 --> 00:36:56.010

    Amanda Bickerstaff: I don't know, like there's something about the fact that it came up with a pretty good question to ask, and that like prop that sharing piece just from that idea of like standing up to bullies like. And it's you've crafted this really thoughtful agenda like II just find that so fascinating to see, especially as I was a high school teacher, and so like

    124

    00:36:56.010 --> 00:37:20.859

    Amanda Bickerstaff: I would have loved it. I feel like that. We should do this more at high schools. But it is really really interesting. And thank you so much, Margo, for putting this down. And, as as you said, if you want to come off sharing, just so we can get ready for Christine. What's really great is that there are 6 here, but they're all kind of in that castle framework which is really the best in class, in terms of thinking about those proficiencies and competencies that we want students to build. But, as you can see, like, there is such a wide variety of ways

    125

    00:37:20.860 --> 00:37:45.760

    Amanda Bickerstaff: in which we can use these tools to help us get a starting place for, like a a pretty good classroom agenda or mindfulness practice or cooperative learning activity that really has students work together in ways that go beyond just the normal. So I think that was really awesome. And thank you so much, Margo. And I know we're gonna have so many people that are gonna love, this prompt library. So thank you for checking it in. Okay, Christian, you I love. When we had our first little conversation, and also, I think, Christian, we're probably very

    126

    00:37:45.760 --> 00:38:02.389

    Amanda Bickerstaff: similar in our approach to something. So we definitely have the same sense of humor. But I love. When we first talked. You were like I did this with my students, and I did it around self care. And I think that this is such an enormous piece, especially with our students transitioning out of high school of like W. How do they know like, what? How are they

    127

    00:38:02.390 --> 00:38:09.249

    Amanda Bickerstaff: getting ready for that next step? Or they really are going to have to own their self care. So I'd love for you to share what you've done with your students.

    128

    00:38:09.360 --> 00:38:10.540

    Kristie-Ann Opaleski: Sure.

    129

    00:38:11.130 --> 00:38:31.549

    Kristie-Ann Opaleski: So this was the lesson that I started with them. And you know, II wanted to show them that you know Chat Gp could help them personally, emotionally, and use again as a tool, not for cheating. And then, when I said, We're gonna use chat gpt for our due. Now for the whole week, we have this like little mini thing they're like, well.

    130

    00:38:31.780 --> 00:38:56.709

    Kristie-Ann Opaleski: they're like, you can make a straight essays. It's like, No, we're gonna talk about ourselves. You get to pick a topic. So then they were like, alright, whatever. So they they commute. You know they were like, alright. We'll humor her because she's way too excited about this cause. I didn't know how it was gonna work, cause I'm like, alright. So I explained to them, you know, I came up with the this prompt is that when II did it live with the students is you're a mental health expert working with teams to build a healthy lifestyle.

    131

    00:38:56.710 --> 00:39:05.259

    Kristie-Ann Opaleski: provide a list of whatever focal point for. And I put, I use the word buzzword, specific buzz, word, or trend to help my twelfth graders build healthy habits.

    132

    00:39:05.260 --> 00:39:34.889

    Kristie-Ann Opaleski: So if you wanted to look at the actual chat that came up here, it was to model it from my students. So you know, we had already throughout the year, because I'm as obviously as an Sel specialist. I talked to them about self care. So you know. You know, during different times, especially as seniors when they were getting their college applications done. And it's the fall, and they're going to homecoming, and they're exhausted and staying up to 3 Am. And I'm like, all right, we're gonna focus on you know how to get better sleep or management. And we'd have little like, you know, powwows about

    133

    00:39:34.890 --> 00:39:40.199

    Kristie-Ann Opaleski: that. When AI came out I was like, why can't AI do that so cool?

    134

    00:39:40.200 --> 00:40:05.179

    Kristie-Ann Opaleski: So I kind of gave them suggestions. You know. I started off with this table for them, so I'm giving it to you exactly how I gave it to them last year, you know. Like, if you you're wanting to eat better and I said, I'm not saying anyone in here needs to lose weight. I said, I don't wanna hear that like I wanna lose 15 pounds. I said, I wanna hear about healthy habits like maybe you need to incorporate more fruits and veggies. So the other thing was, you know, we talked about like all the different styles

    135

    00:40:05.180 --> 00:40:17.299

    Kristie-Ann Opaleski: and a lot of kids, I said, feel free to go and Google something first, like when I said, There's all these different healthy lifestyle diets so like, what do you mean? I'm like? Add Kin Paleo Vegan clean eating, and they're like

    136

    00:40:17.300 --> 00:40:40.490

    Kristie-Ann Opaleski: mit Ctl. And I don't know what Paleo is. I don't know what Mediterranean is. I'm like, look it up. See? Like, can we use? AI? I'm like, yes, fine. You say, I use Google use any tool that you want. So you know, I gave them some suggestions, same thing with like movement and exercise. And again, a lot of them are athletes. So it was like you could. You know you're in your off season. What specific drills could you use? What kind of conditioning could you do?

    137

    00:40:40.490 --> 00:40:47.789

    Kristie-Ann Opaleski: You know, and we we talked about you know, just overall mental health. Obviously, there are certain times a year they're more stressed.

    138

    00:40:47.790 --> 00:41:04.390

    Kristie-Ann Opaleski: You know, in the the winter. It's colder in New Jersey. The weather is always crazy, you know. You have people with seasonal effective disorders. So, looking at depression, we did a whole unit on what I call ants, which I'm if you're in social, emotional learning, it's the the automatic negative thoughts.

    139

    00:41:04.390 --> 00:41:27.310

    Kristie-Ann Opaleski: So I was like, oh, I see an ant crawling on you so some of them were really like I need to control my inner voice. It's like yelling at me. So I was like, Okay, so I put in some you know, suggestions. And then the students added things in. And they added in sleep, focused study skills and money management which I was like. Oh, my God! I'm so proud of you about money management like I should have thought of that as the adult, but now they did

    140

    00:41:27.310 --> 00:41:53.099

    Kristie-Ann Opaleski: and then they came up with the popular trends or the buzz words. So you know, I gave them the you know the doc where they could pick anything, and I said, I don't need to. I said, this is not about the the output. It is about the process. So every day they had to do something. So the you know, the first day they had to pick what their focus was and the trend. And I said, and if it's not in the box, or you just thought of something, and you don't wanna share it with the group. That's fine, do it yourself.

    141

    00:41:53.100 --> 00:42:17.169

    Kristie-Ann Opaleski: and then Tuesday, drilling down with 2 more questions, refining it, you know. So like, if you wanna do, Paleo, but you hate, you know. Chicken, what do I do if you you know I hate chicken. So again, you know, what other recipes could we use? What other things could we do? And then adding more details about works for them. And they really liked that personalization. And then some of them were kinda like.

    142

    00:42:17.180 --> 00:42:39.250

    Kristie-Ann Opaleski: I kinda get where they're saying AI could be a tutor because it's a dialogue, and it's teaching me. But I'm learning that like maybe my question wasn't clear, or maybe I need more information, and then they did a reflection on what they thought, which I apologize. It's not hyperlinked, which I will hyperlink where they had to reflect on the process and what they got out of it. And then on Friday we shared.

    143

    00:42:39.250 --> 00:43:03.869

    Kristie-Ann Opaleski: and I have to say, for doing something off the cuff and not knowing how it was gonna work. And again, they are seniors, and I'm not afraid to look dumb in front of them. I mean, come on. I'm I'm Jen's I'm a Jen excer. I look dumb, anyway, so I don't mind playing around with them, and they had a lot of fun, and one of the things at the end of the year that they said cause I always asked for a reflection on my teaching the class the content

    144

    00:43:03.870 --> 00:43:31.370

    Kristie-Ann Opaleski: mit Ctl. And that kind of thing. They all went back to the fact that you gave us other ways to use, you know, AI, or they gave us other ways to take care of ourselves. It wasn't just the academic you didn't tell us. Well, get off your social media. Set a timer, you suggested. You know, websites or influencers that are health practitioners. They're like you met us where we needed to be. And honestly like, I'm I'm getting through this without crying 120

    145

    00:43:31.370 --> 00:43:36.959

    Kristie-Ann Opaleski: that like in June, I was like bawling, reading all I'm like.

    146

    00:43:37.460 --> 00:43:44.029

    even though, you know, finals were horrible, and it was pouring during graduation reading those reflections

    147

    00:43:44.030 --> 00:44:08.670

    Kristie-Ann Opaleski: made it worthwhile, and that really got me on the kick like we can use AI for this, you know, not again, as an end all be all, and the way to answer all of our problems, but as a tool, as another way that we can reach kids, that we can give them resources. And again, I'm a title, one district, or, you know, on one side of town we're completely title one, the 2 buildings that I work in. So they don't have resources when they realize that Chat Gp was

    148

    00:44:08.670 --> 00:44:17.140

    Kristie-Ann Opaleski: great. They were like, Are you serious like, anybody can use this? I was like, Yeah, there's no subscription right now and then when it came out, for I said, you still have the free version.

    149

    00:44:17.140 --> 00:44:39.429

    Kristie-Ann Opaleski: So I like the equitable focus as well. So you know, some of my students who don't, you know, who have monetary issues? Financial issues still could get self care. They still could ask for resources. And again, I think, as educators, we need to embrace that because it's not going away, it's only going to evolve more, and we want to be part of that evolution with our students.

    150

    00:44:39.960 --> 00:44:46.710

    Amanda Bickerstaff: Thank you so much. And we have a prompt that also, we've developed from this activity. So one version is

    151

    00:44:46.710 --> 00:45:10.690

    Amanda Bickerstaff: for a teacher like lesson planning and actually potentially building one for students and then one for students to use on their own, that is reflected, and the the great work you have there, and I think it's so great that you know you you are willing to like. I don't see. I do not think you look dumb, I think, but I think you look vulnerable in ways that matter like that. You're willing to try things and fail in productive manners with your students into? And

    152

    00:45:10.690 --> 00:45:13.679

    have you actually foregrounded that with this experiment?

    153

    00:45:13.680 --> 00:45:42.709

    Amanda Bickerstaff: And I think that that really makes a big difference in terms of their commitment to trying as well, and it gives you some space to use these as teachable moments when it doesn't work, so so to speak. So now we have anna Bell is actually gonna show us a tool and so we're gonna we're gonna do this piece. And if you have to leave, we totally understand. We love having you here. But we're gonna do is we're gonna keep rolling. And then we're going to like, have a a speed round around like we can think about for for teachers themselves and their their self care. So, Annabelle, you wanna share your screen and talk about alongside.

    154

    00:45:48.730 --> 00:45:51.770

    Amanda Bickerstaff: You are also on mute.

    155

    00:45:52.610 --> 00:46:14.960

    Anabel Ibarra: There we go. So booty. Middle school is one of 4 schools in Irving Isd, that piloted this program back in March. So our students started using it last school year, and then came back this school year, and we did the rollout again. And they're using it. So alongside is a program that allows students to talk to the llama is what they call it, but it's a chat bot

    156

    00:46:14.960 --> 00:46:23.890

    Anabel Ibarra: where they can talk to the chat bot about anything, and uses that AI language model to respond to them. But this

    157

    00:46:23.890 --> 00:46:29.530

    Anabel Ibarra: is more in a safe space, because it was created by doctoral clinicians who have

    158

    00:46:29.650 --> 00:46:40.739

    Anabel Ibarra: taught the llama to respond with you know certain strategies that are research-based and that could actually help the students. I was trying to pull up our training

    159

    00:46:41.000 --> 00:47:03.249

    Anabel Ibarra: that we did for our teachers so that they could know, like, these are the programs that alongside is equipped to address a lot of them go back to just issues with friendships, families, peers. And so, as the students talk to the llama, it determines what it is that they're struggling with what the issue is, and gives them strategies for it. And so.

    160

    00:47:03.700 --> 00:47:19.340

    Anabel Ibarra: you know, they try to say, like, it doesn't really use a lot of AI. It's not pulling information from, you know, just anywhere on Google. But it. It is developed by the clinicians. And so a lot of the AI is put more into like the validating responses.

    161

    00:47:19.560 --> 00:47:37.909

    Anabel Ibarra: There's journals and other things that they can do on the app. But our students here at really have enjoyed being able to use the llama for the students like we said they grew up with technology. It's natural for them. Many of them actually prefer to talk

    162

    00:47:38.000 --> 00:48:04.480

    Anabel Ibarra: and chat because it's non judgmental. They don't have to worry about it leaking out when they log into the chat. It will tell them like this, information is completely private until or you know, unless you say that you're harming yourself, harming others or abuse. And so at that point we do get alerts on the administrative profile. We have the alerts area. But one thing that I wanted to show was the dashboard.

    163

    00:48:04.720 --> 00:48:09.339

    Anabel Ibarra: Like I said, our school is about 800 students. We try to use this tool as a preventative

    164

    00:48:09.510 --> 00:48:21.890

    Anabel Ibarra: measure, and to make sure that we are acting before our students are, you know, further down or spiraling. And so out of our 800 students, we've had

    165

    00:48:21.900 --> 00:48:29.399

    Anabel Ibarra: about half of them log in, and then we have another 294 students that are continuing users.

    166

    00:48:29.920 --> 00:48:44.969

    Anabel Ibarra: Our counselors here on campus like to see this dashboard here to see what are the issues that our students are talking about that helps them customize the lessons that they do when they go into the classrooms. And so overall, it's

    167

    00:48:45.760 --> 00:48:57.680

    Anabel Ibarra: to me. It's revolutionary, the way that they are using AI in a very controlled way in a safe space for students. And so I've I've really enjoyed piloting this program

    168

    00:48:58.800 --> 00:49:04.760

    Amanda Bickerstaff: and so Annabelle is is alongside right now, is in Bana, right? So it is free.

    169

    00:49:05.420 --> 00:49:10.280

    So alongside right now. I think, has moved on to

    170

    00:49:10.970 --> 00:49:25.619

    Anabel Ibarra: charging. But one of the things I'm trying to get to it. They do have a demo app, and so anyone who's trying who wants to try it out can go into just their right, their website alongside Doc, care and when they go on there they can try the app

    171

    00:49:25.730 --> 00:49:28.289

    Anabel Ibarra: and see what that conversation is like.

    172

    00:49:28.900 --> 00:49:36.490

    Amanda Bickerstaff: That's great. And so I would say, is like, you know, we always try to support, like the different levels. What I would say is that this is a great example of a

    173

    00:49:36.600 --> 00:49:51.429

    Amanda Bickerstaff: an AI tool that is traditional. AI in the sense of it's not generative AI, primarily, but it's generative AI in meaningful ways around that validation. And so that's really great to see, because because I would not give a chat. Bot

    174

    00:49:51.500 --> 00:50:11.049

    Amanda Bickerstaff: like a generative. AI Chat bought this much license for all kinds of reasons, but if it's used in productive manners and responsible ways, it can make that more sticky for the student where they actually do feel like it's heard. And as someone that has built, I know we talked to everybody. But like my last role. We actually built a wellbeing tool and did a whole bunch of research. And it does show that

    175

    00:50:11.050 --> 00:50:33.369

    Amanda Bickerstaff: so much of the work that we need to do is on the preventative side, and like just having students, even if they have one person, the researchers, one person within a school community that they know that cares about them, and they go to. That is enough. It only has to take one. And that's something like this as a proxy to that one. That is clear. It's clear, because, you know, you have, you know, 80% of your students that use the tool have used it more than once

    176

    00:50:33.400 --> 00:50:50.809

    Amanda Bickerstaff: is that that's a really great way. And then, we see, is the augmentation we talked about earlier of like your staff. Now, using this, these tools to go deeper, and to do better in terms of actually meeting students where they are. So I really love that, and thank you so much, Annabelle, for being so open to showing like your data, which we really appreciate.

    177

    00:50:51.070 --> 00:51:16.010

    Amanda Bickerstaff: So we are. We're we're gonna come to our last question, which will be our speed round. If we can come off sharing. Anabel, you don't mind, that's perfect. And so, you know, we talked about at the very top that, like we owe, we often talk about well-being and social learning kind of like being for kids and students. But we know that like, it absolutely matters for our educators, too. And so a speed round. You have 30 s starting with Margo like, what do you think is like your best to.

    178

    00:51:16.010 --> 00:51:25.039

    Amanda Bickerstaff: or best strategy for actually using AI with you know, for us teacher or self care or teacher, mindfulness or teacher, social-emotional learning

    179

    00:51:25.300 --> 00:51:26.260

    Amanda Bickerstaff: our care.

    180

    00:51:28.030 --> 00:51:37.709

    Margot Toppen: Yeah. Well, I think there's so many ways, and save saving obviously them time from all the things that are like

    181

    00:51:38.370 --> 00:52:01.919

    Margot Toppen: making. Teaching, not fun for them is one way, and that I see a huge benefit right off off the top. Of course. Then also, like all those different chats, self care plans, all those things. Those are things that Bennett would ben be beneficial to adults as well. So I think that's that's another thing. And then, lastly, like, for example,

    182

    00:52:01.920 --> 00:52:26.229

    Margot Toppen: what Annabel just shared. And II got really excited when I learned about alongside and like thinking. And and I didn't even realize how in the field of psychology. Chat bots have been researched for actually decades, and that, you know, I didn't really ever think about the fact that for people who have a lot of social anxiety, or maybe skeptical about therapy. It's actually a really good starting point, a really good entry point.

    183

    00:52:26.230 --> 00:52:42.009

    Margot Toppen: and it and you know, for teens who might have trouble talking to adults. So I just Oh, that gets a little off topic from what it can mean for adults other than the fact that these tools in same ways they can be helpful for students. They can be helpful for adults. And I think

    184

    00:52:42.090 --> 00:52:44.140

    Margot Toppen: that's just, yeah.

    185

    00:52:44.930 --> 00:52:57.730

    Margot Toppen: absolutely. Thank you. Margo Sel is a lifelong journey that we're all on. It's not something you that happens in school, and you're done like we're always building our social and emotional competence throughout our lives. So yeah.

    186

    00:52:58.220 --> 00:53:00.030

    Amanda Bickerstaff: awesome. Thank you. Kissyan.

    187

    00:53:00.060 --> 00:53:23.389

    Kristie-Ann Opaleski: I definitely you know, AI, for all the same reasons that Margo said, and just in general, what it can do as an adult and like, when I did my first presentation from my district on AI for the high school teachers. It was more than not academic stuff that they wanted to go and play. And it was like, you know, obviously, they wanna be more productive. And this is gonna save them time with lesson plans and assessments. And

    188

    00:53:23.390 --> 00:53:39.650

    Kristie-Ann Opaleski: but then, when I was like, Oh, we can help you come out with a workout schedule. You don't like to run. Tell it what you wanted to do if you wanted. And they all thought that was great. They were like this is amazing like this can help me. The only thing I would wanna add on is that there is now a lot of AI going into wearables.

    189

    00:53:39.870 --> 00:53:54.739

    Kristie-Ann Opaleski: and I am not against that. I actually had a meditation. Client, ask me, though, like, isn't that against like all your hippie stuff like, you know. Isn't it supposed to be just you and I was like, no, I was like, I honestly, when I first started meditating, use the muse bands.

    190

    00:53:54.740 --> 00:54:19.699

    Kristie-Ann Opaleski: Which helps, you know, regulate like your breathing. And it's telling you, you know, whether you're on par with like the program it's sensing, you know, using bio feedback I was like. Now they have AI just got a notice for air tags. That you know how they're they're looking to use AI and to be perfectly honest. My mother has was diagnosed with Alzheimer's last year, so we got the air tag, so we always knew where she was.

    191

    00:54:19.700 --> 00:54:34.010

    Kristie-Ann Opaleski: because one of the new features that's coming out is that you can get an air tag that's wearable, that does your heart rate as well. But I can get the data. So if my mom's heart rate is going up, I can. You know. I'll get an alert saying she's becoming anxious.

    192

    00:54:34.010 --> 00:54:55.209

    Kristie-Ann Opaleski: So it's like, okay. Well, then, maybe I need to call my dad, or maybe I need to call her, you know, or do something. And I know some people are very nervous about, you know, privacy issues as we should be. All of this, as Amanda you've said, is very new, but I believe in the next couple of years the tech that's gonna come out. And the awareness that we can bring to the body mind connection.

    193

    00:54:55.210 --> 00:55:21.459

    Kristie-Ann Opaleski: for adults especially. And I think it should be more geared through adults than children right now, because we can, you know, have more reasoning power. II think that's so powerful and innovative. And it's gonna make such a difference if we use it in that way. And if that's you know how it's talked about as opposed to the doomsday prophecies, you know. So II think it's it's a wonderful feature for adult sel as much as I do it as it is for kids.

    194

    00:55:21.820 --> 00:55:26.420

    Amanda Bickerstaff: Absolutely, that's so much great there. And finally, so, Annabel, you're gonna take us home. Same question.

    195

    00:55:26.670 --> 00:55:41.280

    Anabel Ibarra: Okay? So honestly, II tell people like just play with it. The limits are your creativity. If we're talking about sel for adults, what they can do with it. For me, family is a huge thing. I have 5 kids from the baby.

    196

    00:55:41.280 --> 00:56:01.340

    Anabel Ibarra: She's about to be one to a 6 year old, 8 year old, 8 year old, 12 year old, so we're always busy. But one of our favorite things is like bedtime stories. And so Chat Gp is our favorite. I'm like, Okay, you tell me one thing you want in the story you tell me one thing you want in story. You tell me one thing, and then we put together a story, and I read it to them, and that becomes like our routine, and they hits love it. And I'm like it. That's such a

    197

    00:56:01.340 --> 00:56:26.680

    Anabel Ibarra: bonding time. It's the like just bringing us together. So that's super easy on the teaching side of it. I use it in my data chats with teachers all the time we're creating our re, teach lessons. They're like, Oh, we could do this, and I'm like, and I just like will show them like type, these prompts. And and so for them that's like, Oh, that takes pressure off of them to have to recreate things. And so the ability to

    198

    00:56:26.680 --> 00:56:33.099

    Anabel Ibarra: just imagination and creativity put it in, see what happens. So that would be my advice.

    199

    00:56:33.350 --> 00:56:49.569

    Amanda Bickerstaff: Oh, man! Well, our superheroes wear all kinds of clothes, since the fact that you have 5 children and a principal, and are spending this time with us. Oh, my goodness gracious! Thank you so much. And I just wanna say, thank you to this wonderful panel. There's so much good here. I think that

    200

    00:56:49.570 --> 00:57:13.839

    Amanda Bickerstaff: what we've kind of underlined is that the the limits are our own at this stage, in the sense of how we can start thinking about these tools and and and knowing that they need to be responsibly done. And and as Christie said, Christine said, let's start with adults first and then like, find those spaces in which they can help students and and use things like us alongside that are designed to be used with students and are built on evidence spaces, and are only using generative AI and small spaces.

    201

    00:57:13.840 --> 00:57:42.869

    Amanda Bickerstaff: And so I think that this is such a really great opportunity for us all, and the challenge that we always like to do is like, go try these things with your staff yourself, etc. And I just wanna say thank you so much to our panel, Margo, for pulling this together and to doing our prompt library, Christyanne and and Anabel for doing this work with us and and being here, considering you our practitioners, I know how busy you are, and I also wanna thank our audience. Thank you for always being a part of this sharing your thoughts and best practices. We got such good feedback already, and I just hope that everyone has.

    00:57:43.000 --> 00:57:51.930

    Amanda Bickerstaff: whether it's a good morning, a good night, or please go to bed. Thank you for joining us, and we look forward to having you here in the next time. Thank you, everybody, and thank you to our panel.