From Research to Practice: Insights from Stanford’s Work on AI in Education

Join AI for Education and members of the Stanford Accelerator for Learning team for an insight-rich session on what's happening with AI in education today.

We explored how their work is helping system leaders cut through the AI noise, highlighting real-world research from classrooms across the country. We also showcased practical AI literacy resources for teachers from Stanford's CRAFT initiative and shared new research on how AI is impacting  academic integrity.

Attendees left with actionable insights, ready-to-use tools, and clear next steps to guide AI decision making and practice from the district to the classroom level.

In this session, we:

  • Shared findings from Stanford’s School-AI research repository, including how educators are using it to inform classroom practice.

  • Highlighted CRAFT’s frameworks and resources for integrating AI competencies across disciplines.

  • Discussed trends, challenges, and opportunities in building ethical, practical AI literacy for educators and students.

  • Reflected on the balance between research, implementation, and integrity in the evolving AI landscape.

  • Slides

    CRAFT AI Literacy Resources

    AI Hub for Education

    Stanford Research Study Repository

  • Amanda Bickerstaff

    Amanda is the Founder and CEO of AI for Education. A former high school science teacher and EdTech executive with over 20 years of experience in the education sector, she has a deep understanding of the challenges and opportunities that AI can offer. She is a frequent consultant, speaker, and writer on the topic of AI in education, leading workshops and professional learning across both K12 and Higher Ed. Amanda is committed to helping schools and teachers maximize their potential through the ethical and equitable adoption of AI.

    Chris Agnew

    Chris leads the AI Hub for Education at the Stanford Accelerator for Learning, aiming to be the trusted source for education system leaders on what’s working (and what’s not) to benefit students and schools and to reimagine learning. Across 25 years in experiential and applied education, Chris has both taught and led organizations in K12 and post secondary. Prior to Stanford, Chris led US higher ed strategy and credentials for Multiverse, an edtech startup using professional apprenticeships as an alternative to college and university.

    Joba Adisa

    Joba Adisa is a postdoctoral fellow in Human-Centered Artificial Intelligence at Stanford University’s Graduate School of Education. He works with Professor Victor Lee on the CRAFT (Classroom-Ready Resources About AI for Teaching) project, where his research focuses on expanding participation in AI and data science education across formal and informal learning spaces. ‘Joba examines how educators and students can apply, critique, and responsibly collaborate with AI tools for learning. He has partnered with teachers, school districts, and organizations such as Google Research to co-develop classroom resources and games that make AI education accessible to all learners. He also leads professional development workshops for educators, helping schools integrate AI and data science in meaningful and equitable ways. Before joining Stanford, ‘Joba earned his Ph.D. in Learning Sciences from Clemson University, where he developed programs introducing students in the U.S. and Africa to the foundations of AI and data science.

  • 00:00
    Amanda Bickerstaff
    Hi, everyone. Welcome to our December webinar focused on the amazing craft resources and the work that Stanford has been doing on AI and education since before I started AI for Education. And I'm just really excited to be here with two amazing colleagues and professionals around this work. We are just really excited to be able to highlight the work that's being done. We love AI literacy resources. What's really cool is that we're going to talk about kind of two parts of the AI in education space. We're going to talk not only about these really wonderful resources at craft have, but also some of the research that really underlines it. And I think that right now, considering how fast things are going, you know, we need, of course, the practical, but we also need the evidence behind it.


    00:47

    Amanda Bickerstaff
    So really excited to have Chris and Joba here with me today. And as always, please make sure to get involved. We are so unbelievably lucky to be able to have everyone be able to say hello and communicate with each other. You all have been an amazing part of our journey. Our webinar culture we just love. So feel free to say hello in the chat, where you're from, what brings you here today. And then also a couple of things is if you have resources, please share them with each other. We do. We will remove note takers, including our own. So this is meant to be a moment together. And then finally, if you have a question specifically for Jova, Chris or myself, please use a Q and A function because that's where we're going to be able to.


    01:31

    Amanda Bickerstaff
    To really dig into those questions because it goes pretty fast, which you'll notice we have people from all over the world here, so kind of if you want to lift something up to us, just put that in the Q and A. And so I am just so excited to have this conversation. I'm going to start with Chris. Well, even back in the day. So one of the first things that I ever did at AI for Education was come out to the Stanford accelerator of learning, AI and education conference in January 2024, and I was very, thankfully asked to be part of a panel. That was one of the first things, and I will be honest, I had five minutes to talk about AI and AI literacy. And it's probably, out of all the things I've ever done, the most nervous I ever was.


    02:15

    Amanda Bickerstaff
    First of all, talking in five minutes is not my sweet spot. I like. I like a moment. But also it was such an opportunity to be in. Be in front of people. Having only done this work for six months, talking about the necessity of AI literacy. And since then, it's been a joy to get to meet people like Isabel Howe and Glenn Kaiman, but also Chris, who was a part of this futurist convening, which if you want to hang out with us next week, we'll be talking about that in our first webinar on that piece. And when we talked, I was like, chris, we have to do a webinar together. We have to figure out a way to share the great work that's happening at Stanford.


    02:49

    Amanda Bickerstaff
    And so what's really cool is that we have Jova here as well, who's going to kick us off, who is a postdoc and who is really doing the work of research, of thinking about curating and not just curating and building great resources, but identifying how they work, what kind of work needs to happen. And so I'm going to hand it over to Joba, who's going to go through the CRAFT kind of what it is. And we're going to have two micro talks. So Job is going to go first and then Chris, and then we'll have an opportunity to share together. So, Joba, do you want to take us away?


    03:19

    Joba Adisa
    Okay. Thanks, Amanda. Hi, everyone. I'm Joba Disa and I'm a postdoc here at Stanford and work on the CRAFT project with Professor Victor Lee, which is supported by the Stanford Accelerator for learning. So craft stands for, you know, classroom ready resources about AI for teaching. And the goal is really to support K12 educators in order to be able to teach students. And we really look at, you know, how educators can teach with and about AI within the discipline. And the whole idea of CRAFT is really to like, you know, just AI shouldn't be taught as a separate subject, even though, you know, everybody should learn one or two things about AI because of how it's changing society.


    04:01

    Joba Adisa
    So we've kind of like worked with educators all over the US it's been us centric in terms of the teachers we work with just because of logistics, but in terms of, you know, our craft resources are being used. It's been, you know, global. We work with educators through co design to actually develop resources and see how teachers can create content where it actually integrates, you know, AI concept practices into their disciplines. So I'm going to talk about, you know, the approach we kind of like take in the next slide. Thanks, Amanda. Yeah, so the approach of CRAFT has really been grounded in first co design. Like I said, we work with teachers across multiple discipline, from STEM to heart, humanities and all that. And the idea is, you know, AI cuts across every discipline. It's applied in one way or the other everywhere.


    04:53

    Joba Adisa
    So we work with teachers to design some of, like, these materials and just see, you know, like, in math, it might be, you know, oh, maybe our statistics relate, you know, to AI and that could be a different, you know, thing in heart entirely. And so we look at all of that, and another thing is just a multidisciplinary approach we take, which I think we've. I've talked about. And we also try to, you know, just go on with the flow. Initially it was a lot about, you know, face ID and classification. And suddenly 2022, boom, you know, generative AI. And that has, you know, really been a big part. And the idea of craft is actually to create resources where teachers could actually take the old lesson and, you know, just teach it in their classroom.


    05:38

    Joba Adisa
    But we also recognize, you know, teachers have a lot of, like, agents in terms of what's best for their class. And so these resources, teachers could just, you know, just take some part of it and just match it. That I'm just going to share some of, like, you know, the work where we've seen teachers use or design craft lessons in the next couple of slides. And you can look at our website, which Amanda will share later, for the like resources. And so this one is a ELA class, English Language Art, where this teacher is looking at how AI can, you know, help us become better writers. And what they did with the student is to actually, you know, they taught an AP class.


    06:15

    Joba Adisa
    And what they did was ask students, you know, write an essay and then evaluate AI, you know, generated essay and just trying to, like, you know, use the AI to create. So they're learning prompt engineering, but also kind of like assessing how AI writes compared to, like, human and evaluating that using AP rubrics. And that was just, you know, this teacher teaching with AI, but also teaching about AI core concept, like, you know, whether there's bias in it and how AI models language and the likes in the next one, that's really gonna be, you know, social study. This teacher actually used, I think, a magic school to actually have the student, you know, debate with a chatbot before bringing them back. And they were, like, just talking about, you know, controversial, you know, topic.


    07:03

    Joba Adisa
    And this was a teacher using AI to actually support the students communication and argumentative, you know, reasoning skills. And still, you know, with the teacher directing all that. The next one would really be physics. This is one of my favorites, just because this was a teacher that never used A generative AI tool. And then during the code design, we're like, oh, I don't know what to do, stuff like that. And you're like, you know what? Just let's figure it out together. And what they eventually did was actually, you know, have students think about the energy and efficiency trade off, you know, between the brain and AI and just doing some of like, you know, calculation for the physics folks. But the whole idea was actually to just see the impact of AI in society, you know, the energy AI consumes.


    07:52

    Joba Adisa
    And I think the beauty of this lesson was just being able to like, you know, integrate what they would normally learn in their physics class from calculating power, energy and all those stuff, and still learning about AI. So this is kind of like the approach, you know, we take teachers from like different levels in their AI use and understanding different disciplines. And we are just working together based on, you know, researchers here, our own knowledge on AI and pedagogical expertise and bringing all that together. There are a lot of like resources on the website. Yeah, it's just, you know, showing you our craft. Resources are being used on. Globally, 54 of users recently have been from the U.S. we have 4% from Canada, 4% from the U.K. and you know, just 10% from Indonesia, Spain, Brazil and China.


    08:43

    Joba Adisa
    And the other larger chunk has just been from just so many other countries in Africa, Europe and Asia. Yeah, I think, Amanda, we can possibly move to the next slide. These have been most visited resources on the website. The AI or not is has always been one of like the top one. Just a fun activity for checking AI search results. It's something users are, you know, interested in. And physics one I showed earlier. And then the last one is one of our baby projects with Google research that has focused on AI literacy in middle school. And it's aiquest and we have that. And it's just been getting a lot of tractions and people are really interested in it and you can check it out and the next slide. So just some challenges and tension we are with designing and developing resources like this.


    09:38

    Joba Adisa
    I think one is accessibility. Even though we try to just reach a global audience, we recognize that we use resources that online and even though we do have some like, you know, unplug resources, you still have to, you know, be online to access some of like this, you know, Google Slides or even stuff like that. And then just technology with Curve to try to keep it simple, you know, using existing tech tools. We don't create our own AI tools and some of those change pretty quickly. So just keeping up with that and also teacher preparedness. You know, teachers come from different works, different levels, different ways on integrate things and just being able to, like, speak to all those at the same time.


    10:19

    Joba Adisa
    And then there's just like, you know, the perennial question of how much should teachers know in order to be able to teach with AI or teach about AI. And the final thing would be, you know, the challenge every one of us deal with just AI keeps changing. And, you know, that kind of like changes the whole approach sometimes where, oh, it's Gen AI today and tomorrow we don't know what. But let's just hope, you know, Gen AI keeps us on the ground for a while. And finally, I think just some like, other things we are looking at has really been. We've been pivoting a lot now into, you know, designing AI education in a way that really, like, promotes disciplinary practices or, you know, just those skills you want to, like.


    11:01

    Joba Adisa
    See, the idea is that, you know, there are some thinking skills, some epistemological practices are like, different in a physics class compared to like a math class or compared to like a science class. And just seeing how, hey, I can, you know, support that or how we can seize that out is something. And we're also kind of like, you know, just expecting and just encouraging contributions to card from a wider audience. And finally, just, you know, even us as a unit, how we can actually use AI to support our own design resource with teachers. And I think this is going to be all. I'm going to be happy, you know, to answer questions or more about, you know, our design approaches.


    11:40

    Joba Adisa
    And I will work with teachers and, yeah, just going to hang in here while, you know, pass it to Amanda or Chris and we'll see.


    11:49

    Amanda Bickerstaff
    Well, so, I mean, I think first of all, sorry Everybody, for the 9,000 note taker, like, moments in the chat. So hopefully we've got all those out. But Jeb, I think there is like some, like, as an organization that is like, also building a lot of resources, I do think it's really interesting to see. It feels like there are two components of craft. There's kind of the machine learning AI literacy component and the more generative AI component. Have you guys thought about actually separating them or where is the majority of your focus now? Are you going to be focusing on the larger AI literacy or the more specific generative AI literacy? Because it's something that we're thinking about a lot right now.


    12:34

    Joba Adisa
    Yeah, so our focus is still AI literacy as a whole. Just because AI itself is an evolving field, we recognize that yeah, the in thing right now is generative AI, and we try to stay current with generative AI, but then we always keep in mind AI. Everything about AI is not about generative AI, so we don't fall into the trap of hr, GPT is AI or that's just it. Yeah. So we kind of keep that broad level and that's kind of like the approach we still take. And that kind of like helps just because by the time we go into like, you know, key discipline stuff, for example, you know, maths and thinking about how AI, you know, shows up there, whether the statistics or the probabilistics thinking decision tree.


    13:22

    Joba Adisa
    Some of those things, I mean, aren't like, you know, just directly LLMs, but then they are still like AI.


    13:29

    Amanda Bickerstaff
    Yeah, well, they're actually different though. Right. I think that's even a more distinction. Like decision trees are like very much a part of machine learning, but not large language models. So it is like an interesting tension though. I think that we're thinking about a lot of like. I think that's why we've kind of decided to go more into generative AI, just because it's easier to do one, I think. But I love the fact that you're trying like this idea of moving toward disciplinary knowledge because I think that's place. I'm going to say a bold statement. I'm not sure that most educators are at a place where they're ready to do disciplinary focused AI integration. But I do think that by putting that into the world and designing that more, when people get more ready, they'll have a place to go they can trust.


    14:17

    Amanda Bickerstaff
    So I think that's really cool. Well, so, okay, so we're going to make a transition to Chris. And so Chris has had an amazing background of all the things. But one of the things that I love about the work that you're doing right now is just really thinking about the evidence behind things. And I think that the Hub itself, the work that you all are doing, I think is really special. And so I'm excited to hear from you about. We've got these resources and this big movement for AI literacy, but how are you all thinking about the larger world of AI in education? Handing it over to you.


    14:50

    Chris Agnew
    Outstanding. Amanda, thank you so much for having us. And to everybody that are giving us your lunch hour or actually global community, wherever you are, really excited to get a little time with you all. So I'm Chris Agnew. I'm the managing director of the AI Hub for Education. We're part of the Stanford Accelerator for Learning and my goal today is to give you two things at the end of this one, a little deeper insight into one particular study on teacher tool use of AI and then to give you one tool to give you some agency after this to ground yourself in what is noise and what is meaning in the world of AI right now. So we can go to the next slide.


    15:34

    Chris Agnew
    So at the AI Hub for Education, we aim to be a trusted source for education system leaders on what's working and what's not benefiting student schools and learning. So Joba and the craft team do outstanding work in the classroom and teacher level and then think for the hub. For us we're thinking on the systems level. So we're thinking districts and states as our core audience. We have three priorities. Doing original research, building tools from that research to help education system leaders make better decisions, and then engaging the field, bringing together researchers, product builders and practitioners to helpfully improve practice. So we're based on the National Student Support Accelerator, which is our sister lab focused on high impact tutoring. We're based on their playbook that bridges research to practice so we can go to the next slide.


    16:29

    Chris Agnew
    So of those three priorities that I mentioned, research, tools and engagement. Given the nascent stage of AI, specifically generative AI and the very little body of research that exists out in the world right now, our first priority of those three is spotlighting the research that exists and then doing original research. So three current projects. One actually end of last year we released some research called Tutor Copilot focused on human tutors that are augmented by AI. I'm going to share in a moment some more research on school AI teacher tool use and some insights we learned from that. And then we are in the research Planning stages of OpenAI, specifically ChatGPT use by K12 globally with current planning of ChatGPT use with grade 10 and 11 students in Estonia starting next month. So we can go to the next slide. So the focus.


    17:33

    Chris Agnew
    So this is the first thing I want you to take away is little deep dive and some insights in teacher tool use in a very well known tool out there, the school AI platform. So we got a large data share from school AI and investigated the use of 9,000 teachers use of their platform. These 9,000 teachers adopted their platform in fall 2024. You can see in this crimson highlighted window of time from last year. And we analyzed their use over three months, basically the first three months of the school year to understand what are they using, how that's changing over time. And we found three key things 42% of people who initially adopted it became regular or power users. This means basically that they used it at least one time a week over those first three months.


    18:25

    Chris Agnew
    The most common use time of this platform was mid morning Monday through Friday. So squarely during the school day. Notably this was not nights and weekends, which was a early potential hypothesis that teachers would use it night before planning lesson plans, things like that. They were using it during the school day. The third one, and I'll dive into this more in a moment. But regular users began by building AI that would face students. But over time their behavior changed to use AI to augment their practice as a teacher. So let's dig into this last point with the next slide. So this is pretty interesting. We thought so of those 42% that became regular power users, 1% were power users. These are people that used it at least every other day for that three months. Regular users were at least weekly users.


    19:20

    Chris Agnew
    So in a snapshot, these two graphs show different types of tool use. We have red, which is teachers building AI products that would face their students. We have yellow, which is teacher productivity tools. This is build me a lesson plan, things like that. And then teal, we have a teacher chatbot to ask questions, things like that. So the insight here is regular users, you can see start out by, you know, you think AI for education. They were thinking first, okay, I'm going to build tools that where my students will engage with AI. They started off with a third of their time building AI tools for students. And you can see the more they use it, the less they used it to build AI facing students.


    20:15

    Chris Agnew
    And actually what really grew over time where they saw value was I'm going to use this platform, school AI. I'm going to use generative AI to augment my practice as a teacher. Whether that's building lessons, lesson plans, whether that's grading, whether that is looking for insights on a specific topic. So we can go to the next slide. So second thing I want to leave you with is a tool to give you some agency and making sense of all the news and information out there. So in January we launched the Research Repository. This is on our website. It has all the research that exists relevant to K12 and AI. Right now there's over 800 papers and we have a big update planned next week that will likely bring us to over a thousand papers in the research repository.


    21:05

    Chris Agnew
    You can search or filter based on your problem of practice, your curiosity, whatever that is. We can go to the next slide. 2 real key use cases that we're hearing from teachers, from district leaders, from educational consultants, from curriculum leaders are outlined right here. So you can go to the research repository, you can search for papers that you're interested in and then you can read those papers. That's great. I assume many on the call right now are practitioners. They're in classrooms, they're leading a school, they're making decisions for a district. And so you might feel like I don't have time or the ability to read a 50 page academic paper. So we're hearing from our users two really good use cases.


    21:54

    Chris Agnew
    One is go to our research repository, filter or search for the relevant papers that you're interested in, whatever that topic is, and then download the papers and then take those papers and upload them into a notebook lm, and then engage with them. That way you can chat with the papers about this topic area, you can listen to a podcast, et cetera, and you can make yourself a lot smarter based on the evidence that exists in a short amount of time. So that's one use case notebook lm. I'd say that's the most common. Another application is to use Deep Research. This could be, you know, pick your favorite frontier model, but go to the Deep Research tool and then enter a prompt in the red text here I have an example language but you can prompt based on your relevant topic.


    22:46

    Chris Agnew
    In Deep Research, I'm interested in research that exists on AI and middle school math outcomes. Investigate this question. You can point them to our research repository and then tell it what you want to bring back and then the tool will then spotlight our research repository, pull the relevant papers there and then you can make sense of it that way. So two good use cases for you to sort through the evidence that exists and make it actionable. Next slide really quickly.


    23:18

    Amanda Bickerstaff
    I love you, Chris. I want to be. I want to make one slight change to your Deep Research. You cannot choose the Deep Research, cannot use a repository. And so it sounds great, but it will open it up pretty widely. You're almost better off instead of using Deep Research of using the same prompt with the non Deep Research components only because Deep Research is an agent and you cannot control the actual resources that are done. So this same prompt would be amazing, but I would just highly suggest using it for just the plain vanilla Claude, ChatGPT or Gemini versus the deep research component. But also I would highly suggest doing this though because these are the ways to make these tools. That's a repository it would take us years to get through.


    24:08

    Amanda Bickerstaff
    But finding the things that you care the most about it is going to be something that is really powerful. So awesome. And I will now stop talking Chris back to you.


    24:17

    Chris Agnew
    No, that's excellent. Great. Thanks A.D. amanda, next slide. So we are at the end here. So we've got some good resources at CRAFT and at the AI hub to get more information. There was an earlier QR code to go to the research repository, but easily queryable in search and that's what I have for you.


    24:40

    Amanda Bickerstaff
    So first of all, okay, so I always think this is so interesting. So I have a couple questions before we bring Joba back on the stage. So there's so many new papers, but it feels like the quality of Genai and learning papers, or even Genai and teaching quality impact papers seems very limited. I'm just maybe being nice. Am I right? Am I wrong? Am I missing things? But it just feels like even more on a stage and we're talking about notable studies, they just feel like we haven't quite gotten that like shining star, like you know, learning or teacher impact. Although I think that the work you're doing with school AI and the upcoming with OpenAI, but it feels like it's all really still new. Is that, is that a fair statement?


    25:26

    Chris Agnew
    Oh, I'm glad you highlighted that. And so short answer. Yes, we are in early stages. Second piece, I would say we can't wait for the one shining star though too, because this is going to be iterative and quality research is going to be narrower in scope, right. If it's answering a specific question. And so of those 800 papers, a small percentage are actually RCTs that can actually answer a question for us. And an even smaller percentage of that are RCTs of very good quality. So one shout out. So right now what's in the research repository is gauged by relevance topic because there is so much coming out that if were filtering for quality constantly, we would be so far behind that it just wouldn't exist.


    26:20

    Chris Agnew
    So in the first quarter of 2026, so by March we're going to be releasing a State of AI and K12 research report, which we have a team of researchers working on this right now that in six different categories is doing a deep dive of okay, what do we know right now based on quality research? What is early signal that might be lower quality or just early indicators, but still something to keep an eye on. And then what is just noise? So keep an eye out for that.


    26:53

    Amanda Bickerstaff
    Well, I think that'll be really helpful because as someone that's a researcher, I see Some of these studies that get very big news, like your brain on ChatGPT and Genai can harm learning. And it feels like those things break through because they're very like, ooh, this is the worst case scenario. But I do think that the report in the repository is going to be so important and I want to go back to this as well because I will say that even being able to do work with the tools themselves, because the thing about this I find really interesting is that even though we can't that completely correlate or have a causal relationship, but it does kind of show that it seems like student chatbots are the thing in terms of worth.


    27:39

    Amanda Bickerstaff
    But if you start using student chatbots, you see limitations incredibly quickly in terms of the quality of the generative AI systems underneath the models, how they're used, the applications, even the design. What kid wants to spend 30 minutes talking to a chatbot? Sorry, I'm just going to say it, but that it's really, I'm really interested in the idea though of next steps of something like this to say like, is this making like, is like these augmentation tools actually making it not just easier to teach, but I'm teaching better. I'm teaching in ways that are more differentiated or that are focused on, you know, I'm not spending all this time doing X and now I'm focused on what's important for me, which is why in the classroom.


    28:22

    Amanda Bickerstaff
    And so I think that these are incredibly strong signals and like, I really, you know, want to start to like understand. I'm sure you, I know you do too, as well as we've known you. But like, I think that we want to see more of this, right? I would love to see Magic School open up their data. I would love to see, you know, Gemini and Google and like there's lots of big decisions about what's out there in the world. So I hope that these are like moments and clarion calls for like why this research is going to be so important because these tools are integrating very quickly into schools.


    28:53

    Amanda Bickerstaff
    And I actually made a bit of a hot take today at our women's group that I think next year in education, for good or bad, is going to be the year of the Gen AI tool. I think next year we're going to see an incredible increase of organizations buying tools for the classroom and for teachers. And so I'm just really, I think that this is such a beautiful like start for this and I'm excited about the research that comes out of everything that you all are doing. So okay, I'm going to pull Jova back on the screen. And so I think that this is some really interesting stuff, but I think, Joba, I might put you on the spot a little bit if that's okay. So one of the biggest, noisiest things that came out is that AI is for cheating.


    29:37

    Amanda Bickerstaff
    And Victor, who isn't with us today, had been doing. You guys have been doing a cheating study for quite a while before generative AI. And we, you know, you guys put papers into the world, but we don't have a lot of data about what happened in 2020 or 2025. Is there anything you can share about the update to that research about what the impact of generative AI has been on cheating? Because I know it's in peer review, but I'd love to know if there's anything you can share.


    30:07

    Joba Adisa
    Oh yeah. So that's a project Victor leads with a couple of other folks. I think so far it's really just been the fact that there hasn't really been a significant uptick in cheating due to Gen AI, you know, not different from what we've like, you know, seen in the past. And so it's not like, oh, Gen AI. Even though there's like the fear and all of that when it comes to like, cheating there isn't that like, you know, significant increase for us to like, make that claim that oh Chennai is actually, you know, leading to like, much hitting. And I mean, a lot of like, you know, things contribute to that. You have students just being, you know, super aware, like, okay, I don't want to be caught plagiarizing the implication and all that.


    30:53

    Joba Adisa
    And even, you know, as educators, there's often like the opiate whole students are going to use everything that makes their life easy. But you know, research kind of like just suggests it's on. It's not always like that. I've had students, you know, who are like, I don't want to use Gen AI. I enjoy doing this work. And that kind of like just speaks to, you know, things like maybe theories that talk about, you know, low cost of control or even, you know, effort and ownership in terms of like, work students do. But some, you know, of the old thing now the research kind of like, has been contrary to like, you know, the big claim like, oh Chennai isolating. So like, more cheating among students.


    31:33

    Amanda Bickerstaff
    I might push back slightly just because of like, you know, the work that we do. But also just looking at consumer grade cheating tools, like, I hate to like Quillbot for example, being one of the top 20 Genai tools on by usage, which is really going to be a paraphrasing tool designed to get around AI detection. I would just say, I think Jova, for people like us. I understand that we don't want to add to the rhetoric, but I do think that people are really questioning what has happened.


    32:02

    Amanda Bickerstaff
    And I think that there is an opportunity because what we're seeing maybe is that kids are cheating differently, which I think was established in the first two papers, where genai, whether you look at all types of different research about search terms and where people are spending their time and clicking in terms of getting support to write their papers. But also a lot of what we hear from young people is we know it's bad for our thinking, we know it's bad potentially for our learning and our voice, but it's really easy and we're still doing it. We hear the siren song of yes, in fact, I would say situation. Since the beginning, it's been really interesting that young people have made this very interesting analogy of generative AI being vaping, which I think is very fascinating.


    32:51

    Amanda Bickerstaff
    So vaping makes it a lot easier to smoke without the less smelly, all these things. And it's not good for you. But it's interesting that I think that as we keep moving forward, we'd love to see this research be published because I think those are the questions that are coming up. So I won't put you on the spot because you're not Victor, but I did want to bring that up because I think it's something that we really would love to see. But going to Chris, I mean, I think that there are some questions, especially on the research. I want to start with maybe our perfect world. For example, what research doesn't exist yet that you would love to see.


    33:31

    Amanda Bickerstaff
    And one of the questions is from the audience was, how is us as practitioners, can we get involved where we get to be part of these research pieces? I think that's a really interesting question. And then I would say, Joe, because you're doing research too, I'd love to hear your perspective as well, but I'll start with Chris.


    33:47

    Chris Agnew
    Okay, I'll start with the second piece as far as how to get involved in research in the materials that will be shared out. My email address is on there. We are always looking for districts that want to partner for research for the AI hubs interest. We're looking at large data sets. So there's great work happening at the individual classroom level. That's not our focus. Our focus is say district or statewide data probably smallest unit would be at the school level. But if you're at a school that is interested in doing research on AI, whether it's AI, operational AI for teaching and learning, please get in touch with us. And then second piece, what is the research that we would love to see? I mean the.


    34:42

    Chris Agnew
    There is so little right now that the list of what I would love to see is really long to me. What I'm particularly interested in, which I think the. The jury is really out on this right now. And if we can get greater understanding on this topic, it would be high value is this question of can generative AI be a great support and aid in building durable skills metacognition or is it an eroder in that space? And I think there are arguments in both directions and there is research planning in the works to happen. Answer these questions. This will be a very important question to answer.


    35:30

    Amanda Bickerstaff
    Yeah, and I think it starts to like feel like more, less like education research, but more like sociological research where you had these like long term studies on developmental like almost like child psychology development like that. It feels to me at least that like having something that looks and follows a high schooler throughout using these tools, but also like what's happening to elementary school students and middle school students that are accessing these tools in different ways. I think that there's something to be said about the developmental impact that would need to be longitudinal in a way that we don't often do in pure education research, which feels like these cross functional. I don't know, I feel like there's a big space for that. Do you know of any studies that are being designed in that way? I'm sure there are.


    36:19

    Chris Agnew
    I think so. I personally. This doesn't mean they don't exist. I personally don't know of any studies being set up right now purposefully longitudinal because right now there is such an urgency and recognition that major decisions are being made by districts every week, every month and that the more insights we can bring right now, rather than promising insights 10 years from now, will help set us on a better trajectory. So this is kind of similar to theme of instead of waiting for the perfect research that will come out in a decade, can we get early signals now that just make us a little bit smarter to make a little bit better decision? And I would say the groundwork is being laid for a lot of these that could be repeated over years.


    37:09

    Chris Agnew
    So one of our approaches that was exemplified by the school AI blog post that I shared earlier is that rather than our end goal being the peer reviewed academic paper that comes out in two or three years. Our research, we do these short three month sprints, then we'll public a publish a blog post that is not peer reviewed, but it's the early insights, here's what we know right now. And then three to four blog posts then get stacked together, integrated into an academic paper, peer reviewed and then stamped for longer term knowledge. But the hope of getting insights out quickly and early.


    37:50

    Amanda Bickerstaff
    Well, I would also say this is an opportunity for a much faster iteration process or sharing process. Just because generative AI makes it easier to qualitatively code or so thinking of coding qualitative data, meaning what is this about the sentiment analysis than ever before? I mean, I don't know if Joba, you've done this, I'm sure you have, but sitting down and coding enormous amounts of data where there are, I'm not saying this is right, but there are pure research coming out or like pure data sets that all they're looking at is like AI classifiers using that. And they're doing these huge projects very quickly where they're able to say here are themes that would take a human researcher hours and hours, if not days and weeks can be done in hours to days. And I think that's really interesting for your approach.


    38:38

    Amanda Bickerstaff
    You're taking as well job for you like if you had your magic wand or the work that you want to be doing, like what kind of research are you really interested in?


    38:47

    Joba Adisa
    Oh, that's kind of like that's a golden opportunity. Yeah, I think on the part of like students I've actually been really like super interested in terms of, you know, what actually give like the Goldilock effect in terms of, you know, the right amount of AI or actually where and you know, how AI should like come into the learning process. And just because my own take is, you know, the tools are kind of like not just about, you know, year to stay, but just the fact that when you think about participation in society, students really have to like be prepared to like just, you know, not just use these tools, but create and apply it. So for me it's really what's kind of like the best framework or like setup where it actually promotes like learning.


    39:36

    Joba Adisa
    And so I would really like, you know, just want to like see my research about effective collaboration between, you know, students and AI in our work with teachers. I've you know, seen a lot of ways teachers are collaborating with AI, especially early career teachers who are like, oh you know what? I've not like taught this Subject before it's my new class and they are really like using AI with like support of districts and the like. So. So just want to like be able to like see that for like student. Because I do agree with you. You know, there's just a risk of that developmental appropriateness. You know, I know there is the part of you really need to like have like the content knowledge domain expertise when using AI in order to be able to like you know, even use it effectively.


    40:21

    Joba Adisa
    So do you want to like introduce that to somebody at you know, early grid without them developing that knowledge? But there's just also there's a way they can use it to develop that knowledge. Or maybe you don't even want to use it at that level at all. I mean there's the option also. And I think the other thing which we are actually working on now is just to actually like see how and whether AI is actually, you know, truly supporting teachers and actually assisting their workload. I think there's, you know, so many AI tools for teachers out there and it's like, you know, it's gonna like save you a lot of work and all that. And you know, working with teachers, I mean we are really like not sure.


    40:58

    Joba Adisa
    It's just like, you know there is on Stanford campus there's always this new edtech to found that was like, okay, this is gonna, you know, save everybody's life and we just don't know. So starting to like look at that area. Victor is leading that project also where we are just looking at, you know, AI teacher time efficiency and just seeing, following some teachers out there using AI, doing experience sampling with their day to day life and just trying to see whether AI, whether it's you know, just being efficient in terms of their workload or even just even if it's taking them more time, maybe it's making the work better and all of that.


    41:40

    Amanda Bickerstaff
    Yeah, yeah. I mean I think there's so much to unpack there. Jova. I want to start with just this idea of the Goldilocks zone. I mean I think I even. And the thing that we picked up towards the end of your response was the idea of choice about when not to use generative AI too. I think one of the things that we think a lot about is as generative AI gets better and better at doing what we do as humans, what's going to happen when we have to teach young people ourselves when it's best for them to say no, I will not, yes I can, but it shouldn't.


    42:12

    Amanda Bickerstaff
    I think that's a big question that we're thinking a lot about, of choice of if AI systems can do everything that we need them to do, is that actually helpful in some cases, or could it be harmful? And especially at different developmental stages, different stages of knowledge building and acquisition, expression. And so I think that's the thing we're thinking about a lot. But then I also, I think this is so interesting though. There's such a big push towards efficiency as the measure. There was a Walton foundation piece that says teacher saving six hours of time, but I don't know any teacher that just saved a bunch of time becoming a better teacher. I don't think that saving time makes you a better teacher.


    42:50

    Amanda Bickerstaff
    It might make you a happier teacher, it might make you a less stressed teacher, but does it make you a better teacher? I think is a very interesting kind of question. So I'm excited to hear about that research coming out and we're gonna make a slight pivot because there's a huge conversation in the channel, in the chat around assessment practices and human skills. But like this rubber band effect where we're going back in some cases to pen and paper in a way that could be like, could be quite damaging, could be quite positive and quite damaging at the same time. So I'll throw this to Chris because I know it's something that we both think about a lot, but like the idea of like how much will things need to change? Right.


    43:32

    Amanda Bickerstaff
    I think that this is a question of like, we're kind of working through this idea that educate that education doesn't need to change. But like we need to figure out how AI fits versus like, does education need to be broken to like and then like and created anew so that there is a place for AI to work within it? I don't know. I think about this a lot. I know that's probably on your mind as well.


    43:55

    Chris Agnew
    Oh, I think about it all the time. This week is my freshman and high school son's study week because his final start next week. So we've been having lots of conversations in real time assessment with my son this week too. And I would say, I'll put it bluntly, I am disappointed to see the resurgence of blue books. I hated them when I was in college and I do not think a strong return of them is a value add in assessments. So that's maybe a hot take. But yeah. The five paragraph essay is to assess critical thinking and understanding of a topic. The five paragraph essay was a vehicle as insight into students learning not as an end in of itself. And so I do think there are real questions around what does assessment look like going forward? Both what is being assessed.


    44:59

    Chris Agnew
    Certainly knowing stuff still matters. So it's not just because of Gen AI that we can throw out academic content and all that we care about is durable skills, metacognitive skills. So knowing stuff still matters, yet the means we assess it and specifically, and this is like to build upon what I was talking about before, research around durable skill and metacognitive skills, research around the AI tools development there is important, but also there is this opportunity that our ability in the past to assess these durable skills, metacognitive skills have been quite poor. It's often a I know it when I see it scenario. The best case we have rubric to evaluate it that have limitations and scalability.


    45:47

    Chris Agnew
    So there is an opportunity and the question is like no research to hang our hat on yet, but that some of these tools can give us better windows into student metacognitive abilities.


    46:01

    Amanda Bickerstaff
    Yeah, I mean, I think it is. I always find it so interesting because there's this. You know, we talked about durable skills and soft skills for 20 years, right? 25 years. How long have they been around? But how little. Much like how little research do we have about what those actually mean, like how best to build them, what it looks like for students from different cultures and backgrounds and needs. Like, like almost all the durable skill stuff that I think of is like workforce down meaning like these are what we would hire for, but very little like what this actually means to the development of other types of skills. These like I think even remember the time where we talked about grit and then like grit became kind of like. It sounded great.


    46:45

    Amanda Bickerstaff
    We talked about it for so long, but there was like no way to really qualify what grit was.


    46:50

    Chris Agnew
    It was kind of a know it when you see it type scenario. Totally.


    46:55

    Amanda Bickerstaff
    As you remember that though, I remember the grit moment where everyone was talking about grit and I, and I believe that it's not. I don't, I don't mean to be flippant about that, but I do think this again, I mean one of the things that I get really excited about is that especially as researchers get more AI, like fluent and dexterous like that, we should start to see different models of education research. We should start to see faster iteration cycles, different types of data collection, different types. There are these things that I think we haven't quite nailed ourselves as researchers about what the true opportunity of these systems themselves will have in our practices. And I think that there are Some really interesting places there too, because maybe it becomes easier to start to understand. Is grit real? Is it bounded?


    47:41

    Amanda Bickerstaff
    Is it something that could be measured by using multimodality, by using deep learning systems, by using different types of data collection? That I think is really interesting. Okay, so I'm gonna. We are coming up on time. First of all, I just wanna say I love our chat. You guys have gone amazing multiple ways. Because it's close to the end of the year, I'm gonna go a little bit rogue because Joba, I apologize. I feel like Chris will not be surprised because he knows me. But I would like you if you're comfortable. I'm inspired by the women's group meeting today because I. I talked about, like, my thoughts about the year.


    48:15

    Amanda Bickerstaff
    Can you identify, like, the biggest, like, takeaway that you had this year about, like, something in AI and education or just AI that really kind of sits with you to the future and then maybe one prediction for next year? I know this is hard. Job is ready, though. Like, job is ready. You got this.


    48:34

    Joba Adisa
    Yeah. I think one thing that kind of like, sits with me, you know, just because I work with teachers, is really to kind of seeing what teachers are doing with Gen AI. I mean, I've always been a bit like, you know, pro emerging technologies, but just really like, seeing what teachers are doing with training. I support, like, the past two months, I've been interviewing, you know, teachers in the school districts here who are kind of like integrating chain AI fully. They have, you know, Gemini Pro, all of, like, those licenses and, yeah, kind of like just seeing how it supports, you know, teachers at different levels, seeing people who have been like, okay, I'm never using, you know, Gen AI for, like, any of my thing.


    49:18

    Joba Adisa
    And then after, like, three weeks, you know, coming, like, you know, actually decided to try this AI tool and it actually saved me, you know, doing this stuff. I've never done this before. And it was great and all of that. And then just seeing how people flip, you know, from being maybe anti AI or like, oh, you know, I'm doing humanities. I need to be intentional about my feedback and everything. And then the next month they're like, oh, you should try this. Too brisk this.


    49:44

    Chris Agnew
    Like that.


    49:45

    Joba Adisa
    I'm like, what happened? So I think with that is actually gonna, like, that sits with me. I kind of like, just see how, you know, tech offers opportunities and it kind of like, you know, leads me, like, two things for, like, the next year. I think we're really just gonna, like, see the need for, like, you know, webinars like this or even, like, just activities or opportunities to actually, you know, prepare teachers for, you know, being able to, like, know when and how these two fits, you know, into their work and just prepare them, you know, for it. And then the other thing, I think we would see an uptick next year is really gonna be a really, I think, huge adoption of, you know, gen AI tools. I mean, you know, just saying.


    50:31

    Joba Adisa
    I mean, this is literally, you know, less than three weeks for some teachers, and it's like, oh, yeah, now I'm all like, you know, AI and teachers tell people using Grok. I've not even, like, used Grok in all of, like, my AI stuff, and they use so many tools that, you know, I'm like, okay, I can't keep up. But okay, thank you. And yeah, just in the push and also the adoption and even just the fear level or meter dropping really low, I think. Yeah, I think that's just something I'm taking with me into, like, the next year.


    51:05

    Amanda Bickerstaff
    Absolutely. And I, yeah, I think, I definitely agree that. I think, like, again, I think next year is going to be like, there's going to be the year of the gen Ed Tech tool. Feels like that's a big. Whether it's the Chatbots, GROK is coming up a lot. And even talking to people about how Grok was created, people are like, what? But anyway, that's a different story. So, Chris, your biggest takeaway from the year and your biggest prediction.


    51:29

    Chris Agnew
    So I'm gonna go. Since it's end of year, I'm. I'm speaking in terms of academic or, sorry, calendar year, not academic year. So where my biggest takeaway was, and this might feel overly simplistic, but maybe like, zooming out, that's where the value is. So if we're finishing 2025. So if 2024 was the Zoom ahead for EdTech tools, so Magic School AI brisk, you know, run the list. 2025 was the year of zoom ahead for big tech tools, frontier models. So Gemini, ChatGPT, and even in the last, you know, last. What would it be five months? Google and OpenAI have made big pushes in the K12 space and been explicit, like, this is a priority for us. So that's just an observation that we have this EdTech category and this big tech or frontier model category.


    52:26

    Chris Agnew
    And human behavior often shows that we use one thing and we stick to that. And even if a product builds lots of different options, we often keep it simpler. So is it going to be a choice scenario? Like, choose big tech, choose Ed Tech or is it going to be a mix and match? I don't. That's what I'm wondering. Prediction. May is going to. May 2026 is going to be a signal to August and September in 2026 are going to be big as far as adoption questions. And we're going to see a lot of like, stakes planted because by that time there will be ferpa compliance for OpenAI and free resources. It will be the first start of school year where Gemini will be fully native within Google classrooms.


    53:15

    Chris Agnew
    And so we're going to see a lot of districts making decisions on are they in or are they out? And there is no judgment on that question, but that's gonna, we're gonna get a lot of data there.


    53:30

    Amanda Bickerstaff
    Yeah, no, I definitely agree. I think that this, it feels like this budget cycle coming up is going to be a really like, it feels like the first time. They're probably going to be dedicated budgets in significant ways, at least in the U.S. for both, hopefully crossing fingers, guys, AI literacy training and like learning for students, teachers, families, the whole shebang. But also I think for tool use. And so I think that it's going to be a pretty fascinating couple months at the beginning of the year to see what actually ends up happening. So I just want to say first of all, thank you to Chris and Joba for being here and sharing your knowledge and wisdom. I think there are a lot of people in the, the group here that would love to do research with you, submit resources.


    54:15

    Amanda Bickerstaff
    Know that a lot of times it comes down to not necessarily that there's not a space for you to partner, but that it's like if you reach out, it's almost better because then it's easier to find for organizations like Stanford. So I would just say highly suggest that. I think that also just on the second piece for those in the chat, I think that one of the things that we always think about is that every frame that we're thinking about is this is an opportunity for us to be part of the choice. All the things that we've talked about today only are possible because of the people on this call, but people in the chat and the webinar taking a question, a moment, an idea and pushing it forward.


    54:54

    Amanda Bickerstaff
    This is such an enormous opportunity for everyone to kind of take ownership and agency in this moment. There, there's not a perfect paper, there's not a perfect research or a perfect resource or perfect approach. But the more that we start having intentional approaches that are human centered, that are ethical, that are maybe calculated risk taking and meaningful ways that's that are asking interesting questions. Like, there is I, I'm gonna say my one final thing is, like, I think it's more opportunity for that than ever before in the history of education. And I think that people like us want to hear from you as much as hopefully, you enjoyed hearing from us. So I just want to say thank you. If you're not joining our webinar next week, I think that, you know, I hope you all have a wonderful holiday season.


    55:35

    Amanda Bickerstaff
    And as always, please, please, like, check out the resources that we share tomorrow. And thanks again to Chris and Joba and everyone that helped get this set up. And I hope everyone has a beautiful rest of your day, afternoon or evening. Thanks, everybody.

Want to partner with AI for Education at your school or district? LEARN HOW