Four Takeaways from New Report on AI鈥檚 Risks in Education
Brookings study finds AI undermines educational and social-emotional development, as well as teacher-student trust
By Greg Toppo | January 20, 2026In just three years, artificial intelligence has revolutionized entertainment, finance, manufacturing and many other industries.
But a from the Brookings Institution concludes that when it comes to education, the risks of AI overshadow its benefits.
Researchers interviewed K-12 students, parents and teachers in 50 countries. Their conclusion: AI undermines young people鈥檚 foundational development in a way that simply can’t be offset by its productivity advantages.
鈥淭he risks we found are things like shortcutting learning so that you have less cognitive development,鈥 said Rebecca Winthrop, who heads Brookings鈥 and is an author of the report.
While it can reduce inequality, providing access to content for an estimated 250 million young people who don鈥檛 reliably have it, AI can also amplify it, since free AI tools are the least reliable and accurate.
鈥淚t is probably the first time in ed tech history where you have to pay more to have more accurate information,鈥 said Winthrop.

The co-author of a recent book about , she said researchers found that young people spending a lot of time with AI companions are 鈥渄e-skilling鈥 when it comes to basic human interactions.
In the end, researchers admitted that AI鈥檚 rapid evolution puts educators in a bind. They鈥檙e operating with little rigorous, longitudinal evidence on the effects of AI when it comes to student learning and well-being. As a result, they say, 鈥淣one of us, not even AI鈥檚 creators, can predict its potential dangers or benefits with complete accuracy.鈥
Here are four key findings from the report:
1. AI poses risks that undermine children鈥檚 foundational development and may actually prevent them from reaping its benefits.
Using generative AI undermines young people鈥檚 foundational development, researchers found.
At its core, the researchers note, AI is a set of powerful productivity tools now being harnessed most effectively by 鈥減rofessional adults with fully matured brains. They have already developed sophisticated metacognitive and critical thinking skills that undergird their approach to their work.鈥 They also have deep expertise in their fields and the cognitive flexibility that comes with that expertise, allowing them to use AI as a 鈥渃ognitive partner.鈥
Not so for young people, who aren鈥檛 鈥渕ini-professionals.鈥 Their brains are still developing and school should ideally help them practice critical thinking and 鈥渟ustained engagement with challenging material.鈥
For most young people, AI isn鈥檛 a 鈥渃ognitive partner鈥 but a surrogate. It doesn鈥檛 accelerate their development 鈥 it diminishes it via cognitive offloading. The result, researchers say: declining skills across the board.
A teacher tells them, 鈥淚f students can just replace their actual learning and their ability to communicate what they know with something that鈥檚 produced outside of them and get credit for it, what purpose do they have to actually learn?鈥
A student puts it a bit more bluntly: 鈥淚t鈥檚 easy. You don鈥檛 need to (use) your brain.鈥
2. AI can impede students鈥 social and emotional development.
Kids don鈥檛 learn in isolation. Relationships with others 鈥攊n and out of school 鈥 help them develop a sense of well-being. But using AI can undermine their ability to form relationships, recover from setbacks and stay mentally healthy, observers tell researchers.
Young people鈥檚 use of AI chatbots 鈥 for everything from homework to emotional support, therapy and companionship 鈥 has adults worried, researchers report. Nearly one in five teachers worry about AI鈥檚 influence on student well-being, even though just 7% of students mentioned chatbots鈥 emotional harm.
The problem, they say, is that it鈥檚 equally possible kids aren鈥檛 experiencing emotional dependence 鈥 or that they simply lack 鈥渢he self-reflective capacity鈥 to recognize unhealthy emotional dependence and how it impacts their well-being.
3. AI is already eroding the trust relationships between students and teachers 鈥 on both sides.
Teachers tell researchers they increasingly doubt that students are producing authentic work 鈥 while students think the same about their teachers.
Researchers found a fracturing of trust between students and teachers that cuts both ways. Teachers trust students less when they suspect them of using AI to complete homework. In interviews, 16% of teachers said this erosion of trust is 鈥渁 significant concern.鈥
And students also trust teachers less when teachers use AI to create lesson plans and assignments, but aren鈥檛 open about it.
More broadly, this development could be undermining students鈥 trust in educational institutions themselves. 鈥淥ne of AI鈥檚 greatest casualties may be the trust that ensures young people have what they need in school to meet their needs and prepare them for the future,鈥 they write.
4. It鈥檚 not too late to turn things around.
Researchers say that while AI is doing damage, the wounds are 鈥渇ixable鈥 and that adults 鈥渟hould neither capitulate to these harms nor focus solely on limiting their repercussions.鈥
The report offers 12 recommendations, including:
- Shifting education away from 鈥渢ransactional task completion鈥 that AI can most easily help students with.
- Co-creating AI tools with educators, students, parents and communities. The researchers suggest that schools create 鈥渟tudent AI councils鈥 that can help embed student voice into AI tool design 鈥渢o ensure their relevance, inclusivity, and pedagogical soundness鈥 before adoption.
- Using AI tools that 鈥渢each, not tell.鈥 Winthrop suggested, for instance, using AI to interface with a difficult digital text. 鈥淚’ve read this paragraph twice,鈥 she said. 鈥淚 don’t get it. Can you explain it to me in a different way?鈥 Used in such a fashion, with vetted content, she said, 鈥渋t can be really effective.鈥
- Offering AI literacy that helps students, educators, and families understand its capabilities, limitations and broader implications. That includes robust professional development that equips teachers with deep knowledge to teach students about AI.
Winthrop highlighted the National Academy for AI Instruction, created last fall by the American Federation of Teachers. AFT President Randy Weingarten has said that over the next five years it will train 400,000 educators, or one in 10 U.S. teachers, in effective AI usage.
鈥淲hen I talked to Randi Weingarten about why she did it, she said, 鈥榃e have to be at the table this time,鈥欌 said Winthrop. 鈥溾榃e were not at the table during social media.鈥欌
Winthrop said Weingarten 鈥済ot a lot of flack鈥 for creating the academy, but added, 鈥淚 think it’s the right decision.鈥
Did you use this article in your work?
We鈥檇 love to hear how 社区黑料鈥檚 reporting is helping educators, researchers, and policymakers.