社区黑料

Explore

Students Need Human Relationships to Thrive. Why Bots May Stand in the Way

Freeland Fisher: The possibility that kids may use AI counselors to avoid engaging with people is very real. Five ways to support human connections.

Get stories like this delivered straight to your inbox. Sign up for 社区黑料 Newsletter

In August, OpenAI released its latest for ChatGPT 4.0. It鈥檚 a highly technical and fairly bleak read, detailing risks and safety concerns that generative artificial intelligence could create or amplify. At the very bottom of the document, OpenAI enumerates 鈥渟ocietal impacts鈥 it intends to study further. First on its list? Anthropomorphization and emotional reliance.

That鈥檚 a fancy way of saying bots are increasingly capable of sounding human, putting humans increasingly at risk of bonding with them. OpenAI admits its own catch-22: These improvements create 鈥渂oth a compelling product experience and the potential for overreliance and dependence.鈥 In short, as the tech gets better, the social risks get worse.

Ed tech tools are not immune to this challenge. As AI floods the market, technology companies and district leaders alike must start asking the hard question: If bots are increasingly built to emulate human relationships 鈥 if they are being engineered to sound human 鈥 are they also being designed to help connect students to actual humans?

If not, AI tools run a big risk of displacing their human connections. That poses long-term risks to students鈥 well-being, their ability to maintain human relationships and their access to networks that open doors to opportunities.

In a new report, , Anna Arsenault and I set out to analyze whether and how that question is being addressed in AI-enabled college and career guidance.

This is a domain where chatbots are especially likely to take hold. On average, high schools have one guidance counselor for every . Research suggests a mere of high school counselors鈥 time is spent on career advising. Such scarce human resources create gaps where chatbots can help, offering students on-demand personalized advice about applying to college, graduating and launching careers.

Our report features insights from founders, CEOs and chief technology officers at over 30 technology companies that build and implement chatbots to support students as they apply to college and continue through to their careers.

Based on these interviews, Open AI鈥檚 warnings about anthropomorphization 鈥 attributing human characteristics to non-human things 鈥 ring true. For example, most college and career bots have names and are designed to mimic cheerful, upbeat personalities. Many go beyond informational support to offer students emotional and motivational assistance when counselors can鈥檛.

Are students over-relying on these bots? It鈥檚 too early to tell. But while most of the leaders we interviewed envision a system of hybrid advising that gives students access to both bots and human coaches, the majority admitted that some students gravitate toward bots in hopes of avoiding human interaction altogether.

In short, the possibility that students may start to bond with and rely on bots, rather than humans, is very real.

Luckily, a number of the leaders are taking steps to build bots that foster relationships, rather than just mimicking them. Here are five examples of efforts to ensure that authentic human connection is an outcome, rather than a casualty, of AI products:

Promoting frequent social interaction offline: , which spun out of Arizona State University鈥檚 student-led , is an AI companion that supports students鈥 personal growth. The bot has been trained to learn about the relationships in students鈥 lives. If students tell the bot they are struggling or bored, it will suggest reaching out to specific friends or family members. Axio has also worked to curb overreliance by limiting students鈥 time on the app.

Involving students鈥 families and friends: is a nonprofit that operates a virtual community center where students can interact with AI-powered coaches that help them apply to college. To ensure that those digital relationships don鈥檛 replace real ones, Uprooted Academy asks students to identify up to five supportive individuals in their lives when they enroll. The tool automatically updates those five people by text message every two weeks with recommendations for supporting students鈥 college progress.

Prompting conversations 鈥 even the hard ones: , an AI counselor and tutor, coaches high school students throughout the college application process. Through conversations with the students, CollegeVine鈥檚 bot, Sage, keeps track of how they describe their interactions with advisers and teachers. When it comes time to ask for recommendation letters, the bot can coach students on whom to ask and how to address any challenges they might have faced in interacting with those adults.

Matching students and mentors: is a platform that recruits online volunteer mentors to coach high school students on projects related to their academic and extracurricular interests. The company recently released an AI success coach, Lubav, which helps students find the right mentors on the platform and craft messages to them.

Practicing networking through online role-playing: is a chatbot designed to help students practice asking for help. The platform includes a series of career development activities in which students practice for interviews with the bot and draft networking and job-hunting emails, social media messages and letters asking for references.

These examples highlight AI’s potential to strengthen human connections. However, the incentives to build relationship-centered AI tools are weak. Few schools are asking for these social features or evaluating tools for their social impacts.

If things remain as they are, the more anthropomorphic bots simulate relationships across guidance, tutoring and student support, the more they could foster student isolation. 

But that outcome is not inevitable. Research underscores the importance of relationships in and . To live up to their mission, schools should prioritize human connection, ensuring that AI tools work for 鈥 not against 鈥 expanding students鈥 networks. Information technology coordinators and purchasers, superintendents, principals and educators who are involved in procuring new technologies should demand evidence that AI enhances relationships and implement data systems to track that progress. Entrepreneurs taking steps to safeguard and expand connections should be rewarded for their efforts.

Otherwise, ed tech companies risk the same catch-22 as OpenAI: building artificial intelligence that gets better and better, but to the detriment of the human relationships students need to thrive.

Disclosure: Julia Freeland Fisher serves as an unpaid adviser to Backrs.

Get stories like these delivered straight to your inbox. Sign up for 社区黑料 Newsletter

Republish This Article

We want our stories to be shared as widely as possible 鈥 for free.

Please view 社区黑料's republishing terms.





On 社区黑料 Today