Skip to main content

AI vs. Identity

Northwestern researcher Elizabeth Gerber explores how AI is shifting the way young people develop self-concept.

February 16, 2026
Teens at computer

Elizabeth Gerber wants to help young people pursue fulfilling lives and create meaningful change in the world.

In fact, that ambitious aim inspired Gerber, professor of mechanical engineering and communication studies at Northwestern University, to launch Design for America (DFA) in 2009. The extracurricular program, which now boasts dozens of collegiate chapters around the globe, challenges interdisciplinary teams of students to use design thinking to address pressing social problems, from climate change and food insecurity to medical access and early childhood education.        

DFA represented a marriage of one of Gerber’s chief personal interests – inspiring agency – with her longstanding research pursuit – how individuals leverage technology to complete work.

Now, the longtime Northwestern researcher is bringing those two worlds together in her scholarly work, exploring how adolescents are using artificial intelligence (AI) to develop their self-concept – essentially, how they view themselves, their capabilities, and their possibilities.

The perils of AI shaping self-concept

Historically, Gerber says, individuals in the 15 to 25-year-old age bracket – the period in which self-concept often takes hold – developed self-concept through their interactions with family and friends, mentors, teachers, coaches, and other influential sources, including the cultures and traditions around them.

“You start to imagine who you might be, what you might be, and reflect on who are you,” Gerber says. “It’s a process.”

But increasingly and unexpectedly, Gerber notes, young people are turning to AI for self-concept guidance. The infusion of technology into a process traditionally informed by personal relationships and environments is shaping how youth view their futures, their potential careers, and how they present themselves to the world. The trouble is … well, AI isn’t necessarily an objective ally.

In a recent study, for instance, Gerber’s team, which includes fellow Northwestern faculty member Duri Long and PhD student Aidan Fitzsimons, created scenarios with college essays – the first high-stakes instance in which many teens must articulate who they are and what they envision for their lives. Stressed out, many turn to AI for answers, whether for ease, competitive pressure, or a faulty belief the technology will deliver enhanced quality. AI-generated results, however, can be vanilla and, even more, perpetuate shaky narratives.

When Gerber’s team prompted AI to help a gendered minority – a trans student, for example – craft their college essay, the technology often churned out a trauma-informed narrative focused on overcoming societal biases. When the researchers requested the same help of AI for a heteronormative young woman, however, talk of overcoming challenges or prejudices disappeared. For Gerber, the results demonstrated AI’s tendency to reinforce biases about self-identity.

“Our big concern is that AI is really disrupting self-concept development, which is going to be problematic for kids’ mental health,” says Gerber, who serves as co-director of Northwestern’s Center for Human Computer Interaction + Design.

Gerber offers the analogy of a medical school student forced into medicine by parents and told to bypass passions for music or politics or entrepreneurship. Somebody, Gerber says, has told that student what he should be and potential alternative pursuits evaporated. AI could present a similar scenario and nudge students away from a fulfilling life, Gerber warns.

“The fear is that young people will get to a point where who they think they should be is discordant with who they’ve been told they should be,” Gerber says.

 Developing an empowering solution

To be fair, Gerber says there remains plenty more to examine about AI’s role in self-concept development, something she is excited to pursue as an affiliate faculty member with Northwestern’s newly launched Institute for Adolescent Mental Health and Well-Being.

Her worry, however, remains real.

As adolescents increasingly employ AI for creative tasks, Gerber expresses concern the technology could negatively impact self-concept development, which could then ultimately influence agency, sense of self, and the kind of efficacy one might have in the world. AI could spur dissatisfaction in careers, relationships, and life overall, diminishing general happiness and well-being.

Gerber, in fact, sees parallels between the current adoption of AI among adolescents and the early years of social media. At first, the novelty and glitz of social media platforms like Facebook, Pinterest, and Twitter largely silenced any potential alarms. Years later, however, there’s ever-swelling concern about social media’s profound impact on adolescent mental health.

“We have to be visionary, and we have to teach people, especially young people, how AI can be a positive tool,” says Gerber, who is eager to swap a dystopian view of AI for tech-charged solutions that empower and support.

To that end, Gerber and her team are in the early stages of building an AI-powered writing coach. More thought partner than answer generator, the system looks to coach students on their college essays – or, even better, other creative projects youth might tackle to explore self-concept and envision healthier, livelier futures.

“People are going to continue using AI – we can’t turn off that open hose,” Gerber says. “But we can create a tool that allows adolescents to develop self-concept through writing and help them see a better future or a way of making change in the world.”