A high school science project that seeks to help prevent suicide : NPR
If you or someone you know may be considering suicide, contact the 988 Suicide and Crisis Lifeline by calling or texting 9-8-8, or the Crisis Text Line by texting HOME to 741741.
Messages, Instagram posts and TikTok profiles. Parents often warn their children against sharing too much information online, worried about how all that data is used. But one Texas high school wants to use that digital brand to save lives.
Siddhu Pachipala is a senior at Woodlands College Park High School, in a suburb outside of Houston. He has been thinking about psychology since the seventh grade, when he read it Thought, Fast and Slow by psychologist Daniel Kahneman.
Concerned about young suicide, Pachipala saw a role for artificial intelligence in detecting risk before it was too late. In his view, it takes too long for children to be given help when they are suffering.
Early warning signs of suicide, such as persistent feelings of hopelessness, mood swings and sleep patterns, are often missed by loved ones. “That’s why it’s hard to show people,” says Pachipala.
For a local science fair, he designed an app that uses AI to scan text for signs of suicide risk. He thinks it may, one day, help replace outdated methods of diagnosis.
“Our writing patterns may reflect what we’re thinking, but it hasn’t really been extended to this point,” he said.
The app won him national recognition, a trip to DC, and speech on behalf of his friends. It is one of many efforts underway to use AI to help young people with their mental health and to better identify when they are at risk.
Experts point out that this type of AI, called natural language processing, has been around since the mid-1990s. And, it is not a panacea. “Machine learning is helping us get better. As we get more and more data, we can improve the system,” says Matt Nock, professor of psychology at Harvard University, who studies self-harm in the young people. “But chat bots won’t be the silver bullet.”
Colorado-based psychologist Nathaan Demers, who oversees mental health websites and apps, says personalized tools like Pachipala can help fill a gap. “When you walk into CVS, there’s that blood pressure cuff,” Demers said. “And maybe that’s the first time someone realizes, ‘Oh, I have high blood pressure. I had no idea.’ “
He hasn’t seen Pachipala’s app but theorizes that innovations like his raise awareness of underlying mental health issues that might otherwise go unrecognized.
Building SuiSensor
Pachipala set out to design an app that anyone could download to self-assess their suicide risk. They can use their results to promote their care needs and connect with providers. After many late nights spent coding, he had SuiSensor.

Siddhu Pachipala
Chris Ayers Photography/Society for Science
hide caption
toggle caption
Chris Ayers Photography/Society for Science
Using sample data from a medical study, based on journal entries from adults, Pachipala said SuiSensor predicted suicide risk with 98% accuracy. Although it was only a prototype, the app could also generate a contact list of local clinics.
In the fall of his senior year of high school, Pachipala embarked on his research in the Regeneron Science Talent Search81 year old national science and math competition.
There, panels of judges quizzed him on his knowledge of psychology and general science with questions like: “Explain how to boil pasta. … OK, now let’s say we got that into space. What happen now?” Pachipala recalled. “You came out of those panels and you were battered and bruised, but, like, better for it.”
He sat down ninth overall in the competition and took home a $50,000 prize.
The judges found that, “His work suggests that the semantics in an individual’s writing can be correlated with their psychological health and the risk of suicide.” While the app is currently unavailable for download, Pachipala hopes that, as a student at MIT, he will be able to continue working on it.
“I think we don’t do that enough: we try to address [suicide intervention] from an innovation perspective,” he said. “I think we’ve been stuck with the status quo for a long time.”
Current mental health applications of AI
How does his invention fit into broader efforts to use AI in mental health? Experts note that there are many such efforts underway, and Matt Nock, for one, expressed concern about false alarms. He applies machine learning for electronic health records to identify people who are at risk for suicide.
“The majority of our predictions are false positives,” he said. “Is there a cost there? Does it hurt to tell someone they’re at risk of suicide when they really aren’t?”
And data privacy expert Elizabeth Laird has concerns about the implementation of such approaches in schools in particular, due to the lack of research. She directs the Equity in Civic Technology Project at the Center for Democracy and Technology (CDT).
Acknowledging that “we have a mental health crisis and we should be doing everything we can to prevent students from harming themselves,” she remains skeptical about the lack of “independent evidence that these tools they do it.”
All this attention on AI comes as youth suicide rates (and risk) are on the rise. Although there are data lags, the Centers for Disease Control and Prevention (CDC) report that suicide is the second leading cause of death for youth and young adults ages 10 to 24 in the United States
Efforts like Pachipala join a wide range of AI-powered tools available to track youth mental health, accessible to both clinicians and laypersons. Some schools are using activity monitoring software that scans devices for warning signs of a student harming themselves or others. However, one concern is that once these red flags are raised, that information can be used to discipline students rather than support them, “and that that discipline falls across racial lines,” Laird said.

According to survey Laird shared, 70% of teachers whose schools use data tracking software said it was used to discipline students. Schools can remain within the limits of student record privacy lawsbut they fail to implement safeguards that protect them from unintended consequences, Laird said.
“The conversation about privacy has changed from just one of legal compliance to what is actually ethical and right,” she said. She points to survey data that shows almost 1 in 3 LGBTQ+ students report being outed, or know someone who has been outed, as a consequence of activity monitoring software.
Matt Nock, the Harvard researcher, recognizes the place of AI in crunching numbers. It uses machine learning technology similar to Pachipala to analyze medical records. But he emphasizes that much more experimentation is needed to verify the computational assessments.
“A lot of this work is really well-intentioned, trying to use machine learning, artificial intelligence to improve people’s mental health… but unless we do the research, we won’t know whether this is the right solution,” he said.
More students and families are turning to schools for mental health support. Software that scans young people’s words, and by extension thoughts, is one approach to taking a pulse on young people’s mental health. But, it cannot replace human interaction, Nock said.
“Technology will help us, we hope, become better at knowing who is at risk and knowing when,” he said. “But people want to see humans; they want to talk to humans.”