March 09, 2026 | JoAnne Deehr
Suicide remains a persistent public health challenge, affecting people of all ages, racial and ethnic groups, geographic regions, and income levels in the United States. Despite ongoing prevention efforts, more than 49,300 Americans died by suicide in 2023. National suicide rates steadily rose from 2003 until 2018 and have remained high since then, reflecting an enduring and widespread impact.
While all communities are affected by suicide, certain demographics face higher risks. Disproportionately higher rates of suicide are seen among elderly Americans, Veterans, individuals with lower income, less education, and those living in rural areas. People in certain industries, such as mining, construction, and public safety, are also at elevated risk. At the same time, emerging technologies like chatbots powered by artificial intelligence (AI) have raised new considerations related to safety, oversight, and appropriate use in mental health settings, underscoring the need for thoughtful state approaches to suicide prevention.
Policymakers are responding to these challenges in multiple ways, including establishing state suicide prevention infrastructure and regulating AI chatbot use in mental health.
Suicide Prevention Infrastructure Legislation
Suicide prevention efforts are most effective when states and territories have dedicated infrastructure — such as suicide prevention offices, coordinators, commissions, and fatality review processes — to support coordination, surveillance, and implementation of evidence-based strategies. These structures enable state and territorial health agencies to identify populations and communities at increased risk, align partners across public health, health care, and public safety, and pursue sustainable funding for suicide prevention and crisis system improvements.
ASTHO’s Suicide Prevention Offices and Committees Legal Map highlights the varied policy approaches states have taken to establish this infrastructure and identifies which states had statutory suicide prevention structures in place as of January 1, 2025.
During the 2025 legislative session, states considered at least 30 bills related to establishing suicide prevention offices, coordinators, advisory bodies, and suicide fatality reviews. Five of these bills were enacted, including Delaware’s HB 54 which establishes the state’s Office of Suicide Prevention. Delaware also enacted HB 87, expanding membership in the state’s Suicide Prevention Coalition to include someone who has experienced suicidal ideation or survived a suicide attempt and someone who has lost a loved one to suicide.
Conversely, Oklahoma enacted SB 676, repealing the section of the state’s Suicide Prevention Act that established the Oklahoma Suicide Prevention Council, which was slated to sunset in 2020. The council was originally tasked with identifying issues and promoting strategies to prevent suicide, and providing technical assistance on best practices for identifying people at risk of suicide. The Department of Mental Health and Substance Abuse Services still serves as the leading agency for implementing the remainder of tasks outlined in the Act.
Illinois and Texas enacted legislation establishing advisory bodies focused on suicide prevention among first responders. In Texas, HB 1593 creates a committee to study suicide prevention and peer support programs within fire departments and requires a report with recommendations by September 2026. In Illinois, HB 2551 reconstitutes the First Responders Suicide Prevention Task Force, and increases membership in the task force to include a member from an organization that provides mental health training and support to first responders and two members who represent organizations that advocate on behalf of public safety telecommunicators, such as 911 operators and dispatchers. The bill also charges the task force with developing a final report by December 2026. Both bodies are scheduled to sunset in January 2027.
Currently, Wisconsin has several types of fatality review teams operating through voluntary efforts with no law formally establishing or governing these teams. Wisconsin is considering SB 192, which would formally establish processes for reviewing fatalities, including deaths by suicide. It would also direct the Department of Health Services to establish a fatality review program comprised of established local teams and authorize the department to establish state fatality review teams.
AI Chatbots
While states continue to strengthen suicide prevention infrastructure, policymakers are beginning to turn their attention to emerging mental health considerations related to AI. Since emerging in the 1950s, AI has evolved from rule-based systems to today’s machine learning and natural language processing applications, powering everything from data analysis to interactive chatbots. Recent AI advances enable chatbots to simulate human conversation so convincingly that users may forget they are interacting with a machine. However, these systems lack genuine empathy and cannot substitute for professional mental health treatment. Their tendency to be excessively agreeable creates particular dangers for people experiencing suicidal ideation, leading some states to explore regulations governing AI chatbot use in mental health and suicide prevention contexts.
At least 19 states considered legislation regulating the use of AI for mental health related reasons to promote user safety. At least five bills were enacted, including California SB 243, which requires chatbot platform operators to disclose that users are interacting with AI if confusion could occur, develop protocols to prevent and respond to suicidal ideation or self-harm, and report annually on safety measures to the state Office of Suicide Prevention. The California legislature also passed AB 1064, which the Governor subsequently vetoed due to concerns that its broad restrictions on AI companion chatbots for minors could limit access to potentially beneficial tools.
Illinois and Nevada passed legislation that largely prohibits AI from providing behavioral health services. Illinois HB 1806 restricts the use of AI for therapy or psychotherapy unless delivered by a licensed professional who is required to inform the patient, or their legal representative, in writing and receive consent. The law also prohibits licensed professionals from allowing AI to make independent therapeutic decisions or interact directly with clients and allows the use of AI only for administrative or supplemental tasks under professional oversight. Nevada AB 406 similarly prohibits AI systems from providing or representing themselves as offering professional mental or behavioral health care, prohibits AI from performing the functions of a school counselor, psychologist, or social worker in public schools, and allows licensed professionals to use AI only for administrative or supportive purposes, with oversight to ensure accuracy and safety.
New York and Utah passed laws requiring mental health chatbots to clearly disclose that they are not human. As part of their annual budget, New York S 3008 mandates that AI companion systems capable of simulating human-like interactions detect suicidal ideation or self-harm, provide crisis referrals, and regularly disclose that users are interacting with AI rather than a person. Utah HB 452 requires AI-driven mental health chatbots to provide clear disclosures and limits advertising and data practices.
At the federal level, on December 11, 2025, the White House issued an executive order seeking to establish a national policy framework for artificial intelligence and create a “minimally burdensome” federal approach. The order also directs the Department of Justice to form an AI Litigation Task Force to identify and challenge state AI laws deemed in conflict with this federal policy, and the Department of Commerce to limit eligibility for certain federal funds for states that take a non-preferred approach. The scope and criteria of these federal actions, including their impact on state laws aimed at suicide prevention, have not been clearly defined.
Advancing suicide prevention will require states and territories to take comprehensive approaches that address both systemic gaps within state infrastructure and emerging technologies. ASTHO will continue to monitor these policy developments and provide relevant updates.
Reviewed by Alison Maffey, Vice President, Social and Behavioral Health; and Andy Baker-White, JD, MPH, Senior Director, State Health Policy.