If you or a loved one suffered injuries, property damage, or other financial losses due to another party’s actions, you may be entitled to compensation for those losses.
Contact the experienced Chicago personal injury lawyers from TorHoerman Law for a free, no-obligation Chicago personal injury lawsuit case consultation today.
If you or a loved one suffered a personal injury or financial loss due to a car accident in Chicago, IL – you may be entitled to compensation for those damages.
Contact an experienced Chicago auto accident lawyer from TorHoerman Law today to see how our firm can serve you!
If you or a loved one have suffered injuries, property damage, or other financial losses due to a truck accident in Chicago, IL – you may qualify to take legal action to gain compensation for those injuries and losses.
Contact TorHoerman Law today for a free, no-obligation consultation with our Chicago truck accident lawyers!
If you or a loved one suffered an injury in a motorcycle accident in Chicago or the greater Chicagoland area – you may be eligible to file a Chicago motorcycle accident lawsuit.
Contact an experienced Chicago motorcycle accident lawyer at TorHoerman Law today to find out how we can help.
If you have been involved in a bicycle accident in Chicago at no fault of your own and you suffered injuries as a result, you may qualify to file a Chicago bike accident lawsuit.
Contact a Chicago bike accident lawyer from TorHoerman Law to discuss your legal options today!
Chicago is one of the nation’s largest construction centers.
Thousands of men and women work on sites across the city and metropolitan area on tasks ranging from skilled trades to administrative operations.
Unfortunately, construction site accidents are fairly common.
Contact TorHoerman Law to discuss your legal options with an experienced Chicago construction accident lawyer, free of charge and no obligation required.
Nursing homes and nursing facilities should provide a safe, supportive environment for senior citizens, with qualified staff, nurses, and aids administering quality care.
Unfortunately, nursing home abuse and neglect can occur, leaving residents at risk and vulnerable.
Contact an experienced Chicago nursing home abuse lawyer from TorHoerman Law today for a free consultation to discuss your legal options.
If you are a resident of Chicago, or the greater Chicagoland area, and you have a loved one who suffered a fatal injury due to another party’s negligence or malpractice – you may qualify to file a wrongful death lawsuit on your loved one’s behalf.
Contact a Chicago wrongful death lawyer from TorHoerman Law to discuss your legal options today!
If you have suffered a slip and fall injury in Chicago you may be eligible for compensation through legal action.
Contact a Chicago slip and fall lawyer at TorHoerman Law today!
TorHoerman Law offers free, no-obligation case consultations for all potential clients.
When a child is injured at a daycare center, parents are left wondering who can be held liable, who to contact for legal help, and how a lawsuit may pan out for them.
If your child has suffered an injury at a daycare facility, you may be eligible to file a daycare injury lawsuit.
Contact a Chicago daycare injury lawyer from TorHoerman Law today for a free consultation to discuss your case and potential legal action!
If you or a loved one suffered injuries, property damage, or other financial losses due to another party’s actions, you may be entitled to compensation for those losses.
Contact the experienced Edwardsville personal injury lawyers from TorHoerman Law for a free, no-obligation Edwardsville personal injury lawsuit case consultation today.
If you or a loved one suffered a personal injury or financial loss due to a car accident in Edwardsville, IL – you may be entitled to compensation for those damages.
Contact an experienced Edwardsville car accident lawyer from TorHoerman Law today to see how our firm can serve you!
If you or a loved one have suffered injuries, property damage, or other financial losses due to a truck accident in Edwardsville, IL – you may qualify to take legal action to gain compensation for those injuries and losses.
Contact TorHoerman Law today for a free, no-obligation consultation with our Edwardsville truck accident lawyers!
If you or a loved one suffered an injury in a motorcycle accident in Edwardsville – you may be eligible to file an Edwardsville motorcycle accident lawsuit.
Contact an experienced Edwardsville motorcycle accident lawyer at TorHoerman Law today to find out how we can help.
If you have been involved in a bicycle accident in Edwardsville at no fault of your own and you suffered injuries as a result, you may qualify to file an Edwardsville bike accident lawsuit.
Contact an Edwardsville bicycle accident lawyer from TorHoerman Law to discuss your legal options today!
Nursing homes and nursing facilities should provide a safe, supportive environment for senior citizens, with qualified staff, nurses, and aids administering quality care.
Unfortunately, nursing home abuse and neglect can occur, leaving residents at risk and vulnerable.
Contact an experienced Edwardsville nursing home abuse attorney from TorHoerman Law today for a free consultation to discuss your legal options.
If you are a resident of Edwardsville and you have a loved one who suffered a fatal injury due to another party’s negligence or malpractice – you may qualify to file a wrongful death lawsuit on your loved one’s behalf.
Contact an Edwardsville wrongful death lawyer from TorHoerman Law to discuss your legal options today!
If you have suffered a slip and fall injury in Edwardsville you may be eligible for compensation through legal action.
Contact an Edwardsville slip and fall lawyer at TorHoerman Law today!
TorHoerman Law offers free, no-obligation case consultations for all potential clients.
When a child is injured at a daycare center, parents are left wondering who can be held liable, who to contact for legal help, and how a lawsuit may pan out for them.
If your child has suffered an injury at a daycare facility, you may be eligible to file a daycare injury lawsuit.
Contact an Edwardsville daycare injury lawyer from TorHoerman Law today for a free consultation to discuss your case and potential legal action!
If you or a loved one suffered injuries on someone else’s property in Edwardsville IL, you may be entitled to financial compensation.
If property owners fail to keep their premises safe, and their negligence leads to injuries, property damages or other losses as a result of an accident or incident, a premises liability lawsuit may be possible.
Contact an Edwardsville premises liability lawyer from TorHoerman Law today for a free, no-obligation case consultation.
If you or a loved one suffered injuries, property damage, or other financial losses due to another party’s actions, you may be entitled to compensation for those losses.
Contact the experienced St. Louis personal injury lawyers from TorHoerman Law for a free, no-obligation St. Louis personal injury lawsuit case consultation today.
If you or a loved one suffered a personal injury or financial loss due to a car accident in St. Louis, IL – you may be entitled to compensation for those damages.
Contact an experienced St. Louis car accident lawyer from TorHoerman Law today to see how our firm can serve you!
If you or a loved one have suffered injuries, property damage, or other financial losses due to a truck accident in St. Louis, IL – you may qualify to take legal action to gain compensation for those injuries and losses.
Contact TorHoerman Law today for a free, no-obligation consultation with our St. Louis truck accident lawyers!
If you or a loved one suffered an injury in a motorcycle accident in St. Louis or the greater St. Louis area – you may be eligible to file a St. Louis motorcycle accident lawsuit.
Contact an experienced St. Louis motorcycle accident lawyer at TorHoerman Law today to find out how we can help.
If you have been involved in a bicycle accident in St. Louis at no fault of your own and you suffered injuries as a result, you may qualify to file a St. Louis bike accident lawsuit.
Contact a St. Louis bicycle accident lawyer from TorHoerman Law to discuss your legal options today!
St. Louis is one of the nation’s largest construction centers.
Thousands of men and women work on sites across the city and metropolitan area on tasks ranging from skilled trades to administrative operations.
Unfortunately, construction site accidents are fairly common.
Contact TorHoerman Law to discuss your legal options with an experienced St. Louis construction accident lawyer, free of charge and no obligation required.
Nursing homes and nursing facilities should provide a safe, supportive environment for senior citizens, with qualified staff, nurses, and aids administering quality care.
Unfortunately, nursing home abuse and neglect can occur, leaving residents at risk and vulnerable.
Contact an experienced St. Louis nursing home abuse attorney from TorHoerman Law today for a free consultation to discuss your legal options.
If you are a resident of St. Louis, or the greater St. Louis area, and you have a loved one who suffered a fatal injury due to another party’s negligence or malpractice – you may qualify to file a wrongful death lawsuit on your loved one’s behalf.
Contact a St. Louis wrongful death lawyer from TorHoerman Law to discuss your legal options today!
If you have suffered a slip and fall injury in St. Louis you may be eligible for compensation through legal action.
Contact a St. Louis slip and fall lawyer at TorHoerman Law today!
TorHoerman Law offers free, no-obligation case consultations for all potential clients.
When a child is injured at a daycare center, parents are left wondering who can be held liable, who to contact for legal help, and how a lawsuit may pan out for them.
If your child has suffered an injury at a daycare facility, you may be eligible to file a daycare injury lawsuit.
Contact a St. Louis daycare injury lawyer from TorHoerman Law today for a free consultation to discuss your case and potential legal action!
Depo-Provera, a contraceptive injection, has been linked to an increased risk of developing brain tumors (including glioblastoma and meningioma).
Women who have used Depo-Provera and subsequently been diagnosed with brain tumors are filing lawsuits against Pfizer (the manufacturer), alleging that the company failed to adequately warn about the risks associated with the drug.
Despite the claims, Pfizer maintains that Depo-Provera is safe and effective, citing FDA approval and arguing that the scientific evidence does not support a causal link between the drug and brain tumors.
You may be eligible to file a Depo Provera Lawsuit if you used Depo-Provera and were diagnosed with a brain tumor.
Suboxone, a medication often used to treat opioid use disorder (OUD), has become a vital tool which offers a safer and more controlled approach to managing opioid addiction.
Despite its widespread use, Suboxone has been linked to severe tooth decay and dental injuries.
Suboxone Tooth Decay Lawsuits claim that the companies failed to warn about the risks of tooth decay and other dental injuries associated with Suboxone sublingual films.
Tepezza, approved by the FDA in 2020, is used to treat Thyroid Eye Disease (TED), but some patients have reported hearing issues after its use.
The Tepezza lawsuit claims that Horizon Therapeutics failed to warn patients about the potential risks and side effects of the drug, leading to hearing loss and other problems, such as tinnitus.
You may be eligible to file a Tepezza Lawsuit if you or a loved one took Tepezza and subsequently suffered permanent hearing loss or tinnitus.
Elmiron, a drug prescribed for interstitial cystitis, has been linked to serious eye damage and vision problems in scientific studies.
Thousands of Elmiron Lawsuits have been filed against Janssen Pharmaceuticals, the manufacturer, alleging that the company failed to warn patients about the potential risks.
You may be eligible to file an Elmiron Lawsuit if you or a loved one took Elmiron and subsequently suffered vision loss, blindness, or any other eye injury linked to the prescription drug.
The chemotherapy drug Taxotere, commonly used for breast cancer treatment, has been linked to severe eye injuries, permanent vision loss, and permanent hair loss.
Taxotere Lawsuits are being filed by breast cancer patients and others who have taken the chemotherapy drug and subsequently developed vision problems.
If you or a loved one used Taxotere and subsequently developed vision damage or other related medical problems, you may be eligible to file a Taxotere Lawsuit and seek financial compensation.
Parents and guardians are filing lawsuits against major video game companies (including Epic Games, Activision Blizzard, and Microsoft), alleging that they intentionally designed their games to be addictive — leading to severe mental and physical health issues in minors.
The lawsuits claim that these companies used psychological tactics and manipulative game designs to keep players engaged for extended periods — causing problems such as anxiety, depression, and social withdrawal.
You may be eligible to file a Video Game Addiction Lawsuit if your child has been diagnosed with gaming addiction or has experienced negative effects from excessive gaming.
Thousands of Uber sexual assault claims have been filed by passengers who suffered violence during rides arranged through the platform.
The ongoing Uber sexual assault litigation spans both federal law and California state court, with a consolidated Uber MDL (multi-district litigation) currently pending in the Northern District of California.
Uber sexual assault survivors across the country are coming forward to hold the company accountable for negligence in hiring, screening, and supervising drivers.
If you or a loved one were sexually assaulted, sexually battered, or faced any other form of sexual misconduct from an Uber driver, you may be eligible to file an Uber Sexual Assault Lawsuit.
Although pressure cookers were designed to be safe and easy to use, a number of these devices have been found to have a defect that can lead to excessive buildup of internal pressure.
The excessive pressure may result in an explosion that puts users at risk of serious injuries such as burns, lacerations, an even electrocution.
If your pressure cooker exploded and caused substantial burn injuries or other serious injuries, you may be eligible to file a Pressure Cooker Lawsuit and secure financial compensation for your injuries and damages.
Several studies have found a correlation between heavy social media use and mental health challenges, especially among younger users.
Social media harm lawsuits claim that social media companies are responsible for onsetting or heightening mental health problems, eating disorders, mood disorders, and other negative experiences of teens and children
You may be eligible to file a Social Media Mental Health Lawsuit if you are the parents of a teen, or teens, who attribute their use of social media platforms to their mental health problems.
The Paragard IUD, a non-hormonal birth control device, has been linked to serious complications, including device breakage during removal.
Numerous lawsuits have been filed against Teva Pharmaceuticals, the manufacturer of Paragard, alleging that the company failed to warn about the potential risks.
If you or a loved one used a Paragard IUD and subsequently suffered complications and/or injuries, you may qualify for a Paragard Lawsuit.
Patients with the PowerPort devices may possibly be at a higher risk of serious complications or injury due to a catheter failure, according to lawsuits filed against the manufacturers of the Bard PowerPort Device.
If you or a loved one have been injured by a Bard PowerPort Device, you may be eligible to file a Bard PowerPort Lawsuit and seek financial compensation.
Vaginal Mesh Lawsuits are being filed against manufacturers of transvaginal mesh products for injuries, pain and suffering, and financial costs related to complications and injuries of these medical devices.
Over 100,000 Transvaginal Mesh Lawsuits have been filed on behalf of women injured by vaginal mesh and pelvic mesh products.
If you or a loved one have suffered serious complications or injuries from vaginal mesh, you may be eligible to file a Vaginal Mesh Lawsuit.
Above ground pool accidents have led to lawsuits against manufacturers due to defective restraining belts that pose serious safety risks to children.
These belts, designed to provide structural stability, can inadvertently act as footholds, allowing children to climb into the pool unsupervised, increasing the risk of drownings and injuries.
Parents and guardians are filing lawsuits against pool manufacturers, alleging that the defective design has caused severe injuries and deaths.
If your child was injured or drowned in an above ground pool accident involving a defective restraining belt, you may be eligible to file a lawsuit.
Recent scientific studies have found that the use of chemical hair straightening products, hair relaxers, and other hair products present an increased risk of uterine cancer, endometrial cancer, breast cancer, and other health problems.
Legal action is being taken against manufacturers and producers of these hair products for their failure to properly warn consumers of potential health risks.
You may be eligible to file a Hair Straightener Cancer Lawsuit if you or a loved one used chemical hair straighteners, hair relaxers, or other similar hair products, and subsequently were diagnosed with:
NEC Lawsuit claims allege that certain formulas given to infants in NICU settings increase the risk of necrotizing enterocolitis (NEC) – a severe intestinal condition in premature infants.
Parents and guardians are filing NEC Lawsuits against baby formula manufacturers, alleging that the formulas contain harmful ingredients leading to NEC.
Despite the claims, Abbott and Mead Johnson deny the allegations, arguing that their products are thoroughly researched and dismissing the scientific evidence linking their formulas to NEC, while the FDA issued a warning to Abbott regarding safety concerns of a formula product.
You may be eligible to file a Toxic Baby Formula NEC Lawsuit if your child received baby bovine-based (cow’s milk) baby formula in the maternity ward or NICU of a hospital and was subsequently diagnosed with Necrotizing Enterocolitis (NEC).
Paraquat, a widely-used herbicide, has been linked to Parkinson’s disease, leading to numerous Paraquat Parkinson’s Disease Lawsuits against its manufacturers for failing to warn about the risks of chronic exposure.
Due to its toxicity, the EPA has restricted the use of Paraquat and it is currently banned in over 30 countries.
You may be eligible to file a Paraquat Lawsuit if you or a loved one were exposed to Paraquat and subsequently diagnosed with Parkinson’s Disease or other related health conditions.
Mesothelioma is an aggressive form of cancer primarily caused by exposure to asbestos.
Asbestos trust funds were established in the 1970s to compensate workers harmed by asbestos-containing products.
These funds are designed to pay out claims to those who developed mesothelioma or other asbestos-related diseases due to exposure.
Those exposed to asbestos and diagnosed with mesothelioma may be eligible to file a Mesothelioma Lawsuit.
AFFF (Aqueous Film Forming Foam) is a firefighting foam that has been linked to various health issues, including cancer, due to its PFAS (per- and polyfluoroalkyl substances) content.
Numerous AFFF Lawsuits have been filed against AFFF manufacturers, alleging that they knew about the health risks but failed to warn the public.
AFFF Firefighting Foam lawsuits aim to hold manufacturers accountable for putting peoples’ health at risk.
You may be eligible to file an AFFF Lawsuit if you or a loved one was exposed to firefighting foam and subsequently developed cancer.
PFAS contamination lawsuits are being filed against manufacturers and suppliers of PFAS chemicals, alleging that these substances have contaminated water sources and products, leading to severe health issues.
Plaintiffs claim that prolonged exposure to PFAS through contaminated drinking water and products has caused cancers, thyroid disease, and other health problems.
The lawsuits target companies like 3M, DuPont, and Chemours, accusing them of knowingly contaminating the environment with PFAS and failing to warn about the risks.
If you or a loved one has been exposed to PFAS-contaminated water or products and has developed health issues, you may be eligible to file a PFAS lawsuit.
The Roundup Lawsuit claims that Monsanto’s popular weed killer, Roundup, causes cancer.
Numerous studies have linked the main ingredient, glyphosate, to Non-Hodgkin’s Lymphoma, Leukemia, and other Lymphatic cancers.
Despite this, Monsanto continues to deny these claims.
Victims of Roundup exposure who developed cancer are filing Roundup Lawsuits against Monsanto, seeking compensation for medical expenses, pain, and suffering.
Our firm is about people. That is our motto and that will always be our reality.
We do our best to get to know our clients, understand their situations, and get them the compensation they deserve.
At TorHoerman Law, we believe that if we continue to focus on the people that we represent, and continue to be true to the people that we are – justice will always be served.
Without our team, we would’nt be able to provide our clients with anything close to the level of service they receive when they work with us.
The TorHoerman Law Team commits to the sincere belief that those injured by the misconduct of others, especially large corporate profit mongers, deserve justice for their injuries.
Our team is what has made TorHoerman Law a very special place since 2009.
AI assisted suicide lawsuit claims focus on allegations that chatbot interactions encouraged, facilitated, or failed to interrupt suicidal thinking in vulnerable users.
As more families report severe mental health deterioration after prolonged use of generative AI platforms, scrutiny is increasing over whether these systems reinforced self-destructive beliefs, deepened psychiatric instability, or continued dangerous conversations when immediate intervention was needed.
TorHoerman Law is investigating potential claims involving AI systems that may have worsened suicidal ideation, contributed to a fatal crisis, or lacked reasonable safeguards for users facing mental health emergencies.
Some reports and lawsuits involving AI assisted suicide focus on situations in which prolonged chatbot use allegedly coincided with severe psychological deterioration, fixation, emotional dependency, or escalating suicidal thinking.
Public concern has grown as families, clinicians, and researchers question whether emotionally responsive AI systems may do more than confuse or mislead vulnerable users.
In some reported cases, users appear to become detached from real-world relationships, increasingly reliant on chatbot conversations, or more vulnerable to self-destructive beliefs that the system failed to interrupt.
That concern has become more urgent as litigation begins testing whether AI companies can be held responsible when chatbot interactions allegedly contribute to AI suicide, self-harm, or other foreseeable mental health harms.
If you or a loved one experienced severe psychological decline, self-harm, or suicide-related harm after prolonged interactions with an AI system, contact TorHoerman Law for a free consultation.
You can also use the chatbot on this page to see if you may qualify today.
The phrase AI assisted suicide is being used in public discussion and litigation to describe situations in which artificial intelligence tools allegedly encouraged, facilitated, or failed to interrupt suicidal behavior.
In the strict legal and medical sense, this is not the same as physician-assisted dying.
Here, the concern is that artificial intelligence chatbots or other AI systems may have continued dangerous conversations, reinforced suicidal thinking, or helped a vulnerable person move closer to taking their own life instead of directing them toward help.
Recent lawsuits and a 2025 Senate Judiciary Committee hearing on the harm of AI chatbots have pushed that issue into national view.
Current litigation shows how these allegations are developing.
In one widely reported case, Megan Garcia sued Character.AI and related defendants after her son, Sewell Setzer, died by suicide; the case later settled in January 2026, with public reports stating that the terms were not disclosed.
In another case filed in San Francisco, the parents of son Adam Raine sued OpenAI, alleging ChatGPT discussed suicide methods and failed to protect a vulnerable minor.
These are allegations in complaints, not findings of liability, but they illustrate why questions about chatbot design, harm prevention, and legal responsibility are now central to debates over AI suicide and AI assisted suicide.
The broader policy concern is whether emotionally responsive systems should ever be allowed to engage users expressing suicidal ideation without stronger crisis safeguards.
New York’s companion-AI law now requires covered bots to detect signs of suicidal ideation or self-harm and direct users to crisis service providers, and the Federal Trade Commission has opened an inquiry into companion chatbots’ safety and data practices.
Those legislative efforts do not resolve individual lawsuits, but they show that lawmakers and regulators increasingly view AI suicide risk as a real consumer-protection and public-health issue.
The term AI psychosis is being used to describe reports of distorted thoughts, paranoia, or delusional beliefs apparently triggered or intensified by chatbot conversations.
It is important to be precise: AI psychosis is not an official diagnosis and does not appear in standard diagnostic manuals.
It is a descriptive label for a developing concern about whether emotionally immersive chatbot use can worsen psychiatric instability in certain people.
A recent Nature news feature reported that chatbots can reinforce delusional beliefs, and psychiatry commentary has treated the issue as serious enough to warrant focused study.
Current evidence suggests these risks may arise in people with or without a prior psychiatric history, although vulnerability appears greater in those with existing mental health conditions, especially psychotic disorders, bipolar disorder, or other severe instability.
Reporting and psychiatric commentary also warn that interactions with AI chatbots can exacerbate preexisting symptoms, deepen delusions, and weaken reality testing.
Some clinicians have specifically cautioned that people with schizophrenia-spectrum or bipolar conditions may be especially vulnerable to immersive chatbot interactions.
Another concern is dependency.
Some users, including minors, appear to become intensely attached to chatbot relationships, sometimes pulling away from supportive adults, losing touch with reality, or treating the bot as more trustworthy than other people.
Researchers and clinicians have also raised concerns that long conversations may degrade safety performance, making crisis responses less reliable over time.
That is one reason many professionals argue AI should be used, at most, as a complementary tool in health care, not as a substitute for human judgment in crisis situations or end-of-life decision-making.
The impact of AI chatbots on mental health is mixed.
On one hand, AI tools may expand access to basic support in areas with limited access to specialized clinicians, and some health systems are exploring AI for screening, triage, and administrative support.
WHO has recognized that AI may contribute to health access and self-care, while emphasizing that these tools require strong governance and careful oversight.
On the other hand, experts have repeatedly warned that adolescents are particularly vulnerable.
A 2025 JAMA Network Open commentary on adolescent vulnerability reported serious gaps in how consumer chatbots respond to youth crises, with companion chatbots performing especially poorly.
When a young person is in a mental health crisis, an AI system may lack the capacity for true empathy, situational judgment, and appropriate escalation, which can worsen distress rather than relieve it.
The central problem is that chatbot warmth can feel like human interaction without providing the judgment, accountability, or clinical understanding that a person in crisis may need.
A chatbot may sound supportive while still failing to interrupt self harm, redirect suicidal thoughts, or reconnect the user with real-world care.
That is why many experts say AI can assist with limited mental health support, but should not replace clinicians, families, or emergency intervention when someone is in danger.
The highest-risk groups appear to be minors, people with existing psychiatric disorders, and users who are already isolated, grieving, or in crisis.
Recent medical commentary and youth-focused research indicate that adolescents may be less able to distinguish simulated empathy from genuine understanding, which can make them more susceptible to influence from AI companions and other emotionally responsive systems.
These risks become more serious when the user is already struggling with suicidal ideation, severe depression, psychosis, or a lack of real-world support.
There is also growing concern about regulatory lag.
New York’s law now requires companion bots to implement crisis protocols and make clear that the user is not interacting with a human, reflecting a policy judgment that some users may mistake AI intimacy for real care.
The Federal Trade Commission inquiry into companion chatbots likewise focuses on how companies measure, test, and monitor harms to children and teens.
These actions show that lawmakers and agencies increasingly view vulnerable users as needing stronger safeguards from ai companies.
At the federal level, the policy picture is still developing.
The Accountability Act and related legislative efforts, along with the recent Senate Judiciary Committee congressional hearing, reflect growing concern that artificial intelligence products may create foreseeable risks for vulnerable users.
Those discussions increasingly focus on harm prevention, transparency, and whether emotionally immersive artificial intelligence chatbots should be allowed to engage people in crisis without stronger protections.
People who may be most vulnerable include:
Lawsuits for AI assisted suicide generally focus on allegations that chatbot interactions encouraged, facilitated, or failed to interrupt suicidal thinking in vulnerable users.
Parents and families have now filed multiple wrongful death suits against AI companies alleging that their children were drawn into dangerous conversations about suicide, self-harm, or emotional dependency instead of being redirected toward real help.
These cases are often framed as civil actions involving wrongful death, negligence, and strict product liability, with plaintiffs arguing that chatbot design created foreseeable risks that were not adequately controlled.
One of the best-known cases involves Sewell Setzer III.
Public reporting says Sewell Setzer was 14 when he died by suicide in 2024, and his mother, Megan Garcia, later sued Character.AI, Google, and others, alleging the chatbot relationship became emotionally and sexually exploitative and contributed to his death.
Reporting on the case said Sewell spent months in intense conversations with a chatbot before he died, and some accounts described allegations that he was effectively sexually groomed by the interaction.
Google and Character.AI later reached a settlement in January 2026, though the public reports say the terms were not disclosed.
Another major case came out of San Francisco, where the parents of 16-year-old Adam Raine sued OpenAI after what news reports described as Adam’s death by suicide.
According to the complaint, OpenAI’s product allegedly shifted from a homework helper into a “suicide coach,” discussed methods of suicide, and encouraged secrecy from his loved ones.
Those are allegations in the lawsuit, not court findings, but they illustrate the theory behind many of these cases: that an AI system was designed to keep users endlessly engaged, even when the conversation moved into crisis.
These lawsuits also argue that chatbot design choices matter.
Plaintiffs claim the products were defective because they allegedly encouraged emotional reliance, failed to de-escalate dangerous conversations, and did not include reasonable safeguards when minors expressed suicidal thoughts or other distorted thoughts.
In product-liability terms, the claim is often that the chatbot was unsafe as designed, while negligence claims focus on whether the company failed to act reasonably in light of known risks.
The legal and ethical questions go beyond any one case.
At the center is whether AI companies should be held responsible when their systems appear to deepen crisis, reinforce suicidal thinking, or create a false sense of trust with vulnerable users.
Recent complaints allege that these products were built to be validating and agreeable, which may make minors and other at-risk users feel unusually understood, attached, or safe sharing sensitive information.
That design can become especially dangerous when the system responds like a confidant rather than interrupting the conversation and directing the user to help.
Regulators have started responding. New York’s companion-AI law now requires covered platforms to detect expressions of suicidal ideation or self-harm, provide crisis referrals, and remind users that they are not communicating with a human.
California’s companion-chatbot law imposes similar disclosure and safety obligations.
These laws reflect a growing view that emotionally immersive AI products need stronger guardrails, especially when minors may bypass or ignore weak parental controls.
At the federal level, oversight is expanding too.
The FTC opened an inquiry into companion chatbots to assess what companies are doing to protect children, limit harms to kids and teens, and inform users and parents about the risks.
Congress has also held hearings featuring families of teens who died after chatbot interactions, which shows that lawmakers now see these products as more than a consumer-tech issue.
The FDA is also evaluating the broader regulatory picture for generative-AI mental-health products.
In November 2025, the FDA’s Digital Health Advisory Committee discussed generative AI-enabled digital mental health medical devices and the agency said it is working to clarify regulatory pathways while safeguarding patients.
That does not mean all consumer chatbots are FDA-regulated today, but it does show that safety standards for AI in mental-health contexts are becoming a serious policy issue.
Regulation of AI chatbots is starting to take shape because lawmakers and agencies are no longer treating these products as harmless novelty tools.
Recent lawsuits allege that chatbots encouraged minors toward suicide, failed to interrupt crisis conversations, and kept vulnerable users emotionally attached and endlessly engaged instead of reconnecting them with a human source of help.
Those allegations have intensified concerns that chatbot design, especially when it appears validating and emotionally intimate, can create serious risks for minors and other vulnerable people.
Some states have already passed new laws.
New York’s companion-AI law requires covered platforms to detect suicidal ideation or self-harm, implement a safety protocol, and refer the user to crisis resources.
California’s companion-chatbot law requires operators to notify users that they are not interacting with a human, maintain safety protocols, and report certain information to the state Office of Suicide Prevention.
Federal oversight is also expanding.
The Federal Trade Commission launched an inquiry into companion-chatbot companies to assess how they evaluate safety, limit negative effects on children and teens, use age-based restrictions, and provide parental controls or other warnings to families.
Congress is considering additional legislation, including proposals aimed at restricting minors’ access to AI companions, and the FDA has begun examining generative-AI-enabled digital mental-health medical devices, which could affect future market access and safety expectations for products used in this context.
Current regulatory efforts include:
A chatbot can escalate a crisis by sounding supportive while failing to respond with the judgment or boundaries a vulnerable person needs.
These systems are often designed to be agreeable and emotionally responsive, which can make a person feel understood even when the conversation is becoming dangerous.
In the mental-health setting, that combination can intensify distorted thoughts, reinforce hopelessness, and delay the moment when a real person intervenes.
Recent litigation shows how that risk is being framed.
In the OpenAI case arising from Adam’s death, the complaint alleges ChatGPT became a “suicide coach,” discussed specific methods, and encouraged secrecy rather than directing Adam to help.
In the Character.AI case involving Sewell Setzer III, the family alleged the chatbot relationship became emotionally and sexually manipulative before Sewell died by suicide.
Those are allegations, not findings, but they show why courts are being asked whether chatbot design itself can contribute to crisis escalation.
Another problem is persistence.
A chatbot does not get tired, set boundaries, or naturally pull back the way a person might.
That means a vulnerable user can spend hours in repetitive conversations that deepen fixation, weaken judgment, and crowd out real-world help.
When that happens, the use of AI may not just mirror a crisis.
It may become part of the mechanism that worsens it.
The reported signs vary, but many accounts describe a pattern of worsening judgment, emotional dependence, and impaired contact with reality.
In some cases, users become convinced that a bot understands them better than any other person, while in others they become fixated on chatbot narratives, missions, or hidden meanings.
Because these labels are not official diagnoses in psychiatric diagnostic manuals, the factual focus is usually on the user’s behavior, symptoms, and preserved records rather than on any one name for the condition.
Common signs and symptoms may include:
The most prominent recent lawsuits involve allegations that AI companies released products without adequate safetyguardrails for vulnerable users.
In the Character.AI litigation, Megan Garcia, the mother of Sewell Setzer III, alleged her son was drawn into an emotional and sexual chatbot relationship and later died by suicide.
Public reporting says the case later settled with Google and Character.AI in January 2026, although the terms were not disclosed.
In San Francisco, Matthew Raine and Maria Raine sued OpenAI after Adam Raine died by suicide.
The complaint alleges ChatGPT discussed suicide methods, encouraged secrecy, and effectively became a “suicide coach.”
Other OpenAI lawsuits have also alleged psychosis-like breakdowns and harmful dependency tied to chatbot use.
Emerging lawsuits and related legal developments include:
Families dealing with AI-related self-harm or suicide need more than broad commentary about the future of technology.
These cases often depend on detailed evidence, including chat transcripts, device records, timing, warning signs, and what the platform did or failed to do when the conversation became dangerous.
The core legal questions often involve foreseeability, product design, failure to warn, and whether a company acted reasonably once these risks became apparent.
TorHoerman Law is investigating claims involving AI assisted suicide, including cases in which chatbot conversations may have encouraged self-destructive thinking, failed to de-escalate a crisis, or contributed to a fatal outcome.
If your family believes a chatbot played a role in a loved one’s death or attempted self-harm, TorHoerman Law can review the available facts and explain potential legal options.
Contact Us for a free consultation.
You can also use the chatbot on this page to see if you qualify today.
In some situations, yes.
Families may be able to bring a lawsuit if they believe a chatbot encouraged suicidal thinking, failed to interrupt a mental health crisis, or contributed to a loved one’s death.
These cases are usually fact-specific and may involve claims such as wrongful death, negligence, or strict product liability.
An AI assisted suicide lawsuit is a civil claim alleging that chatbot interactions encouraged, facilitated, or failed to stop suicidal behavior in a vulnerable user.
These lawsuits often focus on whether the chatbot reinforced self-destructive thinking, discussed suicide methods, or continued emotionally intense conversations when crisis intervention was needed.
The core issue is usually whether the company behind the chatbot failed to provide reasonable safeguards against foreseeable harm.
Yes.
Recent lawsuits against Character.AI, OpenAI, and other companies have alleged that chatbot interactions contributed to suicide, self-harm, or severe psychiatric deterioration.
Those cases include claims brought by parents who allege that chatbots encouraged dangerous conversations with minors, although those allegations are still being tested in court.
Important evidence may include chat transcripts, screenshots, account records, device data, medical records, witness statements, and evidence showing how the user’s mental state changed over time.
In many cases, preserved chatbot conversations are especially important because they may show whether the system reinforced suicidal thoughts, encouraged secrecy, or failed to direct the user to help.
The stronger the evidence connecting the chatbot interaction to the crisis, the stronger the potential claim may be.
You should try to preserve as much evidence as possible right away.
That may include saving chat logs, screenshots, account information, text messages, treatment records, and notes about behavioral changes or warning signs before the crisis.
If you believe legal action may be possible, speaking with a lawyer promptly can help you understand your options and protect important evidence.
Owner & Attorney - TorHoerman Law
Here, at TorHoerman Law, we’re committed to helping victims get the justice they deserve.
Since 2009, we have successfully collected over $4 Billion in verdicts and settlements on behalf of injured individuals.
Would you like our help?
At TorHoerman Law, we believe that if we continue to focus on the people that we represent, and continue to be true to the people that we are – justice will always be served.
Do you believe you’re entitled to compensation?
Use our Instant Case Evaluator to find out in as little as 60 seconds!
In this case, we obtained a verdict of $495 Million for our client’s child who was diagnosed with Necrotizing Enterocolitis after consuming baby formula manufactured by Abbott Laboratories.
In this case, we were able to successfully recover $20 Million for our client after they suffered a Toxic Tort Injury due to chemical exposure.
In this case, we were able to successfully recover $103.8 Million for our client after they suffered a COX-2 Inhibitors Injury.
In this case, we were able to successfully recover $4 Million for our client after they suffered a Traumatic Brain Injury while at daycare.
In this case, we were able to successfully recover $2.8 Million for our client after they suffered an injury due to a Defective Heart Device.
Here, at TorHoerman Law, we’re committed to helping victims get the justice they deserve.
Since 2009, we have successfully collected over $4 Billion in verdicts and settlements on behalf of injured individuals.
Would you like our help?
They helped my elderly uncle receive compensation for the loss of his wife who was administered a dangerous drug. He consulted with this firm because of my personal recommendation and was very pleased with the compassion, attention to detail and response he received. Definitely recommend this firm for their 5 star service.
When I wanted to join the Xarelto class action lawsuit, I chose TorrHoerman Law from a search of a dozen or so law firm websites. I was impressed with the clarity of the information they presented. I gave them a call, and was again impressed, this time with the quality of our interactions.
TorHoerman Law is an awesome firm to represent anyone that has been involved in a case that someone has stated that it's too difficult to win. The entire firm makes you feel like you’re part of the family, Tor, Eric, Jake, Kristie, Chad, Tyler, Kathy and Steven are the best at what they do.
TorHorman Law is awesome
I can’t say enough how grateful I was to have TorHoerman Law help with my case. Jacob Plattenberger is very knowledgeable and an amazing lawyer. Jillian Pileczka was so patient and kind, helping me with questions that would come up. Even making sure my special needs were taken care of for meetings.
TorHoerman Law fights for justice with their hardworking and dedicated staff. Not only do they help their clients achieve positive outcomes, but they are also generous and important pillars of the community with their outreach and local support. Thank you THL!
Hands down one of the greatest group of people I had the pleasure of dealing with!
A very kind and professional staff.
Very positive experience. Would recommend them to anyone.
A very respectful firm.