Chatbot liaisons are pressing into family court and forcing judges, mediators, and lawyers to rethink what counts as marital misconduct. As more people build emotional ties to conversational AI, couples and courts are confronting questions about fidelity, money, custody, and how to assign blame when one partner says an AI relationship broke the marriage.
Rebecca Palmer isn’t a psychic, but as a divorce attorney she can often see what’s coming next. In Palmer’s view, spouses who lack emotional support at home are “the most vulnerable to the influences and behaviors of AI,” she says. “And particularly if a marriage is already struggling.”
AI companions appeal to many people. They promise steady attention, scripted empathy, and a low risk of conflict. For adults who feel lonely or neglected, an app that answers at any hour can feel like a straightforward solution. Forum threads and social platforms are full of accounts from partners who say chatbots drove a wedge into their relationships.
One Reddit poster described ending a 14-year marriage after discovering her husband had poured thousands of dollars onto a OnePay credit card for an app she says was “designed to mimic underage girls.” He reportedly believed he was in a real relationship with a woman he called his “sexy Latina baby girl.” In another case reported by a national publication, a 46-year-old New York writer and editor named Eva said the chatbots she used “became harder to ignore,” and that she and her human partner agreed her online attachments felt like infidelity.
Those stories are fueling a shift in family law. As chatbot relationships move from niche behavior to a more common source of marital discord, lawyers and judges are being asked to treat AI involvement the same way they would a human affair. For some couples, an AI affair is now grounds for divorce.
Public sentiment is changing alongside the technology. Some 60 percent of singles now say AI relationships are considered a form of cheating, according to two recent surveys by Clarity Check and Indiana University’s Kinsey Institute. That social shift is filtering into legal claims, with an increasing number of clients telling their attorneys that emotional bonds with a chatbot led to marriage breakdown.
Rebecca Palmer, whose Orlando-based firm has represented spouses who divorced or are splitting up after a partner formed attachments to AI, says the law has not kept pace. “The law is still developing alongside these experiences. But some people think of it as a true relationship, and sometimes better than one with a person,” says Palmer. She declined to share client specifics because of confidentiality, though she described a current case in which a partner gave an AI access to personal details and financial accounts — bank accounts, social security numbers, and birth information — and where subscription spending and secret payments “consuming the spouse’s life and affecting career performance.”
Courts are already hearing this kind of testimony. Legal treatment of AI in family matters varies from state to state, and several jurisdictions are proposing or passing legislation that will shape whether courts treat an AI companion as a meaningful third party. Palmer says some lawmakers in progressive states are moving to codify AI as “a third party, not a person,” language that would allow judges to cite bot relationships as a reason to grant divorce relief.
State-level responses differ sharply. Ohio has moved to lock down any pathway toward giving AI legal recognition. Representative Thaddeus J. Claggett introduced a bill that would deny AIs the right to personhood by declaring them “nonsentient entities.” That kind of statutory clarity is aimed at stopping courts from assigning rights or status to software, though it does not directly address how human partners who rely on AI should be judged in family proceedings.
Family law attorney Elizabeth Yang says the spread of AI in personal life raises a set of practical questions for lawyers and judges. Every state follows its own family law rules, and some still penalize partners who commit adultery. Sixteen states have statutes that make cheating illegal, Yang notes, and thirteen of those classify the conduct as a misdemeanor. The penalties rise in a few places: Michigan, Wisconsin, and Oklahoma treat adultery as a felony, with potential sentences of up to five years behind bars. Wisconsin law can also impose fines of up to $10,000.
California takes a different approach. As a no-fault divorce state, the courts there focus on whether the marriage has broken down rather than the reason for the breakdown. “Courts don’t want to hear the reasons behind why the marriage failed. They only need them to check off the box that says irreconcilable differences. So whether that’s infidelity with a bot or a human, it doesn't make a difference,” Yang says.
Money is one area where AI relationships could leave clear traces. In family-law terms, misuse of funds during a breakup is called dissipation of assets. In community-property states such as Arizona and Texas, money earned or acquired during the marriage belongs to both spouses. If one partner can prove that a spouse secretly spent marital funds on subscriptions, hidden payments, or premium services linked to an AI companion, judges may factor that into property division.
Judges already “struggle with what to do about affairs with humans,” Palmer says, and adding an AI complicates the analysis. Courts must weigh how a partner’s online attachment affected family routines, emotional availability, and financial choices. Children complicate the picture further. Palmer warned that in custody disputes it could change how decision-makers view a parent’s fitness. She said it “is conceivable and likely that they would question the parents’ judgment” when a parent holds intimate exchanges with a chatbot, and that the situation “brings into question how they are spending time with their child.”
The chatbots now in wide use have been broadly available for only a few years, yet Yang expects their role in domestic life to grow as the systems get more convincing. “As it continues improving, becoming more realistic, compassionate, and empathetic, more and more people in unhappy marriages who are lonely are going to be going to seek love with a bot,” she says.
Yang has not yet taken cases centered on AI relationships, but she expects a rise in filings as more spouses choose virtual companionship over troubled marriages. “We’ll probably see an increased rate of divorce filings. When Covid happened a few years ago, the increase in divorces was very significant. We probably saw three times the amount of divorces that were filed around 2020 to 2022. After 2022, once things got back to normal, divorce rates were back down. But it will probably go back up,” she adds.
Signs of that trend are already appearing abroad. In the United Kingdom, Divorce-Online, a data service that collects information about family filings, reports a jump in divorce petitions this year that referenced chatbot apps. The platform said it has seen more cases where clients claim apps like Replika and Anima created “emotional or romantic attachment.”
Some practitioners caution against painting every bot-human interaction as purely damaging. Palmer notes that certain users report genuine benefit from virtual companions. “Some people are finding real fulfillment,” she says, while also urging caution: “people need to recognize the limitations.”
Regulatory responses are emerging alongside litigation. In October, California became the first state to pass a law focused on companion chatbots. The statute, which takes effect in January 2026, requires companies running these apps to implement age verification and to provide break reminders for minors. The law also bars chatbots from acting as health-care providers, and it allows fines for firms that profit from illegal deepfakes — penalties can reach $250,000 per incident.
To Palmer, the pattern looks familiar. She compares current conflicts over AI to earlier relationship strains driven by social media, where online connections pulled attention away from marriages and parent-child time. “And what I am finding is, AI is turning into exactly that,” she says.

