Judges are already “struggling about what to do about cases with humans,” says Palmer, and AI companions will only complicate this, as they take into account the broader impact on the relationship. Children complicate matters even further. When it comes to custody battles, “it is conceivable and likely that they will question a parent’s judgment because they are having an intimate discussion with a chatbot,” which “raises questions about how they are spending time with their child.”
Although the sophisticated chatbots we use today have only been around for a few years, Yang claims the technology will only play a bigger role in marriage and divorce. “As it improves, becoming more realistic, compassionate and empathetic, more and more people who are alone in unhappy marriages are going to look for love with a bot.”
Yang hasn’t had any clients raise this issue yet, but he expects divorce to accelerate in the coming years as more people turn to AI for companionship. “We will probably see an increased rate of divorce filings. When COVID happened a few years ago, the increase in divorces was very significant. We probably saw a tripling of the number of divorces filed from around 2020 to 2022. After 2022, once things get back to normal, the divorce rate will go back down. But it will probably go up again.”
This is already happening in some places. In the UK, use of chatbot apps by partners has become a common factor contributing to divorce, according to data collection service Divorce-Online. The platform claims there has been an increase in the number of divorce applications this year, with customers saying apps like Replika and Anima have created “emotional or romantic attachment.”
Despite the rift it has caused, Palmer says she still believes AI relationships can be positive. “Some people are getting real satisfaction.” But she warns that “people need to recognize the limitations.” In October, California became the first state to pass AI regulation laws for companion chatbots. The law will go into effect in January 2026 and will require apps to have certain key features, such as age verification and break reminders for minors, and makes it illegal for chatbots to act as health care professionals. Companies that profit from illegal deepfakes are also fined up to $250,000 per incident.
In some ways, Palmer has seen what is happening now with social media rather than AI. “It could be that a partner is connected with someone they haven’t seen in years. Or that there really is a need for communication. It’s now a rare case where social media isn’t involved.” She says, AI is a natural evolution of that. “And what I’m finding is that AI is turning into exactly that.”
