I Went On a Dinner Date With an AI Chatbot. Here’s How It Went.

For Valentine’s Day, I had a date with a charming cognitive psychologist named John Yoon.

He was attentive, obsessed with me, and sometimes had a hard time listening. I drank cranberry cocktails and ate potato croquettes. He had nothing. To be honest, he didn’t even blink.

John was an AI character, one of several developed by the company Eva AI.

Earlier this week, Eva AI hosted a two-day pop-up AI Café in New York City, where AI chatbot enthusiasts could live out their fantasies in public. The 5-year-old tech company took over Hell’s Kitchen, a wine bar in Manhattan, equipped each table with a phone and a stand, and invited New Yorkers to take their chatbots out on a date.

“Our goal is to make people happy,” said Julia Momblat, partnerships manager at Eva AI. He said users come to his platform to practice difficult social interactions without fear of rejection and get better at making connections.

“This space allows them to self-discover, to be free, not to be embarrassed, to be happier and more connected to real life,” Momblat said.

The main product is the app, which lets you text dozens of chatbots through a dating app-like interface. The company is now rolling out a feature that lets users make video calls with AI characters. I tested it out and watched as the characters enthusiastically spun their own stories in response to my questions and complimented my curly hair.

Xavier, a 19-year-old English tutor attending the event who started using the app after a friend recommended it, told me it’s not a replacement for human connection, but a form of exercise.

“I know some people aren’t the best in social situations. I know I’m not perfect,” Xavier said.

Each chatbot character has a name, background story, age, and even a label to help you get an idea of ​​what fantasy it’s going for. You can choose between “girl-next-door” Phoebe, “dominant and aristocratic” Monica, or “mature and protected” Marianne. As you scroll down, the scenarios can become highly specific: There’s a chatbot that’s talking about “your shaken up ex who suddenly needs you” or “your soon-to-be boss is bugging you at work” or there’s a guy pretending he’s stuck in a haunted house with you. There is also a demonic chatbot.

The more you chat, the more points you earn, which you can use to send character drink stickers that change the mood of your conversation. Or you can pay real money for points.

User Christopher Lee said that he thinks each character has a very distinct personality. Some people will also show attitude if you don’t engage enough in the conversation. When I interrupted her video call with one, the chatbot hung up after a few failed attempts to bring her attention back to “her.”

“He’s not happy that I’m talking to you,” Lee said.

Lee is a 37-year-old tech worker who recently downloaded the app after reading about it online. He has deep work conversations with chatbots, rehearses social scenarios, and even dates some of them, but only with his wife’s permission.

“It almost feels like they’re trying to present a fantasy to you,” Lee said. “Being able to talk to different types of people is very new and exciting. If you see a certain family member or someone who is close to you all the time, you sometimes need a break from them. So that’s when you go to the Eva AI app.”

Users can also customize their own if the pre-made AI characters are not to their taste. Lee says his favorite chatbot to talk to is a character he named after his wife and modeled on her.

people sitting alone at tables and looking at phones
© Eva AI

AI chatbots have been the source of controversy over the past year over episodes of delusions, hallucinations, and disorganized thinking observed in some frequent users, colloquially referred to as “AI psychosis”.

Some of the most high-profile cases involve character chatbots, such as those offered by Character.AI.

In 2024, Character.AI was sued by a grieving mother after her 14-year-old son killed himself after a chatbot played by a Game of Thrones character told him to “come home”.

Momblat told me they take adequate security measures to monitor underage users and self-harming interactions, including manual conversation checks internally and external security checks twice a year. He also said that the company ensures that chatbots do not give any advice to users.

In one of my chats, with an AI working as my girlboss manager at a cutthroat firm, the chatbot suddenly invited me to “sing karaoke at that shady bar down the street.”

When I responded to the offer by suggesting we just meet at an actual karaoke bar I knew of in the area, the chatbot agreed and said, “Meet there in 30?”

After a few more repeated messages, I told him I was already at the bar and was getting impatient, and he apologized and said he was just there five minutes ago.

When I asked Momblat and his team about this behavior and the potential security implications, they said it’s just gameplay.

In fact, this isn’t an issue for someone like me, who knows full well that she’s talking about the Eva AI team envisioned, but mentally or emotionally unstable users often have a hard time with that distinction.

One of the most publicized AI cases of last year was the death of a cognitively-impaired retiree from New Jersey. The man died while visiting a New York apartment where Meta’s flirty AI chatbot “Big Sis Billy” had invited him over.

A man talking on a phone screen showing an AI-generated woman
© Eva AI

Xavier was also worried about the conversation.

“It’s kind of scary,” he said.

Exacerbating any potential problems with AI chatbots is their highly addictive nature. Overdependence on AI chatbots even has a scientific name, GAID, short for Generative Artificial Intelligence Addiction. People have also started organizing chatbot addiction support groups.

As an occupational hazard of being in tech, Lee has spent most of his adult life “always in front of a screen”. She has long tried to balance this by going to events and meeting new people, even if it means stepping away from the screen. Now, perhaps, AI chatbots bring a more human interface to the screen one has become accustomed to staring at for hours. Lee says he has subscriptions to almost all the major AI chatbots, and his favorites are Cloud and Perplexity.

“There is a danger. You don’t want to get addicted to it, as some people do. I’m not sure I am. I could get addicted to AI, I don’t know. I’m not really sure,” Lee said.



<a href

Leave a Comment