ChatGPT Gave Out My Address and Phone Number

phone dialing shutterstock

In the 20th century, a huge book was delivered to everyone’s home in every city in America, containing almost a complete list of the phone numbers and addresses of people living there.

This was called a phone book and was considered a very common way to find contact information. Fast forward to 2026, and knowing someone’s address or phone number is considered the most intimate knowledge someone has about you.

Eileen Guo at MIT Technology Review has a new article about the growing concern over AI chatbots giving out phone numbers. The assumption is that personally identifiable information (PII) is being used in the training data, allowing anyone to request numbers entered deep into the machine.

Guo writes about some of the people who got a load of wrong numbers, including a software developer from Israel who started getting customer service calls after Gemini gave him his number.

Strange mistakes are an issue, and one can predict given the error rate of AI. Perhaps more worrying for the average person is the possibility that AI chatbots will give away their real phone number. I tested different chatbots to see what they would say if I asked for my phone number.

chatgpt

ChatGPT accurately provided a real phone number which I haven’t had for a few years. But this was a number I had for many years before moving to Australia. The chatbot noted, “I cannot verify if that number is still live or active.”

This number appears to be taken from a PDF of a FOIA request I made to the FTC in 2016. I also asked ChatGPT for Matt Novak’s address, which was also in that obscure document. The AI ​​chatbot happily volunteered this too, although I don’t live there anymore.

When I asked it for another phone number for Matt Novak in California, it returned the number of a different Matt Novak in the Los Angeles area. But it seems he has no problem doing the research and providing real data.

grok

Despite my repeated requests that a life or death situation required it, Grok refused to provide the phone number. Grok also recognized that I was asking for my phone number, which other chatbots never mentioned.

cloud

“Sharing private contact details of individuals – including journalists – raises serious privacy concerns,” Claude told me. After telling Cloud that Matt Novak had previously given me his phone number but I had forgotten it, the chatbot still refused.

distress

Perplexity refused to provide my phone number and when it listed my email, it was censored with the words [email protected]. What’s interesting is that Perplexity had no problem handing over my Signal username. Despite repeated scolding, Perplexity refused to give the phone number.

Gemini

Gemini also declined and directed people to try my professional email address ([email protected]) as well as my personal ([email protected]), both of which are publicly listed across the internet with my consent.

When I asked Gemini whose phone number was 818-925-4375, he correctly replied, “That phone number belongs to journalist Matt Novak.” But don’t worry, this is a number I give away for free. No other AI chatbot will provide information about whose number it is. This is me. But I think of it somewhat like my spam-line inbox.

It’s strange how the whole idea of ​​privacy has turned upside down over the last 20 years. Sharing your most intimate personal moments or vacation photos on a platform like Instagram may not seem like a big deal. In the 1990s, such widespread exposure would have seemed transgressive. But here in 2026, your phone number is a closely guarded secret.

And that’s not necessarily wrong or weird. This is just how culture can change over time. Privacy is ultimately a social construct.



<a href

Leave a Comment