NullPxl/banrays: Glasses to detect smart-glasses that have cameras. Ray-BANNED

Glasses that have cameras to detect smart glasses

I am experimenting with 2 main approaches:

  • Optics: Classify cameras using light reflections.
  • Networking: Bluetooth and Wi-Fi analysis.

So far fingerprinting specific devices based on Bluetooth (BLE) seems to be the easiest and most reliable method. The picture below is the first version, which plays The Legend of Zelda ‘Secret Found’ jingle when Meta Rebus detects the BLE ad.

banrays physical v1

I’m essentially treating this README like a logbook, so it will contain my current viewpoints/thoughts.

By sending IR at the camera lens, we can take advantage of the fact that the CMOS sensor in the camera reflects light directly back to the source (called ‘retro-reflectivity’ https://github.com/’cat-eye effect’) to identify the camera.

irrb

This is not a completely new idea. Some researchers used this property to create a ‘capture-resistant environment’ in 2005 when smartphones with cameras were gaining popularity.

There has even been some recent research (2024) exploring how to classify individual cameras based on their retro-reflections.

We now have the same situation as those researchers in 2005, where smart glasses with hidden cameras are becoming more popular. So I want to make a pair of glasses to identify them. Unfortunately, from what I can tell most existing research in this area records data with a camera and then uses ML, a ton of controlled angles, etc. to differentiate between normal reflective surfaces and cameras.

I would feel very silly if my solution used its own camera. So I will avoid him. Instead I think it’s likely that I will have to remain consistent with my ‘sweeps’ and rely on building a good classifier based on the signal data. For example you can see here that the rear camera of my smartphone appears to produce sharper and larger spikes, while the brighter screen produces longer waves.

ts plot labeled

ts plot spikes

After testing some meta rebans, I found that this setup would not suffice. Here’s a test of the camera-area + a few sweeps of the same area with the lens covered. You can see that the waveform is the same as I saw in the first test (small spike for the camera, wide otherwise), but it is extremely inconsistent and the signal strength is much weaker. This was from about 4 inches away from the LEDs. I didn’t notice much difference when swapping between 940nm and 850nm LEDs.

ir rayban first sweeps

So at least with the existing hardware that I have easy access to, it’s probably not enough to make an accurate distinction.

Another idea of ​​mine is to create a specified sweep ‘pattern’. The user (wearing detector glasses) will execute a specific scan pattern of the target. Using the waves captured from this data, perhaps we can fingerprint the Raybons more accurately. For example, passing the target prism in a ‘left, right, up, down’ approach. I tested this by comparing the results of Meta Raybons versus some aviators I had lying around. I think the idea behind this approach is sound (in fact it’s lightweight), but it may require more workshopping.

ir rayban sweeping pattern

For the prototype, I am using:

  • arduino uno
  • A bunch of 940nm and 850nm IR LEDs
  • A photodiode as a receiver
  • a 2222A transistor

basicsetup

To:

  • Use with wider patterns
  • Use a combination of data from different wavelengths
  • alignment?

It’s more complicated than I first thought! My current approach here is to fingerprint meta rebounces on Bluetooth low-energy (BLE) advertisements. But, I am only able to detect BLE traffic during 1) pairing 2) power-onI also sometimes see ads as they are taken out of the case (while already on), but not constantly,

ble detect

The goal is to detect them during use when they are communicating with the paired phone, but to see this type of directed BLE traffic it seems like I would need to see it first. CONNECT_REQ Packet containing information about which communication channels to move between in the sink. I don’t think I currently have it (ESP32) set up to do this type of thing.

For any Bluetooth Classic (BTC) traffic, unfortunately the hardware seems a bit more involved (read: expensive). So if I want to proceed down this route, I’ll probably need a more clever solution here.

When turned on or put into pairing mode (or sometimes when taken out of the case), I can locate the device via the advertised manufacturer data and service UUID. 0x01AB is a meta-unique SIG-specified ID (assigned by the Bluetooth standards body), and 0xFD5F The UUID in the service is also assigned to the meta.

Capture while glasses are on:

[01:07:06] RSSI: -59 dBm
Address: XX:XX:XX:XX:XX:XX
Name: Unknown

META/LUXOTTICA DEVICE DETECTED!
  Manufacturer: Meta (0x01AB)
  Service UUID: Meta (0xFD5F) (0000fd5f-0000-1000-8000-00805f9b34fb)

Manufacturer Data:
  Company ID: Meta (0x01AB)
  Data: 020102102716e4

Service UUIDs: ['0000fd5f-0000-1000-8000-00805f9b34fb']

The IEEE specifies some MAC address prefixes (OUI, ‘Organizationally Unique Identifier’), but these addresses tend to be randomized so I don’t expect them to be very useful for BLE.

Here are some links to more data if you’re curious:

To:


Thanks to Trevor Seitz and Junming Chen for their advice in optics and BLE (respectively). Also to Sohail, who loaned me the Meta Raybons for testing.



<a href

Leave a Comment