Science

Seeing isn't believing – flickering lights could reveal deepfaked videos

Seeing isn't believing – flickering lights could reveal deepfaked videos
Asst. Prof. Abe Davis (left) and computer science grad student Peter Michael with an NCI-code-generating light
Asst. Prof. Abe Davis (left) and computer science grad student Peter Michael with an NCI-code-generating light
View 1 Image
Asst. Prof. Abe Davis (left) and computer science grad student Peter Michael with an NCI-code-generating light
1/1
Asst. Prof. Abe Davis (left) and computer science grad student Peter Michael with an NCI-code-generating light

Thanks to advances in generative AI, seeing video of an event is no longer proof that it actually happened as shown. There could be new hope on the horizon, however, in the form of an authentication system that watermarks videos using fluctuations in the on-location lighting.

First of all, there are already systems that digitally watermark video footage via the camera that shoots it. These are only effective, however, if a specially adapted camera is utilized. What's needed is a technology that automatically affects the video recorded by any camera used by any person. A team led by Cornell University's Asst. Prof. Abe Davis has created just such a system, and it's known as "noise-coded illumination" (NCI).

In a nutshell, NCI involves adding a coded flicker to one or more of the lights that are illuminating the subject. This flicker consists of tiny, rapid fluctuations in brightness – or video "noise" – which aren't noticeable to the human eye.

Some light sources, such as ambient room lighting and computer screens, can be directly programmed to emit the secret code. Other sources, like stand-alone photographic lamps, can be controlled via an attached chip which is "about the size of a postage stamp."

In either case, the recorded video looks normal when viewed by a casual observer. When the footage is visually analyzed by a computer that has the key to the code, however, that code produces its own low-res time-stamped version of the video.

As long as the footage hasn't been digitally manipulated after being shot, the code-generated and main versions of the video will visually match (apart from the resolution). If manipulation has occurred, however, it will present itself as obvious visual discrepancies in the code-generated version – these could include blacked-out sections of the screen, or even the complete lack of any discernible image.

And what's more, in order to further boost security, different lights illuminating a single subject can each be programmed to generate their own unique NCI code.

"Even if an adversary knows the technique is being used and somehow figures out the codes, their job is still a lot harder," says Davis. "Instead of faking the light for just one video, they have to fake each code video separately, and all those fakes have to agree with each other."

Noise-Coded Illumination

Source: Cornell University

No comments
0 comments
There are no comments. Be the first!