Taking photos through windows is annoying because light bounces off it and ruins the picture. But scientists at MIT just developed new technology that could fix the pesky reflections.
Members of MIT’s Media Lab have developed tech that detects how long it takes flash to bounce back off nearby objects (like, say, a window) and objects that are farther away (your subject). Then, the team teased out those signals that illuminated only what they wanted to take a picture of.
The key here was a principle called the “Fourier transform” — an important signal processing phenomenon that breaks down separate frequencies within a signal. Using a modified Microsoft Kinect One camera, the MIT team took a photo and separated its 45 frequencies, including signals that showed the flash’s light hitting nearby objects, and light hitting objects farther away. This caused the many frequencies to arrive at the camera’s sensor at slightly different times.
From there, the team could effectively pick and choose which signals it wanted to keep. In their experiment, they kept the image of the mannequin head being pleasantly lit and eliminated the frequencies that produced images of an ugly window flare.
This cool development could change not just photography, but could also improve ultrasounds and lasers that detect explosives. The craziest thing of all, though? The researchers pulled off this complicated image separation with a Kinect camera.
“For this challenging problem, everyone would think that you’d need expensive, research-grade, bulky lab equipment,” Physics professor Laurent Daudet said. “This is a very elegant and inspiring line of work.” Surprising what you can do with cheap video game hardware, huh?