Researchers have figured out how to turn everyday objects into “visual microphones” by developing a software algorithm that can translate vibrations picked up on high-speed video into the sounds that caused them.
It works even when the motion can’t be detected by the naked eye. In fact, the team from Adobe, Microsoft and MIT was able to reconstruct words uttered on the opposite side of soundproof glass from the vibrations of a potato chip bag near the speaker.
From the vibrations. Of a potato chip bag!
See it to believe it in the video below or read the full story from the MIT news office.
This article originally appeared on Recode.net.