/cdn.vox-cdn.com/uploads/chorus_image/image/63698826/potatochip.0.1505262773.0.jpg)
Researchers have figured out how to turn everyday objects into “visual microphones” by developing a software algorithm that can translate vibrations picked up on high-speed video into the sounds that caused them.
It works even when the motion can’t be detected by the naked eye. In fact, the team from Adobe, Microsoft and MIT was able to reconstruct words uttered on the opposite side of soundproof glass from the vibrations of a potato chip bag near the speaker.
From the vibrations. Of a potato chip bag!
See it to believe it in the video below or read the full story from the MIT news office.
This article originally appeared on Recode.net.