Google moves closer to creating sensors for controlling devices with hand gestures

Nearly two decades after its release, “Minority Report” still seems to be as prescient as the film’s eerie crime-fighting “precogs,” offering a clarifying vision of the future that continues to manifest in the real world.

Though it debuted way back in 2002, the film highlighted technologies like driverless cars, hyper-targeted advertising and robotic insects – all of which exist in 2019. Now, it appears Steven Spielberg’s cinematic premonition may have included another technology that is potentially one step closer to reality: gesture-controlled sensing technology.

Translated to English: technology that would allow us to control televisions, smartphones and computers without actually touching them, not unlike Tom Cruise’s character, chief John Anderton, manipulating floating digital images like a conductor directing an orchestra (though he uses gloves instead of a baton).

For years, Google’s Advanced Technology and Projects (ATAP) lab has been seeking to create motion sensors that might be used in similar technology, an effort the company dubbed “Project Soli.”