Last week, the Federal Communications Commission lifted limits on a radar-enabled chipset from Google that allows users to control mobile devices simply by wiggling their fingers. The FCC ruling is a gesture as much to deregulation as to the commercial potential of the technology.
The agency now permits devices that use Google’s Project Soli silicon to operate at the higher power outputs necessary for the sensors it contains, enabling them to recognize hand movements that are then translated into device commands. Engineers at Google say the Soli chipset can accurately discern movement in as many as 10,000 frames per second.
In development since 2015, the Soli platform unites a pair of radar architectures and a beam-forming antenna array. Users can manipulate virtual buttons, knobs, dials, keys, levers and sliding arms without touching their device, Google says, making the hand a part of the tools it operates.
Ruling Ignites Spectrum Turf War
Federal limits on power output are in place to prevent unlicensed devices such as cellular phones from interfering with other users of the same frequency spectrum. Current US norms, measured in decibel milliwatts, are lower than those in other regions, including in the European Union.
In part by citing those differences, Google first petitioned the FCC for a waiver in March, 2018. Several companies and institutions, among them fellow online advertising giant Facebook, expressed concern that the motion-sensor tech could disrupt transmissions.
Facebook has equal skin in the game, via a partnership named Terragraph that was formed with chip designer Qualcomm. The company hopes to boost wireless capacity in high-density population centers within the same spectrum, starting later this year. That move and others like it came in response to the lifting of Net Neutrality, which the FCC did away with in June to spur investment in the nation’s telecoms infrastructure.
While Soli doesn’t employ bandwidth similar to traditional sensors for 3D spatial resolution, it possesses the ability to read subtle changes in the signals that bounce back to the chipset’s onboard sensor array. This lets the platform respond to the dexterity of the human hand and processes gestures as commands.
Developed with German chipmaker Infineon, the Soli chipset transforms raw data from the radar relay into signals from tracked gestures. Machine learning algorithms determine their probabilities and tools applied at the user interface interpret those gestures as instructions.
Mutual Benefits Mark Path to FCC Rule Change
After the search-engine behemoth successfully allayed Facebook's fears that the Soli platform would interfere with other on-device systems, the pair jointly informed the FCC in September, clearing the way for last week's ruling.
In it, Julius A. Knapp. chief of the agency's Office of Technology and Engineering, touted the potential benefit of the technology for disabled users, including those unable to utter voice commands. The regulator also approved Soli-equipped devices for use on airplanes, saying that concerns the tech would interfere with onboard communications were misplaced.
Spearheaded by engineers at Google’s Advanced Technology and Projects Group (ATAP), the platform marries Frequency Modulated Continuous Wave architecture with Direct Sequence Spread Spectrum architecture in a solid-state chipset. Soli’s radar sensors recognize gestures more efficiently than competing technologies, including infrared light and Radio Frequency Identification, or RFID.
The division, focused on mobility R&D, dates from the company’s ownership of handset maker Motorola that ended in 2014 with the sale of that company to China’s Lenovo. At Alphabet-owned Google, ATAP engineers worked in rapid prototyping cycles over a period of 10 months to scale Soli’s sensor and antenna arrays in a hardware-neutral form factor for mobile devices and wearables.
Myriad Applications Push Fast-Track Development
Google began turning the tech out to both researchers and commercial enterprises in 2016, with developers demonstrating applications for object recognition, 3D imaging, predictive drawing, an in-car remote control that operates a cell phone, and a rudimentary security system.
A software development kit helps developers create applications with Soli’s gesture-recognition libraries. They contain precision position and motion data, letting developers set gesture parameters and label the real-time signals received from radar hardware.
Commercial prototypes, including a wireless sound system from US speaker manufacturer JBL and smartwatches from Korea’s LG, followed in 2017. And in a paper published in December, researchers at St. Andrew’s University in Scotland described creating stacking, counting, movement and orientation applications with Soli.