Researchers Developing Solution for GPS-denied Environments
While the French police in “The DaVinci Code” were able to use a GPS chip to track symbologist Robert Langdon through the Louvre Museum after a curator was murdered, Soon-Jo Chung spoils their fun by noting that technology unfortunately isn’t quite up to speed with that storyline yet.
“In real-life, the police wouldn’t have been able to do that, as most indoor environments are GPS-denied areas where there is no GPS signal,” said Chung, assistant professor of Aerospace Engineering at Illinois and the Coordinated Science Laboratory.
Chung, along with CSL Prof. Seth Hutchinson, is working to solve this problem by creating algorithms that are able to map an area, even when GPS signals aren't available, such as inside a building or in remote areas.
“It’s similar to what you do when you arrive to a new city,” said Hutchinson, professor of electrical and computer engineering. “You build a map of your environment incrementally and are able to figure out where you are in that map. Originally, you don’t know where you are and don’t know where anything else is either.”
Typically, GPS needs direct line-of-sight communication with GPS satellites to work, according to Chung, also a Beckman Fellow of the Center for Advanced Study.
“In a GPS-denied environment, such as on a river where there are overhanging canopies or trees, the signal becomes weak, so we want to develop an autonomous vision-based navigation algorithm that can be used without GPS or in conjunction with weak GPS signals to navigate through these unknown areas,” he said.
The team has been awarded two grants of over $1 million from the Office of Naval Research to realize this goal in recent years. In the first phase of the project, the results of which were reported in the March 2015 IEEE Transactions on Automatic Control, the group focused on small UAVs and drones developed by the group. Most recently, they received a 3-year, $450,360 grant to focus on testing the algorithms in the real world using the Navy’s Special Operations Craft-Riverine (SOC-R) boats in riverine environments. This type of boat is often used by the Navy to do short-range insertion and extraction of special operations forces in GPS-denied river environments.
Through the use of multiple sensors, it’s possible to apply computer vision algorithms to videos, measure acceleration, recognize change in orientation and develop a low-level map that recognizes boundaries of the area and any obstacles along the path. Specifically in riverine environments, these measurements can be complicated by reflections caused by the water, as well as the possible tree and canopy overhangs.
Additionally, by leveraging computer vision techniques to create a vision-centric localization algorithm, Chung and Hutchinson are able to create novel algorithms that perform in real-time and effectively integrate information from a variety of sensors to develop a map and trajectory of the vehicle during the day or night. Inertial measurement units (IMU), sensors that include gyroscopes and accelerometers, provide additional data needed to accomplish the precise simultaneous localization and mapping. This enables users, such as the Navy, to explore and generate a map of the area in real time while precisely locating the trajectory of an unmanned or manned vehicle in real time. Their vision-based navigation algorithms, which will appear in the 2015 Journal of Field Robotics, effectively show through experimental validation that the use of reflected features by the novel simultaneous localization and mapping (SLAM) algorithm, integrated in a robot-centric mapping framework, can greatly improve the observability and accuracy of the SLAM estimation problem.
The group has gathered data and tested their algorithms at the Naval Surface Warfare Center in Crane, Indiana, and plans to take multiple trips there this summer. They are paying close attention to developing a solution with a low cost and low weight, with a future goal of creating a self-contained system that may potentially replace the current GPS navigation systems found in smart phones.
“The overarching goal of the project is to develop a vision-based navigation algorithm that can replace GPS navigation,” Chung said. “It’ll be a challenge and we’ll still use GPS, but since we often lose that signal in an urban canyon, dense forest or an indoor environment, we’re working to see how we can seamlessly integrate the two in a GPS-denied area.”
Not only are Hutchinson and Chung aiming to improve GPS-only navigation, but also are working to ameliorate the concern of U.S. radio astronomers on potential interference of the radio frequency signal that home robotic lawnmowers would create. There was recently wide media coverage on a complaint with the FCC filed by the National Radio Astronomy Observatory, urging the commission not to grant iRobot a waiver that would allow the Roomba lawnmowers to use a wireless beacon system that operates in the spectrum band of radio telescopes.
Chung believes they have a solution and Mechanical Engineering Ph.D. student Junho Yang, co-advised by Hutchinson and Chung, has been working on vision-based navigation and containment for robotic mowers, funded by John Deere. The following video, which earned a CSL Video of the Month award, and the corresponding paper will be presented in the upcoming IEEE International Conference on Robotics and Automation (ICRA) in May.