Ground penetrating radar is a valuable tool for the location of underground infrastructure, but it has its limitations: it needs an expert operator and only provides an indication of the buried infrastructure as the device is moved over it.
What if you could translate that information into an augmented reality representation of the underground infrastructure identified by the radar?
That’s the ambition of a joint research project underway at the University of Vermont and the University of Tennessee at Chattanooga.
The research is being supported by two National Science Foundation grants and one from US Ignite, a non-profit organisation that helps to accelerate new wired and wireless networking advances from research to prototype to full-scale smart community and interconnected national deployments.
The researchers demonstrated their achievements at the Smart Cities Connect national conference in Kansas City on March 29.
According to Dryver Huston, a mechanical engineering professor at the University of Vermontand the project's principal investigator, when the technology is fully developed, "a person with augmented reality goggles or a specially equipped smart phone or tablet will be able to walk over the area that needs to be inspected, look into the device and see in detail what's underground [two to four metres] down."
The technology is claimed to be able to combine information on the buried infrastructure gathered by the ground penetrating radar with spatial information to create a map showing the location of the infrastructure, using a common 3D scanning smart phone app.
According to Huston “The phone t converts the grainy radar scans to clearly recognisable, nuanced three-dimensional objects using augmented reality software, commonly used for video game development.The net result is that the system knows where you are, knows what's underneath, and can show you detailed images of what's there."
The technology has been under development for three years and the last six months have been spent verifying it by using it on locations where excavations have been scheduled.
This has enabled the researchers to compare the information presented by their technology with what is actually under the ground.According to Huston, “it checked out.”
The researchers are now working on another component needed to make the technology a practical tool: edge computing. At present the massive amount of data generated is transmitted to a distant cloud computing centre for processing to turn it into the imagery that the operator sees.
The delays involved prevent the operator seeing the image in real time as they traverse the area, so the researchers are developing technology to process the data close to the location where the technology is being used.
They claim to have achieved this, transferring the huge amount of data generated by the GPR and the 3D scanning software from a street near the university back to a server on the university campus in real time. This, they say, represented “a technical challenge of the highest order.”
Infrastructure protection news brought to you by PelicanCorp