Researchers Building Vision-based Robotic System for Construction Monitoring

5/28/2015 Mike Koon, Engineering Communications Office

Associate Prof. Tim Bretl is co-PI in developing an automated vision-based construction progress monitoring system.

Written by Mike Koon, Engineering Communications Office

Members of the autonomous vision-based robotic system project include front row (l-r) Kevin Han, Jacob Lin, Joseph DeGol, and Ka Wai Tsoi; back row (l-r) Xinke Deng, Tim Bretl, Derek Hoiem, Mani Golparvar-Fard, Nour Dabboussi, Joseph Yaw Darko Akyeampong, Rebecca Nothof, and David Hanley
Members of the autonomous vision-based robotic system project include front row (l-r) Kevin Han, Jacob Lin, Joseph DeGol, and Ka Wai Tsoi; back row (l-r) Xinke Deng, Tim Bretl, Derek Hoiem, Mani Golparvar-Fard, Nour Dabboussi, Joseph Yaw Darko Akyeampong, Rebecca Nothof, and David Hanley
Members of the autonomous vision-based robotic system project include front row (l-r) Kevin Han, Jacob Lin, Joseph DeGol, and Ka Wai Tsoi; back row (l-r) Xinke Deng, Tim Bretl, Derek Hoiem, Mani Golparvar-Fard, Nour Dabboussi, Joseph Yaw Darko Akyeampong, Rebecca Nothof, and David Hanley
Anyone that has been involved with a construction project understands that because tight deadlines are a norm, staying on schedule becomes imperative. For grander projects, completion even days late can be costly, both in terms of construction costs and lost revenue.

With that in mind, a group of University of Illinois scientists, including Aerospace Engineering Associate Prof. Tim Bretl, are developing a first of its kind automated vision-based construction progress monitoring system that uses video and still images taken with the aid of a robotic quadcopter. The aerial robotic device would autonomously transport recording equipment to strategic points along a job site. Eventually returning to a home base, the video and images would be downloaded and the operations on the site analyzed, giving project managers a more accurate status of current construction progress, together with a comparison to the project plan. This improves the project management team’s understanding of when actual or potential construction progress deviations happen.

The robotic quadcopter also autonomously mounts and demounts a network of video cameras to building elements on the site to record dynamic construction operations from strategic locations and viewpoints.  From these video streams, new computer vision methods will detect, track, and analyze activities of the construction equipment and craft workers in 3D to provide an accurate and direct measurement of productivity on the site, and enable root-cause assessment on performance deviations. By providing a visual interface to the outcome of monitoring operations, the system improves decision-makings that can lead to efficiency in execution of the project.

The project called “Flying Superintendents,” joins the expertise of lead principal investigator Mani Golparvar-Fard from civil and environmental engineering with co-PIs Bretl and Derek Hoiem from computer science. The team successfully received a nearly $1 million Cyber-Physical Systems (CPS) award through the National Science Foundation for the project, which kicked off in January 2015 and continues through the end of 2019.

The group has started preliminary studies at two locations, one on the Illinois campus, at the construction site of Residence Hall 3 on the Ikenberry Commons, and another at one of the largest current stadium projects in the country, the new $500 million home for the NBA’s Sacramento Kings. Because of some early seed funding from the National Center for Supercomputing Applications (NCSA), initial data collection has already begun.

Bretl’s research group is building the robotic quadcopters and its operating system that controls the autonomous data collection process. These robotic quadcopters also magnetically dock the video cameras at various places through an autonomous operation, allowing for video recording of the site operations. Among other things, Hoiem is taking these images and videos to automatically analyze the changes on construction site and assess activities of the craft workers in 4D (3D + time). Golparvar-Fard, meanwhile, is creating methods that can automatically compare the site image-based 3D models with pre-produced renderings of the expected progress and color code them in red or green to indicate parts of the job that are either at, ahead or behind schedule. He is also translating the outcome of the video-based activity analysis into a workface assessment that could be used for root-cause analysis of performance deviations.

According to Golparvar-Fard, these methods track performance at project and operation levels and can drastically improve assessment time and accuracy. This means that project managers, superintendents, and field engineers would not need to traverse the job site regularly to make their assessments, which are often subjective and take many hours to days to complete.

“Quickly detecting potential or actual problems helps projects stay on schedule,” Golparvar-Fard said. “Because the robots fly daily and because the network of cameras can be frequently installed, this method can provide a 4D (3D + time) model of the construction at both project and task levels.  This rich multi-level model provides a way for quick qualitative and quantitative analysis, showing deviation in terms of budgeted cost, on a daily basis.”

The second part of the project involves activity analysis. Applying Hoeim’s computer vision techniques, the system will capture tasks of workers and relate them to productivity.  The goal is to install fixed cameras throughout the site with the capability to detect and track each worker. Once they are detected, they can be tracked in 3D with use of the other cameras throughout the site. The robot would transport each camera autonomously.

“There are many reasons why these projects fall behind schedule,” Golparvar-Fard explains. “In order to detect why they are behind, contractors have to look into both equipment and workers. We can tell you what percent of the time each worker spends on productive tasks, how much was spent walking to and from tasks and how much was spent simply waiting. The superintendents then can use the information to perhaps change roles, tools, or sequence of tasks for increased efficiency.”

Along with AE Associate Prof. Soon-Jo Chung, Golparvar-Fard and Bretl are also PIs on a different NSF-funded project headed by Seth Hutchinson from electrical and computer engineering. That project involves designing and developing a robotic bat supplied with a camera that can go around and between elements at a construction site. Currently, they are being equipped for inspection of safety and quality issues on construction sites, which would limit the exposure of construction practitioners to safety hazards. Golparvar-Fard believes as co-workers, these bat inspired robots will also eventually assist with automation of certain construction tasks, for example assisting with beams into place.

Companies see flying robots as the future of the business, so much so that in December 2014, the FAA granted permission to five of them to fly unmanned aerial vehicles (UAVs) at their sites. The flying robotic superintendent project will minimize the need for manual operation and control of the UAVs, and will provide analytics on construction performance.
 


Share this story

This story was published May 28, 2015.