Automatic far-field camera calibration for construction scene analysis

Amin Assadzadeh, Mehrdad Arashpour, Alireza Bab-Hadiashar, Tuan Ngo, Heng Li

Research output: Journal article publicationJournal articleAcademic researchpeer-review

20 Citations (Scopus)

Abstract

The use of cameras for safety monitoring, progress tracking, and site security has grown significantly on construction and civil infrastructure sites over the past decade. Localization of construction resources is a crucial prerequisite for many applications in automated construction management. However, most existing vision-based methods perform the analysis in the image plane, overlooking the effect of perspective and depth. The manual and labor-intensive process of traditional calibration techniques, as well as the busy and restrictive construction environment, makes this a challenging task. This study proposes a framework for automatic camera calibration with no manual intervention. The framework utilizes convolutional neural networks for geometrical scene analysis and object detection, which are used to estimate the location of horizon line, vertical vanishing point, as well as objects with known height distributions. This enables automatic estimation of camera parameters and retrieval of scale. The proposed framework is evaluated on images from two major construction projects in Melbourne, Australia. Results show that the proposed method achieves a minimum accuracy of 90% in estimating proximity of points on the ground and can facilitate further development of vision-based solutions for safety and productivity analysis.

Original languageEnglish
JournalComputer-Aided Civil and Infrastructure Engineering
DOIs
Publication statusAccepted/In press - 2021

ASJC Scopus subject areas

  • Civil and Structural Engineering
  • Computer Science Applications
  • Computer Graphics and Computer-Aided Design
  • Computational Theory and Mathematics

Fingerprint

Dive into the research topics of 'Automatic far-field camera calibration for construction scene analysis'. Together they form a unique fingerprint.

Cite this