Repository logo
 

Parking Camera Calibration for Assisting Automated Road Defect Detection

Accepted version
Peer-reviewed

Repository DOI


Type

Article

Change log

Authors

Radopoulou, Stefania C 
Brilakis, Ioannis 

Abstract

Accurate and timely information is essential for efficient road maintenance planning. Current practice mainly depends on manual visual surveys that are laborious, time consuming, subjective and not frequent enough. We overcame this limitation in our previous work, by proposing a method that automatically detects road defects in video frames collected by a parking camera. The use of such a camera leads to capturing the surroundings of the road, such as sidewalks and sky due to its wide field of view. This unnecessarily reduces the method’s performance. This paper presents a process that identifies the correct Region of Interest (myROI). myROI corresponds to the region of the camera’s field of view that corresponds to the road lane, while considering defect inspection guidelines. We use the theory of inverse perspective mapping (IPM) to map the road frame coordinates to world coordinates. The camera specifications, and position, lane width and road defect detection guidelines constitute the parking camera calibration parameters for the calculation of myROI’s span and boundaries. We performed computational experiments in MATLAB to calculate myROI, and validated the results with field experiments, where we used a metric tape to measure the road defects. Preliminary results show that the proposed process is capable of calculating myROI.

Description

This is the author accepted manuscript. It is currently under an indefinite embargo pending publication by Osaka University.

Keywords

region of interest, inverse perspective mapping

Journal Title

International Conference on Computing in Civil and Building Engineering

Conference Name

Journal ISSN

Volume Title

Publisher

Osaka University

Publisher DOI

Publisher URL

Sponsorship
This material is based in part upon work supported by the National Science Foundation under Grant Number 1031329.