Repository logo
 

A machine learning-based method for the large-scale evaluation of the qualities of the urban environment

Published version
Peer-reviewed

Type

Article

Change log

Authors

Silva, EA 
Wang, H 

Abstract

Given the present size of modern cities, it is beyond the perceptual capacity of most people to develop a good knowledge about the qualities of the urban space at every street corner. Correspondingly, for planners, it is also difficult to accurately answer questions such as ‘where the quality of the physical environment is the most dilapidated in the city that regeneration should be given first consideration’ and ‘in fast urbanising cities, how is the city appearance changing’. To address this issue, in the present study, we present a computer vision method that contains three machine learning models for the large-scale and automatic evaluation on the qualities of the urban environment by leveraging state-of-the-art machine learning techniques and wide-coverage street view images. From various physical qualities that have been identified by previous research to be important for the urban visual experience, we choose two key qualities, the construction and maintenance quality of building facade and the continuity of street wall, to be measured in this research. To test the validity of the proposed method, we compare the machine scores with public rating scores collected on-site from 752 passers-by at 56 locations in the city. We show that the machine learning models can produce a medium-to-good estimation of people's real experience, and the modelling results can be applied in many ways by researchers, planners and local residents.

Description

Keywords

machine learning, physical quality, street view image, urban design, architecture

Journal Title

Computers, Environment and Urban Systems

Conference Name

Journal ISSN

0198-9715
1873-7587

Volume Title

65

Publisher

Elsevier
Sponsorship
This research is funded by the National Natural Science Foundation of China (Grant No. 51478232), Independent Research Project of Tsinghua University (Grant No. 20131089262) and a scholarship from the China Scholarship Council (CSC No. 201306210039).