Digital Twin Journeys: Teaching a Computer to See
Publication Date
2022-03-18Series
Digital Twin Journeys
Publisher
Centre for Digital Built Britain
Type
Report
Metadata
Show full item recordCitation
Lamb, K., Fenby-Taylor, H., & Danish, M. (2022). Digital Twin Journeys: Teaching a Computer to See. https://doi.org/10.17863/CAM.82150
Abstract
To asset owners and managers, understanding how people move through and use the built environment is a high priority, enabling better, more user-focused decisions. However, many of the methods for getting these insights can feel invasive to users. The latest output from Digital Twin Journeys looks at how a researcher at the University of Cambridge has solved this problem by teaching a computer to see.
Working from the University of Cambridge Computer Laboratory, Matthew Danish is developing an innovative, low-cost sensor that tracks the movement of people through the built environment. DeepDish is based on open-source software and low-cost hardware, including a webcam and a Raspberry Pi. Using Machine Learning, Matthew has previously taught DeepDish to recognise pedestrians and track their journeys through the space, and then began training them to distinguish pedestrians from Cambridge’s many cyclists.
Identifiers
2.5
This record's DOI: https://doi.org/10.17863/CAM.82150
This record's URL: https://www.repository.cam.ac.uk/handle/1810/335368
Rights
Attribution 4.0 International (CC BY 4.0)
Licence URL: https://creativecommons.org/licenses/by/4.0/
Statistics
Total file downloads (since January 2020). For more information on metrics see the
IRUS guide.