3 Citations (Scopus)

Abstract

A Large-scale base map is needed by metropolitan cities such as Surabaya City for urban planning and smart city development. The most needed information from a large-scale base map is road geospatial information. Road network extraction is a challenging task for many reasons, including the heterogeneous attribute of the geometry and spectral, the complexity of objects that are difficult to model, and poor sensor data. Visual interpretation by operators is stilla commonly used approach for extracting information from orthophoto. Interpretation accuracy depends on the skill and experience of the operator. So, there can be inconsistencies in the data generated by different operators. In recent years, the automatic extraction of road from orthophoto or VHR image has become an important and challenging research issue. Many recent studies have explored deep learning to improve the quality of building and road extraction. In this study, we applied a Mask Region-based Convolutional Neural Network (Mask R-CNN) model for the road network extraction using orthophoto in urban area in Surabaya City. The quality of the extracted geometry needs to be improved. Several post-processing strategies, including polygon regularization using Douglas-Peucker algorithm and polygon smoothing are designed to achieve optimal extraction results. The method produces a good performance for road extraction, the precision is 90.28%; recall 85.85%; F1-score 88.01%; and IoU 78.59%; and the overall accuracy is 95.25% and the kappa value is 90.5%.

Original languageEnglish
Article number012047
JournalIOP Conference Series: Earth and Environmental Science
Volume1127
Issue number1
DOIs
Publication statusPublished - 2023
Event7th Geomatics International Conference, GEOICON 2022 - Virtual, Online
Duration: 26 Jul 2022 → …

Fingerprint

Dive into the research topics of 'Extraction of Road Network in Urban Area from Orthophoto Using Deep Learning and Douglas-Peucker Post-Processing Algorithm'. Together they form a unique fingerprint.

Cite this