Please use this identifier to cite or link to this item:
Title: Pedestrian network extraction from fused aerial imagery (orthoimages) and laser imagery (Lidar)
Authors: Kasemsuppakorn, P. 
Karimi, H.A. 
Issue Date: 2013
Publisher: University of the Thai Chamber of Commerce
Source: P. Kasemsuppakorn, H.A. Karimi (2013) Pedestrian network extraction from fused aerial imagery (orthoimages) and laser imagery (Lidar). Photogrammetric Engineering and Remote Sensing Vol.79 No.4, 369-379.
Abstract: A pedestrian network is a topological map that contains the geometric relationship between pedestrian path segments (e.g.,sidewalk, crosswalk, footpath), which is needed in a variety of applications, such as pedestrian navigation services. However, current pedestrian networks are not widely available. In an effort to provide an automatic means for creating pedestrian networks, this paper presents a methodology for extracting pedestrian network from aerial and laser images. The methodologyconsists of data preparation and four steps: object filtering, pedestrian path region extraction, pedestrian network construction, and raster to vector conversion. An experiment, using ten images, was conducted to evaluate the performance of the methodology. Evaluation results indicate that the methodology can extract sidewalk, crosswalk, footpath, and building entrances; it collects pedestrian networks with 61 percent geometrical completeness, 67.35 percent geometrical correctness, 71 percent topological completeness and 51.38 percent topological correctness. © 2013 American Society for Photogrammetry and Remote Sensing.
Rights: This work is protected by copyright. Reproduction or distribution of the work in any format is prohibited without written permission of the copyright owner.
Appears in Collections:RSO: Journal Articles

Files in This Item:
File Description SizeFormat 
74.pdf149.39 kBAdobe PDFThumbnail
Show full item record Recommend this item

Page view(s)

checked on Jul 11, 2019

Google ScholarTM


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.