Supporting data for "HRHD-HK: A Benchmark Dataset of High-Rise and High-Density Urban Scenes for 3D Semantic Segmentation of Photogrammetric Point Clouds"
<h2><b>HRHD-HK: A Benchmark Dataset of High-Rise and High-Density Urban Scenes for 3D Semantic Segmentation of Photogrammetric Point Clouds</b></h2><p dir="ltr">This is the official repository of the HRHD-HK dataset. For technical details, please refer to:</p><p dir="ltr">Li, M., Wu, Y., Yeh, A. G. O., & Xue, F. (2023). HRHD-HK: A benchmark dataset of high-rise and high-density urban scenes for 3D semantic segmentation of photogrammetric point cloud. <i>Proceedings of 2023 IEEE International Conference on Image Processing Challenges and Workshops</i>, 3714-3718. IEEE. https://doi.org/10.1109/ICIPC59416.2023.10328383<br></p><h3><b>Overview of HRHD-HK</b></h3><p dir="ltr">This paper presents a benchmark dataset of high-rise high-density urban point clouds, namely High-Rise, High-Density urban scenes of Hong Kong (HRHD-HK) for 3D semantic segmentation.</p><ul><li>The semantic labels of HRHD-HK include 1) building, 2) vegetation, 3) road, 4) waterbody, 5) facility, 6) terrain, and 7) vehicle.</li><li>Point clouds of HRHD-HK were collected in HK with two features, i.e., color and coordinates in the HK 1980 Grid system (EPSG:2326).</li><li>HRHD-HK arranged in 150 tiles, contains approximately 273 million points, covering 9.375 km<sup>2</sup>.</li><li>Each tile of point clouds was saved in the "ply" format with seven channels, i.e., x, y, z, red, green, blue, and label.</li><li>HRHD-HK aims to supplement the existing benchmark datasets with Asian HRHD urban scenes as well as subtropical natural landscapes, such as sea, vegetation, and mountains.</li></ul><p dir="ltr">For any inquiries, please feel free to contact Maosu at <a href="mailto:maosulee@connect.hku.hk" target="_blank">maosulee@connect.hku.hk</a> or Dr. Frank at <a href="mailto:xuef@hku.hk" target="_blank">xuef@hku.hk</a>.</p><p dir="ltr">Please cite our paper, if you find our work useful for your research.</p><p><br></p>
Funding
This study was supported in part by the Hong Kong Research Grant Council (RGC) (27200520) and Department of Science and Technology of Guangdong Province (GDST) (2020B1212030009, 2023A1515010757).