Professional Documents
Culture Documents
Segment Lidar v0.2.1
Segment Lidar v0.2.1
Segment Lidar v0.2.1
v0.2.1
Anass Yarroudh
1 About 1
2 Support 13
Index 17
i
ii
CHAPTER
ONE
ABOUT
The package segment-lidar is specifically designed for unsupervised instance segmentation of aerial LiDAR data. It
brings together the power of Segment-Anything Model (SAM) developed by Meta Research and segment-geospatial
(SamGeo) package from Open Geospatial Solutions to automatize the segmentation of LiDAR data. If you use this
package for your research, please cite:
@misc{yarroudh:2023:samlidar,
author = {Yarroudh, Anass},
title = {LiDAR Automatic Unsupervised Segmentation using Segment-Anything Model (SAM)␣
˓→from Meta AI},
year = {2023},
howpublished = {GitHub Repository},
url = {https://github.com/Yarroudh/segment-lidar}
}
The latest source code is available at GitHub. The package builds on top of existing works and when using specific
algorithms within segment-lidar, please also cite the original authors, as specified in the source code.
1.1 Installation
This guide describes the steps to install segment-lidar using PyPI or from source.
Before installing segment-lidar, you need to create an environment by running the following commands:
This command will create a new Conda environment named samlidar. We recommend using Python 3.9, but feel free
to test with other versions.
Please note that using a Conda environment is not mandatory, but it is highly recommended. Alternatively, you can
use virtualenv.
1
segment-lidar, v0.2.1
For the installation instructions and options, refer to the official PyTorch website: PyTorch Get Started.
Note: If you want to leverage GPU acceleration with PyTorch, make sure you have a CUDA-supported GPU and
install the corresponding CUDA toolkit. Follow the instructions in the official CUDA installation guide: NVIDIA
CUDA Installation Guide.
You can easily install segment-lidar from PyPI using the following command:
To make sure that segment-lidar is installed correctly, you can run the following command:
If the installation is successful, you should see the version that you have installed.
In this tutorial, we will learn how to use the segment_lidar module for automatic unsupervised instance segmentation
of LiDAR data.
1.2.1 Prerequisites
For more information on how to install the module, please refer to the Installation page.
2 Chapter 1. About
segment-lidar, v0.2.1
For testing purposes, you can download a sample data here: pointcloud.las. This data was retrieved from AHN-4. For
more data, please visit AHN-Viewer.
Click the links below to download the checkpoint for the corresponding Segment-Anything model (SAM) type.
• default or vit_h: ViT-H SAM model.
• vit_l: ViT-L SAM model.
• vit_b: ViT-B SAM model.
2. Define the viewpoint using the view module. You can choose between the following:
• TopView: Top view of the point cloud.
• PinholeView: Pinhole camera view of the point cloud, defined by its intrinsic and extrinsic parameters.
For example, to define a top view, you can do the following:
viewpoint = view.TopView()
The pinhole view can be defined either by providing the intrinsic and extrinsic parameters:
K is a 3x3 intrinsic matrix, R is a 3x3 rotation matrix and T is a 3x1 translation vector.
or by using the interactive mode:
viewpoint = view.PinholeView(interactive=True)
3. Create an instance of the SamLidar class and specify the path to the checkpoint file ckpt_path when instantiating
the class:
model = samlidar.SamLidar(ckpt_path="sam_vit_h_4b8939.pth")
4. Read the point cloud data from a .las/.laz file using the read method of the SamLidar instance. Provide the path to
the point cloud file pointcloud.las as an argument:
points = model.read("pointcloud.las")
5. Apply the Cloth Simulation Filter (CSF) algorithm for ground filtering using the csf method of the SamLidar
instance. This method returns the filtered point cloud cloud, the non-ground non_ground and the ground ground
indices:
6. Perform segmentation using the segment method of the SamLidar instance. This method requires the filtered point
cloud cloud as input, and you can optionally provide an image path image_path and labels path labels_path to save the
segmentation results as an image and labels, respectively. The segment method returns the segmentation labels labels:
7. Save results to .las/.laz file using the write method of the SamLidar instance:
# Define viewpoint
view = view.TopView()
# Apply CSF
cloud, non_ground, ground = model.csf(points)
# Save results
model.write(points=points, non_ground=non_ground, ground=ground, segment_ids=labels,␣
˓→save_path="segmented.las")
8. The resulted point cloud contains a new scalar field called segment_id. For visualization and further processing, we
recommand using CloudCompare.
The following figure shows the results of the segmentation on the sample data form AHN-4:
4 Chapter 1. About
segment-lidar, v0.2.1
The interactive mode allows you to interactively define the viewpoint using GUI.
viewpoint = view.PinholeView(interactive=True)
You can rotate, move and zoom the camera using the mouse (please refer to Open3D documentation for more details).
Once you are done, press p to save the image and the camera parameters, than esc to quit the interactive mode.
Example:
import os
from segment_lidar import samlidar, view
view = view.PinholeView(interactive=True)
model = samlidar.SamLidar(ckpt_path='sam_vit_h_4b8939.pth',
device='cuda:0',
algorithm='segment-anything')
model.mask.min_mask_region_area = 200
model.mask.points_per_side = 5
(continues on next page)
6 Chapter 1. About
segment-lidar, v0.2.1
points = model.read('laundry.las')
os.makedirs("results/", exist_ok=True)
labels, *_ = model.segment(points=points,
view=view,
image_path="results/raster.tif",
labels_path="results/labeled.tif")
1.2.6 Configuration
The segment_lidar module provides a set of parameters that can be used to configure the segmentation process. These
parameters are passed to the SamLidar class as arguments when instantiating the class. The following table shows the
parameters and their default values:
model = samlidar.SamLidar(ckpt_path="sam_vit_h_4b8939.pth",
algorithm="segment-geo-spatial",
model_type="vit_h",
resolution=0.5,
sam_kwargs=True)
model.mask.crop_n_layers = 1
model.mask.crop_n_points_downscale_factor = 2
model.mask.min_mask_region_area = 500
model.mask.points_per_side = 10
model.mask.pred_iou_thresh = 0.90
model.mask.stability_score_thresh = 0.92
Please, refer to the segment-anything repository for more details about these parameters. See the complete arguments
list of the SamLidar class here.
1.3 API
8 Chapter 1. About
segment-lidar, v0.2.1
Return type
np.ndarray
Raises
ValueError – If the input file format is not supported.
segment(points: ~numpy.ndarray, view: ~segment_lidar.view.TopView | ~segment_lidar.view.PinholeView =
<segment_lidar.view.TopView object>, image_path: str = 'raster.tif', labels_path: str = 'labeled.tif',
image_exists: bool = False, label_exists: bool = False) → Tuple[ndarray, ndarray, ndarray]
Segments a point cloud based on the provided parameters and returns the segment IDs, original image, and
segmented image.
Parameters
• points (np.ndarray) – The point cloud data as a NumPy array.
• view (Union[TopView, PinholeView]) – The viewpoint to use for segmenting the
point cloud, defaults to TopView().
• image_path (str) – Path to the input raster image, defaults to ‘raster.tif’.
• labels_path (str) – Path to save the labeled output image, defaults to ‘labeled.tif’.
• image_exists (bool) – A boolean indicating whether the raster image already exists,
defaults to False.
• label_exists (bool) – A boolean indicating whether the labeled image already exists,
defaults to False.
Returns
A tuple containing the segment IDs, segmented image, and RGB image.
Return type
Tuple[np.ndarray, np.ndarray, np.ndarray]
class text_prompt(text: str | None = None, box_threshold: float = 0.24, text_threshold: float = 0.15)
Bases: object
write(points: ndarray, segment_ids: ndarray, non_ground: ndarray | None = None, ground: ndarray | None
= None, save_path: str = 'segmented.las', ground_path: str | None = None) → None
Writes the segmented point cloud data to a LAS/LAZ file.
Parameters
• points (np.ndarray) – The input point cloud data as a NumPy array, where each row
represents a point with x, y, z coordinates.
• segment_ids (np.ndarray) – The segment IDs corresponding to each point in the point
cloud.
• non_ground (np.ndarray, optional) – Optional array of indices for non-ground
points in the original point cloud (default: None).
• ground (np.ndarray, optional) – Optional array of indices for ground points in the
original point cloud (default: None).
• save_path (str, optional) – The path to save the segmented LAS/LAZ file (default:
‘segmented.las’).
Returns
None
1.3. API 9
segment-lidar, v0.2.1
10 Chapter 1. About
segment-lidar, v0.2.1
1.3. API 11
segment-lidar, v0.2.1
1.4 Citation
The use of open-source software repositories has become increasingly prevalent in scientific research. If you use this
repository for your research, please make sure to cite it appropriately in your work. The recommended citation format
for this repository is provided in the accompanying BibTeX citation. Additionally, please make sure to comply with
any licensing terms and conditions associated with the use of this repository.
@misc{yarroudh:2023:samlidar,
author = {Yarroudh, Anass},
title = {LiDAR Automatic Unsupervised Segmentation using Segment-Anything Model␣
˓→(SAM) from Meta AI},
year = {2023},
howpublished = {GitHub Repository},
url = {https://github.com/Yarroudh/segment-lidar}
}
Yarroudh, A. (2023). LiDAR Automatic Unsupervised Segmentation using Segment-Anything Model (SAM) from
Meta AI [GitHub repository]. Retrieved from https://github.com/Yarroudh/segment-lidar
1.5 License
12 Chapter 1. About
CHAPTER
TWO
SUPPORT
Please, contact as via email ayarroudh@uliege.be or akharroubi@uliege.be for questions and the GitHub issue tracker
for bug reports, feature requests/additions, etc.
13
segment-lidar, v0.2.1
14 Chapter 2. Support
PYTHON MODULE INDEX
s
segment_lidar.samlidar, 8
segment_lidar.view, 9
15
segment-lidar, v0.2.1
C
cloud_to_image() (segment_lidar.view.PinholeView
method), 10
cloud_to_image() (segment_lidar.view.TopView
method), 10
csf() (segment_lidar.samlidar.SamLidar method), 8
I
image_to_cloud() (segment_lidar.view.PinholeView
method), 10
image_to_cloud() (segment_lidar.view.TopView
method), 11
M
module
segment_lidar.samlidar, 8
segment_lidar.view, 9
P
PinholeView (class in segment_lidar.view), 9
R
read() (segment_lidar.samlidar.SamLidar method), 8
S
SamLidar (class in segment_lidar.samlidar), 8
SamLidar.mask (class in segment_lidar.samlidar), 8
SamLidar.text_prompt (class in seg-
ment_lidar.samlidar), 9
segment() (segment_lidar.samlidar.SamLidar method),
9
segment_lidar.samlidar
module, 8
segment_lidar.view
module, 9
T
TopView (class in segment_lidar.view), 10
W
write() (segment_lidar.samlidar.SamLidar method), 9
17