Skip to content

Commit

Permalink
WIP:: update script
Browse files Browse the repository at this point in the history
  • Loading branch information
leavauchier committed May 30, 2024
1 parent f5f4267 commit 220a5be
Show file tree
Hide file tree
Showing 5 changed files with 51 additions and 19 deletions.
3 changes: 2 additions & 1 deletion .github/workflows/cicd_docker.yml
Original file line number Diff line number Diff line change
Expand Up @@ -10,8 +10,8 @@ env:
DOCKER_IMAGE_NAME: pdal_ign_plugin

jobs:

build_docker_image_and_run_tests:

runs-on: ubuntu-latest

steps:
Expand All @@ -24,3 +24,4 @@ jobs:
- name: Run tests in docker image
run: docker run ${{ env.DOCKER_IMAGE_NAME }}:test python -m pytest


3 changes: 1 addition & 2 deletions Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,6 @@ RUN apt-get update && apt-get install --no-install-recommends -y cmake make buil

COPY src src
COPY CMakeLists.txt CMakeLists.txt

RUN cmake -G"Unix Makefiles" -DCONDA_PREFIX=$CONDA_PREFIX -DCMAKE_BUILD_TYPE=Release
RUN make -j4 install

Expand All @@ -31,4 +30,4 @@ RUN pip install .

# Add example scripts + test data (to be able to test inside the docker image)
COPY scripts /pdal_ign_plugin/scripts
COPY test /pdal_ign_plugin/test
COPY test /pdal_ign_plugin/test
4 changes: 2 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -74,7 +74,7 @@ The name of the folder informs of the plugIN nature (reader, writer, filter).
The code should respect the documentation proposed by pdal: [build a pdal plugin](https://pdal.io/en/2.6.0/development/plugins.html).
Be careful to change if the plugIn is a reader, a writer or a filter.

The CMakeList should contains:
The CMakeList should contain:

```
file( GLOB_RECURSE GD_SRCS ${CMAKE_SOURCE_DIR} *)
Expand All @@ -90,14 +90,14 @@ install(TARGETS pdal_plugin_filter_my_new_PI)
```

You should complete the main CMakeList by adding the new plugIN:

```
add_subdirectory(src/filter_my_new_PI)
```

Each plugIN has his own md file in the doc directory, structured as the [model](./doc/_doc_model_plugIN.md).

Don't forget to update [the list](#list-of-filters) with a link to the documentation.
<<<<<<< Updated upstream

## `macro` python module usage

Expand Down
2 changes: 2 additions & 0 deletions environment_docker.yml
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,9 @@ dependencies:
- python-pdal
- gdal
- pytest

# --------- pip & pip libraries --------- #
- pip
- pip:
- ign-pdal-tools

Original file line number Diff line number Diff line change
@@ -1,5 +1,7 @@
import argparse

import pdal

from macro import macro

"""
Expand All @@ -15,24 +17,48 @@ def parse_args():
parser.add_argument(
"--output_las", "-o", type=str, required=True, help="Output cloud las file"
)
parser.add_argument("--output_dsm", "-s", type=str, required=True, help="Output dsm tiff file")
parser.add_argument("--output_dtm", "-t", type=str, required=True, help="Output dtm tiff file")
parser.add_argument(
"--dsm_dimension",
type=str,
required=False,
default="dsm_marker",
help="Dimension name for the output DSM marker",
)
parser.add_argument(
"--dtm_dimension",
type=str,
required=False,
default="dtm_marker",
help="Dimension name for the output DTM marker",
)
parser.add_argument(
"--output_dsm", "-s", type=str, required=False, default="", help="Output dsm tiff file"
)
parser.add_argument(
"--output_dtm", "-t", type=str, required=False, default="", help="Output dtm tiff file"
)
return parser.parse_args()


if __name__ == "__main__":
args = parse_args()

pipeline = pdal.Reader.las(args.input)
pipeline = pdal.Pipeline() | pdal.Reader.las(args.input)
dsm_dim = args.dsm_dimension
dtm_dim = args.dtm_dimension

# Récupération des dimensions du fichier en entrée
input_dimensions = pipeline.quickinfo["readers.las"]["dimensions"].split(", ")

# 0 - ajout de dimensions temporaires
pipeline |= pdal.Filter.ferry(
dimensions=f"=>PT_GRID_DSM, =>PT_VEG_DSM, =>PT_GRID_DTM, =>PT_ON_BRIDGE"
dimensions="=>PT_GRID_DSM, =>PT_VEG_DSM, =>PT_GRID_DTM, =>PT_ON_BRIDGE"
)

## 1 - recherche des points max de végétation (4,5) sur une grille régulière, avec prise en compte des points sol (2) et basse
## vegetation (3) proche de la végétation
## pour le calcul du DSM
# 1 - recherche des points max de végétation (4,5) sur une grille régulière, avec prise en
# compte des points sol (2) et basse
# vegetation (3) proche de la végétation
# pour le calcul du DSM

pipeline |= pdal.Filter.assign(
value=["PT_VEG_DSM = 1 WHERE " + macro.build_condition("Classification", [4, 5])]
Expand Down Expand Up @@ -71,7 +97,7 @@ def parse_args():
resolution=0.75, value="PT_GRID_DSM=1", output_type="max", where="PT_VEG_DSM==1"
)

## 2 - sélection des points pour DTM et DSM
# 2 - sélection des points pour DTM et DSM

# selection de points DTM (max) sur une grille régulière
pipeline |= pdal.Filter.gridDecimation(
Expand All @@ -98,7 +124,7 @@ def parse_args():
condition_out="PT_GRID_DSM=1",
)

## 3 - gestion des ponts
# 3 - gestion des ponts
# bouche trou : on filtre les points (2,3,4,5,9) au milieu du pont en les mettant à PT_ON_BRIDGE=1

pipeline = macro.add_radius_assign(
Expand All @@ -119,25 +145,29 @@ def parse_args():
)
pipeline |= pdal.Filter.assign(value=["PT_GRID_DSM=0 WHERE PT_ON_BRIDGE==1"])

## 4 - point pour DTM servent au DSM également
# 4 - point pour DTM servent au DSM également
pipeline |= pdal.Filter.assign(value=["PT_GRID_DSM=1 WHERE PT_GRID_DTM==1"])

## 5 - export du nuage et des DSM
# 5 - export du nuage et des DSM
# Ajout des dimensions de sortie
pipeline |= pdal.Filter.ferry(dimensions=f"PT_GRID_DSM=>{dsm_dim}, PT_GRID_DTM=>{dtm_dim}")

pipeline |= pdal.Writer.las(extra_dims="all", forward="all", filename=args.output_las)
pipeline |= pdal.Writer.las(
extra_dims=input_dimensions + [dtm_dim, dsm_dim], forward="all", filename=args.output_las
)
pipeline |= pdal.Writer.gdal(
gdaldriver="GTiff",
output_type="max",
resolution=2.0,
filename=args.output_dtm,
where="PT_GRID_DTM==1",
where=f"{dtm_dim}==1",
)
pipeline |= pdal.Writer.gdal(
gdaldriver="GTiff",
output_type="max",
resolution=2.0,
filename=args.output_dsm,
where="PT_GRID_DSM==1",
where=f"{dsm_dim}==1",
)

pipeline.execute()

0 comments on commit 220a5be

Please sign in to comment.