Reading point cloud data out of a postgresql pgPointcloud database using PDAL

113 views Asked by At

I want to read point cloud data from a postgres database.

Here is my code

Dockerfile

FROM ubuntu:latest

ENV TZ=UTC
ENV DEBIAN_FRONTEND=nointeractive

RUN apt-get update && apt-get install -y python3 python3-pip wget
RUN apt-get install libpcl-dev cmake libgtest-dev -y 
RUN apt-get install build-essential

RUN pip config set global.trusted-host "pypi.org files.pythonhosted.org pypi.python.org"

RUN wget -P /tmp https://repo.anaconda.com/miniconda/Miniconda3-py310_23.1.0-1-Linux-x86_64.sh --no-check-certificate
RUN bash /tmp/Miniconda3-py310_23.1.0-1-Linux-x86_64.sh -b -p /home/spacesium/miniconda
ENV PATH="/home/spacesium/miniconda/bin:$PATH"

RUN conda config --set ssl_verify false
RUN conda install -c conda-forge pdal python-pdal gdal

WORKDIR /app

docker-compose.yml

services:

  edge:
    build: .
    command: sh -c "pdal pipeline /app/pipeline.json"
    volumes:
      - .:/app

pipeline.json

[
    {
        "type":"readers.pgpointcloud",
        "connection":"dbname='dbname' user='username' password='password' host='localhost' port=5432",
        "table":"patches",
        "column":"pa",
        "where":"id=1"
    },
    {
        "type":"writers.text",
        "filename":"outputfile.txt"
    }
]

When I run it all using docker compose up

I get the error message

 PDAL: connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused
   Is the server running on that host and accepting TCP/IP connections?
 connection to server at "localhost" (::1), port 5432 failed: Cannot assign requested address
   Is the server running on that host and accepting TCP/IP connections?

Interestingly, when I run the pdal command on my local development PC (ie: not within a docker container) from a conda environment the command works fine.

ie: pdal pipeline pipeline.json

data


UPDATE:

Building the current container and tagging it docker build . -t das-edge:latest

I then added this to the docker compose script which hosts the database:

edge:
    image: das-edge:latest
    command: sh -c "pdal pipeline /app/pipeline.json"
    depends_on:
      - db
    links:
      - "db:hdasdb"
    networks:
      - "app-network"
    volumes:
      - .:/app

I get this error message:

2023-12-11 03:53:15.148 UTC [33] LOG:  Pointcloud (1.2.5) module loaded
2023-12-11 03:53:15.148 UTC [33] STATEMENT:  SELECT PC_Typmod_Pcid(a.atttypmod) AS pcid FROM pg_class c, pg_attribute a WHERE c.relname = 'patches' AND a.attname = 'pa' AND a.attrelid = c.oid
(pdal pipeline readers.pgpointcloud Error) GDAL failure (1) PROJ: proj_create_from_database: Open of /home/spacesium/miniconda/share/proj failed
PDAL: Could not import coordinate system 'EPSG:32756': PROJ: proj_create_from_database: Open of /home/spacesium/miniconda/share/proj failed.

Which seems to relate to this: Error Message Source Code

It's strange that I get this error message only when inside a docker container.

1

There are 1 answers

0
sav On

Thanks to DavidMaze for pointing me to the first problem

The error message mentions proj_create_from_database, which is mentioned in a number of tickets in the PDAL library.

Reading through these issues, it would seem to suggest that the PDAL library may still not be installed correctly in the docker container.

I have used the following docker file:

FROM condaforge/mambaforge:latest as build

ENV LANG=C.UTF-8 LC_ALL=C.UTF-8

RUN conda config --set ssl_verify false
RUN pip config set global.trusted-host "pypi.org files.pythonhosted.org pypi.python.org"

RUN conda create -n pdal -y
ARG GITHUB_SHA
ARG GITHUB_REPOSITORY="PDAL/PDAL"
ARG GITHUB_SERVER_URL="https://github.com"

SHELL ["conda", "run", "-n", "pdal", "/bin/bash", "-c"]

RUN git config --global http.sslVerify false

RUN mamba install -c conda-forge git compilers conda-pack cmake make ninja sysroot_linux-64=2.17 && \
    mamba install --yes -c conda-forge pdal --only-deps

RUN git clone "${GITHUB_SERVER_URL}/${GITHUB_REPOSITORY}" pdal && \
    cd pdal ; \
    git checkout ${GITHUB_SHA}

RUN mkdir -p pdal/build && \
    cd pdal/build  && \
    CXXFLAGS="-Werror=strict-aliasing" LDFLAGS="-Wl,-rpath-link,$CONDA_PREFIX/lib" cmake -G Ninja  \
        -DCMAKE_BUILD_TYPE=Release \
        -DCMAKE_LIBRARY_PATH:FILEPATH="$CONDA_PREFIX/lib" \
        -DCMAKE_INCLUDE_PATH:FILEPATH="$CONDA_PREFIX/include" \
        -DCMAKE_INSTALL_PREFIX="$CONDA_PREFIX" \
        -DBUILD_PLUGIN_CPD=OFF \
        -DBUILD_PLUGIN_PGPOINTCLOUD=ON \
        -DBUILD_PLUGIN_NITF=ON \
        -DBUILD_PLUGIN_ICEBRIDGE=ON \
        -DBUILD_PLUGIN_HDF=ON \
        -DBUILD_PLUGIN_TILEDB=ON \
        -DBUILD_PLUGIN_E57=ON \
        -DBUILD_PGPOINTCLOUD_TESTS=OFF \
        -DWITH_ZSTD=ON \
        ..

RUN cd pdal/build  && \
    ninja

#RUN cd pdal/build  && \
#    ctest -V

RUN cd pdal/build  && \
    ninja install

RUN conda-pack -n pdal --dest-prefix=/opt/conda/envs/pdal -o  /tmp/env.tar && \
     mkdir /venv && cd /venv && tar xf /tmp/env.tar  && \
     rm /tmp/env.tar

FROM condaforge/miniforge3

ENV CONDAENV "/opt/conda/envs/pdal"
COPY --from=build /venv "/opt/conda/envs/pdal"

ENV PROJ_NETWORK=TRUE
ENV PROJ_DATA="${CONDAENV}/share/proj"
ENV GDAL_DATA="${CONDAENV}/share/gdal"
ENV GEOTIFF_CSV="${CONDAENV}/share/epsg_csv"
ENV GDAL_DRIVER_PATH="${CONDAENV}/lib/gdalplugins"
ENV PATH $PATH:${CONDAENV}/bin
ENV GTIFF_REPORT_COMPD_CS=TRUE
ENV REPORT_COMPD_CS=TRUE
ENV OAMS_TRADITIONAL_GIS_ORDER=TRUE


SHELL ["conda", "run", "--no-capture-output", "-n", "pdal", "/bin/sh", "-c"]

This was mostly from PDAL Docker Script

Building and tagging this image: docker build . -t das-edge:latest

Docker Compose:

  edge:
    image: das-edge:latest
    command: sh -c "pdal pipeline /app/pipeline.json"
    depends_on:
      - db
    links:
      - "db:hdasdb"
    networks:
      - "app-network"
    volumes:
      - .:/app

pipeline.json

[
    {
        "type":"readers.pgpointcloud",
        "connection":"dbname='dbname' user='username' password='password' host='localhost' port=5432",
        "table":"patches",
        "column":"pa",
        "where":"id=1"
    },
    {
        "type":"writers.text",
        "filename":"outputfile.txt"
    }
]