Closed supton77 closed 4 years ago
Your SID docker container works for me with the following pipeline:
import pdal
import json
pipe_dict = {
'pipeline' :[
{
'type': 'readers.las',
'filename': '1.2-with-color.las'
},
{
"type":"filters.python",
"module":"anything",
"function":"fff",
"source":"import numpy as np\ndef fff(ins,outs):\n\tX = ins['X']\n\tResult = np.equal(X, 637501.67)\n\t#print X\n\t#print Mask\n\touts['Mask'] = Result\n\tprint(outs)\n\treturn True\n"
}
]
}
pipe_json = json.dumps(pipe_dict)
pipeline = pdal.Pipeline(pipe_json)
pipeline.validate()
pipeline.execute()
You have to execute()
the pipeline to get anything to happen.
Thanks for the super-fast reply!
Your pipeline does run in my container, but my actual code (which does have an execute()
call) continues to error out on the missing filter.
I guess that this is a ghost in my code.
I wonder if the script
option is somehow messing with the environment. It is a little weird here in that you're in python using the python filter too. It is not a commonly tested scenario.
I was wondering that, so I tested both my function reformated into a 'source' string, and also just calling the function. I ran both from the pipeline in the python interpreter in bash. I can successfully call both. But when I call the pipeline from my larger app it still errors out
when I call the pipeline from my larger app it still errors out
what is the PDAL_DRIVER_PATH
environment variable set to in your larger application? It must be set to a valid value for PDAL to pick up the plugins for filters.python
driver.
That was it. it was set PDAL_DRIVER_PATH=/opt/conda/lib:/opt/conda/lib/python3.8/site-packages/lib:/opt/conda/lib/python3.8/site-packages/lib64
from the console in the container, but when I listed os.environ
from the app it was unset
Glad to know it works. I'm interested to hear why you're taking this particular approach. It seems kind of inside out 😄
I do a couple of things with python filters in my codebase: I grab a spatial extent using shapely inline as I filter and sample a raw point cloud, and I apply an ml model on a per-point basis before I hand it to the gdal writer to give me spatial means. It's places where I want to operate on the points inline instead of pulling out an array, operating on it, writing it, and kicking off another pipeline
I have tried three paths trying to get a working 'filters.python' stage in a pipeline.
I tried building a container
FROM pdal/pdal:2.1
and installing pdal==2.3.3 via pip (along with other application-necessary packages). This installs correctly butpdal --drivers
does not include 'filters.python' and running a test pipeline:which returns:
I did the same but using
FROM debian:testing
as my base image with the same result.I have also created a miniconda py3.8 container installing
pdal=2.1
andpython-pdal=2.3.3
This container builds correctly and
pdal --drivers
does include 'filters.python' but I get the same error when I try a pipeline with a python stage.Am I missing something?