Closed iannesbitt closed 3 months ago
This should be ready for review.
Now it's ready for review. Installation and test steps are here. Please let me know if you need details or explanations of things and I'd be happy to give them.
FYI: the test dataset is tiled and output to ./pdgpoints/testdata/3dtiles/
@robyngit I have finished most of the requested changes from your review. A brief summary of what's changed:
getopt
with argparse
(#19)pathlib.Path
objects (#22) and removing relative references to home directorypdgpoints.defs.LOGCONFIG
pdgpoints.defs.LOGGING_CONFIG
pdgpoints.__init__
calls dictConfig(
LOGGING_CONFIG`) at runtimeL = getLogger(__name__)
x
and y
of every 10,000th point to file, then using pandas
to get mean of each variable setx
and y
are converted to lat
and lon
with pyproj.Transformer
lat
and lon
calculated and set vertical adjustment factor to returned valueI've decided to bump the version to 0.0.2
given the many breaking changes.
What I still have yet to implement from your review:
In parts of
viz-points
, the input path is used to construct the output path
I will need some help resolving this as I am currently unfamiliar with node-based file management. Shouldn't be too difficult to fix based on how it's currently coded, however.
py3dtiles
issue because I've noticed that their merge function tends to break the usability of tiled datasets (see for example #9). If you set debugWireframe: true
in Cesium.Cesium3DTileset
you can see that the bounding box extends far beyond the boundaries of the points when merged, but is normal when simply tiled. I think this means they have an error in their bounding volume calculation that causes some issues with the camera and tile prioritization (apologies if my vocabulary isn't quite correct here). I've tried investigating but I think I'm missing some knowledge about how Cesium expects to read the tile hierarchy in order to assess what's actually happening in py3dtiles. We should be able to resolve this easily if we bring our version of py3dtiles in line with Oslandia's. I'm not sure when I can get around to this but it's on my list of big to-do items.Manually testing in Cesium during development
My cesium.js
file is included below:
function start(){
Cesium.Ion.defaultAccessToken = "TOKEN";
var tileset = new Cesium.Cesium3DTileset({
url: "3dtiles/tileset.json",
skipLevelOfDetail: true,
debugWireframe: false,
});
tileset.style = new Cesium.Cesium3DTileStyle({
pointSize: 1.5,
show: true
});
tileset.pointCloudShading.attenuation = true;
viewer.scene.primitives.add(tileset);
window.zoom_to_me = function(){
viewer.zoomTo(tileset);
}
viewer.scene.globe
tileset.readyPromise.then(zoom_to_me).otherwise(error => { console.log(error) });
}
start()
This is a great idea and I will be implementing it as I work to make a stable version of this program. It seems like something that I can implement alongside #25.
With your approval I will move to merge this to main
.
@iannesbitt the changes you made look great! π I wasn't able to test everything because I got an error during installation:
The only other comment I have at this point is super minor - I think it's more typical to have the test scripts & test data at the root level of the repo rather than inside the pdgpoints
, but perhaps there an argument for doing it this way?
@robyngit The error was an oversight on my part...I was logging in my home dir because I don't have sudo
privileges on datateam. If you are running on datateam and don't have permission to create /var/log/viz-points/
, you will have to edit the two filename
fields in pdgpoints/log/config.json
to point to your own log directory. Otherwise logs will go to /var/log/viz-points/...
as they are now configured for.
As for the test scripts and data, the reason I did that is because I'm pretty sure the manifest will only include files (such as the two test LAS files) when they exist inside a module. So maybe it makes sense to make a test/
module?
@iannesbitt I'm not sure of the best way to set up logging, but it does seem like an atypical pattern to require a logging config file to exist and be properly set up before the user can install or import a python package. What do you think? Is there a way to enable users to configure the logging after the package has been installed and imported?
I think your release is good as-is, without changing your test set up, but just to continue the discussion:
If I understand your proposed solution correctly, I do think that is a more common set up. So use a file structure like the one below and then add recursive-include tests *
to the manifest?
viz-points/
βββ pdgpoints/
β βββ __init__.py
β βββ ... etc
βββ tests/
β βββ __init__.py
β βββ test.py
β βββ data/
β βββ README.md
β βββ lp.png
β βββ lp_jumps_e.laz
β βββ lp_jumps_w.laz
βββ ... etc
BTW, also not necessary right away, but you could also include other dev dependencies you're using in setup.py
, like:
extras_require={
'dev': [
'sphinx',
]
}
Merging to main as all immediately relevant points have been addressed, and others have issues tracking them for a future release.
At present the workflow is contained in the
pdgpoints.pipeline.Pipeline
class, which can be executed withPipeline.run()
. In constructing it this way I hoped to make it easy to parallelize in the future, either withthreading
or a Ray/Slurm instance. Certain steps of this workflow are executed withLAStools
binary. LAStools is multithreading-enabled but I doubt that functionality can be scaled across machines easily.This workflow contains 5 steps:
las2las
to ensure LiDAR file (LAS, LAZ, or other major filetype) has CRS information in WKT stringlasinfo
to read WKT string, write to filelas2las
to rewrite LiDAR file with corrected header, write WKT string from file to header. Write 8 bit intensity values to 16 bit R, G, and B channels if necessary.py3dtiles.convert._Convert
class to tile dataset, use WKT string from file to input CRS infopy3dtiles.merger.merge_from_files()
to merge created tileset with existing tilesetsTest dataset is the Olympic Jumping Complex in Lake Placid, NY.