terraref / computing-pipeline

Pipeline to Extract Plant Phenotypes from Reference Data
BSD 3-Clause "New" or "Revised" License
24 stars 13 forks source link

Generic summary statistics from geotiff + leaf angle statistics from angle map #433

Closed craig-willis closed 6 years ago

craig-willis commented 6 years ago

Per https://github.com/terraref/computing-pipeline/issues/405, the next step is to compute the leaf angle summary statistics from the angle map.

See https://github.com/terraref/computing-pipeline/issues/338#issuecomment-357304136 for specifics of R code using Leaf angle package. Basic statistics can be implemented in python or R.

In the short term, you can calculate the basic summary statistics from any geotiff (including ir_geotiff or rgb_geotiff) but the actual leaf angle calculations require output from las2dem process. As David mentioned, there is already the "mean temperature" extractor for the ir_geotiff that calculates the mean in a similar way (see https://github.com/terraref/extractors-multispectral/blob/master/meantemp/terra_meantemp.py). Ideally, the basic summary statistics would be implemented in a way that works with any geotiff.

Completion criteria:

weiqin61 commented 6 years ago

https://github.com/terraref/laser3d/pull/9

weiqin61 commented 6 years ago

Currently we face the issue of distinguishing of leaf angle and ground, the leaf angle would only require leaf angle information while ground would need height value (z), which is now in the current leaf angle geotiff. We might think of using las point cloud to do the leaf angle fit in the future, and we would come back to this issue after laser3d package is done.

max-zilla commented 6 years ago

@weiqin61 some code to help.

Getting PLY files from Globus Right now you need 2 pieces from 2 different places.

Some dates you could use:

2017-05-10 small plants but rows visible
2017-05-20 slightly bigger plants
2017-06-10 canopy starting to close

...so pick a dataset(s) from those dates and download the 2 PLY files from Level_1 and the metadata.json file from raw_data into the same folder and use that folder below.

conversion code for PLY files Use this branch of laser3d package: https://github.com/terraref/laser3d/pull/11 to avoid having to install a temporary version of the package, you can just copy these functions into your python script directly:

from plyfile import PlyData, PlyElement
import laspy, math, numpy, json, subprocess, utm, os
from PIL import Image
from terrautils.formats import create_geotiff
from terrautils.spatial import scanalyzer_to_mac
from osgeo import gdal

# Copy these definitions from laser3d branch
def ply_to_array(...)
def generate_las_from_ply(...)
def generate_tif_from_ply(...)

Then you can take PLY files and a metadata.json file and process them like so: plytest.py

# If you use my docker code below, your data will be mapped to /PLYDATA directory
folder = "/PLYDATA/2018-06-01__01-33-47-593"

# Iterate through files in folder to find east/west/metadata
for f in os.listdir(folder):
        if f.endswith("east_0.ply"):
            inpe = os.path.join(folder, f)
        elif f.endswith("west_0.ply"):
            inpw = os.path.join(folder, f)
        elif f.endswith("metadata.json"):
            md = os.path.join(folder, f)
    out = os.path.join(folder, "merged_utm.las")
    tif = os.path.join(folder, "merged.tif")
    inp = [inpe, inpw]

    # Read metadata
    with open(md) as jmd:
            raw_md = json.load(jmd)
            metadata = clean_metadata(raw_md, 'scanner3DTop')

    generate_las_from_ply(inp, out, metadata)
    # You might not care about this part
    generate_tif_from_ply(inp, tif, metadata)

Note that instead of messing with installing dependencies, you can use docker. if you have your script and PLY folders in /Users/wei/plydata then:

# Start interactive terrautils image
docker run -it -v /Users/wei/plydata:/PLYDATA terraref/terrautils /bin/bash
# Install 2 new dependencies and create BETY key (see our Slack chat history for bety part)
pip install plyfile, laspy
apt-get update && apt-get install -y pdal
echo PASTE_BETY_SECRET_KEY_HERE >> $HOME/.betykey
# Run your script as above
python plytest.py

...this should let you create LAS file and TIF file from any PLY data + metadata.

max-zilla commented 6 years ago

I will share several sample geotiffs from new heightmap extractor to @weiqin61 and she will show how to run the code on current code in master for statistics and sample output.

max-zilla commented 6 years ago

Wei shared code with me via email and created laser3D pull request. i need to evaluate and will paste sample results here.

max-zilla commented 6 years ago

histogram

max-zilla commented 6 years ago

https://github.com/terraref/laser3d/pull/9/files

https://github.com/terraref/laser3d/pull/9/files#diff-56e434976a24d813d062cb84ad6c92d0R44

dlebauer commented 6 years ago

For just the beta, we could do this in python

  xbar <- mean(x)
  xvar <- var(x)
  alpha <- ((1 - xbar) / xvar - 1 / xbar) * xbar ^ 2
  beta <- alpha * (1 / xbar - 1)

but we also want theta from the ellipsoidal distribution ... which is the value of using the rleafangle package, as sketched out here: https://github.com/terraref/computing-pipeline/issues/338#issuecomment-357304136

max-zilla commented 6 years ago

merged the initial package, creating followup issue #507 to integrate into pipeline.