Closed dlebauer closed 8 years ago
@solmazhajmohammadi do you need a fully closed canopy, or an individual plant? Also, do you need these images to come from the field scanner system or can they be any pictures of a plant?
I think that there are some potted plants at the field site in MAC.
@rjstrand are you the right person to ask about getting some sample images for Solmaz to work on?
I need fully closed canopy images preferably from the field. Images should be taken from the top and at least 20-30% overlap between two images. The point is to validate the algorithm when human eye can not see the difference between two images (Featureless images!).
@yanliu-chn or @pless do you have some sample images that Solmaz can start working with?
@dlebauer : I think the sample images need to be from the field/indoor scanners this project has. For software evaluation purpose, each software should have its test datasets.
@rjstrand is the wheat canopy closed (or close enough)? The live stream suggests 'perhaps'
@solmazhajmohammadi presumably you can get to the sample data on roger - either via ssh or globus, currently we have data from the following sensors:
will one of those work?
@solmazhajmohammadi does the wheat data that we have on Roger meet your needs here? If so, can this be closed?
I took couple of images from grass and used it. But I still need hyperspectral data (with 30% overlapping).
@solmazhajmohammadi - can this issue be closed?
@rachelshekar Yes.
From @solmazhajmohammadi on March 10, 2016 19:28
In order to validate the algorithm for image alignment and stitching, I need images from full-grown plants.
Copied from original issue: terraref/reference-data#23