Closed long123524 closed 2 years ago
Hi @long123524 , You can download the Sentinel-2 time series from data platforms such as Google Earth Engine, THEIA, or Copernicus. The semantic segmentation annotations for France are obtained from this link. The annotations are usually in vector format, so you have to rasterise them to obtain pixel level semantic labels. Cheers.
Thanks for your reply. I want to know about that how is the .npy data obtained? What has been done? Are you convenient to provide a code?
I don't have a cleaned piece of code for that part, but it is quite straightforward. The process is as follows:
We do not use cloud gap filling.
Thanks for your rapid reply. I can do these, but I can not produce the 'metadata.geojson' file like PASTIS dataset for our experiments. How do you produce it?
The main role of the metadata.geojson
file is to store the observation dates of each time series.
So can construct it like this :
{
patch_id:
{
geometry: patch geo-referenced polygon,
dates-S2: list of observation dates (YYYY-MM-DD format) for that patch,
Fold: Fold this patch belongs to.
}
}
What are the name of the bands of the Sentinel_2 time series?And what is the order?
The bands are all bands except B01,B09, and B10 and they are in natural order (B02,B03,B04,...,B12).
Hi @VSainteuf Thank you for your work. I have a concern about dataset. For the satellite data, you got from sentinel 2 from20180924 to 20191019. But ground truth from this link. maybe the crop type change during this period at the same farm
I'm not sure I understood your concern, but what I can say is that we used the ground truth corresponding to the same year as the sentinel observations.
How to make own time-series dataset for semantic segmentation? How do you generate datasets that can be used for training from satellite images and corresponding labels