I need functions that make it easy to call the neural network with
[ ] dicts of numpy arrays
[ ] Stacked numpy arrays
[ ] xarray objects
[ ] dicts of torch arrays
To do this, I should follow a hub and spoke model. The hub will be a common data-structure (e.g. dict of torch arrays) and the spokes will convert data-structures to this central data structure. Then, I should only need to write one interface routine for the central data structure, and then provide translation routines from xarrays to this data structure.
this central data format will be a dict of torch tensors with shapes (time, y, x, z) where c can be either 1 or the number of vertical points. All tensors will need to be broadcast-able to this shape. This won't work because individual points are selected in x and y and the concatenated along one batch dimension.
What about dict of tensort with shape (time, batch, z)? This is what I currently have.
I need functions that make it easy to call the neural network with
To do this, I should follow a hub and spoke model. The hub will be a common data-structure (e.g. dict of torch arrays) and the spokes will convert data-structures to this central data structure. Then, I should only need to write one interface routine for the central data structure, and then provide translation routines from xarrays to this data structure.
(time, y, x, z)
wherec
can be either 1 or the number of vertical points. All tensors will need to be broadcast-able to this shape. This won't work because individual points are selected in x and y and the concatenated along one batch dimension.(time, batch, z)
? This is what I currently have.