You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
In credit.data, the ERA5_and_Forcing_Dataset class produces xarray.Dataset with dimensions of [time, level, latitude, longitude]
In credit.transforms, class ToTensor_ERA5_and_Forcing collects xarray.Dataset variables and produces tensors of [time, var, level, lat, lon]
In credit.trainers, the Trainer class in trainerERA5_v2.py collects [batch, time, var, level, lat, lon] from the dataloader and permute it to [batch, var, time, level, lat, lon]
The entire process above can be handled better with ToTensor_ERA5_and_Forcing produces [var, time, level, lat, lon]. This will avoid the tensor permutation in trainerERA5_v2.py.
Also note that all current models (their embedding layers) take [batch, var, time, lat, lon] tensors with "channel-first" and patch_size = [patch_time, patch_lat, patch_lon]. The improved data workflow needs to be compatible with this model requirement.
The text was updated successfully, but these errors were encountered:
credit.data
, theERA5_and_Forcing_Dataset
class produces xarray.Dataset with dimensions of[time, level, latitude, longitude]
credit.transforms
, classToTensor_ERA5_and_Forcing
collects xarray.Dataset variables and produces tensors of[time, var, level, lat, lon]
credit.trainers
, the Trainer class intrainerERA5_v2.py
collects[batch, time, var, level, lat, lon]
from the dataloader and permute it to[batch, var, time, level, lat, lon]
The entire process above can be handled better with
ToTensor_ERA5_and_Forcing
produces[var, time, level, lat, lon]
. This will avoid the tensor permutation intrainerERA5_v2.py
.Also note that all current models (their embedding layers) take
[batch, var, time, lat, lon]
tensors with "channel-first" andpatch_size = [patch_time, patch_lat, patch_lon]
. The improved data workflow needs to be compatible with this model requirement.The text was updated successfully, but these errors were encountered: