Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Change hazard.frequency automatically to hazard.frequency/hazard.size during hazard computation in rf_glofas #141

Open
elianekobler opened this issue Sep 3, 2024 · 1 comment

Comments

@elianekobler
Copy link
Collaborator

The default hazard.frequency value of 1 leads to excessive impact numbers when plotting impacts from 50 ensemble members (GloFAS Forecast). The _build_exp() function is used during plotting, which takes the eai_exp value at each centroid. Per centroid all 50 impact values (50 events) are summed up each with a frequency of 1. This leads to unexpected high impact values per centroid.
In the tutorial it is described like this:
"The averaging function takes the frequency parameter of the hazard into account.
By default, all hazard events have a frequency of 1.
This may result in unexpected values for average impacts.
We therefore divide the frequency by the number of events before plotting."

The issue occurs when you follow the climada_hazard_glofas_rf tutorial but forget to add this line before the impact calculation:
hazard.frequency = hazard.frequency / hazard.size

The simplified calculation steps are as follows:

rf = RiverFloodInundation()
discharge = rf.download_forecast()
ds_flood = rf.compute(discharge=discharge)
save_file(ds_flood)
with xr.open_dataset(f"flood-2024-08-04.nc", chunks="auto") as ds:
    hazard = hazard_series_from_dataset(ds, "flood_depth", "number")
impact = ImpactCalc(exposure, impf_set_affected, hazard).impact()
impact.plot_hexbin_eai_exposure()

This is how my plots looked like: (the mean exposed population is calculated from impact.at_event values and therefore not affected by the frequency)
Bildschirmfoto 2024-09-03 um 10 46 14

Would it be a better solution to divide the hazard.frequency by the hazard.size automatically during the hazard_series_from_dataset() computation?
Another solution would be to return a warning to the user as they may want to adjust the hazard.frequency.

@peanutfun
Copy link
Member

It sounds reasonable to set the frequency to 1/size automatically!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants