Professional Documents
Culture Documents
2407.04724v1
2407.04724v1
2407.04724v1
Downscaling
Jose González-Abad 1
arXiv:2407.04724v1 [physics.ao-ph] 26 Jun 2024
1
A Likelihood-Based Generative Approach for Precipitation Downscaling
For this reason, in the climate context PP-SD models con- ing ensembles of predictions, although unlike the NLL ap-
stitute the standard, as they rely on large-scale synoptic proach, these are computed by passing noise as input to
variables representing the state of the atmosphere, which the generator, not by explicitly modeling the corresponding
are properly reproduced by the coarse resolution of GCMs. distribution. In the context of PP-SD, GANs have not been
Several architectures have been explored such as recurrent explored yet, although they show promise in related areas
networks (Misra et al., 2018), a combination of convolu- such as SR downscaling (Leinonen et al., 2020; Cheng et al.,
tional and dense layers (Pan et al., 2019; Baño-Medina 2020b), meteorological downscaling (Price & Rasp, 2022;
et al., 2020) and fully-convolutional (Adewoyin et al., 2021; Harris et al., 2022) and emulation (Rampal et al., 2024a).
Quesada-Chacón et al., 2022), some of them even gener-
ating projections in future scenarios (Baño-Medina et al., 3. Experimental Framework
2021; Soares et al., 2023). Among these architectures, the
U-Net (Ronneberger et al., 2015) has recently shown promis- 3.1. Region of Study and Data
ing results (Quesada-Chacón et al., 2022; Adewoyin et al.,
In this work, we focus on daily precipitation downscaling
2021), even in related fields such as emulation (Doury et al.,
over a domain centered on the Alps (37.6°N-50.4°N and
2023; 2024).
3.6°E-16.4°E), a region of interest due to its prominent
orography, which significantly influences local precipitation.
2.2. Extreme Precipitation
As we frame this study within the PP approach, we rely on
Due to the dynamics of precipitation, which typically ad- the ERA5 reanalysis dataset (Hersbach et al., 2020) (quasi-
heres to exponential probability distributions, and its non- observational) at 1◦ resolution for the large-scale variables
continuous nature (occurrence and amount), regression- (predictors) and on the observational dataset E-OBS (Cornes
based DL models encounter difficulties in accurately model- et al., 2018) for the local-scale variable (predictand) at 0.1◦
ing it. This often leads to significant issues, such as under- resolution. Following previous works (Baño-Medina et al.,
estimation of extreme precipitation events (Rampal et al., 2022; Soares et al., 2023), we select as predictors the air
2024b). temperature, specific humidity, geopotential height, and the
meridional and zonal wind components at 500, 700, and
To address this challenge, and drawing inspiration from pre- 850 hPa.
vious work (Dunn, 2004; Cannon, 2008), authors in (Baño-
Medina et al., 2022) train a DL model by minimizing the
3.2. Standard Deep Learning Models
Negative Log-Likelihood (NLL) of a Bernoulli and gamma
distributions for the occurrence and amount, respectively. Following the aforementioned recent advances in PP-SD,
By working under this assumption, which aligns with the we rely on the U-Net, a fully-convolutional model. Specif-
dynamics of precipitation (Williams, 1997), they are able ically, we adhere to the implementation details described
to model the whole distribution, including the extremes. Its in (Doury et al., 2023). We train two different versions of
success has led to its application across various regions (Sun this model: U-Net (MSE) and U-Net (NLL). The former
& Lan, 2021; Rampal et al., 2022; Kheir et al., 2023; Hos- minimizes the MSE, whereas the latter minimizes the NLL
seini Baghanam et al., 2024), making it the most extended of a Bernoulli-gamma distribution, as proposed in (Baño-
DL-based PP-SD model. Unfortunately, this approach mod- Medina et al., 2022). While U-Net (MSE) directly computes
els a different probability distribution for each of the grid- the downscaled precipitation, U-Net (NLL) computes, for
points forming the downscaled variable, resulting in spatial each grid-point in the predictand, the parameters p, α and
inconsistency when sampling from these distributions, lead- β defining the corresponding Bernoulli-gamma distribution.
ing to unrealistic projections (González-Abad et al., 2021). Consequently, for NLL-based models, the final prediction
corresponds to a random sample from the modeled distribu-
2.3. Generative Precipitation Downscaling tions.
Recently, Generative Adversarial Networks (GANs) (Good- We divide the observational dataset into a training (1980-
fellow et al., 2014) have attracted significant attention in 2010) and a test (2011-2022) period. These models are
the SD field. Unlike regression-based DL models, which trained using the Adam optimizer (Kingma & Ba, 2014)
generally minimize loss functions aiming at capturing the with a learning rate of 10−4 and a batch size of 64.
mean (e.g., Mean Squared Error, MSE), GANs minimize
an adversarial loss which encourages the generator to better 3.3. Likelihood-Based Generative Approach
reproduce the underlying distribution of the data in order to
The main contribution of this work involves cGAN mod-
fool the discriminator. This leads to improved reproduction
els (Mirza & Osindero, 2014) for PP-SD. Unlike standard
of extremes and finer details in precipitation downscaling
GANs, cGANs allow conditioning the generation process
(Rampal et al., 2024b). In addition, GANs allow comput-
2
A Likelihood-Based Generative Approach for Precipitation Downscaling
For both cGAN (MSE) and cGAN (NLL) models, we use 10 RMSE Ratio Standard Dev.
the same architecture for the generator as that used for the U- 1.4
8
Net (MSE) and U-Net (NLL) models. For the discriminator, 1.2
we implement a fully convolutional network that processes 6
1.0
both the large- and regional-scale data through a series of 4
0.8
convolutional and dense layers. Following previous works 2
(Leinonen et al., 2020; Harris et al., 2022; Rampal et al., 0.6
0 U-Net U-Net cGAN cGAN U-Net U-Net cGAN cGAN
2024a), we rely on the Wasserstein formulation of GANs (MSE) (NLL) (MSE) (NLL) (MSE) (NLL) (MSE) (NLL)
(Arjovsky et al., 2017) with a gradient penalty term (Gulra-
jani et al., 2017). This training framework is popular due Figure 1. Violin plot showing the results for four different metrics
to its theoretical properties, such as the possibility to train computed on the test set: the relative bias of the mean and the
the discriminator to optimality. For both cGAN models, SDII, the RMSE, and the ratio of standard deviations. Each metric
and following (Arjovsky et al., 2017), we use RMSprop as displays the results corresponding to the different DL models
the optimizer with a learning rate of 10−5 and a batch size intercompared: U-Net (MSE), U-Net (NLL), cGAN (MSE), and
of 64. The generator and discriminator are trained in an cGAN (NLL).
adversarial manner, with the discriminator being updated
five times for each update of the generator. The training and Figure 2 displays the precipitation histogram during the test
test sets cover the years detailed in Section 3.2. period across all grid-points in the predictand. The logarith-
mic scale of the y-axis facilitates the assessment of model
3
A Likelihood-Based Generative Approach for Precipitation Downscaling
performance, particularly for less frequent extreme events. nature of the distributions resulting from minimizing the
Examining the histogram within the 0-50 mm interval (top- NLL loss function leads to spatial inconsistencies in the
right corner), we observe a notable decline in precipitation generated field, evident in the U-Net (NLL) model’s predic-
values from 0 to approximately 1 mm in the target dataset. tion. However, incorporating the adversarial loss enables
The U-Net (MSE) model fails to adapt to this pattern, result- the cGAN (NLL) to achieve greater consistency while still
ing in an overestimation of precipitation. In contrast, other adhering to the Bernoulli-gamma distribution. In fact, the
models accurately replicate this decrease. This discrepancy prediction of the cGAN (NLL) model falls between that of
supports the U-Net (MSE) model’s overestimation of the the cGAN (MSE) and the U-Net (NLL).
mean in the corresponding violin plot. The adversarial train-
ing of the cGAN (MSE) allows it to adjust to this decrease,
while NLL-based models effectively handle it due to the U-Net (MSE) U-Net (NLL)
underlying distributional assumption. Additionally, the his-
togram reveals that MSE-based models underestimate the
distribution beyond approximately 15 mm, as minimizing
the MSE loss function mainly involves fitting the mean.
Conversely, NLL-based models accurately reproduce this
segment of the distribution, as also evidenced in the SDII
violin plot.
Expanding our focus to the histogram spanning the 0-350 cGAN (MSE) cGAN (NLL)
mm interval, representing precipitation extremes, a simi-
lar pattern emerges: only NLL-based models successfully
replicate these extreme values, including those surpassing
200 mm. This underscores the effectiveness of assuming a
gamma distribution for modeling the precipitation amount.
Target 0 10 20 30 40 50
107 107
U-Net (MSE) Precip. (mm/day)
106 106
U-Net (NLL)
cGAN (MSE)
10 5
cGAN (NLL) Figure 3. Comparison of predictions generated by the DL models
105
intercompared for a day in the test period.
Counts
104 104
3
10
103
102 0 10 20 30 40 50 5. Conclusions
101 In this work, we have introduced a novel likelihood-based
10 0 generative approach for precipitation downscaling. This
0 50 100 150 200 250 300 350 method leverages a combination of likelihood and adver-
Precip. (mm/day) sarial losses, enabling the model to properly reproduce the
target distribution while generating spatially consistent pre-
Figure 2. Histogram of the precipitation distribution for the test pe- cipitation fields, addressing a main challenge for standard
riod, aggregated across all grid-points in the predictand. The black likelihood-based DL models. Furthermore, likelihood-based
line represents the target observational dataset, while the different loss functions enable generative models to produce explicit
colors correspond to the various DL models being compared. A
probability distributions (e.g., Bernoulli-gamma) for precip-
zoomed-in view for values in the 0-50 interval is provided in the
top-right corner of the histogram.
itation. This capability is crucial when downscaling future
GCM projections, as estimating probabilities of extreme
events is vital for risk assessment.
Figure 3 depicts the predictions of each DL model for a spe-
cific day of the test period. As anticipated, the U-Net (MSE) Future work includes evaluating the likelihood-based gen-
exhibits the blurry effect, lacking fine details in the gener- erative model in the GCM space and compare the resul-
ated field. In contrast, the cGAN (MSE) addresses this issue tant projections with those of established PP-SD models.
by leveraging the adversarial loss, leading the generator to We also plan to employ eXplainable Artificial Intelligence
better capture these details to fool the discriminator, along (XAI) techniques to understand the impact of the adversarial
with the effect of computing the MSE over the ensemble loss on the modeled probability distributions and learned
of predictions. As previously discussed, the independent patterns.
4
A Likelihood-Based Generative Approach for Precipitation Downscaling
Acknowledgements Cheng, J., Liu, J., Xu, Z., Shen, C., and Kuang, Q. Generat-
ing high-resolution climate prediction through generative
We acknowledge support from grant CPP2021-008510 adversarial network. Procedia Computer Science, 174:
funded by MICIU/AEI/10.13039/501100011033 and by the 123–127, 2020b.
“European Union” and the “European Union NextGenera-
tionEU/PRTR”. Cornes, R. C., van der Schrier, G., van den Besselaar, E. J.,
and Jones, P. D. An ensemble version of the e-obs temper-
References ature and precipitation data sets. Journal of Geophysical
Research: Atmospheres, 123(17):9391–9409, 2018.
Adewoyin, R. A., Dueben, P., Watson, P., He, Y., and Dutta,
R. Tru-net: a deep learning approach to high resolution Doury, A., Somot, S., Gadat, S., Ribes, A., and Corre, L.
prediction of rainfall. Machine Learning, 110:2035–2062, Regional climate model emulator based on deep learn-
2021. ing: Concept and first evaluation of a novel hybrid down-
scaling approach. Climate Dynamics, 60(5):1751–1779,
Arjovsky, M., Chintala, S., and Bottou, L. Wasserstein gen- 2023.
erative adversarial networks. In International conference
on machine learning, pp. 214–223. PMLR, 2017. Doury, A., Somot, S., and Gadat, S. On the suitability of
a convolutional neural network based rcm-emulator for
Baño-Medina, J., Manzanas, R., and Gutiérrez, J. M. Con- fine spatio-temporal precipitation. Toulouse School of
figuration and intercomparison of deep learning neural Economics Repository, 2024.
models for statistical downscaling. Geoscientific Model
Development, 13(4):2109–2124, 2020. Dunn, P. K. Occurrence and quantity of precipitation can be
modelled simultaneously. International Journal of Cli-
Baño-Medina, J., Manzanas, R., and Gutiérrez, J. M. On
matology: A Journal of the Royal Meteorological Society,
the suitability of deep convolutional neural networks for
24(10):1231–1239, 2004.
continental-wide downscaling of climate change projec-
tions. Climate Dynamics, 57(11):2941–2951, 2021. González-Abad, J., Baño-Medina, J., and Cachá, I. H. On
Baño-Medina, J., Manzanas, R., Cimadevilla, E., Fernández, the use of deep generative models for perfect prognosis
J., González-Abad, J., Cofiño, A. S., and Gutiérrez, J. M. climate downscaling. arXiv preprint arXiv:2305.00974,
Downscaling multi-model climate projection ensembles 2021.
with deep learning (deepesd): contribution to cordex eur- Goodfellow, I., Pouget-Abadie, J., Mirza, M., Xu, B.,
44. Geoscientific Model Development Discussions, 2022: Warde-Farley, D., Ozair, S., Courville, A., and Bengio, Y.
1–14, 2022. Generative adversarial nets. Advances in neural informa-
Cannon, A. J. Probabilistic multisite precipitation downscal- tion processing systems, 27, 2014.
ing by an expanded bernoulli–gamma density network.
Goodfellow, I., Bengio, Y., and Courville, A. Deep learning.
Journal of Hydrometeorology, 9(6):1284–1300, 2008.
MIT press, 2016.
Chen, D., Rojas, M., Samset, B., Cobb, K., Diongue Ni-
Gulrajani, I., Ahmed, F., Arjovsky, M., Dumoulin, V., and
ang, A., Edwards, P., Emori, S., Faria, S., Hawkins, E.,
Courville, A. C. Improved training of wasserstein gans.
Hope, P., Huybrechts, P., Meinshausen, M., Mustafa, S.,
Advances in neural information processing systems, 30,
Plattner, G.-K., and Tréguier, A.-M. Framing, Context,
2017.
and Methods. In Masson-Delmotte, V., Zhai, P., Pirani,
A., Connors, S., Péan, C., Berger, S., Caud, N., Chen, Y., Harris, L., McRae, A. T., Chantry, M., Dueben, P. D., and
Goldfarb, L., Gomis, M., Huang, M., Leitzell, K., Lonnoy, Palmer, T. N. A generative deep learning approach to
E., Matthews, J., Maycock, T., Waterfield, T., Yelekçi, O., stochastic downscaling of precipitation forecasts. Jour-
Yu, R., and Zhou, B. (eds.), Climate Change 2021: The nal of Advances in Modeling Earth Systems, 14(10):
Physical Science Basis. Contribution of Working Group e2022MS003120, 2022.
I to the Sixth Assessment Report of the Intergovernmen-
tal Panel on Climate Change, pp. 147–286. Cambridge Hersbach, H., Bell, B., Berrisford, P., Hirahara, S., Horányi,
University Press, Cambridge, United Kingdom and New A., Muñoz-Sabater, J., Nicolas, J., Peubey, C., Radu, R.,
York, NY, USA, 2021. Schepers, D., et al. The era5 global reanalysis. Quarterly
Journal of the Royal Meteorological Society, 146(730):
Cheng, J., Kuang, Q., Shen, C., Liu, J., Tan, X., and Liu, 1999–2049, 2020.
W. Reslap: Generating high-resolution climate prediction
through image super-resolution. IEEE Access, 8:39623– Hosseini Baghanam, A., Nourani, V., Bejani, M., and Ke, C.-
39634, 2020a. Q. Improving the statistical downscaling performance of
5
A Likelihood-Based Generative Approach for Precipitation Downscaling
climatic parameters with convolutional neural networks. Rainfall extremes over new zealand. Weather and Climate
Journal of Water and Climate Change, pp. jwc2024592, Extremes, 38:100525, 2022.
2024.
Rampal, N., Gibson, P. B., Sherwood, S., Abramowitz, G.,
Kheir, A. M., Elnashar, A., Mosad, A., and Govind, A. An and Hobeichi, S. A robust generative adversarial network
improved deep learning procedure for statistical down- approach for climate downscaling and weather generation.
scaling of climate data. Heliyon, 9(7), 2023. Authorea Preprints, 2024a.
Kingma, D. P. and Ba, J. Adam: A method for stochastic Rampal, N., Hobeichi, S., Gibson, P. B., Baño-Medina, J.,
optimization. arXiv preprint arXiv:1412.6980, 2014. Abramowitz, G., Beucler, T., González-Abad, J., Chap-
Leinonen, J., Nerini, D., and Berne, A. Stochastic super- man, W., Harder, P., and Gutiérrez, J. M. Enhancing re-
resolution for downscaling time-evolving atmospheric gional climate downscaling through advances in machine
fields with a generative adversarial network. IEEE Trans- learning. Artificial Intelligence for the Earth Systems, 3
actions on Geoscience and Remote Sensing, 59(9):7211– (2):230066, 2024b.
7223, 2020.
Ravuri, S., Lenc, K., Willson, M., Kangin, D., Lam,
Maraun, D. and Widmann, M. Statistical downscaling and R., Mirowski, P., Fitzsimons, M., Athanassiadou, M.,
bias correction for climate research. Cambridge Univer- Kashem, S., Madge, S., et al. Skilful precipitation now-
sity Press, 2018. casting using deep generative models of radar. Nature,
597(7878):672–677, 2021.
Mirza, M. and Osindero, S. Conditional generative adver-
sarial nets. arXiv preprint arXiv:1411.1784, 2014. Ronneberger, O., Fischer, P., and Brox, T. U-net: Con-
Misra, S., Sarkar, S., and Mitra, P. Statistical downscaling volutional networks for biomedical image segmenta-
of precipitation using long short-term memory recurrent tion. In Medical image computing and computer-assisted
neural networks. Theoretical and applied climatology, intervention–MICCAI 2015: 18th international confer-
134:1179–1196, 2018. ence, Munich, Germany, October 5-9, 2015, proceedings,
part III 18, pp. 234–241. Springer, 2015.
O’Neill, B. C., Tebaldi, C., Van Vuuren, D. P., Eyring,
V., Friedlingstein, P., Hurtt, G., Knutti, R., Kriegler, E., Sharma, S. C. M. and Mitra, A. Resdeepd: A residual
Lamarque, J.-F., Lowe, J., et al. The scenario model super-resolution network for deep downscaling of daily
intercomparison project (scenariomip) for cmip6. Geo- precipitation over india. Environmental Data Science, 1:
scientific Model Development, 9(9):3461–3482, 2016. e19, 2022.
Pan, B., Hsu, K., AghaKouchak, A., and Sorooshian, S. Soares, P. M., Johannsen, F., Lima, D. C., Lemos, G., Bento,
Improving precipitation estimation using convolutional V., and Bushenkova, A. High resolution downscaling of
neural network. Water Resources Research, 55(3):2301– cmip6 earth system and global climate models using deep
2321, 2019. learning for iberia. Geoscientific Model Development
Discussions, 2023:1–46, 2023.
Passarella, L. S., Mahajan, S., Pal, A., and Norman,
M. R. Reconstructing high resolution esm data through Sun, L. and Lan, Y. Statistical downscaling of daily tem-
a novel fast super resolution convolutional neural net- perature and precipitation over china using deep learning
work (fsrcnn). Geophysical Research Letters, 49(4): neural models: Localization and comparison with other
e2021GL097571, 2022. methods. International Journal of Climatology, 41(2):
Price, I. and Rasp, S. Increasing the accuracy and resolution 1128–1147, 2021.
of precipitation forecasts using deep generative models. Vandal, T., Kodra, E., Ganguly, S., Michaelis, A., Nemani,
In International conference on artificial intelligence and R., and Ganguly, A. R. Deepsd: Generating high reso-
statistics, pp. 10555–10571. PMLR, 2022. lution climate change projections through single image
Quesada-Chacón, D., Barfus, K., and Bernhofer, C. Re- super-resolution. In Proceedings of the 23rd acm sigkdd
peatable high-resolution statistical downscaling through international conference on knowledge discovery and
deep learning. Geoscientific Model Development, 15(19): data mining, pp. 1663–1672, 2017.
7353–7370, 2022.
Wang, Z., Chen, J., and Hoi, S. C. Deep learning for image
Rampal, N., Gibson, P. B., Sood, A., Stuart, S., Fauchereau, super-resolution: A survey. IEEE transactions on pattern
N. C., Brandolino, C., Noll, B., and Meyers, T. High- analysis and machine intelligence, 43(10):3365–3387,
resolution downscaling with interpretable deep learning: 2020.
6
A Likelihood-Based Generative Approach for Precipitation Downscaling