Similarity-based semilocal estimation of post-processing models

Sebastian Lerch and S´andor Baran German Probability and Statistics Days, Freiburg, March 2018

Probabilistic weather forecasts

Weather forecasts rely on numerical weather prediction (NWP) models that represent the physics and chemistry of the atmosphere based on systems of partial differential equations. However, there are major sources of uncertainty, including uncertainty about initial conditions and physical models. Carefully designed ensembles of NWP model runs seek to quantify uncertainty. However, despite their undisputed success, ensemble forecasts are subject to biases and lack calibration.

Example: Wind speed forecasts at Frankfurt airport

8 6 4 2

Forecast

10

12

14

Ensemble forecasts Observation

2011−03−20

2011−03−30

2011−04−09 Date

2011−04−19

Ensemble forecasts Observation

0

5

Forecast

10

15

Example: Wind speed forecasts at Frankfurt airport

2010−10−14

2010−10−24

2010−11−03 Date

2010−11−13

Statistical post-processing of ensemble forecasts

Ensemble forecasts typically fail to to represent the full model uncertainty and require statistical post-processing. I

Basic idea: Exploit structure in past forecast-observation pairs to correct the systematic errors in the model output by fitting distributional regression models.

I

Post-processing combines physical weather models and statistical distributional regression modeling.

Non-homogeneous regression (NR) models Fit a parametric predictive distribution, y |x1 , . . . , xm ∼ Fθ (y |x1 , . . . , xm ), with parameters θ ∈ Rd depending on the ensemble forecasts through link functions, θ = g (x1 , . . . , xm ).

Non-homogeneous regression (NR) models Fit a parametric predictive distribution, y |x1 , . . . , xm ∼ Fθ (y |x1 , . . . , xm ), with parameters θ ∈ Rd depending on the ensemble forecasts through link functions, θ = g (x1 , . . . , xm ). This requires choosing an appropriate I

parametric model Fθ , g (S´andor Baran’s talk)

I

loss function for parameter estimation (Bernhard Klar’s talk)

I

training set (this talk)

An NR model for wind speed

y |x1 , . . . , xm ∼ Fθ = N[0,∞) (µ, σ 2 ), ¯ , c + d S 2 ), θ = (µ, σ 2 ) = g (x1 , . . . , xm ) = (a + b X ¯ denotes the ensemble mean, S 2 the ensemble variance. where X Parameters a, b, c, d are estimated over a rolling training period minimizing the CRPS.

Thorarinsdottir, T.L. and Gneiting, T. (2010) Probabilistic forecasts of wind speed: Ensemble model output statistics by using heteroscedastic censored regression. Journal of the Royal Statistical Society Series A, 173, 371–388.

Parameter estimation methods

Thereby, decisions regarding the spatial composition of training sets have to be made. I

regional estimation: data from all available stations are composited to form a single training set for all stations

I

local estimation: consider only forecast cases from the single observation station of interest

Local estimation accounts for spatial variability and results in better predictions, but requires long training periods. Depending on the data, both options can be undesirable.

GLAMEPS data

I

Grand Limited Area Model Ensemble Prediction System

I

short-range multi-model EPS

I

complex model structure combining different physical models and initialization times

I

18 h ahead forecasts of 10 m wind speed for 1738 observation stations in Europe and Northern Africa

I

data for October–November 2013, and March–May 2014

Locations of observation stations ●





●●



70

● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ●●● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ●● ●● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ●● ● ● ● ●● ● ● ● ●●● ● ● ● ● ● ●● ● ●● ● ● ●●● ● ● ● ● ●● ● ● ● ● ● ●● ●● ● ● ● ● ● ●●●●● ● ● ● ● ●● ● ●● ● ● ● ● ● ● ● ●● ● ●● ● ● ● ● ● ● ● ● ●●● ●● ● ●●● ● ● ● ● ● ●● ● ● ● ● ● ● ● ●● ● ●●● ● ● ● ● ● ●● ● ● ● ● ●● ● ● ● ● ● ●● ● ● ● ●● ● ●●● ● ● ● ● ●● ● ● ● ● ● ● ●● ● ● ● ● ●● ●● ● ● ● ● ● ● ● ●● ●● ● ●●● ●●● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ●●●● ● ●● ● ●● ●● ●● ●● ● ● ● ● ● ●● ● ● ● ● ●● ● ● ● ● ●● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ●● ● ● ● ● ● ● ● ● ● ● ●●● ●● ● ●● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●● ● ● ● ● ●● ● ●● ●●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ● ● ●● ● ● ● ●●●●● ●● ●● ● ● ● ● ●●● ● ● ● ● ● ● ● ● ● ● ● ●● ●● ●●● ● ● ● ●●● ● ● ●● ●● ● ● ●● ● ●●● ●●● ● ● ● ● ●● ● ● ●● ●● ● ●● ●● ● ● ●●●●● ●● ● ● ● ●●● ● ● ● ●● ● ● ● ● ●● ●● ● ● ● ● ● ● ● ● ●●● ● ● ●● ● ● ● ● ● ●● ● ● ●● ●● ● ● ●●● ● ● ●●● ● ●● ●● ● ●●● ● ● ● ● ● ●● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ●● ● ● ● ●● ●● ● ● ● ● ●●● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ●●●● ● ● ● ● ● ● ●●● ● ● ● ● ● ●● ● ●●● ● ● ● ●● ●● ● ● ● ●● ● ● ● ● ● ● ● ● ●●●● ● ●●● ● ●● ●●● ●● ● ● ● ● ● ● ● ●● ● ●●●●● ● ● ●●● ● ●● ● ● ● ● ●● ● ● ● ●● ●● ●● ● ● ●●●● ●● ● ● ● ●● ●● ● ● ●● ●● ● ●●●● ● ●● ● ●● ● ●●●● ●● ● ● ● ●● ● ● ● ●● ● ●● ● ●●● ● ●●● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●● ● ● ● ● ● ● ● ● ●●●● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ●●● ● ● ●●●● ●●● ●●●●●● ● ●● ● ● ● ● ● ●● ●● ● ●●● ● ● ●●● ● ● ● ● ● ●● ● ● ●● ●● ● ● ● ●● ●●● ●●● ● ● ●●● ●● ● ●●● ● ● ● ● ● ●● ● ●● ● ●● ● ● ●●●● ●●● ●● ● ● ● ● ●●● ● ●● ●● ● ●●● ● ● ●● ●● ●● ●●● ● ● ●● ●●●● ● ● ●●●● ● ●●● ● ● ● ● ● ● ● ●● ●● ●● ● ●● ●● ●●● ● ● ● ● ● ● ●● ● ●● ● ● ● ●●●● ● ●● ● ●● ● ●● ●●● ● ●● ● ● ● ● ● ●● ●● ● ● ● ●●● ●● ● ● ●●● ●● ●● ● ● ● ●● ● ● ● ●● ●● ●● ● ● ● ● ● ●● ●● ● ● ● ● ● ● ●● ● ● ● ● ●● ● ● ● ● ● ●● ● ● ●● ● ● ● ●●● ●● ● ● ● ● ● ●● ● ● ● ● ● ●● ●●●●● ●● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ●●● ● ● ● ● ●● ● ● ● ●●● ● ● ● ●●● ● ●● ●● ● ●● ● ● ● ● ●● ● ● ● ● ●●●● ● ● ● ● ● ●● ● ● ● ●●● ● ●● ●●●●●●● ● ● ● ● ● ●● ● ● ●● ● ● ● ● ● ●●● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ●● ●● ● ● ● ●● ●● ● ● ● ●● ● ● ●●● ●● ●●● ● ● ● ●● ● ● ● ● ● ● ●● ● ●● ● ● ● ● ●● ● ●● ● ●● ● ● ● ● ● ●● ●● ●● ● ●● ● ● ● ●● ● ● ● ● ● ●● ● ●● ● ● ● ●● ● ● ● ●● ●●● ● ● ●●● ●● ● ● ● ● ●● ● ● ● ● ●● ● ●● ● ●● ● ● ●● ● ●●● ● ● ● ● ● ● ● ●● ● ● ● ●● ● ● ●●● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●●●● ● ● ● ● ● ● ●● ● ●● ● ● ● ● ● ● ●● ● ● ● ● ● ●● ● ● ● ● ● ●● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ●● ● ● ●● ● ● ● ● ● ●● ● ● ● ● ● ● ● ●● ● ● ●● ●● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ●● ● ● ● ● ● ●●● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●





● ●

● ●



Latitude

●● ●

60

50

40 ● ●

30



● ●

−20

● ●





0

20

Longitude

40

60

Structure of GLAMEPS ensemble I

52 members, combination of subensembles from 4 numerical models, each contributing I I I

I

1 control forecast 6 perturbed members 6 perturbed members (lagged by 6 h)

12 groups of members which should have distinct NR coefficients bi , i.e., µ=a+

12 X i=1

¯i bi X

in the TN model

Structure of GLAMEPS ensemble I

52 members, combination of subensembles from 4 numerical models, each contributing I I I

I

1 control forecast 6 perturbed members 6 perturbed members (lagged by 6 h)

12 groups of members which should have distinct NR coefficients bi , i.e., µ=a+

12 X

¯i bi X

in the TN model

i=1 I

we consider the following variants and simplifications I I I

full model: account for all 12 groups, 15 parameters lag-ignoring model: ignore time-lagging, 11 parameters simplified model: ignore existence of groups, 4 parameters

Challenges in post-processing GLAMEPS forecasts I

regional parameter estimation does not account for variability I

I

local parameter estimation causes numerical issues I

I I

I

single set of coefficients undesirable for large heterogeneous ensemble domain lack of training data (c.f. Hemri et al. (2014): optimal training period lengths of 365–1816 days) large number of parameters leads to numerical problems only simplified model can be estimated successfully

ideal solution: compute re-forecasts of past cases with current model, however, usually impossible in practice due to large computational costs of generating ensemble forecasts

Hemri, S., Scheuerer, M., Pappenberger, F., Bogner, K. and Haiden, T. (2014) Trends in the predictive performance of raw ensemble weather forecasts. Geophysical Research Letters, 41, 9197–9205.

Similarity-based semi-local parameter estimation

I

basic idea: augment training data for a given station with data from stations with similar characteristics

I

thereby, combine advantages of regional and local estimation I I I

I

locally adaptive, but parsimonious complex models can be estimated without numerical issues improved predictive performance

choice of similar stations is based on I I

distance functions clustering

Distance-based model estimation I

Two-step approach 1. compute pair-wise similarities (distances) d(i, j) between stations 2. add corresponding forecast cases from the L most similar stations to the training set for the station of interest

I

distances are based on characteristics of stations, distribution of observations, and forecast errors of the ensemble

I

follows basic idea of Hamill et al. (2008)

Hamill, T.M., Hagedorn, R. and Whitaker J.S. (2008) Probabilistic Forecast Calibration Using ECMWF and GFS Ensemble Reforecasts. Part II: Precipitation. Monthly Weather Review, 136, 2620–2632.

Example 1: Geographical locations (Hamill et al., 2008) d

(1)

q (i, j) = (Xi − Xj )2 + (Yi − Yj )2 .

Euclidean distance of locations ●









●●



● ●

70

● ●



●●● ●● ● ●



● ●



70



● ●

● ●●

● ● ● ● ● ● ● ● ● ●● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ●● ●● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ●● ● ● ● ●● ● ● ● ●●● ● ● ● ● ● ● ● ● ●● ● ● ●●● ● ● ● ● ●● ● ● ● ● ● ●● ●● ● ● ● ● ● ●●●●● ● ● ● ● ●● ● ●● ● ● ● ● ● ● ●● ●● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ●●● ● ● ● ●●● ● ● ●● ●● ● ● ●● ● ● ●●● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ●● ● ●●● ● ● ● ●● ● ● ●● ● ●● ● ● ● ● ● ● ● ● ●● ●● ● ● ● ● ● ● ● ●● ●● ● ● ●● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ●● ● ● ●●●● ● ●● ● ●● ●● ●● ●● ● ● ● ● ● ● ● ●● ● ● ● ● ● ●● ● ● ● ● ● ● ●● ● ● ● ●● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ●●● ●● ● ●● ● ● ● ● ● ● ● ● ● ● ●●●● ● ● ●● ● ● ● ● ●● ● ●● ●●● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ● ● ●● ● ● ● ●●●●● ●● ●● ● ● ● ● ●●● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ●● ●●● ● ●● ● ● ●● ● ●● ● ●●● ●●● ●● ● ● ● ● ● ● ● ●● ● ● ●● ●● ● ●● ● ● ● ●●● ● ●● ● ● ● ● ● ● ●● ● ●● ● ● ● ● ●● ● ● ● ● ● ● ●● ● ●●● ● ● ● ● ● ●● ● ● ● ● ● ●● ●● ●●● ●●●● ● ●●● ● ● ● ●●● ● ● ● ● ● ●●● ● ● ● ●● ●● ● ● ● ● ● ●● ● ● ● ● ●●● ● ● ● ●● ● ● ●●● ●● ● ● ● ●● ● ●● ● ● ●● ● ● ● ● ● ●●●● ● ● ● ●●● ● ● ● ● ● ● ● ● ●● ● ●●● ● ● ● ●● ●● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ●●●● ● ●●●●● ●● ●●● ●● ● ● ● ● ● ● ● ●● ● ●●● ● ●● ● ● ● ● ●● ● ● ●●●● ● ● ● ●● ● ● ● ● ● ●●●● ●● ●● ● ● ●● ● ● ●● ● ●● ●●● ●● ● ● ● ● ● ● ●● ● ● ●●●● ●● ●● ● ● ●●● ●● ●●● ●● ●● ● ●●●● ● ● ●●●● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ●● ● ●●● ● ● ● ● ● ●● ●● ● ● ● ● ● ● ● ● ●● ●●● ● ● ●●●● ●●● ●●●●●● ● ●● ● ● ● ● ● ●● ●● ● ●●● ● ● ●●● ● ● ● ● ● ●● ● ● ●● ●● ● ● ● ●● ●●● ●●● ● ● ●●● ●● ● ●●● ● ● ● ● ● ●● ● ● ●● ●● ● ●●●●● ●●● ●● ● ● ● ● ●● ●● ● ●●● ● ● ●● ●● ●● ●●● ● ● ●●●●●●●●● ●● ●●●● ● ● ● ● ● ● ● ● ● ● ●● ●● ●● ●● ● ●● ●●● ●● ●●● ● ● ●● ● ● ●● ● ● ● ● ● ●● ●●● ● ● ●● ● ●● ●●● ● ● ● ● ● ● ● ● ●●● ●● ● ● ● ● ●●● ● ● ● ●●● ● ●● ● ● ●● ●● ● ●● ● ● ● ● ● ●● ● ●● ●● ● ● ● ● ● ●●● ● ● ● ● ●● ● ● ● ● ● ●● ● ● ●● ● ● ● ●●● ●● ● ● ● ● ● ●● ● ● ● ● ● ●● ●●●●● ●● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ●●● ● ●● ● ●● ● ● ● ●●● ● ● ● ● ●●● ● ●● ●●● ●● ● ● ● ● ●● ● ● ● ● ●●●● ● ● ● ● ● ●● ● ● ● ●●● ● ●● ●●●●●●● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ●● ● ●●●● ●● ● ● ● ● ●●● ● ●● ● ● ● ●● ●● ●●● ● ● ● ●● ● ● ● ● ● ● ● ● ● ●● ● ● ●● ● ●● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ● ● ● ●● ●● ● ●● ● ● ● ●● ● ● ● ● ● ●● ●●● ● ● ●● ● ● ● ●● ● ● ●●● ●●● ● ● ●●● ●● ● ● ● ● ●● ● ● ● ● ●● ● ●● ● ●● ● ● ●● ● ●●● ● ● ●● ● ● ● ● ● ●● ● ● ● ●● ● ● ●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ●● ● ● ● ●● ● ●●● ● ● ● ● ● ●● ● ●● ● ● ● ● ● ● ●● ● ● ● ● ● ● ●● ● ● ● ● ●● ● ● ● ● ●● ● ● ● ● ●● ● ● ● ● ● ●● ● ● ● ● ●● ● ● ●● ● ● ● ● ● ●● ● ● ● ● ● ● ● ●● ● ● ●● ●● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ●●● ● ●● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ●



● ●



Latitude

●●●

60

40 ● ●

30



● ●

−20







● ●

● ●●●

60

40 ● ●

30





● ●

20

Longitude

40

● ●

● ● ●●

60



● ●

● ●●●● ● ●●● ●● ●●● ●● ● ● ●●● ●● ● ●● ●●●● ●● ● ● ● ●●● ●● ● ●● ● ● ●●● ● ● ●● ●● ●●● ●● ●● ● ● ●● ● ● ● ● ●●● ●● ●● ● ●● ● ● ●● ● ●●● ● ● ● ●● ●

50



0







● ●

● ● ● ● ●● ●● ● ●● ● ●● ● ●● ● ●●● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ●● ● ● ●●● ● ● ● ●● ● ● ● ● ●● ● ●

50

●●● ●● ● ●

● ● ● ● ● ●





Latitude





● ● ● ● ● ● ● ● ● ●● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ●● ●● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ●● ● ● ● ●● ● ● ● ●●● ● ● ● ● ● ● ● ● ●● ● ● ●●● ● ● ● ● ●● ● ● ● ● ● ●● ●● ● ● ● ● ● ●●●●● ● ● ● ● ●● ● ●● ● ● ● ● ● ● ●● ●● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ●●● ● ● ● ●●● ● ● ●● ●● ● ● ●● ● ● ●●● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ●● ● ●●● ● ● ● ●● ● ● ●● ● ●● ● ● ● ● ● ● ● ● ●● ●● ● ● ● ● ● ● ● ●● ●● ● ● ●● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ●● ● ● ●●●● ● ●● ● ●● ●● ●● ●● ● ● ● ● ● ● ● ●● ● ● ● ● ● ●● ● ● ● ● ● ● ●● ● ● ● ●● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ●●● ●● ● ●● ● ● ● ● ● ● ● ● ● ● ●●●● ● ● ●● ● ● ● ● ●● ● ●● ●●● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ● ● ●● ● ● ● ●●●●● ●● ●● ● ● ● ● ●●● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ●● ●●● ● ●● ● ● ●● ● ●● ● ●●● ●●● ●● ● ● ● ● ● ● ● ●● ● ● ●● ●● ● ●● ● ● ● ●●● ● ●● ● ● ● ● ● ● ●● ● ●● ● ● ● ● ●● ● ● ● ● ● ● ●● ● ●●● ● ● ● ● ● ●● ● ● ● ● ● ●● ●● ●●● ●●●● ● ●●● ● ● ● ●●● ● ● ● ● ● ●●● ● ● ● ●● ●● ● ● ● ● ● ●● ● ● ● ● ●●● ● ● ● ●● ● ● ●●● ●● ● ● ● ●● ● ●● ● ● ●● ● ● ● ● ● ●●●● ● ● ● ●●● ● ● ● ● ● ● ● ● ●● ● ●●● ● ● ● ●● ●● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ●●●● ● ●●●●● ●● ●●● ●● ● ● ● ● ● ● ● ●● ● ●●● ● ●● ● ● ● ● ●● ● ● ●●●● ● ● ● ●● ● ● ● ● ● ●●●● ●● ●● ● ● ●● ● ● ●● ● ●● ●●● ●● ● ● ● ● ● ● ●● ● ● ●●●● ●● ●● ● ● ●●● ●● ●●● ●● ●● ● ●●●● ● ● ●●●● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ●● ● ●●● ● ● ● ● ● ●● ●● ● ● ● ● ● ● ● ● ●● ●●● ● ● ●●●● ●●● ●●●●●● ● ●● ● ● ● ● ● ●● ●● ● ●●● ● ● ●●● ● ● ● ● ● ●● ● ● ●● ●● ● ● ● ●● ●●● ●●● ● ● ●●● ●● ● ●●● ● ● ● ● ● ●● ● ● ●● ●● ● ●●●●● ●●● ●● ● ● ● ● ●● ●● ● ●●● ● ● ●● ●● ●● ●●● ● ● ●●●●●●●●● ●● ●●●● ● ● ● ● ● ● ● ● ● ● ●● ●● ●● ●● ● ●● ●●● ●● ●●● ● ● ●● ● ● ●● ● ● ● ● ● ●● ●●● ● ● ●● ● ●● ●●● ● ● ● ● ● ● ● ● ●●● ●● ● ● ● ● ●●● ● ● ● ●●● ● ●● ● ● ●● ●● ● ●● ● ● ● ● ● ●● ● ●● ●● ● ● ● ● ● ●●● ● ● ● ● ●● ● ● ● ● ● ●● ● ● ●● ● ● ● ●●● ●● ● ● ● ● ● ●● ● ● ● ● ● ●● ●●●●● ●● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ●●● ● ●● ● ●● ● ● ● ●●● ● ● ● ● ●●● ● ●● ●●● ●● ● ● ● ● ●● ● ● ● ● ●●●● ● ● ● ● ● ●● ● ● ● ●●● ● ●● ●●●●●●● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ●● ● ●●●● ●● ● ● ● ● ●●● ● ●● ● ● ● ●● ●● ●●● ● ● ● ●● ● ● ● ● ● ● ● ● ● ●● ● ● ●● ● ●● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ● ● ● ●● ●● ● ●● ● ● ● ●● ● ● ● ● ● ●● ●●● ● ● ●● ● ● ● ●● ● ● ●●● ●●● ● ● ●●● ●● ● ● ● ● ●● ● ● ● ● ●● ● ●● ● ●● ● ● ●● ● ●●● ● ● ●● ● ● ● ● ● ●● ● ● ● ●● ● ● ●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ●● ● ● ● ●● ● ●●● ● ● ● ● ● ●● ● ●● ● ● ● ● ● ● ●● ● ● ● ● ● ● ●● ● ● ● ● ●● ● ● ● ● ●● ● ● ● ● ●● ● ● ● ● ● ●● ● ● ● ● ●● ● ● ●● ● ● ● ● ● ●● ● ● ● ● ● ● ● ●● ● ● ●● ●● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ●●● ● ●● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ●









●●

● ●

● ● ● ● ●

−20





0

20

Longitude

40

60

How many similar stations should be added? 0.94

Distance 1: Geographical locations ● ● ● ●

0.92





● ●

● ●



● ● ●

● ● ●

● ●

● ● ●

● ● ● ●

● ●

0.90

● ●

0.88



0.86

CRPS



0.84



0

20

40

60

Number of similar stations

CRPS of GLAMEPS: 1.058; best regional model: 0.955; best local model: 0.790

simpl., n = 25 simpl., n = 50 simpl., n = 80 lag−ign., n = 25 lag−ign., n = 50 lag−ign., n = 80 full, n = 25 full, n = 50 full, n = 80

Example 2: Station climatology + forecast errors d (4) (i, j) =

1 X ˆ 1 X ˆ e ˆje (x) , Gi (x) − G Fi (x) − Fˆj (x) + 0 |S| |S | x∈S x∈S 0 | {z } | {z } station climatology

forecast error of ensemble

where Fˆi = empirical CDF of observations at station i; S = {0, 0.5, . . . , 14.5, 15}; ˆie = empirical CDF of forecast errors of ensemble mean; S 0 = {−10, −9.5, . . . , 10} G ● ●













● ● ●●● ●● ● ●



● ●





● ● ● ● ●



● ●●

●●















● ● ● ●

●●





●●●● ● ● ●● ●● ● ●●● ●● ●●● ● ● ●● ● ● ●● ● ●● ● ● ● ●●● ●● ● ● ● ● ● ● ● ●

50



● ● ●

40

●● ● ● ●















●●





●● ●

60



●● ●



●●

● ● ●

50



30



● ●

−20

● ●

● ● ●



● ● ● ● ● ●● ●●

● ●

Longitude



● ●

●● ●● ● ● ●●● ●● ●

● ●● ● ● ● ●● ● ●● ● ●









30 40

● ●



● ●



20

● ● ●●

●●



40



0



● ●

● ●



●●

● ●













● ● ● ● ●

● ●

● ●

● ●● ● ● ● ● ●● ● ● ● ● ●● ●● ● ● ● ●● ●







● ●











● ● ●●● ●●

● ● ● ●●● ●● ● ● ●



● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ●● ● ● ● ● ●● ● ● ● ● ●● ● ● ● ●● ● ● ● ●●● ● ● ● ● ● ●● ● ●● ● ● ●●● ● ● ● ● ●● ● ● ● ● ● ●● ●● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ●● ● ● ● ● ● ●● ●● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ●●● ● ● ● ●●● ● ● ●● ●● ● ● ●● ● ● ●●● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ●● ● ●●● ● ● ● ●● ● ● ●● ● ●● ● ● ● ● ● ● ● ●● ● ● ● ●●●● ● ● ● ● ● ● ● ● ● ●● ● ●● ●● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ●● ● ● ●●●● ● ●● ● ●● ●● ●● ●● ● ● ● ● ● ● ● ●● ● ● ● ● ● ●● ● ● ● ● ● ● ●● ● ● ● ●● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ●●● ●● ● ●● ● ● ● ● ● ● ● ● ● ●●●● ● ● ●● ● ● ● ● ●● ● ●● ●●● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ●● ● ● ● ●● ● ● ● ●●●●● ●● ●● ● ● ● ● ●●● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ●● ●●● ● ●● ● ● ●● ● ●● ● ●●● ●●● ●● ● ● ● ● ● ● ●● ● ● ●● ●● ● ●● ● ● ● ●●● ● ●● ● ● ● ●●● ● ● ● ●● ● ● ● ● ●● ●● ● ● ● ● ● ● ●● ● ●●● ● ● ●● ●● ● ● ● ● ●●● ●●●● ● ●●● ● ● ●●● ● ● ● ● ●● ● ● ● ● ●● ● ● ●● ● ● ●● ●●● ● ● ● ●● ●●● ● ● ● ● ● ●● ● ● ● ● ●● ● ● ●● ● ● ●● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ●●●● ● ● ● ● ● ● ● ● ● ● ●● ●●● ● ● ● ●● ●● ● ● ● ●● ● ● ● ● ● ●● ● ● ●●●●● ●● ●●● ●● ● ● ● ● ● ● ● ● ●● ● ●●●●●● ● ●●● ● ●● ● ● ● ● ● ● ● ●● ● ● ● ● ●● ● ● ●● ●● ● ● ●●●● ●● ●● ● ● ●● ●● ● ● ● ●● ● ● ●● ● ●●● ●● ● ● ●● ● ● ● ● ● ● ●● ● ●● ●● ● ● ● ● ● ● ● ● ● ●● ●● ●● ●● ● ●● ●● ● ●●● ●● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ●● ● ●●● ● ● ● ● ● ●● ●● ● ● ● ● ● ● ● ● ●● ●●● ● ● ●●●● ●●● ●●●●●● ● ●● ● ● ● ● ● ●● ●● ● ●●● ● ● ●● ● ● ● ● ● ●● ● ● ●● ●● ● ● ● ●● ●●● ● ● ●●● ●● ● ●●● ●●●● ● ● ● ● ● ●● ● ● ●● ●● ● ●●●●● ●●● ●● ● ● ● ● ●● ●● ● ●●● ● ● ●● ●● ●● ●●● ● ● ●●●●●●●●● ●● ●●●● ● ● ● ● ● ● ● ● ● ● ●● ●● ●● ●● ● ●● ●●● ●● ●●● ● ● ●● ● ● ●● ● ● ● ● ● ●● ●●● ● ● ●● ● ●● ●●● ● ● ● ● ● ● ● ● ●●● ●● ● ● ● ● ●●● ● ● ● ●●● ● ●● ● ● ●● ●● ● ●● ● ● ● ● ● ●● ● ●● ●● ● ● ● ● ● ●●● ● ● ● ● ●● ● ● ● ● ● ●● ● ● ●● ● ● ● ●●● ●● ● ● ● ● ● ●● ● ● ● ● ● ●● ●●●●● ●● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ●●● ● ●● ● ●● ● ● ● ●●● ● ● ● ● ●●● ● ●● ●●● ●● ● ● ● ● ●● ● ● ● ● ●●●● ● ● ● ● ● ●● ● ● ● ●●● ● ●● ●●●●●●● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ●● ● ●●●● ●● ● ● ● ● ● ●●● ● ●● ● ● ● ●● ●● ●● ● ●● ● ● ●●● ●● ● ● ● ● ● ●● ● ● ●● ●● ● ● ● ● ● ● ● ● ●● ● ●● ● ● ● ● ● ● ●● ●● ● ●● ● ● ● ●● ● ● ● ● ● ●● ●●● ● ● ●● ● ● ● ●● ● ● ●●● ●●● ● ● ●●● ●● ● ● ● ● ● ●● ● ● ● ● ●● ● ●● ● ●● ● ● ●● ● ●● ● ● ●● ● ● ● ● ● ●● ● ● ● ● ● ●●● ● ● ● ● ● ● ● ● ● ●● ● ● ● ●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ●● ● ● ● ●● ● ●●● ● ● ● ● ● ●● ● ●● ● ● ● ● ● ● ●● ● ● ● ● ● ● ●● ● ● ● ● ●● ● ● ● ● ●● ● ● ● ● ●● ● ● ● ● ● ● ●● ● ● ● ●● ● ● ●● ● ● ● ● ● ●● ● ● ● ● ● ● ● ●● ● ● ●● ●● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ●●● ● ●● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●









● ●





60

● ●

70



Latitude

Latitude

● ●●●

● ● ●



● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ●● ● ● ● ● ●● ● ● ● ● ●● ● ● ● ●● ● ● ● ●●● ● ● ● ● ● ●● ● ●● ● ● ●●● ● ● ● ● ●● ● ● ● ● ● ●● ●● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ●● ● ● ● ● ● ●● ●● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ●●● ● ● ● ●●● ● ● ●● ●● ● ● ●● ● ● ●●● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ●● ● ●●● ● ● ● ●● ● ● ●● ● ●● ● ● ● ● ● ● ● ●● ● ● ● ●●●● ● ● ● ● ● ● ● ● ● ●● ● ●● ●● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ●● ● ● ●●●● ● ●● ● ●● ●● ●● ●● ● ● ● ● ● ● ● ●● ● ● ● ● ● ●● ● ● ● ● ● ● ●● ● ● ● ●● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ●●● ●● ● ●● ● ● ● ● ● ● ● ● ● ●●●● ● ● ●● ● ● ● ● ●● ● ●● ●●● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ●● ● ● ● ●● ● ● ● ●●●●● ●● ●● ● ● ● ● ●●● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ●● ●●● ● ●● ● ● ●● ● ●● ● ●●● ●●● ●● ● ● ● ● ● ● ●● ● ● ●● ●● ● ●● ● ● ● ●●● ● ●● ● ● ● ●●● ● ● ● ●● ● ● ● ● ●● ●● ● ● ● ● ● ● ●● ● ●●● ● ● ●● ●● ● ● ● ● ●●● ●●●● ● ●●● ● ● ●●● ● ● ● ● ●● ● ● ● ● ●● ● ● ●● ● ● ●● ●●● ● ● ● ●● ●●● ● ● ● ● ● ●● ● ● ● ● ●● ● ● ●● ● ● ●● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ●●●● ● ● ● ● ● ● ● ● ● ● ●● ●●● ● ● ● ●● ●● ● ● ● ●● ● ● ● ● ● ●● ● ● ●●●●● ●● ●●● ●● ● ● ● ● ● ● ● ● ●● ● ●●●●●● ● ●●● ● ●● ● ● ● ● ● ● ● ●● ● ● ● ● ●● ● ● ●● ●● ● ● ●●●● ●● ●● ● ● ●● ●● ● ● ● ●● ● ● ●● ● ●●● ●● ● ● ●● ● ● ● ● ● ● ●● ● ●● ●● ● ● ● ● ● ● ● ● ● ●● ●● ●● ●● ● ●● ●● ● ●●● ●● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ●● ● ●●● ● ● ● ● ● ●● ●● ● ● ● ● ● ● ● ● ●● ●●● ● ● ●●●● ●●● ●●●●●● ● ●● ● ● ● ● ● ●● ●● ● ●●● ● ● ●● ● ● ● ● ● ●● ● ● ●● ●● ● ● ● ●● ●●● ● ● ●●● ●● ● ●●● ●●●● ● ● ● ● ● ●● ● ● ●● ●● ● ●●●●● ●●● ●● ● ● ● ● ●● ●● ● ●●● ● ● ●● ●● ●● ●●● ● ● ●●●●●●●●● ●● ●●●● ● ● ● ● ● ● ● ● ● ● ●● ●● ●● ●● ● ●● ●●● ●● ●●● ● ● ●● ● ● ●● ● ● ● ● ● ●● ●●● ● ● ●● ● ●● ●●● ● ● ● ● ● ● ● ● ●●● ●● ● ● ● ● ●●● ● ● ● ●●● ● ●● ● ● ●● ●● ● ●● ● ● ● ● ● ●● ● ●● ●● ● ● ● ● ● ●●● ● ● ● ● ●● ● ● ● ● ● ●● ● ● ●● ● ● ● ●●● ●● ● ● ● ● ● ●● ● ● ● ● ● ●● ●●●●● ●● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ●●● ● ●● ● ●● ● ● ● ●●● ● ● ● ● ●●● ● ●● ●●● ●● ● ● ● ● ●● ● ● ● ● ●●●● ● ● ● ● ● ●● ● ● ● ●●● ● ●● ●●●●●●● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ●● ● ●●●● ●● ● ● ● ● ● ●●● ● ●● ● ● ● ●● ●● ●● ● ●● ● ● ●●● ●● ● ● ● ● ● ●● ● ● ●● ●● ● ● ● ● ● ● ● ● ●● ● ●● ● ● ● ● ● ● ●● ●● ● ●● ● ● ● ●● ● ● ● ● ● ●● ●●● ● ● ●● ● ● ● ●● ● ● ●●● ●●● ● ● ●●● ●● ● ● ● ● ● ●● ● ● ● ● ●● ● ●● ● ●● ● ● ●● ● ●● ● ● ●● ● ● ● ● ● ●● ● ● ● ● ● ●●● ● ● ● ● ● ● ● ● ● ●● ● ● ● ●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ●● ● ● ● ●● ● ●●● ● ● ● ● ● ●● ● ●● ● ● ● ● ● ● ●● ● ● ● ● ● ● ●● ● ● ● ● ●● ● ● ● ● ●● ● ● ● ● ●● ● ● ● ● ● ● ●● ● ● ● ●● ● ● ●● ● ● ● ● ● ●● ● ● ● ● ● ● ● ●● ● ● ●● ●● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ●●● ● ●● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ●







● ● ● ● ● ●

●●

● ●

70

60

−20





0

20

Longitude

40

60

How many similar stations should be added? 0.795

Distance 4: Climatology + forecast errors



● ●



0.785

● ● ● ●

● ●

CRPS

● ●

● ●

● ● ● ●



0.775



● ●



● ● ●

● ●





● ● ●

0.765



0

simpl., n = 25 simpl., n = 50 simpl., n = 80

20

40

60

Number of similar stations

CRPS of GLAMEPS: 1.058; best regional model: 0.955; best local model: 0.790 (gray dashed line)

lag−ign., n = 25 lag−ign., n = 50 lag−ign., n = 80 full, n = 25 full, n = 50 full, n = 80

Clustering-based model estimation

I

application of k-means clustering 1. determine clusters based on various feature sets. 2. parameters are estimated separately for each cluster using only data from stations within the given cluster.

I

feature sets for clustering depend on I I I

I

distribution of the observations distribution of forecast errors of the ensemble combination thereof

computationally much more efficient than distance-based semi-local model estimation

Example: Station climatology + forecast errors Feature set given by equidistant quantiles of distribution of observations + equidistant quantiles of distribution of forecast errors of ensemble mean ●





●●

● ●

70



● ●

● ● ● ● ● ● ● ● ● ● ●● ●●● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ●● ● ● ● ● ●● ● ● ● ● ●● ● ● ● ●● ● ● ● ●●● ● ● ● ● ● ●● ● ●● ● ● ●●● ● ● ● ● ●● ● ● ● ● ● ●● ●● ● ● ● ● ● ●●●●● ● ● ● ● ●● ● ●● ● ● ● ● ● ● ●● ●● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ●●● ● ● ● ●●● ● ● ●● ●● ● ● ●● ● ● ●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ●● ●● ● ●●● ● ● ● ● ●● ● ● ● ● ● ●● ● ● ● ● ●● ●● ● ● ● ● ● ● ● ●● ●● ● ●●● ●●● ●● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ●● ● ● ●●●● ● ●● ● ●● ●● ●● ●● ● ● ● ● ● ● ● ●● ● ● ● ● ● ●● ● ● ● ● ● ● ●● ● ● ● ●● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ●●● ●● ● ●● ● ● ● ● ● ● ● ● ● ●●●● ● ● ●● ● ● ● ● ●● ● ●● ●●● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ●● ● ● ● ●● ● ● ● ●●●●● ●● ●● ● ● ● ● ●●● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ●● ●●● ● ● ●● ● ● ● ● ●● ● ● ● ● ● ●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ●● ● ● ●●●●● ●● ● ● ● ● ● ● ● ● ●● ● ● ● ● ●● ●● ● ● ● ● ● ● ●● ● ●●● ● ● ●● ●● ● ● ● ● ●●● ●●●● ● ●●● ● ● ●●● ● ● ● ● ●● ● ● ● ● ●● ● ● ● ● ● ● ●● ●● ● ● ● ●● ●●● ● ● ● ● ● ●● ● ● ● ● ●● ● ● ●●● ● ● ● ● ● ●● ● ● ●● ● ● ● ● ● ●●●● ● ● ● ●●● ● ● ● ● ● ● ● ● ●● ● ●●● ● ● ● ●● ●● ● ● ● ●● ● ● ● ● ● ● ● ● ●● ● ● ●●●●● ●● ●●● ●● ● ● ● ● ●● ● ●●●●●● ● ●●● ● ●● ● ● ● ● ● ● ● ●● ● ● ● ● ●● ● ● ●● ●● ● ●●● ●●●●● ●● ● ● ●● ●● ●● ●●● ●● ● ● ●● ● ● ● ● ●● ● ● ●● ● ● ● ● ●●● ●●● ● ●●● ●● ● ● ●● ●● ● ● ●●●● ● ● ● ● ● ● ● ●● ● ● ● ●● ● ● ● ● ● ● ● ● ● ●●●● ● ● ●●● ● ● ● ● ● ● ● ● ●● ●● ● ● ● ● ● ● ●● ●●● ● ● ●● ● ●●● ●●●●●● ● ●● ● ● ● ● ● ●● ●● ● ●●● ● ● ●●● ● ● ● ● ● ●● ● ● ●● ●● ● ● ● ●● ● ●●● ● ● ●●● ●● ● ●●● ● ● ● ● ● ●● ● ● ●● ● ● ●● ● ●● ●● ●●● ● ● ● ● ● ● ● ●● ●● ● ● ●●● ● ● ● ● ● ●● ● ● ● ● ● ● ● ●●●● ●●●● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ●● ●● ●●●● ●● ● ●● ●● ●● ● ●●● ● ● ● ● ● ●● ● ●● ● ● ● ●● ● ●● ●●● ●●● ● ● ●● ● ● ● ● ● ●● ●● ● ●●●● ● ●● ● ● ● ●●● ● ●● ● ● ● ●● ● ● ● ●● ●● ● ● ● ●● ● ●● ●● ● ● ● ● ● ●●● ● ● ● ● ●● ● ● ● ● ● ●● ● ● ●● ● ● ● ●●● ●● ● ● ● ● ● ● ●● ● ● ● ●● ●●●●● ●● ● ● ●● ● ● ● ● ● ●● ● ● ● ● ● ●●● ● ●● ● ●● ● ● ● ●●● ● ● ● ● ●●● ● ●● ●●● ●● ● ● ● ● ●● ● ● ● ● ●●●● ● ● ● ● ● ●● ● ● ● ●●● ● ●● ●●●●●●● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ●● ● ●●●● ●● ● ● ● ● ● ●●● ● ●● ● ● ● ●● ●● ●● ● ●● ● ● ●●● ●● ● ● ● ● ● ●● ● ● ●● ●● ● ● ● ● ● ● ●● ● ●● ● ●● ● ● ● ● ● ● ● ●● ●● ● ●● ● ● ●● ● ● ● ● ●● ●● ● ● ● ● ● ●● ●● ● ● ● ●● ● ● ● ● ●●● ●● ● ● ●● ● ●● ● ● ● ● ● ●● ● ●● ● ●● ● ● ●● ● ●●● ● ● ●● ● ● ● ● ● ●● ● ● ●●●● ● ● ●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ●● ● ● ● ●● ● ●●● ● ● ● ● ● ●● ● ●● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ●● ● ● ●● ● ● ● ● ●● ● ● ● ● ●● ● ● ● ● ● ●● ● ● ● ● ●● ● ● ●● ● ● ● ● ● ●● ● ● ● ● ● ● ● ●● ● ● ●● ●● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ●●● ● ●● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ●





● ●

● ●



Latitude

●●●

60

50

40 ● ●

30



● ●

−20

● ●





0

20

Longitude

40

60

How many clusters? 0.78 0.80 0.82 0.84 0.86 0.88 0.90

CRPS

Feature set 3: Climatology + forecast errors

● ● ● ●

● ● ●

● ● ●

● ● ● ●

● ● ●

0

● ● ● ●

20

● ● ●

● ●

40

● ● ●

● ●

60



● ●







● ●

80

● ●

100

Number of clusters

all models estimated with a fixed number of 24 features

simpl., n = 25 simpl., n = 50 simpl., n = 80 lag−ign., n = 25 lag−ign., n = 50 lag−ign., n = 80 full, n = 25 full, n = 50 full, n = 80

Computational aspects & summary

I

semi-local models outperform local and regional models: mean CRPS distance-based semi-local

I

clustering-based < local < regional semi-local

computational costs regional <

I

<

clustering-based < local < semi-local

distance-based semi-local

allow for estimation of complex models without numerical stability issues, straightforward to implement

Outlook I

extension towards multivariate model estimation utilizing spatio-temporal dependecies (Annette M¨ oller’s talk)

I

comparison and combination with methods based on historical analogs

I

investigate alternative, meteorologically meaningful similarity measures

Lerch, S. and Baran, S. (2017) Similarity-based semilocal estimation of post-processing models, Journal of the Royal Statistical Society Series C, 66, 29–51

Outlook I

extension towards multivariate model estimation utilizing spatio-temporal dependecies (Annette M¨ oller’s talk)

I

comparison and combination with methods based on historical analogs

I

investigate alternative, meteorologically meaningful similarity measures

Lerch, S. and Baran, S. (2017) Similarity-based semilocal estimation of post-processing models, Journal of the Royal Statistical Society Series C, 66, 29–51

Thank you for your attention.

Similarity-based semilocal estimation of post ...

This requires choosing an appropriate. ▷ parametric model Fθ,g (Sándor Baran's talk). ▷ loss function for parameter estimation (Bernhard Klar's talk).

2MB Sizes 0 Downloads 118 Views

Recommend Documents

Empirical Evaluation of Volatility Estimation
Abstract: This paper shall attempt to forecast option prices using volatilities obtained from techniques of neural networks, time series analysis and calculations of implied ..... However, the prediction obtained from the Straddle technique is.

Age estimation of faces: a review - Dental Age Estimation
Feb 27, 2008 - The current paper reviews data on the ... service) on the basis of age. ... than the remainder of the face and a relatively small, pug-like nose and ..... For example, does wearing formal attire such as a business suit, associated with

Post of Pilot.pdf
have to appear for the Walk-In-Interview on 27th and 28th April, 2017 from ... He shall be paid a consolidated salary as follows:- ... Displaying Post of Pilot.pdf.

Creation of post of Dietician.PDF
(\r.(. /i).S \.-/. ' SIT{Fr €TiTW/GOVEIdNIMEI{T OF I}{DI.A. h riarrqlMlNrsrRy oli ... it is advised to furnish your rernarks with requirenrent of man power analysis with ...

Shrinkage Estimation of High Dimensional Covariance Matrices
Apr 22, 2009 - Shrinkage Estimation of High Dimensional Covariance Matrices. Outline. Introduction. The Rao-Blackwell Ledoit-Wolf estimator. The Oracle ...

Bayesian Estimation of DSGE Models
Feb 2, 2012 - mators that no amount of data or computing power can overcome. ..... Rt−1 is the gross nominal interest rate paid on Bi,t; Ai,t captures net ...

Estimation of Separable Representations in ...
cities Turin, Venice, Rome, Naples and Palermo. The range of the stimuli goes from 124 to 885 km and the range of the real distance ratios from 2 to 7.137. Estimation and Inference. Log-log Transformation. The main problem with representation (1) is

PARAMETER ESTIMATION OF 2D POLYNOMIAL ...
still important to track the instantaneous phases of the single com- ponents. This model arises, for example, in radar ... ods do not attempt to track the phase modulation induced by the motion. Better performance can be .... set of lags as τ(l). 1.

Nonparametric Estimation of Triangular Simultaneous ...
Oct 6, 2015 - penalization procedure is also justified in the context of design density. ...... P0 is a projection matrix, hence is p.s.d, the second term of (A.21).

GMM Estimation of DSGE Models.pdf
simulation|that is, the simulated method of moments (SMM)|is examined in this chapter. as well. The use of the method of moments for the estimation of DSGE ...

MODERN TECHNIQUES OF POWER SPECTRUM ESTIMATION
which arise directly from the Fourier retransformation. If one wants, for example ... in related formulas following, if we were to center our values of t at 0), we find that ... The most frequent situation will call for both reasonable care in preser

Nonparametric Estimation of Triangular Simultaneous ...
Sep 10, 2017 - ⇤I am very grateful to my advisors, Donald Andrews and Edward Vytlacil, and ..... follows that g0(x) = E[y|x, v = ¯v]¯λ, which we apply in estimation as it is convenient to implement. ..... Given the connections between the weak i

DECENTRALIZED ESTIMATION AND CONTROL OF ...
transmitted by each node in order to drive the network connectivity toward a ... Numerical results illustrate the main features ... bile wireless sensor networks.

Identification and Semiparametric Estimation of ...
An important insight from these models is that plausible single-crossing assump- ...... in crime and commuting time to the city center in estimation using a partially.

Nonparametric Estimation of Triangular Simultaneous ...
Department of Economics. University of Texas at Austin [email protected]. February 21, 2017 ... I also thank the seminar participants at Yale, UT Austin, Chicago Booth, Notre Dame, SUNY Albany, Duke, Sogang, SKKU, and Yonsei, as well as th

Nonparametric Estimation of an Instrumental ...
in the second step we compute the regularized bayesian estimator of ϕ. We develop asymptotic analysis in a frequentist sense and posterior consistency is ...

ESTIMATION OF CAUSAL STRUCTURES IN LONGITUDINAL DATA ...
in research to study causal structures in data [1]. Struc- tural equation models [2] and Bayesian networks [3, 4] have been used to analyze causal structures and ...

ESTIMATION OF FREQUENCY SELECTIVITY FOR ...
Abstract. In this paper, estimation of both global (long term) and local (in- stantaneous) ... file (PDP) parameters corresponding to a specific PDP model is given. ... discussed. 2 System model. An OFDM based system model is used. Time domain sample

Semiparametric Estimation of Markov Decision ...
Oct 12, 2011 - procedure generalizes the computationally attractive methodology of ... pecially in the recent development of the estimation of dynamic games. .... distribution of εt ensures we can apply Hotz and Miller's inversion theorem.

nonparametric estimation of homogeneous functions - Semantic Scholar
xs ~the last component of Ix ..... Average mse over grid for Model 1 ~Cobb–Douglas! ... @1,2# and the mse calculated at each grid point in 1,000 replications+.

Nonparametric Estimation of an Instrumental ...
Oct 6, 2009 - ϕ(Z) is not the conditional expectation function E(Y |Z). ... Integral equation of the first kind and recovering its solution ϕ is an ill-posed inverse.

nonparametric estimation of homogeneous functions - Semantic Scholar
d. N~0,0+75!,. (Model 1) f2~x1, x2 ! 10~x1. 0+5 x2. 0+5!2 and «2 d. N~0,1!+ (Model 2). Table 1. Average mse over grid for Model 1 ~Cobb–Douglas! s~x1, x2! 1.

Noise-contrastive estimation: A new estimation principle for ...
Any solution ˆα to this estimation problem must yield a properly ... tion problem.1 In principle, the constraint can always be fulfilled by .... Gaussian distribution for the contrastive noise. In practice, the amount .... the system to learn much

The Removal of Post-sclerotherapy Pigmentation following ...
Nov 9, 2011 - The data collected were analysed by three independent researchers. ... a univariate analysis of variance (ANOVA) for dependent groups.