0% found this document useful (0 votes)
59 views8 pages

Rastogi 19535

This paper proposes a method to generate synthetic weather time series incorporating climate change forecasts to quantify uncertainty in building energy simulations. The method uses a small sample of historical weather data to generate large numbers of synthetic years rapidly. Simulation results using these synthetic future time series are compared to those from using recorded historical data to demonstrate the method.

Uploaded by

AT 98
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
59 views8 pages

Rastogi 19535

This paper proposes a method to generate synthetic weather time series incorporating climate change forecasts to quantify uncertainty in building energy simulations. The method uses a small sample of historical weather data to generate large numbers of synthetic years rapidly. Simulation results using these synthetic future time series are compared to those from using recorded historical data to demonstrate the method.

Uploaded by

AT 98
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 8

ASHRAE and IBPSA-USA SimBuild 2016

Building Performance Modeling Conference


Salt Lake City, UT
August 8-12, 2016

INCORPORATING CLIMATE CHANGE PREDICTIONS IN THE ANALYSIS OF


WEATHER-BASED UNCERTAINTY

Parag Rastogi and Marilyne Andersen


Interdisciplinary Laboratory of Performance-Integrated Design (LIPID)
École Polytechnique Fédérale de Lausanne (EPFL)
Lausanne, Switzerland
parag.rastogi@epfl.ch

ABSTRACT cannot be used to assess risk.


This paper proposes randomly-generated synthetic time To simulate the future performance of a building, i.e., an
series incorporating climate change forecasts to quantify explicit estimate of some performance parameter condi-
the variation in energy simulation due to weather inputs, tional on physically viable future projections, a ‘future
i.e., a Monte Carlo analysis for uncertainty and sensitiv- weather file’ is needed. Belcher et al. (2005) proposed
ity quantification. The method is based on the use of ‘morphing’, a simple solution that can be easily imple-
a small sample (e.g., a typical year) and can generate mented in the context of building simulation, since it only
any numbers of years rapidly. Our work builds on pre- requires one of three operations: addition (shifting), mul-
vious work that has raised the need for viable comple- tiplication (linear stretching), and a combination of the
ments to the currently-standard typical or reference years two (shift and stretch). Shifting is applied to those vari-
for simulation, and which identified the chief components ables for which an absolute change of mean is available
of weather time series. While we make no special efforts in climate change forecasts. Stretching works when the
to reproduce either extreme or average temperature, the change to mean or variance is given as a fractional change.
sheer number of draws ensures both are seen with either Finally, the combination is used when both the mean and
the same or higher probability as recent recorded data. variance of a variable need to be changed. For example,
if the forecast includes a change of minimum and maxi-
INTRODUCTION mum temperatures in addition to a change of mean tem-
The analysis of uncertainty in building simulation is re- peratures. Belcher et al. (ibid.) demonstrated their method
lated to the need for risk-conscious design. Existing stud- for three cities in the United Kingdom. They demon-
ies have largely focused on analysing the effect of un- strated the agreement of future heating degree day val-
certainty in material inputs (a kind of epistemic uncer- ues calculated using their ‘morphed’ Test Reference Year
tainty), and variations due to occupant behaviour (a type (TRY) and Design Summer Year (DSY) files, with those
of aleatory uncertainty). A few have focused on exam- calculated directly from the UKCIP02 report itself (the
ining the impacts of climate change and the effect of forecasts on which the morphed files were based). Ker-
the uncertainties inherent in simulation based on ‘future shaw et al. (2011) used the latest future weather genera-
weather’ (e.g., Belcher et al. 2005; Chinazzo et al. 2015a; tor from the UK, the UKCP09 , which is based on a fu-
Crawley 2008; Jentsch et al. 2008; Kershaw et al. 2011; ture rainfall generator model. The baseline climate, like
Wilde et al. 2008). These uncertainties in the weather in- most generators and projections, is 1961-1990. Upon cal-
put arise due to modelling assumptions (simplifications ibration, “change factors [were] applied to [recorded data
of physical phenomena, skipping phenomena that are not to] generate the future precipitation”. All “. . . other vari-
well understood), incomplete records (to calibrate climate able [were] created using mathematical and statistical re-
models), ‘downscaling’ (where global circulation models lationships with daily precipitation and the previous day’s
have to be ‘scaled’ down to a region of interest), among weather” (ibid.). The UKCP09 generator outputs 100 runs
other sources. Kershaw et al. (2011) argue that using a of 30 years each, from which the authors constructed 100
single typical or reference file is conceptually and compu- reference years. Eames et al. (2011) used the percentiles
tationally far simpler than working with several files, each of monthly mean Dry Bulb Temperature (TDB) to create
of which have some probability of occurring. They point reference years tied to certain percentiles, i.e., the median
out that while the original advantages of reducing simu- January “. . . combined with the median February, March,
lation time by incorporating smaller weather files should etc.”
now be irrelevant, the increasing complexity of building One question that arises in the creation of any synthetic
simulation codes has negated much of the gain in com- data is its advantages over recorded data. If long-term
putational speed. In any case, typical files, of any sort, high-quality data is available for some location, is there
any point in using synthetic data? As Kershaw et al.
(2011) point out, the utility of recent records in predicting
future return periods (i.e., probabilities of weather events
of interest) is limited by the length of the record. For ex-
ample, if a 100-year event (over a long enough record, this
event will occur roughly 1% of the time) happened thrice
in the last 10 years, does that make it a 3-year event or
not? While the return period obtained from any weather
generator is speculative, it does at least provide bounds
on a system’s response. Then, it is the decision-makers
who must choose the probability for which they would
like to design. For example, HVAC system failure may
be acceptable for some value of outdoor temperature or
episode of some intensity, which has a very low probabil-
ity of occurrence. Kershaw et al. (ibid.) warn that using
Figure 1: The single family home simulated as an exam-
the UKCP09 weather generator to assign return periods
ple, details of which are in Chinazzo (2014).
should be done with “extreme care”, and “. . . return peri-
ods longer than 5-years should be used with caution”. ogy as previous work, as does fig. 2.
A long record does enable a sensitivity analysis, but one
is still hostage to the vagaries of the weather when us- Previous Work
ing it. That is to say that there are several possible future Several publications, listed in Rastogi (2016) and Rastogi
conditions that may not have occurred in the recent past. and Andersen (2015), have showed that temperature, so-
There are no guarantees about what conditions may pre- lar radiation, and humidity can be divided into periodic
vail in the future based on knowledge of past conditions. and aperiodic components. That is, if the periodic part of
As far as we are aware, the temperatures of future years the original time series is removed (by subtracting Fourier
do not have to follow some well-defined mathematical re- series with appropriate periods, µt and ζt , for example),
lationship with temperatures from previous years, or even the remainder is aperiodic noise (εt ). These two parts are
some well-defined periodic relation. The intention of our shown in fig. 3 for Geneva. This noise is not entirely free
work related to the creation of synthetic weather data is from structure, however, and is generally well described
not to predict future weather. Incorporating stochasticity by low-order Seasonal Auto-Regressive Moving Average
does not automatically improve the predictive power of (SARMA) models (ψ(L)). Upon fitting these low-order
simulation for a specific time in the future. Rather, we ex- models, the residual is near-white noise. This residual or
pand the role of simulation in exploring design options by remainder term (rt ) is reshuffled, in 3-day blocks sepa-
broadening the test conditions. rated by month, to create new ‘resampled’ residuals (r̂t ).
This paper begins with an explanation of the method used These resampled series are input as the noise component
to construct future weather time series. We then present in simulating the fitted SARMA model to create new syn-
some descriptive statistics about the generated series. Fi- thetic aperiodic noise components ε̂t .
nally, the results of simulating a single family home with A Seasonal Auto-Regressive Moving Average (SARMA)
these ‘future time series’ are compared to simulation with model is a combination of seasonal and non-seasonal
recorded data from the last two decades. This building Auto-Regressive (AR) and Moving Average (MA) terms.
model has been previously described in Chinazzo (2014). The idea is that the value of a time series at a certain
point in time is predicted by a polynomial composed of
METHOD two parts: an AR part and an MA part. The AR part is
The work presented in this paper builds on previous work a regression of a time series on itself, i.e., its own val-
by the authors (Rastogi and Andersen 2015). We begin ues in the past. The MA part is averaged white noise, a
with a brief overview of previous work, avoiding repeti- weighted average of a finite number of white noise draws
tion as far as possible. The generation of synthetic future preceding the current time step. The seasonal terms in-
weather data presented here relies on two major steps: clude every Qth past value, e.g., every 24 hours back from
time series decomposition and resampling1 . Both steps the present. The non-seasonal terms refer to the last p
are explained in detail in our previous work, and sum- terms, e.g., 1-4 hours ago (generally p for AR and q for
marised here in fig. 2. This paper uses the same terminol- MA). The coefficients of the polynomial are estimated us-
1 We use the terms resampling and bootstrapping interchangeably in ing maximum likelihood estimation to minimise the resid-
our work. See Davison and Hinkley (1997) and Politis (1998) for a re- uals, rt ∼ N (0, σ). The details of these can be found in
fresher on the theory, or Rastogi (2016) for a summary. a time series analysis book like Cryer and Chan (2008),
residuals block resampled synthetic
start rt boot- residuals εt
non-periodic
SARMA strap rt
component
model
typical εt SARMA synthetic
fitting
time series Fourier model simulate time series
fitting Ψ(L) with rt
TSo Ψ(L) TSsyn
periodic replace with
component µt + ζt end
µt + ζt

future
time series Fourier fit
to daily values
TSf

Figure 2: Generating synthetic weather time series from typical data and future forecasts of daily mean values.

to simulate. The stochastic models added on to the low-


310
Fourier fit
resolution future series create variation around this fore-
Raw values
305 cast, generating (bootstrapped) confidence intervals.
300

The periodic parts of the meteorological time series being


Temperature [K]

295

290 considered, TDB and Relative Humidity (RH), are gen-


285
erally composed of a low-frequency signal and a high-
frequency signal. The temperature series needs three
280
Fourier pairs: one for annual seasonal variability, or a pair
275
of terms with a period of 8760 hours; one for diurnal vari-
270 ability, or a pair of terms with a period of 24 hours; and,
265 an additional pair with a period of half a year or 4380
Jul
Jan

Feb

Jun
Mar

Apr

Oct
Aug

Sep
May

Nov

Dec

hours, to shift the peak slightly to the right of centre to-


wards August. Humidity shows no appreciable diurnal
Figure 3: The periodic part of TDB series for Geneva variations, so reduces to aperiodic noise with the removal
(line) is overlaid on the raw hourly values (dots). Three of just an annual signal. We decided to not de-trend and
pairs of Fourier terms are used to create the periodic sig- simulate the global horizontal irradiation series separately
nal: 8760 hours, 4380 hours, and 24 hours. since it creates additional artefacts that are difficult to re-
or the documentation of a software like the ones we used: move without extensive, manual, post-processing.
MATLAB R arima, estimate, infer; and the forecast
package in R (Rob J. Hyndman and R Core Team 2015). The climate change forecasts available to us were daily
mean values for this century (up to 2100). Two Repre-
Periodic Signal sentative Concentration Pathways (RCPs) were explored
Previous work to create future weather files focussed on in our study, RCP 4.5 and RCP 8.5, details of which
a ‘fixed’ addition of some forecast to current data (mor- can be found in Climate Change 2014, pg. 8. The
phing) or the creation of future data through relation- first corresponds to an intermediate emissions scenario
ships with a single forecast variable (e.g., rainfall for the while the latter to one with very high Green House
UKCP09 projections). The most popular approach, mor- Gas (GHG) emissions. These RCPs are simulated using
phing, is limited to producing “. . . a future weather pat- GCMs, which are downscaled by meteorological agencies
tern. . . that is largely analogous to the present-day weather for their regions of interest. We had access to the Re-
in terms of diurnal cycles and extremes” (Jentsch et al. gional Climate Models (RCMs) for Europe through the
2008). In our work, the combination of a random SARMA CORDEX project website (World Climate Research Pro-
model and climate change forecasts creates ensembles of gramme 2015). There are several GCM model runs avail-
future time series, each of which is unique. As with any able on the CORDEX website forecasting each emissions
data-based methods, we cannot actually account for the scenario for Europe, all of which can be considered equiv-
changed physics of the atmosphere. That is what the alent. That is to say, there is no claim that any one model
Global Climate Model (GCM)-based forecasts are meant is more accurate or likely than another.
Incorporating Forecasts that instead of imposing arbitrary limits on the raw values
The process begins with a selection of forecast daily val- of a parameter, which are highly climate or context depen-
ues from one of the GCM/RCM model runs for either dent, it is possible to use standardised values and cut-offs.
RCP 4.5 or RCP 8.5. One may pick any one of these This helps to maintain consistency across climates and pa-
model combinations to create a ‘string’ (or ensembles of rameters. In our case, we found that choosing the larger of
strings) of 85 years (2015-2100), or use an average. For the 99.9 and 0.1 percentiles is sufficiently conservative to
this study, only one GCM/RCM model combination of remove outrageous values (like 100◦ C or -100◦ C) but not
daily values is used for demonstration. Each string cor- so conservative as to remove extremes. The time series
responds to either of the two RCPs, since each RCP sep- is censored for both high and low values. This is obvi-
arately represents a possible future outcome under a con- ously an arbitrary choice, and the generation of weather
sistent set of assumptions. files is only moderately affected if this cleaning is not car-
These future daily values are used to replace the low- ried out. We looked at various cut-off values, and could
frequency Fourier series µt . Conceptually, a Fourier fit not arrive at a conclusively universal one. This is because
with a period of 365 days fit to daily values is identical to we do not take a position on which extreme is too extreme.
one with a period of 8760 hours fit to hourly values. Thus, We expect that visual inspection or expert opinion is as
we can insert this ‘future’ low-frequency signal instead good as hard-coded checks in the generator. Most build-
of the ‘present’ low-frequency fit in the reassembly of a ing simulation programs have their own cut-offs for valid
complete future time series. Referring back to fig. 2, in- values, but since we use daily mean TDB in subsequent
stead of putting back the original µt to get the plain T Ssyn , steps, censored values are easier to work with. We cen-
we put in a different µ̂t that represents future daily mean sor both raw hourly values and the first difference of the
values. If present, the daily term ζt remains constant. Fi- series (hourly changes in values).
nally, adding the simulated noise values (εˆt ) creates any
number of variants (weather years) for a given combina- GHI, DNI, and DHI are treated differently from the oth-
tion of a future series (µ̂t ) and the (unchanged) daily sig- ers since no attempt was made to fit and remove a periodic
nal ζt , for a particular RCP. This procedure is used for component from any solar time series. As these three se-
TDB and RH, while the solar terms, Global Horizontal ries are dealt with in tandem, we will discuss only the pro-
Irradiation (GHI), Diffuse Horizontal Irradiation (DHI), duction of a synthetic time series for GHI. Instead of fit-
and Direct Normal Irradiation (DNI), are created using a ting models to create synthetic hourly values for solar ra-
nearest-neighbour bootstrap described below. diation, we decided to resample from the values available
Like we mentioned in previous work on creating ‘plain’ in the TMY file. There is a reasonably strong correlation
synthetic files (i.e., without climate change forecasts), between the daily sum of GHI and daily mean of TDB,
these synthetic ‘future’ time series have to be cleaned as evidenced by values of 0.7-0.75 for Pearson’s (linear)
(censored) due to the nature of the generation process. correlation coefficient (r) and Spearman’s rank correla-
For example, the SARMA simulation added to the peri- tion coefficient (ρ) in most climates. This should not be
odic signal may create a final value of 70◦ C for TDB or over-interpreted to mean that daily mean TDB is neces-
0.5 for RH, because the procedure does not ‘know’ that sarily well-described by a linear function of the daily sum
these values are invalid. Even if these physically invalid of GHI. Rather, it is an indication that, in addition to the
values are removed, there are still ‘outliers’ seen in some effect of the season (which is an indication of the ‘band’
series upon visual inspection. The definition of outliers is of temperatures within which most values in a month will
a complicated matter, so we use historical data as a guide. lie, and the hours of radiation in a day), high solar irra-
For example, if a certain hourly change in temperature diation during the day will generally coincide with higher
is seen in the source Typical Meteorological Year (TMY) mean temperatures. In this case, we are merely exploiting
file, then we assume that it is possible. Meaning that the this correlation to find valid day-long series of solar radia-
raw TDB values that caused this change need not be cen- tion. Belcher et al. (2005) point out that there do not seem
sored. There are several different techniques to remove to be any mechanisms in climate change models causing
outliers, of which we used a method based on standard massive shifts in the amount of solar radiation delivered
z-scores, day by day. What might change for some climates is the
number of cloudy or partially cloudy days, leading to a
xi − x̄ change in the quantum of solar radiation received over a
zi = , (1)
s long enough period like a year. Since the length of day
where, x̄ is the sample mean and s is the sample standard and maximum values of GHI are related to latitude, alti-
deviation. By itself, this score does not indicate that a par- tude, and the solar constant, none of which are affected by
ticular data point is an outlier. Rather, an arbitrary cut-off atmospheric concentration of GHGs the authors propose
point must be decided. The advantage of using z-scores is that it is valid to use past or typical data as the source of
future data.
1
The process of selecting future ‘solar days’ is split by
month, since the length of the day depends on the time

Cumulative Probability
of year. For each day in a given ‘future’ month, we cal- 0.8

culate the daily mean TDB. Then, we locate k days in


the TMY file, in the same month, that have the closest 0.6

daily mean temperatures to the future daily mean tem-


perature being considered. Of these k days, any random 0.4

one is chosen for its solar profile (i.e., hourly solar data), Recorded
which becomes the hourly data for the future day. In this 0.2 RCP4.5
way, hourly temperature values (represented by their daily RCP8.5
TMY
means) that have already occurred with certain hourly so- 0
-20 -10 0 10 20 30 40
lar values, occur in the future files as well. Some noise has o
Temperature [ C]
been introduced in the process by initially calculating the
k nearest neighbouring days to a future day, in the same
month, and then choosing one randomly. In our work, 1

we used k = 10 nearest neighbours. While the forecasts


available to us do include future daily mean GHI values,

Cumulative Probability
0.8

we chose not to work with these since their distribution


was not very different from the daily mean GHI values 0.6
seen in the source TMY file. Basing the selection of fu-
ture hourly values purely on mean GHI forecasts (i.e., by 0.4
locating the same value in the TMY file) could break the
Recorded
cross-correlation between TDB and the solar time series. 0.2 RCP4.5
RCP8.5
Variants TMY
0
Our method creates any number of variants for a given -50 0 50 100 150 200
year by simulating the SARMA model with bootstrapped Relative Humidity [%]
residuals, as described in our previous work (Rastogi and Figure 4: The eCDF of hourly TDB [top] and RH [bot-
Andersen 2015). The final values are a combination of the tom] values for TMY, recorded, and future data.
synthetic residuals ε̂t , unchanged daily Fourier term ζt ,
and future forecasts µ̂t , which are equivalent to the low- identical. The monthly extents of values seen in the syn-
frequency Fourier term µt . Thus, each ‘string’ of 85 years thetic and recorded data are given in fig. 6. These figures,
is based on one GCM/RCM model and a set of simulated and the percentiles given in table 1, show that the extreme
residuals. The individual variants for each year, of which temperatures and humidity values seen in the past 30-odd
there can be any number, are an ensemble representing the years of recorded data are well reproduced (and exceeded)
possible weather that may occur in the future. They are in the synthetic series.
meant to be used together, not individually, since the au- Looking at the monthly extents (table 1 and fig. 6), it
thors do not claim that any one variant is more likely than seems that both the recorded data (1984-2014) and cli-
another. The nature of Monte Carlo simulation dictates mate change forecast-based files are slightly warmer than
that a small sample size, or even worse a single sample, is the design temperatures, i.e., the lower extremes are less
almost certainly not representative of the phenomena be- extreme and the higher extremes are worse. The fact that
ing simulated. Similarly, the future time series should also the recorded data contains warmer extremes is no surprise
be interpreted loosely – neither climate change forecasts when considering that the TMY file for Geneva is com-
nor our methods are precise enough to predict a specific posed of months from the 1980s and 1990s, whereas the
value in the future. A simple rule of thumb we propose 2000s have broken several high temperature records. De-
is that the years of each decade should be treated as being pending on the source of the typical weather files for dif-
interchangeable. ferent locations, the ‘baseline’ or source years may be
even older. However, the absence of winter extremes
RESULTS & DISCUSSION
should not be taken as a given: climate change forecasts
Raw Weather do not simply ‘shift’ the existing weather data upwards.
Figure 4 shows the empirical Cumulative Distribution The occurrence and intensity of extreme events, e.g., very
Functions (eCDFs) of synthetic TDB and RH, alongside low or very high temperatures, is unpredictable with an-
recorded values and TMY. The distributions are virtually thropogenic climate change, since past records are less
Table 1: ASHRAE design temperature percentiles for
Geneva. All TMY values are taken from the header of the Recorded RCP 4.5 RCP 8.5 TMY
TMY file, except for the 98th percentile. This was calcu- 40
lated, and so represents the 98th percentile of the ‘mean’
signal. 30

Temperature [ C]
Perc. Geneva – 50-sample run

o
(%) Rec. TMY Syn. RCP4.5 RCP8.5 20

99.6 31.13 30.05 30.80 32.56 34.43 10


99.0 29.21 28.33 29.00 30.24 31.93
98.0 27.35 26.80 27.20 28.00 29.56 0
50.0 10.10 10.00 10.41 9.66 10.53
2.0 -2.77 -3.70 -1.90 -4.85 -4.09 -10

Jul
Jan

Feb

Jun
Mar

Apr

Oct
Aug

Sep
May

Nov

Dec
1.0 -4.04 -5.00 -4.80 -6.53 -5.80
0.4 -5.63 -7.20 -6.90 -8.56 -7.82

Recorded RCP 4.5 RCP 8.5 TMY


120
representative of the future. On the other hand, the ‘av-
erage’ temperature is marching inexorably upwards.
Relative Humidity [%]
100
The broad agreement between new data and our synthetic
climate files would suggest that the synthetic approach is 80
usable for simulating diverse future conditions. The dis-
agreement of the synthetic data with TMY data is also a 60
plus, as explained above. We should point out, however,
that the ‘plain’ synthetic files (i.e., ones that did not incor- 40
porate forecasts) also did a good job of producing extreme
values (as reported in our previous publication). Our pre- 20

Jul
Jan

Feb

Jun
Mar

Apr

Oct
Aug

Sep
May

Nov

Dec
liminary conclusion is that the appearance of extremes de-
pends more strongly on the SARMAsimulation, so it oc-
curs with or without the inclusion of climate change fore- Figure 6: The extents of hourly TDB [top] and RH [bot-
casts. tom] values, by month, for TMY (dotted line), recorded
(solid), and future (dashes) data. The upper lines repre-
Simulation Results
sent monthly maximums (99th percentile), while the lower
The four different kinds of weather input files shown in lines are for monthly minimums (1st percentile). The lines
fig. 7 are: recorded files, which include typical files from in the middle are for monthly means. The synthetic data
the United States Department of Energy (USDOE) web- extremes are appreciably higher, but the probability of
site and the METEONORM (MN) software; plain syn- those extremes is still as low as in the recorded data.
thetic files, which do not include climate change forecasts;
and synthetic files incorporating projections from the two ergy usage should be more sensitive to shifts in overall
RCPs under consideration. The simulation results show temperatures rather than the occurrence of intense events,
very similar distributions. That is to say that the extents so the significant overlap between plain and future syn-
of the spread, and its shape, are roughly equal for the var- thetic files is somewhat surprising. We expected that the
ious kinds of files. The RCP4.5 files have a more skinny addition of an upward signal would change the overall en-
distribution because fewer of them were simulated. The ergy usage appreciably. Looking at fig. 8, we see the rea-
RCP8.5 values show the largest extents. In general, the son that this is not apparent in fig. 7. The range of val-
synthetic files (both plain and with forecasts) show ex- ues possible in the future, i.e., the spread due to different
tremes comparable to or bigger than the recorded data. weather possibilities in the same climate, is so large as to
This is an important result: the synthetic files reproduce drown out the gradual shift seen year-by-year. So, while
extremes near the ones seen in the past 30-odd years, with the prediction is for a gradual warming of the climate, the
nearly the same probability, and extend them a little fur- uncertainty in future values makes forecasting noticeable
ther. We are confident that larger samples of synthetic files reductions in heating (or increases in cooling) very inac-
(plain and future) will show extremes of longer return pe- curate. In upcoming work, the authors are analysing other
riods (i.e. lower probability). metrics such as peak demand and overheating to assess if
We expect that the annual sum of heating or cooling en- useful predictions can be found for those. For example,
Heating [kWh/m2 ]
0.35 190
rec.
170
0.3 syn.
rcp45 150
0.25 rcp85 120
probability

100
0.2 1981 2001 2021 2041 2061 2081 2100
Years
0.15
1

probability
0.1
0.5
0.05 Syn. Decade
Rec. Mean
0 0
50 100 150 200 50 100 150 200

Heating [kWh/m ] 2 Heating [kWh/m2 ]

Cooling [kWh/m2 ]
0.8 70
rec.
50
syn.
rcp45 40
0.6 rcp85 20
probability

0
1981 2001 2021 2041 2061 2081 2100
0.4 Years
1
probability

0.2
0.5
Syn. Decade
Rec. Mean
0 0
0 10 20 30 40 50 60 70 0 10 20 30 40 50 60 70

Cooling [kWh/m2 ] Cooling [kWh/m2 ]

Figure 7: Histograms for EUI heating [top] and cooling Figure 8: The EUI [kWh/m2 ] plotted by year, [top] heat-
[bottom]. The distributions of the four different kinds of ing, [bottom] cooling. The line at 2015 represents the
weather files are nearly identical. extent of results from plain synthetic files. Cumulative
distributions of annual energy use values in the next few
decades, 2010-2100, are plotted below each yearly plot.
the frequency of future extreme events described in Ker-
shaw et al. (2011).
by the authors in Chinazzo et al. (2015a,b).
CONCLUSION This paper demonstrates a method to include climate
In this paper, we have explained our method for incor- change forecasts with variation in individual values, but
porating climate change forecasts into an overall schema it cannot account for the physical effects of the build-up
for generating synthetic weather files. The use of these of GHGs in the atmosphere. Users must rely on climate
files is primarily to enable the exploration of what-if sce- models for that. For Geneva, the forecasts show a very
narios, vis-à-vis weather, to get a range of possible out- small upward trend of temperature. The synthetic time
comes (e.g., range of annual cooling energy used). So, series created using TMY files from the 1980s-90s and
while it is instructive to compare the synthetic data to re- the newest climate change forecasts tally well with recent
cent recorded data, the generation process is meant to also recorded data, which includes the 2000s. That is, the ef-
create values that have not been seen before. The point fect of including climate change forecasts on older data
of this exercise is not to predict the weather at a given (the TMY files) is similar to actual recently recorded data.
point of time in the future, since that is beyond the ken This was to be expected since the concentration of atmo-
of contemporary climate models. Instead, we are looking spheric GHGs has been increasing steadily for more than
to provide a sufficient variety of physically-valid weather a century, the effects of which have only become apparent
conditions based on GCM model outputs. Upon simula- in the past couple of decades.
tion, these conditions generate a statistically valid sample We have also discussed why our proposal is distinct from
of outcomes, like energy use, to have an idea of the robust- previous efforts based on morphing and similar tech-
ness of a building or design, an idea previously developed niques. While morphing is unable to produce files with
sufficient variety, we are able to produce very widely vary- 14th International Conference of the International
ing samples of weather from a future climate scenario Building Performance Simulation Association. Hyder-
rapidly. Like morphing and any other synthetic weather abad, India.
generator, it should be noted that our synthetic weather Crawley, DB (2008). “Estimating the impacts of climate
files are not explicitly accounting for geographical vari- change and urbanization on building performance”.
ability. That is to say, if a source TMY file is not repre- In: Journal of Building Performance Simulation 1.2,
sentative of the building site (e.g., due to urbanisation), pp. 91–115.
then our method will not correct for it. This is an impor- Cryer, JD and KS Chan (2008). Time Series Analysis:
tant limitation, and one we will address only in upcoming With Applications in R. Springer. 501 pp.
work, since the ‘change’ in weather conditions due to ur- Davison, AC and DV Hinkley (1997). Bootstrap Methods
banisation has nothing to do with the techniques we use and their Application. 1st. Cambridge University Press.
here. It is possible to coincidentally reproduce urban con- 594 pp.
ditions, but that is not guaranteed. Eames, M, T Kershaw, and D Coley (2011). “On the cre-
The method shown here, and in Rastogi (2016) and Ras- ation of future probabilistic design weather years from
togi and Andersen (2015), is also applicable when a long UKCP09”. In: Building Services Engineering Research
record of weather data is available. We have focussed on and Technology 32.2, pp. 127–142.
working with typical year files to expand applicability to IPCC (2014). Climate Change 2014 Synthesis Report:
practice. Longer, high-quality, records, where available, Summary for Policymakers. Geneva, Switzerland: In-
could be a better basis for calculating the various periodic tergovernmental Panel on Climate Change (IPCC).
and aperiodic components we use in our method. The in- Jentsch, MF, AS Bahaj, and PAB James (2008). “Climate
fluence of the quality of typical files is not formally ad- change future proofing of buildings - Generation and
dressed in our work, but the use of an ensemble of random assessment of building simulation weather files”. In:
files could ameliorate somewhat the impact of unrepresen- Energy and Buildings 40.12, pp. 2148–2168.
tative data on decision-making. Kershaw, T, M Eames, and D Coley (2011). “Assess-
ing the risk of climate change for buildings: A com-
ACKNOWLEDGEMENTS parison between multi-year and probabilistic reference
This work was carried out at the EPFL, and supported by year simulations”. In: Building and Environment 46.6,
the CCEM-SECURE project and the EuroTech Universi- pp. 1303–1308.
ties Alliance. The advice of Prof. A.C. Davison and M. Politis, D (1998). “Computer-intensive methods in statis-
Kuusela has been invaluable in the development of this tical analysis”. In: IEEE Signal Processing Magazine
work. G. Mavromatidis’ help in obtaining and interpret- 15.1, pp. 39–55.
ing the climate change forecasts is gratefully acknowl- R Core Team (2015). R: A Language and Environment for
edged, along with his support. The large number of simu- Statistical Computing. Vienna, Austria: R Foundation
lations shown here would not have been possible without for Statistical Computing.
the help of Dr. R. Evins. Rastogi, P (2016). “On the sensitivity of buildings to cli-
mate: the interaction of weather and building envelopes
References in determining future building energy consumption
Belcher, SE, JN Hacker, and DS Powell (2005). “Con- (in preparation)”. PhD thesis. Lausanne, Switzerland:
structing design weather data for future climates”. In: Ecole polytechnique fédérale de Lausanne.
Building Services Engineering Research and Technol- Rastogi, P and M Andersen (2015). “Embedding Stochas-
ogy 26.1, pp. 49–61. ticity in Building Simulation Through Synthetic
Chinazzo, G (2014). “Refurbishment of Existing En- Weather Files”. In: Proceedings of BS 2015. 14th Inter-
velopes in Residential Buildings: assessing robust so- national Conference of the International Building Per-
lutions for future climate change”. MSc. Lausanne, formance Simulation Association. Hyderabad, India.
Switzerland: EPFL. Wilde, P de, Y Rafiq, and M Beck (2008). “Uncertain-
Chinazzo, G, P Rastogi, and M Andersen (2015a). “As- ties in predicting the impact of climate change on ther-
sessing robustness regarding weather uncertainties for mal performance of domestic buildings in the UK”. In:
energy-efficiency-driven building refurbishments”. In: Building Service Engineering Research and Technol-
Proceedings of IBPC 2015. 6th International Building ogy 29.1, pp. 7–26.
Physics Conference. Torino. World Climate Research Programme (2015). CORDEX.
– (2015b). “Robustness Assessment Methodology for the WCRP CORDEX. URL: http://www.cordex.org/.
Evaluation of Building Performance With a View to
Climate Uncertainties”. In: Proceedings of BS 2015.

You might also like