This maproom presents an approximate decomposition by time scale of twentieth-century precipitation variations.
Three scales are defined, denoted "trend", "decadal" and "interannual". These correspond loosely to secular variation due to anthropogenic influence and the low- and high-frequency components of natural variability (variability intrinsic to the climate system), respectively.
The divide between decadal and interannual scales corresponds to a period of 10 years, so that variability due to the El Niño-Southern Oscillation (ENSO) falls into the interannual category, while variability on time scales of 10 years or longer is classified as decadal. The procedures used to separate these signal components, as well as some cautionary notes regarding their interpretation, are discussed in an accompanying EOS article and in a more detailed reference document.
A range of analysis and display options is available: The user may define a season of interest, in which case the decomposition will be performed on the corresponding seasonally-averaged data. Results may be displayed either as a map, or as time series, in the latter case at an individual gridpoint or averaged over a user-selectable area. Maps may display either the standard deviation or the percent of variance in the raw data explained by variability on the selected time scale.
At particular locations, data may include "filled" values, where instrumental measurements are lacking. Since the presence of many filled values may degrade analysis results, a screening procedure has been implemented, whereby individual gridpoints are rejected if their records contain too many such values. Since the imposition of such a requirement results in removal of gridpoints from consideration, stricter screening results in fewer available points, especially in the case of precipitation. The user is therefore given a choice in the Precipitation Maproom, between a high level of temporal coverage (strict screening), a high level of spatial coverage (no screening) and a compromise between these two extremes. It is recommended that the high temporal coverage option be chosen if possible, the compromise option being a fallback. The high spatial coverage option may yield less reliable results, and is provided primarily so that the user can see the full geographic range of dataset coverage.
Although the decomposition of a signal into trend, low- and high-frequency components may seem straightforward, the analysis presented involves a number of subtleties. This document provides a more detailed look at the analytical procedures utilized than does the overview presented in Greene et al. (2011), and offers a number of caveats regarding the interpretation of maproom displays.
Data processing consists of three steps: Screening the individual gridbox values for filled data and for very dry seasons and regions, detrending in order to extract slow, trend-like changes and filtering, to separate high and low frequency components in the detrended data. Each of these steps is described below. Data are processed gridbox by gridbox, meaning that results in adjacent gridboxes are not compared or combined, except when the user requests that analysis be performed on area-averaged data. Averaging over gridboxes is then performed prior to the time scales decomposition.
The underlying datasets employed are complete, i.e., they do not contain missing values. This does not mean, however, that actual measurements were available for every month and at every geographic location covered by the data. The completeness requirement has been imposed by the data providers with particular uses in mind and is met by "filling in" values for which actual station measurements do not exist. The exact manner by which this is accomplished is described in documentation linked on the dataset pages; these can be accessed via the maproom pages based on the respective datasets.
Relatively little filling has been performed on the temperature data, so in this case the screening requires that all data values represent actual measurements, rather than filled values. The screening procedure employed for precipitation is more flexib le, with consideration given to both the number and the distribution in time of actual measurements contributing to each gridpoint value. A trade-off between spatial and temporal coverage then comes into play, with a higher degree of temporal coverage corresponding to fewer qualifying gridpoints, and vice versa. The user can control this balance by choosing among "high temporal coverage," "high spatial coverage" and "intermediate temporal and spatial coverage." Because the focus of the maproom is time series behavior, it is recommended that the user prefer "high temporal coverage" or the intermediate option whenever possible. The "high spatial coverage" option presents the data without any temporal screening, so results from the time scales decomposition may be less reliable than with the other choices.
The "high temporal coverage" option imposes the same requirement as that imposed on temperature, viz., that all data values must represent actual measurements, and that none can be filled. For the "intermediate" option this requirement is relaxed somewhat, with at least half of the data values required to represent measurements.
In addition, it is required that the data be relatively uniformly distributed in time. For example, using the intermediate option, the 50% of values that are not filled may not all be concentrated in the second half of the data series. As presently implemented , the uniformity requirement is based on a 10-year sliding window. The fraction of filled values within this window is not permitted to fall below the specified threshold.
In addition to the temporal screening there is the requirement that, for precipitation, climatological seasonal rainfall must exceed 30 mm. Such a threshold would represent very dry conditions, rendering the utility of the time scales decomposition questionable. In addition, even small fluctuations in precipitation would seem large compared with such a dry climatology, increasing the variance of estimated precipitation variations. The minimum screening requirement avoids these situations.
Gridpoints failing these requirements are shown as blank on the maps presented; clicking on such points returns a "no data" message. When the user selects "area average" for a region, only those points meeting the minimum data requirements are utilized in computing the area-averaged data. It should be apparent that area-averaging over a large area that contains few qualifying gridboxes will not produce a result that is regionally representative.
Because even partially filled records may be expected to degrade analytical results to some degree, and because data at individual gridpoints may be noisy, it is probably best, for the sake of robustness, to average the data over at least a small region before applying the time scales decomposition.
Trends are often computed in the time domain, in which case they might be expressed, for example, as a change of so many millimeters per month occurring per decade. The common procedure of fitting a linear trend assumes that such a rate of change is constant with time.
The map room takes a different approach, based on a simple conceptual model: Rather than expressing local or regional trends as functions of time, we relate them instead to global temperature change. The assumption is not that precipitation (or temperature) changes simply as a result of the passage of time, but rather, because of the warming of the planet. It is in this sense that the trend component, as computed in the maproom, can be identified with "climate change." Such a trend has a functional, rather than simply a numerical significance.
Computation of the global temperature record to be used as a regressand is less simple than it sounds. Fluctuations in the Earth's climate have many sources, including "natural" variability — intrinsic variations that are not associated with anthropogenically-induced climate change. Such variations, if large enough in scale, can significantly influence, or "project" onto the global mean temperature. If we take the latter to represent in some sense the signature of climate change, there is a risk that we will unintentionally include some component of natural variability, which will then mistakenly be identified with this signature.
To circumvent this problem the global temperature signal is computed using an ensemble of general circulation models (GCMs). These models, which constitute a comprehensive representation of our current understanding of the mechanisms of climate variability and change, underlie much of the Fourth Assessment Report of the Intergovernmental Panel on Climate Change (IPCC, 2007). Simulations from the "Twentieth Century Climate in Coupled Models" (20C3M) experiment are used.
As with the real Earth, climate in each of these simulations includes both a "forced" response (what we think of a climate change) and natural, "unforced" variations. However, the unforced variability is incoherent from model to model — there is no synchronization or phase relationship among the models, and indeed, the character of each model's unforced variability differs to a greater or lesser degree from that of the others. To obtain an estimate of the forced response, we average together the 20th-century global mean temperature records from the members of this ensemble, which here includes 23 (nearly all) of the IPCC GCMs. Averaging has the effect of attenuating the incoherent (i.e., uncorrelated) unforced variability while enhancing that part of the response that the models have in common — the climate change signal. The multimodel averaging can thus be said to increase the signal-to-noise ratio, where "signal" refers to the common climate change response and "noise" the unforced natural variability. This ratio is increased in the multimodel mean relative to that of the individual simulations. Most of the models provide multiple 20C3M simulations; to put the models on an even footing, a single simulation from each model is used to create the multimodel average.
The multimodel mean signal is further processed, by lowpass filtering. This has the effect of removing most of the residual year-to-year and decade-to-decade variability that has not been averaged away in the formation of the multimodel mean. The resulting smoothed global temperature signal, which serves as the signature of the forced climate change response, is shown in Fig. 1. Downward "bumps" in this signal in the 1900s, 1960s and early 1990s can probably be attributed, at least in part, to major volcanic eruptions, which have a short-term cooling effect; to the extent that these variations are expressed in regional signals they will be recognized as part of the forced response. Although the forcing is not anthropogenic in this case, it is nevertheless considered "external" as far as the maproom is concerned: Volcanic eruptions are not believed to be associated, at least in any easily demonstrable way, with natural climate variability, so it was deemed incorrect to treat them as such.
Figure 1: The global mean "climate change" temperature record used for detrending.
The trend component of a local temperature or precipitation signal is extracted by regressing the local series on the global temperature signal of Fig 1. Fitted values from the regression represent, by construction, that part of the regional signal which is linearly dependent on global mean temperature. It is in this sense that the trend, as here computed, may be thought of as the climate change component of the regional signal.
It is worth noting that for a local or regional signal that is being analyzed in the maproom, the entanglement of forced and natural components is still possible. This is because, while the signal of Fig. 1 has been effectively stripped of natural internal variability, a real-world signal may still contain natural components that are "masquerading" as trend. This might happen, for example, if some natural mode of variability were to be increasing over a relatively long time period, say the last 30 years of the 20th century. In such a case this mode might motivate a similar increase in values of the regional series being analyzed, which then "maps" onto the global mean temperature increase shown in Fig. 1. The Atlantic Multidecadal Oscillation (AMO, see, e.g., Enfield, 2001) exhibits a signal something like this; to the extent that the AMO influences local climate, there may be some possibility for this sort of misidentification to occur (see, e.g., DelSole et al., 2011). In general, and for the more approximative type of assessment for which the maproom is designed, we do not believe that such entanglement will pose a major problem with interpretation.
Figure 2a illustrates the detrending step, as applied to a typical precipitation record obtained from the maproom. Note that the inferred trend is negative, and appears as a shifted, scaled inversion of the signal shown in Fig. 1. The inverse characteristic results from the fitting of a downward-trending regional signal; the fact that the inferred trend is a scaled, shifted version of the signal of Fig. 1 is a characteristic of the linear regression. Recall, finally, that the inferred trend represents a regression on global mean temperature; this explains its nonlinearity in the time domain.
Figure 2: Stages of the maproom decomposition process. (a) Trend component, represented by the fitted values in a regression of the local signal onto the multimodel mean temperature record of Fig. 1; (b) Residual signal from this regression and its lowpass-filtered counterpart, the latter identified with the decadal component of variability; (c) Interannual component, which is the residual signal in (b), from which the decadal component has been subtracted.
If the fitted values from the regression onto the global multimodel mean temperature record of Fig. 1 are taken as the "climate change" trend, the residuals from this regression then represent the natural, unforced component of variability. The next step in the analysis aims to decompose this residual signal into "decadal" and "interannual" signals, representing respectively the low- and high-frequency components of natural variability.
To do this, the residuals are lowpassed by filtering, using an order-two Butterworth filter with half-power at a period of 10 year. Although the Butterworth design has some desirable properties that make it well-suited to this task, any number of alternate filtering procedures could also have been used; testing indicates that results are not sensitive to the filter details. Filter parameters were chosen (a) so as to effect a clean separation between low- and high-frequency components without introducing instability in the filter response (this refers to the filter order) and (b) to effectively classify variability due to El Niño-Southern Oscillation (ENSO) as "interannual." With the order-two filter, covariation between the two components generally amounts to no more than a few percent of the variance of the initial "raw" series. (In the real world a "perfect" separation of time scales is not achievable; all practical filter designs represent compromises in this regard.)
ENSO exhibits a broad spectral peak in the 2-8 year band. Phenomena responsible for variability on longer time scales belong to class of processes that are less well-understood, and whose predictability is currently the subject of active research (see, e.g., Meehl et al. 2009). This "low-frequency" class includes large-scale modes such as the Pacific Decadal Oscillation (PDO) and Atlantic Multidecadal Oscillation (AMO), as well as low-frequency stochastic variations. Thus the filtering effectively partitions variability by process class, not simply by nominal time scale.
This second stage in the decomposition in illustrated in Fig. 2b, which shows in black the "natural" residual from the detrending operation of Fig. 2a (i.e., the raw initial signal minus the trend component). Superimposed on this is the green "decadal" signal, which represents the output of the lowpass filter, applied to the natural residual.
Finally, the interannual component is computed as the difference between black and green traces in Fig. 2b, i.e., the residual from the detrending step minus its lowpassed incarnation. Shown in Fig. 2c, this signal represents that part of natural variability having its expression at periods shorter than ten years. The trend (red), decadal (green) and interannual (blue) signals are what is shown in the maproom when the user either clicks at a point or chooses "area average," the latter in order to display the time scale decomposition as applied to an area-averaged signal.
As noted above, detrending via regression on the multimodel temperature record represents only an approximate separation of forced and natural variability; rigorously performed, such separation is not a simple exercise. Depending on timing and period, natural fluctuations in the data could project onto the multimodel mean temperature signal and be incorrectly identified with the forced response. The potential for such entanglement becomes lower as length-of-record increases, and vice versa.
We recommend the analysis of area-averaged data if possible, for a number of reasons. First, data at an individual gridpoint is likely to be noisy. For example, a couple of wet years near one terminus of the series could create the impression of a trend, even though there is no overall trend in the series. Even if the gridpoint passes the intermediate screening criteria a number of values may have been filled, with potentially misleading effects. These problems will generally be ameliorated to some degree by area-averaging, for the same reason that multimodel averaging increases the signal-to-noise ratio of the resultant series: Anomalous events will be less likely to occur in many locations at once, so their effects will be reduced.
The maproom shows the percent of variance in the raw data that is explained by each of the three components. The alert user will notice that these percentages often sum to less than one hundred percent. This is because the filtering procedure is necessarily imperfect, and does not completely separate the decadal and interannual components of the detrended series, leading to a degree of covariance between them. (This is the filtering problem known as "leakage.") The value for this covariance is also provided. Twice the covariance value, added to the trend, decadal and interannual variance percentages, should add to one hundred percent, give or take a percent for rounding error.
After detrending, about 20% of the variance of annually-resolved white noise would be expected to accrue to the decadal component, as here defined. White noise is a random process having no "memory," in the sense that its value at a particular time does not exhibit any dependence on its values at other previous times. This differs from processes having memory or "persistence," in which the process level is dependent on previous values (such processes tend to vary more slowly than white noise). Thus, a decadal variance fraction of as much as 20% (meaning the decadal fraction divided by the sum of decadal and interannual fractions) should not be mistaken for the signature of a systematic decadal "oscillation," or even a slow random process, that differs from white noise.
Although the variance decomposition, or other aspects of the maproom plots may provide some sense of future expectations (in the statistical sense), the maproom is primarily a means of deconstructing past variations, not a predictive tool. In particular, it cannot tell us how the character of variability may change in response to global warming and thus, how the decomposition of variance might evolve with the passage of time.
Thank you for visiting the Time Scales Maproom. We anticipate that interaction with maproom users will help us to understand how the product might be improved. Questions or comments are therefore solicited, and may be addressed to help@iri.columbia.edu .Please include the phrase "Time scales" in the subject line.
DelSole, Timothy, Michael K. Tippett, Jagadish Shukla, A Significant Component of Unforced Multidecadal Variability in the Recent Acceleration of Global Warming. J. Climate, 24: 909—926. doi: 10.1175/2010JCLI3659.1, 2011.
Enfield, D.B., A.M. Mestas-Nunez and P.J. Trimble, The Atlantic Multidecadal Oscillation and its relationship to rainfall and river flows in the continental U.S., Geophys. Res. Lett. 28: 2077—2080. doi : 10.1029/2000GL012745 , 2001.
IPCC, Climate Change 2007: The Physical Science Basis. Contribution of Working Group I to the Fourth Assessment Report of the Intergovernmental Panel on Climate Change [Solomon, S., D. Qin, M. Manning, Z. Chen, M. Marquis, K.B. Averyt, M.Tignor and H.L. Miller (eds.)]. Cambridge University Press, Cambridge, United Kingdom and New York, NY, USA, 2007.
Meehl, Gerald A., and Coauthors, Decadal Prediction: Can it be skillful, Bull. Amer. Meteor. Soc., 90, 1467—1485, doi: 10.1175/2009BAMS2778.1, 2009.
Greene, A.M, L. Goddard and R. Cousin, Web tool deconstructs variability in twentieth-century climate, Eos Trans. AGU, 92(45), 397, doi:10.1029/2011EO450001.
Global-mean multimodel-mean temperature record
Data Source:
CMIP3 multi-model ensemble mean
Observations
Data Source:
monthly mean precipitation and temperature from CRU TS 3.1
Contact help@iri.columbia.edu with any technical questions or problems with this Map Room, for example, the forecasts not displaying or updating properly.