by Rowan Douglas*
Operational climate forecasting is emerging as the latest frontier in climate services for the financial sector.
A new supply chain: scientists, financial markets, regulators
Enabled by new technology and supported by international legislation, a new supply chain linking climate science, insurance markets and financial regulation has emerged. Around it, a new fertile scientific community breaks down walls between public and private; pure and applied; science and industry. This intellectual and technical fusion offers opportunities as we seek to develop greater resilience to climate change and extreme events.
The grand challenge that is confronting this new community is one of sustainability. How does society, at local and global scales, achieve resilience and share the costs of extreme events so that lives and communities can be rebuilt after losses and disasters? Whether one chooses public mechanisms (taxation), private ones (insurance) or hybrids of both, the fundamentals remain: pooling risk so that the losses of the few are shared by the many.
Hazards insurance: a global product
The non-life insurance industry collects approximately two trillion US dollars in premiums per year from people around the world. About one third of these premiums (and the single largest element) is used to insure property and other assets against natural hazards.
Insurance companies use about US$ 200 billion per year of this income to buy their own insurance, called reinsurance. Catastrophes is the largest segment of the global reinsurance market. Reinsurance protects insurers against extreme losses related to frequency, severity and duration of natural hazards. In broad terms, 10% of the insurance premiums we pay for our homes, cars and businesses passes into a global pool called the reinsurance industry.
Around 90% of the funds are paid out in claims. (The remaining 10% covers administration and profit.) When there is a major bushfire in Australia, floods in central Europe, a typhoon in South-East Asia or hailstorms in the US Midwest, all of us who buy insurance contribute to supporting these communities. Insurance is the ultimate community product; reinsurance is the ultimate global community product.
The rise of catastrophe risk modelling
Until the late 1980s, insurance and reinsurance companies used historical claims records, coupled with the experience and instinct of underwriters to set premium rates for all lines of insurance. Many countries had rates fixed by tariffs. A run of natural and man-made losses in the mid-1980s and early 1990s – including Hurricane Juan (1985), Hurricane Hugo (1989), European Windstorm Daria (1990) and Hurricane Andrew (1992) – helped tear this system apart. With losses far in excess of previous experience, many insurance and reinsurance companies failed. Risk had increased and a new approach to underwriting was required.
This period of market stress coincided with technical innovations, not least the PC spreadsheet (so easily overlooked), which enabled new levels of analysis by underwriters and insurance executives. But the real seat of innovation came from a small group of US engineering consultancies driven by the losses from the Northridge earthquake. They took the view that there must be a more rigorous approach to evaluating the frequency and severity of hazards, the extent and vulnerability of exposed assets and the losses that could result. These firms (Applied Insurance Research, Risk Management Solutions and EQECAT) supported by reinsurance investors who encouraged new quantitative approaches, revolutionized the insurance industry over the last 15-20 years and created a new sub-industry: catastrophe risk modelling.
Catastrophe risk modelling represented the first effective mediation of science into the insurance industry, building firms’ resilience to extreme events. The fundamental approach and philosophy is common across different hazards and geographic areas.
Catastrophic events are infrequent and extreme. Consequently, models need to be constructed which represent the potential range of events and their financial impact. Models are based on core components that reflect the essential elements of risk – the hazard, the exposure and the vulnerability, as well as the insurance and reinsurance conditions that operate to protect those assets. Each is a critical component in the risk calculation process.
Calculating risk
Increasingly sophisticated catastrophe models do provide rigour to the process of financial risk quantification, even if they do not capture the full spectrum of risks that exist in the real world.
Historical records and weather data reanalysis are used to inform random event sets. These are developed to simulate extreme weather activity over thousands of years and generate a projection of the possible range and frequency of extreme events. A key in hazard modelling is the parameters measuring damage at and after each storm –simulating the wind field ‘footprint’ by choosing the most appropriate measure causing damage, such as maximum sustained gust speed or storm surge depth.
In parallel, databases are developed describing the location and character of exposed buildings and other assets. They cover factors such as age, construction type and height. The databases outline the resilience or vulnerability of a structure or other asset to the factor causing the damage.
|
|
|
©NOAA
|
The underestimates of losses from Hurricane Katrina contributed to a revolution in modelling that brought more public science to financial markets.
|
Weather data reanalysis and asset databases are critical to the calibration of the catastrophe models. The vulnerability component provides the means to estimate damage for varying hazard intensity (varying gust speed, water depth). Importantly for the insurance industry, it also estimates the potential financial loss caused, using both engineering and statistical sources. By combining data in a mathematical and statistical framework, one can estimate the probability of annual loss exceedence across a range of annual return periods. These outputs can be used to help decision-makers quantify loss potential.
This approach was proven successful during the hurricane season of 2005, best known for Hurricane Katrina. It was also noteworthy for the frequency of events, including Hurricanes Rita and Wilma later in the season, that also brought significant losses.
The insurance and reinsurance markets creaked a little under the strain of these losses, but there were few, if any, company failures. This was due to the rise of catastrophe risk modelling and its role in enabling insurers and reinsurers to become better capitalized against these events.
Public science in risk models
Yet the underestimates of losses to be expected from a hurricane with the attributes of Katrina brought the accuracy and dependency on catastrophe modelling into question. There could be no retreat. The only answer was more and better modelling, but the seeds of the next revolution had been sown. That revolution brought the deeper integration of public science into catastrophe risk modelling and reinsurance, and is still underway. Its effects are profoundly important to financial and scientific communities alike.
Regulating climate risk
A related revolution, closely linked to the catastrophe modelling industry, has been underway in the world of financial regulation. This is led by authorities seeking to ensure that institutions and customers they serve will be able to withstand market shocks and extreme events.
Throughout the world, insurance policies which consumers buy for their homes, cars, lives and other risks are expected to work to the 1 in 200 year tolerance. In other words, insurance and reinsurance companies should have access to sufficient capital to withstand the maximum probable loss (or combination of loss events) expected once in every 200 years. This is a uniquely high tolerance level among financial institutions.
Natural catastrophes, through their scale and impact on insurers and reinsurers, represent the single largest risk to insurance company capital. Hydro-meteorological hazards dominate both the frequency and magnitude of historic and modelled losses at the global scale.
Regulators’ concern over the financial market risk that the insurance industry faces (such as investment asset, liquidity and credit risk) is augmented by concerns about climate risk. They are concerned that clustering, correlation and uncertainty about climate variability and extreme events drive risk within a non-stationary climatic environment.
Regulatory requirements (such as the European Union Solvency II Directive that is emerging) are defined in terms of capital availability, to ensure that the probability of insolvency is <0.5%. Catastrophes are a major risk to portfolios at this level of probability. As a result, catastrophe modelling will form a significant component of the wider modelling required to fulfil the risk quantification, management and strategic planning requirements of the regulator.
The importance of meteorological hazards to the loss potential at the 0.5% probability, combined with the uncertainty attributed to climatic variability and the lack of observed data to represent potential extreme event ranges at these levels, creates a challenge to the insurer and regulator. They must effectively quantify the risk without confidence in the potential frequency and severity of the events that could cause the loss.
The US National Association of Insurance Commissioners has gone one step further, requiring insurers to disclose their estimation of the climate risk that their portfolios of insured assets face, as well as the steps those organizations are taking to protect their capital.
Public-private synergy
Continued synergy between public scientists and the catastrophe modelling community is critical. The synergy can provide tools and understanding to help insurance and reinsurance industries remain solvent in the face of extreme events. As regulatory requirements continue to focus on extremes, the convergence of global climate models and weather forecasting systems will be central to quantifying and assessing climate risk, and financially protecting communities in the face of climate uncertainty.
The integration of public science with risk markets and the populations they serve may offer entirely new opportunities to manage risk. Global climate models provide greater insight into time and space connections that support more confident portfolio diversification.
For the first time, high-resolution global climate models allow extreme weather systems such as tropical cyclones to be represented at an equivalent resolution to observed weather data. This is underpinning the drive towards understanding the probability of climate events. Crucially, the models examine the causal role of global climatic systems in determining temporal and spatial patterns of hazard intensity and frequency that control extreme event occurrences at the 1:200 year or greater recurrence frequency.
Various initiatives address the links between public science and the risk and reinsurance/insurance sector. The largest is the Willis Research Network, sponsored by Willis Group, an insurance and reinsurance broker. The Willis Research Network supports open science at almost fifty universities and public science institutions worldwide.
For example, the UK National Centre for Atmospheric Sciences and the UK Met Office have produced a dataset of tropical cyclone tracks equivalent to 300 years of observed weather, to a consistent quality and resolution across the globe. Through research undertaken at the Universities of Reading and Exeter, these data for the first time have been successfully incorporated into a catastrophe modelling format equivalent to existing models.
The research models global climate system processes such as the El Niño-Southern Oscillation and the Julian-Madden Oscillation, critical to determining annual distributions of extreme weather events. The tracks and their associated frequencies of occurrence can be analyzed at annual, seasonal and multi-annual time scales, taking account global cycles of climatic anomalies. The results provide new insight into the likely future patterns of extreme events, over and above that observed in the much shorter historical record.
For the first time, the increased skill of these models and their operational counterparts are also allowing multi-annual hurricane forecasting in the North Atlantic. The next challenge is to augment industry catastrophe model event sets with these outputs to provide greater confidence of current levels of variability.
In another example, the US National Center For Atmospheric Research (NCAR) and the Princeton University Geophysical Fluid Dynamics Laboratory focus on North Atlantic tropical cyclone risk. At NCAR a groundbreaking programme is underway to provide global indices of hurricane risk that accurately reflect the characteristics of damage-producing storms. These indices offer means of liberating new sources of capital to help protect exposed populations via public or private programmes.
Meanwhile, flood risk modelling, at local, regional or event global scales can now be informed by integrated climate and rainfall models. It is very early days, the dawn of a new era, with operational result many years away, but the trajectory and lines of integration are forming quickly.
Towards operational climate forecasting
All this forms the foundation of a movement that is likely to have revolutionary consequences for disaster risk management and insurance and reinsurance markets: operational climate forecasting.
The future direction of climate science and operational climate forecasting services set by the resolutions at the World Climate Conference-3 (Geneva, 2009) is having a profound impact on the ability of exposed populations and markets to manage risk.
For reinsurers and insurers, the key operational time horizon is six to 18 months ahead. The time horizon for strategic management is 10-15 years. There are new opportunities to match capital with risk over this period, driven by the potential for greater confidence of relative risk levels through better understanding of pertinent climates.
The UK Met Office Climate Service has developed a close relationship with an international group of reinsurers to design and implement operational services that support situational awareness and decision-making. German authorities, the US National Oceanographic and Atmospheric Administration and others have announced similar programmes. In light of national and intergovernmental policy, these trends are expected to continue. Within five to 10 years operational climate forecasting is likely to play an important role in supporting the financial stability of re/insurance companies and the oversight of regulatory authorities.
These are early days in the development of a computationally sophisticated stable of climate and weather models. The science is still young. Researchers will have to investigate the outputs of the models and use them in conjunction with observed data, in order to ensure their interpretation is appropriate. Yet this new source of robust, scientifically-based, peer-reviewed data – a spin-off of wider global climate modelling programmes – provides the risk community with new, complementary information. Climate and weather models will help assess the likely future levels of extreme weather risk and severity and improve risk decision-making in an increasingly uncertain future.
_______
* Chief Executive Officer, Global Analytics, Willis Re and Chairman, Willis Research Network