General Information & Client Services
  • Americas: +1.212.553.1653
  • Asia: +852.3551.3077
  • China: +86.10.6319.6580
  • EMEA: +44.20.7772.5454
  • Japan: +81.3.5408.4100
Media Relations
  • New York: +1.212.553.0376
  • London: +44.20.7772.5456
  • Hong Kong: +852.3758.1350
  • Tokyo: +813.5408.4110
  • Sydney: +61.2.9270.8141
  • Mexico City: +001.888.779.5833
  • Buenos Aires: +0800.666.3506
  • São Paulo: +0800.891.2518

Learn how global regulations, demographic trends, and technology will impact insurers over the next few years and how they can best prepare for the changes.

What do you see as the main drivers influencing insurers over the next few years?

The insurance industry is still going through a period of change driven by a number of factors, as shown in Figure 1 – a few of which are worth discussing in detail.

First, as part of the ever-increasing regulatory demands, regimes such as Solvency II and IFRS now drive insurers to better understand risk and capital within their business. While Solvency II is a European initiative, its main principles are being adopted across the globe in countries such as Japan, South Africa, Mexico, and Australia. In addition, there is a growing trend for consumer protection legislation, in response to the incorrect selling of insurance instruments in the past. This legislation increases compliance costs and potentially impacts regulatory capital at a time when there are enormous competitive pressures.

A second major factor is the fact that over the last few years, insurers have struggled with low investment returns on government and corporate bonds. Insurers are also often required to hold a significant portion of their assets in bonds for regulatory purposes, with the cash flows from bonds being a good match to their long-term liabilities. The long duration of low returns, however, is forcing insurers to look at other investment types that mirror the cash flow profile of bonds but provide a higher rate of return. As a result, insurers increasingly look toward infrastructure and corporate loans, which were once the sole domain of banks. Thus, the delineation between banks and insurers is more blurred than it used to be.

In addition, there is the perennial question of how to reduce administrative, IT, and sales costs. For years, insurers have been trying to become more efficient, often with limited success. Increasing consumer legislation and awareness, coupled with new distribution models and aggressive competitors, is refocusing efforts on cost reduction. Insurers are considering a range of measures from outsourcing, process re-engineering, and replacing legacy technology infrastructures to new low-cost distribution channels. It is not easy to reduce costs in an environment with low investment returns and a lot of pressure on margins.

Perhaps one of the most profound changes to which insurers have to adapt is the aging population, which requires new types of insurance products and services. A good example is in the annuity market, where improving mortality and low returns is making traditional annuity products very unattractive. Thus, insurers are looking at new types of annuity products with lower charges and increased flexibility to draw-down money as needed, rather than purchase a typical rigid, deferred annuity policy.

Allied with this is that we now are at the cusp of huge advances in medical science, particularly with understanding DNA and its role in aging, disease, and gene therapy. These advances raise a whole range of ethical and underwriting issues that insurers must address. Based on current trends, an individual born today has a life expectancy of 94 years. The Cambridge University Geneticist Aubrey de Grey thinks lifespans will soon increase to 1000 years.

“We are just at the start of a revolutionary new understanding of how our own bodies work with incalculable consequences for our health and longevity.”

-Sir Tim Hunt and 49 other Nobel Prize Winners, in a Letter to the Financial Times, October 2012

The next few years present major challenges for insurers, but those that adapt and offer new products and services that customers seek will differentiate themselves.

Figure 1. Factors impacting the change in the global insurance industry
Factors impacting the change in the global insurance industry
Source: Moody's Analytics

How will capital management challenges impact insurers in the short and long term?

The fundamental tenant of Solvency II is that insurers should better understand the relationship between risk and capital in their business. The implication is that both regulatory and economic capital (the capital the insurer needs to run the business) are central to any insurance company’s decision-making.

There has naturally been a recent focus on regulatory capital (the Solvency Capital Requirements, or SCR), but firms are now considering the wider capital implications and are asking questions, such as:

  • How much risk capital will we need to survive the next five years?
  • What is the optimal legal entity structure from a capital perspective?
  • What is the most effective and profitable use of capital?
  • How can we grow the business profitably?
  • What should the firm’s optimal product and investment portfolios look like?
  • What scenarios might put the firm out of business?
  • How would an acquisition impact the firm’s capital requirement?

Regulatory capital is focused on a one-year, value-at-risk perspective, and while holding the requisite amount of regulatory capital is critical in the longer term, the focus will switch to a “what-if” capital analysis. This new approach is much more forward-looking, projects capital requirements over a longer time horizon, and is based on a range of possible economic scenarios. Thus, capital management is becoming the central theme of strategic planning.

The average insurer has 10 to 14 core systems installed. One multinational insurer had more than 80 core systems. This complexity means that a rip-and-replace strategy would be too risky for such a large project. Estimates vary, but it is not uncommon for insurers to spend up to 80% of their IT budget on maintaining legacy systems.

The Own Risk Solvency Assessment (ORSA) requires insurers to project their balance sheet three to five years into the future, encouraging a strategic approach to capital management. This approach is undertaken in relation to a baseline scenario and a number of other macroeconomic scenarios that an insurer thinks might represent a plausible future operating environment (illustrated in Figure 2).

All insurers must consider regulatory capital in the short term, but insurers who invest long term in capital planning tools and methodologies to better understand capital and risk in their businesses and allocate capital profitably will undoubtedly achieve competitive advantages over those who are less sophisticated.

Figure 2. Macroeconomic scenarios over a business planning horizon
Macroeconomic scenarios over a business planning horizon
Source: Moody's Analytics
Figure 3. Organizing and storing analytical data
Organizing and storing analytical data
Source: Moody's Analytics

What are the main challenges to implementing a centralized analytical repository to support business decision-making?

Risk and capital analytics and metrics are important to meeting the ever-increasing demand from regulators for more complex reporting and to support effective decision-making. An example is the quantitative reporting templates (QRTs) required by Solvency II, which demand 10,000 cells of analytical data organized in 70 templates. Supporting decision-making is perhaps more challenging, as regulatory reporting is highly prescribed. Decision-making requires a complex mix of risk, actuarial, finance, and investment data and varies by business.

Regardless of the measures, metrics, and information the business needs to support their decisions, it is essential that they analyze the underlying data at a high level of granularity (e.g., by line of business, product category, or geography). Typically, the data to support the required calculations, analyses, and reports will often be scattered in non-integrated silos or within disparate data architectures, applications, and methodologies, which can inhibit complete and accurate calculations. To circumvent this problem, many insurers are building a centralized analytical repository to specifically store finance, actuarial, risk, and investment data. Many already have repositories that store operational data, but these are often unsuitable for analytical data.

Organizing and storing analytical data at the lowest level of granularity greatly increases the flexibility for multi-dimensional analysis of results. In turn, this facilitates interactive dashboards and reports. Furthermore, when multi-dimensional analysis is generated from a common repository, it helps ensure data reconciliation and validation. The disadvantage of granularity, however, is that the repository may have to handle potentially huge volumes of data. Performance measures and methodologies continue to grow in complexity, becoming ever more data hungry.

Risk and capital analytics and metrics are important to meeting the ever-increasing demand from regulators for more complex reporting and to support effective decision-making. An example is the quantitative reporting templates (QRTs) required by Solvency II, which demand 10,000 cells of analytical data organized in 70 templates.

A centralized analytical repository means that risk and capital data and metrics are available to the whole enterprise to access and analyze. This approach also avoids duplication and unnecessary movement of data.

A risk and capital literate development team is essential for providing insight into how the repository can support risk-based decision-making. Nonetheless, building an effective repository can be a complex task as it involves numerous detailed steps:

  1. Users within the business define the information and reports they require and the drill-down (granularity) capabilities needed – often the main point of failure
  2. IT builds a flexible data model to structure the data and physicalize the repository
  3. IT identifies where the data will come from, determines gaps, and then builds the data integration and transformation processes required to load the data into the repository in a standard and structured format
  4. IT constructs the OLAP1 cubes necessary to support the generation of reports and dashboards specified by the business
  5. The repository is deployed so that business users can access and generate the reports and dashboards, either automatically or on an ad-hoc basis

How can insurers improve the quality of their analytical data?

The previous question looked at the importance of analytical data in the decision-making process. Information, though, is valueless if the underlying data from which it is derived is inaccurate or of low quality. The numerous sources of raw analytical data within an organization give rise to questions regarding its quality, consistency, and reliability, especially as the volume of data increases. To compound the problem, both analytical and operational data are often organized into separate silos, leading to duplication of data and inconsistent values. Analytical data often comes disaggregated from multiple silos according to different dimensions, such as legal entity, line of business, risk category, etc. The silo approach tends to produce low-quality data, mainly due to the proliferation of data duplication and multiple data quality approaches from one silo to the next.

Data can be considered high quality if it is fit for purpose in its intended use (e.g., statutory reports, business planning, and decision-making, etc.). Six main factors make up data quality:

  • Accuracy
  • Completeness
  • Appropriateness
  • Relevance
  • Consistency
  • Reliability

The quality of data in most insurance organizations is often poor, so improvement is essential. Improving the quality of data, however, is a multi-faceted process that takes raw data and subjects it to a set of algorithms, business rules, and, as a last resort, common sense. This, coupled with expert judgment, enables the data to be validated or corrected. The data quality tools used to do this have in-built data “logic” in terms of patterns, trends, and rules which have been developed over a number of years. Data is tested against this logic. Simple errors can be automatically corrected and flags raised for data that requires expert judgment. The result may not always produce perfect data (no process can do that), but the data should at least be fit for purpose.

Table 1 looks at a typical process for improving the quality of raw data. The steps may not necessarily follow this precise order.

Besides regulations, which initiatives will take the most resources to complete?

Another key area that will likely be a major area of expenditure is legacy technology, which many insurers will have to address over the next few years and will represent a significant investment.

Table 1. Typical process for improving raw data quality
Typical process for improving raw data quality
Source: Moody's Analytics

Legacy technologies

The insurance industry is undergoing a period of radical change, occasioned by the principal drivers of cost reduction, legislation, competition, and ever-increasing critical mass. Thus, the challenge is to radically reduce costs, while at the same time improve customer service and support new initiatives. Technology is the main weapon in meeting this challenge; yet, many insurers are burdened by a plethora of antiquated core systems built in old technology.

  • The average insurer has 10 to 14 core systems installed. One multinational insurer reportedly had more than 80 core systems.2 This complexity means that a rip-and-replace strategy would be too risky for such a large project.
  • Estimates vary, but it is not uncommon for insurers to spend up to 80% of their IT budget on maintaining legacy systems.3 Many of these so-called legacy systems are still running on expensive mainframes and built in Fortran, COBOL, etc. Indeed, insurance systems today account for some 28 billion lines of COBOL code and the number is still growing.

If you review many strategic plans of insurers, they will invariably include provisions for the replacement or improvement of multiple legacy systems. The logic behind this is simple – insurers spend an enormous amount of time and effort developing workaround processes to cover the functional deficiencies of these systems. While there is nothing intrinsically wrong with these systems and languages, they are hardly the first choice for the straight-through, web-enabled processing necessary to reduce costs and provide efficient service.

Many of these systems have outlived their usefulness and continue to operate in expensive mainframe environments. Outdated green screen systems impede an insurer’s ability to scale for growth or compete effectively. Finding people with the technical expertise and willingness to work on legacy platform applications is increasingly difficult, as many do not see a future in working with outdated development tools such as COBOL, FORTRAN, or C++.

Cost is a major issue, but the inherent inflexibility of systems means insurers cannot adapt at the same pace as the businesses they support. Moreover, lengthy development times for change requests can negatively impact an insurer’s productivity, product life cycles, and service delivery. A detailed analysis of the options to solve the legacy issue is beyond the scope of this article, but figure 4 highlights the four main options. In many cases, the solution will involve several or even all of these options.

What is clear from all the research is that updating the core technology platforms that underlie the main processes in insurance is imperative for survival and for competing effectively.

Figure 4. Four main options for solving legacy issues
Four main options for solving legacy issues
Source: Moody's Analytics
Sources and Notes

1 An OLAP cube is an array of data within which the cells comprising the cube hold a number that represents some measure that an element of the business, such as premium, claim, capital, expenses, budget, or forecast. The OLAP cube effectively provides the drill-down capability and granularity required.

2 The Telegraph, Modernize insurance legacy systems without the pain, April 30, 2014.

3 Risk Management Magazine, Practical Strategies for Dealing with Legacy Systems, Gavin Lavelle, February 2005.

As Published In:
Related Insights

The Challenges as Solvency II Reporting Goes Live

This paper is the first in a series of short whitepapers where Brian Heale examines the major challenges and issues insurers face for report production, data management, and SCR calculation for Solvency II. The series of papers also examines the approaches insurers have taken in their Solvency II projects to date.

June 2016 Pdf Brian Heale

A New Advice and Distribution Paradigm in Financial Services

The way insurance and investment products are distributed and managed in the future will undoubtedly change, but firms can benefit from the new paradigm. This article addresses how financial institutions can remain competitive by delivering intuitive customer journeys at a low cost using the latest technology.

December 2015 WebPage Philip Allen, Brian Heale

Latest Developments in the Quantitative Reporting Templates

In this paper, we look at the latest developments in the Quantitative Reporting Templates. We consider how insurers can address the challenge of maintaining Solvency II reporting systems to keep pace with the changing and emerging regulatory requirements.

September 2015 Pdf Brian Heale

Using Analytical Data for Business Decision-Making in Insurance

This article details the organizational and data challenges that insurers face when harnessing the historical and forward-thinking information needed to create interactive dashboards.

May 2015 WebPage Brian Heale

Solvency II and Asset Data

In this White Paper, we look at the challenges that insurers, fund managers and market data providers face in providing and aggregating the asset data required for the completion of the QRT templates and the SCR calculation.

December 2014 Pdf Brian Heale

Data: The Foundation of Risk Management

This article focuses on developing an effective data management framework for the analytical data used for regulatory and business reporting.

November 2014 WebPage Brian Heale

Automating the Solvency Capital Requirement Calculation Process

This Whitepaper explores how the Solvency II Solvency Capital Requirement (SCR) calculation process can be automated to facilitate efficient and timely regulatory reporting. The SCR calculation process is complex, requiring significant data consolidation, cleansing and transformation to produce accurate and consistent results.

July 2014 Pdf Brian Heale

Learn The Three Steps to Solvency II Pillar III Reporting

As the deadline for Solvency II approaches, many insurers are assessing the best approach to delivering the Pillar III reports required by EIOPA. Many recognize the challenges of data consolidation, data cleansing, calculating accurate results and formatting reports to submit to the regulators.

June 2014 WebPage Brian Heale

Solving the Data Challenges of Solvency II

This publication addresses the full spectrum of data challenges: data governance, data quality, tactical and strategic reporting,

May 2014 Pdf Brian Heale

Data Quality is the Biggest Challenge

This article looks at the inherent analytical data quality problems in the insurance industry and how to improve them.

May 2014 WebPage Brian Heale

Using Analytical Data to Support the Decision-Making Process

This article discusses the ways in which insurers can use their analytical and operational data to measure risk and performance, and support their business decision-making processes.

May 2014 WebPage Brian Heale

ORSA – a Forward-Looking View of Capital and Solvency

This article discusses how insurers should look beyond the next year, and build a capability to project solvency capital requirements under a range of scenarios – helping link capital with strategic business decisions.

May 2014 WebPage Gavin Conn, Brian Heale, Craig Turnbull

Making the Most of Analytical Data in Decision Making

In this paper, Brian Heale discusses the many ways in which insurers can use analytical data to support their strategic risk and capital decision-making processes. He also addresses how this can be integrated with the Own Risk Solvency Assessment (ORSA) and Use Test processes.

November 2013 Pdf Brian Heale

ORSA: A Capital Adequacy Assessment Process for Insurers

ORSA introduces a new risk and capital management environment for insurers. This article details the origins of ORSA and offers a framework for financial institutions seeking to execute an ORSA.

November 2013 WebPage Brian Heale

Pillar III Reporting: Avoiding the Pitfalls

Insurers face considerable challenges complying with Solvency II's Pillar III. In this paper, Brian Heale assesses both tactical and strategic approaches for addressing the Pillar III requirements, and proposes a hybrid approach that both satisfies immediate needs and can grow into a strategic solution over time.

September 2013 Pdf Brian Heale

Stress and Scenario Testing: How Insurers Compare with Banks

In this article, we compare the bank and insurance industries, to not only highlight how they are being brought closer together, but also to gain a new perspective on stress testing practices.

September 2013 WebPage Brian Heale

Analytical Data: How Insurers can Improve Quality

There is little doubt that analytical data is a foundation, not only for Solvency II and IFRS programs, but also better informed capital/risk decision making. In the second in a series of papers focusing on key data topics, Brian Heale gives his view on the types of analytical data required for Solvency II and capital/risk decision making with a particular focus on the techniques for improving quality.

July 2013 Pdf Brian Heale

Podcast: Moody's Analytics Solvency II Survey and Data Problems

An interview with Brian Heale - Senior Director, Moody's Analytics - discussing the Moody's Analytics Solvency II Survey. Topics include issues and trends uncovered by the survey, why data is an important issue for insurers, and ways insurers can address the data challenge.

June 2013 WebPage Brian Heale

Data governance best practice: smoothing the way for Solvency II

The first in a series of four papers which focuses on key data topics. Brian Heale gives his view on the obstacles that insurers have to clear in order to get their data management and governance house in order in preparation of Solvency II, and for other regulatory regimes such as ORSA.

April 2013 Pdf Brian Heale