Reinsurers Need a Data Strategy to Build Underwriting Transformation Programs
Years of low interest rates and booming catastrophic claims have accelerated the hardening of the reinsurance market, as the January 2021 renewals clearly demonstrated, and furthermore, the COVID-19 pandemic has made it worse. ‘lower for longer’ interest rate environment.
With suboptimal underwriting profitability, reinsurers need to strengthen underwriting performance and improve pricing, but such an underwriting transformation program requires reinsurers to bolster advanced data and analytics capabilities.
Regardless of the systems, the fundamental intellectual property of the underwriting organization revolves around the experience and human talent in understanding risk, as well as data relating to the risk itself. Underwriting talent is either developed and honed or hired, but the data must be immutable. Data forms the basis for catastrophe modeling, risk profiling, anomaly detection, reinsurance placement and quantification of emerging risks.
Within any reinsurance company, an underwriting department seeking to improve and upgrade data and analytics maturity has three primary concerns:
Security is permanent and essential. Reinsurers must not only comply with necessary regulations like the EU’s General Data Protection Regulation (GDPR), but also, rightly so, must be concerned about the security of systems, especially since more more users are switching to the cloud. In this case, reinsurers need to be sure that vendor security is more than sufficient and that internal IT resources have the capacity to respond to the constant threat of hacking, ransomware, and malware.
Migration from one format to another should be done by a trusted partner with experience in this space. The migration partner must not only understand the requirements of the different systems and environments in which data is migrated, but must also have a fundamental understanding of the risk transfer activity itself. This is especially true in the world of reinsurance.
Finally, shaping data to fit a growing variety of systems is a huge source of friction cost and uncertainty risk. Year after year, the same risk will emerge, but if it comes from broker XYZ rather than broker ABC, the underwriter may be faced with a new format of presentation, which needs to be decoded and deciphered for different systems – including accounting. , the loss, catastrophe and capital models available to the underwriter – and then entered into those, often requiring as many different formats as there are systems.
In disaster modeling, arguably one of the most important single cost areas for reinsurers, the trend towards open source modeling and data formatting is gaining ground. The OASIS Claims Modeling Framework (LMF) is an industry-backed company underwritten by more than 30 reinsurers. The OASIS platform makes it possible to host business models, or in-house models, or the development of models on the platform.
Notably, open source does not have to apply to all of this, as users of the platform may possibly wish to keep the results confidential. Sellers of business models will also want to keep intellectual property private, and of course, they will be able to. In line with OASIS open source ethics, a collective collaboration for the development of open source risk models is also planned.
On the data format side, many players are involved in the development of an open data format. Simplitium, a provider of risk analysis services, has launched a “fully” open source data format for disaster modeling that is completely vendor independent. The ultimate goal for reinsurers and insurers is a dataset that is formatted to user specifications and can be retrieved as needed across the organization, thus providing a ‘single source of truth’ for all systems, including account management, accounting and complaints. .
Data transparency becomes even more critical when discussing next-generation reinsurance business models, such as exchange-based secondary markets and digital platforms leading to automated reinsurance placement. Companies, like Tremor Technologies (Tremor) which pioneered a programmatic re / insurance risk transfer marketplace, are leading the way in reducing data challenges to enable these data-centric business models. Tremor transforms the way insurers purchase reinsurance as an adjunct to the brokerage process, without requiring participants to normalize data / quotes or enter duplicate data into another system, dramatically improving the speed and quality of the process. execution of reinsurance investments.
As the reinsurance industry seeks ways to leverage data to increase efficiency, reduce cost drivers, and improve the cedant experience, a concerted effort from all industry participants will be required. to move towards open data formats, accelerate digitization and create a culture of data sharing and collaboration.
There are, however, practical and pragmatic concerns about business disruption, the high cost of upgrading existing systems, organizational inertia, and the general lack of in-house digital talent. These issues can be effectively addressed by creating a strong data strategy that is tightly aligned with the organization’s business goals and data maturity, and turning strategy into action by integrating a data-driven technology partner and assurance.
Operators Underwriting Reinsurance Data Driven