A case study of inari’s bordereau management platform

A case study of inari’s bordereau management platform

When delegating authority to coverholders, there comes a point in the process when you will need to receive large volumes of risk, premium and claims data from your coverholder partners. This data comes in hard and fast, and you are faced with the challenge that each coverholder generates their information from different systems, to different standards of data and these data sets can vary in format and content. 
Typically, this information needs to be received, triaged, sanitised and then analysed first in isolation and then as part of the wider data exchange for the period of the binder, and done at scale. This process can be very taxing for operations and underwriting departments and hard to manage. The downstream impact of this process can be risk that is accepted outside binding agreements, payments that are not monitored properly and more importantly, financial impact. 
Our client, a leading market Managing Agent and Syndicate that manages a large amount of coverholder business, needed a platform that could ingest, sanitise, normalise, confirm and then run binder rules at scale so that data was conformed to a standard irrespective of origin, normalised to a common standard in terms of content and then validated to ensure that the information is within the different binding agreements taking into account cases such as rolling bordereaux, duplicate entries and data anomalies. 
In a world without Matsuri, a typical rolling bordereau with 5000 rows of data would take days to process, in addition to the time it takes to then work with the coverholder to resolve any issues. Multiply this by 10, 20 or 30 binder agreements with different coverholders with different volumes of data and you have an entire team of people “lost” in what is effectively a data management pipeline. 
With Matsuri, this client is able to upload a sample bordereau, map it to a common, standardized format and the system automatically ingests, detects data issues (and will try to fix known data types that it finds in error), confirms the data to a standard format and then runs business rules to make sure each row of data that is submitted is within the desired binding authority. The ultimate result is a clean, conformed and business approved data set that can then be used in a data warehouse and also sent to a policy administration system. 
This process happens within minutes instead of days, saving time, allowing operations and underwriting staff to focus on business-critical issues, saving possible financial leakage and raising the internal standards for data quality and consistency. 

Frank Perkins, CEO