Insurance providers are rich with data far beyond what they once had at their disposal for traditional historical analysis. The quantity, variety, and complexity of that data enhance the ability of insurers to gain greater insights into consumers, market trends, and strategies to improve their bottom line. But which projects offer you the best return on your investment? Here’s a glimpse at some of the most common insurance analytics project use cases that can transform the capabilities of your business.
Use your historical data to predict when a customer is most likely to buy a new policy.
Both traditional insurance providers and digital newcomers are competing for the same customer base. As a result, acquiring new customers requires targeted outreach with the right message at the moment a buyer is ready to purchase a specific type of insurance.
Predictive analytics allows insurance companies to evaluate the demographics of the target audience, their buying signals, preferences, buying patterns, pricing sensitivity, and a variety of other data points that forecast buyer readiness. This real-time data empowers insurers to reach policyholders with customized messaging that makes them more likely to convert.
Quoting Accurate Premiums
Provide instant access to correct quotes and speed up the time to purchase.
Consumers want the best value when shopping for insurance coverage, but if their quote fails to match their premium, they’ll take their business elsewhere. Insurers hoping to acquire and retain policyholders need to ensure their quotes are precise – no matter how complex the policy.
For example, one of our clients wanted to provide ride-share drivers with four-hour customized micro policies on-demand. Using real-time analytical functionality, we enabled them to quickly and accurately underwrite policies on the spot.
Improving Customer Experience
Better understand your customer’s preferences and optimize future interactions.
A positive customer experience means strong customer retention, a better brand reputation, and a reduced likelihood that a customer will leave you for the competition. In an interview with CMSWire, the CEO of John Hancock Insurance said many customers see the whole process as “cumbersome, invasive, and long.” A key solution is reaching out to customers in a way that balances automation and human interaction.
For example, the right analytics platform can help your agents engage policyholders at a deeper level. It can combine the customer story and their preferences from across customer channels to provide more personalized interactions that make customers feel valued.
Detecting Fraud
Stop fraud before it happens.
You want to provide all of your customers with the most economical coverage, but unnecessary costs inflate your overall expenses. Enterprise analytics platforms enable claims analysis to evaluate petabytes of data to detect trends that indicate fraud, waste, and abuse.
See for yourself how a tool like Tableau can help you quickly spot suspicious behavior with visual insurance fraud analysis.
Improving Operations and Financials
Access and analyze financial data in real time.
In 2019, ongoing economic growth, rising interest rates, and higher investment income were creating ideal conditions for insurers. However, that’s only if a company is maximizing their operations and ledgers.
Now, high-powered analytics has the potential to provide insurers with a real-time understanding of loss ratios, using a wide range of data points to evaluate which of your customers are underpaying or overpaying.
Are you interested in learning how a modern analytics platform like Tableau, Power BI, Looker, or other BI technologies can help you drive ROI for your insurance organization? Schedule a no-cost insurance whiteboarding strategy session to explore the full potential of your insurance data.
With your experience in the insurance industry, you understand more than most about how the actions of a smattering of people can cause disproportionate damage. The $80 billion in fraudulent claims paid out across all lines of insurance each year, whether soft or hard fraud, is perpetrated by lone individuals, sketchy auto mechanic shops, or the occasional organized crime group. The challenge for most insurers is that detecting, investigating, and mitigating these deceitful claims is a time-consuming and expensive process.
Rather than accepting loss to fraud as part of the cost of doing business, some organizations are enhancing their detection capabilities with insurance analytics solutions. Here is how your organization can use insurance fraud analytics to enhance fraud detection, uncover emerging criminal strategies, and still remain compliant with data privacy regulations.
Recognizing Patterns Faster
When you look at exceptional claim’s adjusters or special investigation units, one of the major traits they all share is an uncanny ability to recognize fraudulent patterns. Their experience allows them to notice the telltale signs of fraud, whether it’s frequent suspicious estimates from a body shop or complex billing codes intended to hide frivolous medical tests. Though you trust adjusters, many rely on heuristic judgments (e.g., trial and error, intuition, etc.) rather than hard rational analysis. When they do have statistical findings to back them up, they struggle to keep up with the sheer volume of claims.
This is where machine learning techniques can help to accelerate pattern recognition and optimize the productivity of adjusters and special investigation units. An organization starts by feeding a machine learning model a large data set that includes verified legitimate and fraudulent claims. Under supervision, the machine learning algorithm reviews and evaluates the patterns across all claims in the data set until it has mastered the ability to spot fraud indicators.
Let’s say this model was given a training set of legitimate and fraudulent auto insurance claims. While reviewing the data for fraud, the algorithm might spot links in deceptive claims between extensive damage in a claim and a lack of towing charges from the scene of the accident. Or it might notice instances where claims involve rental cars rented the day of the accident that are all brought to the same body repair shop. Once the algorithm begins to piece together these common threads, your organization can test the model’s unsupervised ability to create a criteria for detecting deception and spot all instances of fraud.
What’s important in this process is finding a balance between fraud identification and instances of false positives. If your program is overzealous, it might create more work for your agents, forcing them to prove that legitimate claims received an incorrect label. Yet when the machine learning model is optimized, it can review a multitude of dimensions to identify the likelihood of fraudulent claims. That way, if an insurance claim is called into question, adjusters can comb through the data to determine if the claim should truly be rejected or if the red flags have a valid explanation.
Detecting New Strategies
The ability of analytics tools to detect known instances of fraud is only the beginning of their full potential. As with any type of crime, insurance fraud evolves with technology, regulations, and innovation. With that transformation comes new strategies to outwit or deceive insurance companies.
One recent example has emerged through automation. When insurance organizations began to implement straight through processing (STP) in their claim approvals, the goal was to issue remittances more quickly, easily, and cheaply than manual processes. For a time, this approach provided a net positive, but once organized fraudsters caught wind of this practice, they pounced on a new opportunity to deceive insurers.
Criminals learned to game the system, identifying amounts that were below the threshold for investigation and flying their fraudulent claims under the radar. In many cases, instances of fraud could potentially double without the proper tools to detect these new deception strategies. Though most organizations plan to enhance their anti-fraud technology, there’s still the potential for them to lose millions in errant claims – if their insurance fraud analytics are not programmed to detect new patterns.
In addition to spotting red flags for common fraud occurrences, analytics programs need to be attuned to any abnormal similarities or unlikely statistical trends. Using cluster analysis, an organization can detect statistical outliers and meaningful patterns that reveal potential instances of fraud (such as suspiciously identical fraud claims).
Even beyond the above automation example, your organization can use data discovery to find hidden indicators of fraud and predict future incidents. Splitting claims data into various groups through a few parameters (such as region, physician, billing code, etc., in healthcare) can help in detecting unexpected correlations or warning signs for your automation process or even human adjusters to flag as fraud.
Safeguarding Personally Identifiable Information
As you work to improve your fraud detection, there’s one challenge all insurers face: protecting the personally identifiable information (PII) of policyholders while you analyze your data. The fines related to HIPAA violations can amount to $50,000 per violation, and other data privacy regulations can result in similarly steep fines. The good news is that insurance organizations can balance their fraud prediction and data discovery with security protocols if their data ecosystem is appropriately designed.
Maintaining data privacy compliance and effective insurance fraud analytics requires some maneuvering. Organizations that derive meaningful and accurate insight from their data must first bring all of their disparate data into a single source of truth. Yet, unless they also implement access control through a compliance-focused data governance strategy, there’s a risk of regulatory violations while conducting fraud analysis.
One way to limit your exposure is to create a data access layer that tokenizes the data, replacing any sensitive PII with unique identification symbols to keep data separate. Paired with clear data visualization capabilities, your adjusters and special investigation units can see clear-cut trends and evolving strategies without revealing individual claimants. From there, they can take their newfound insights into any red flag situation, saving your organization millions while reducing the threat of noncompliance.
Want to learn more about how the right analytics solutions can help you reduce your liability, issue more policies, and provide better customer service? Check out our insurance analytics solutions page for use cases that are transforming your industry.
Real-time analytics. Streaming analytics. Predictive analytics. These buzzwords are thrown around in the business world without a clear-cut explanation of their full significance. Each approach to analytics presents its own distinct value (and challenges), but it’s tough for stakeholders to make the right call when the buzz borders on white noise.
Which data analytics solution fits your current needs? In this post, we aim to help businesses cut through the static and clarify modern analytics solutions by defining real-time analytics, sharing use cases, and providing an overview of the players in the space.
TL;DR
Real-time or streaming analytics allows businesses to analyze complex data as it’s ingested and gain insights while it’s still fresh and relevant.
Real-time analytics has a wide variety of uses, from preventative maintenance and real-time insurance underwriting to improving preventive medicine and detecting sepsis faster.
To get the full benefits of real-time analytics, you need the right tools and a solid data strategy foundation.
What is Real-Time Analytics?
In a nutshell, real-time or streaming analysis allows businesses to access data within seconds or minutes of ingestion to encourage faster and better decision-making. Unlike batch analysis, data points are fresh and findings remain topical. Your users can respond to the latest insight without delay.
Yet speed isn’t the sole advantage of real-time analytics. The right solution is equipped to handle high volumes of complex data and still yield insight at blistering speeds. In short, you can conduct big data analysis at faster rates, mobilizing terabytes of information to allow you to strike while the iron is hot and extract the best insight from your reports. Best of all, you can combine real-time needs with scheduled batch loads to deliver a top-tier hybrid solution.
How does the hype translate into real-world results? Depending on your industry, there is a wide variety of examples you can pursue. Here are just a few that we’ve seen in action:
Next-Level Preventative Maintenance
Factories hinge on a complex web of equipment and machinery working for hours on end to meet the demand for their products. Through defects or standard wear and tear, a breakdown can occur and bring production to a screeching halt. Connected devices and IoT sensors now provide technicians and plant managers with warnings – but only if they have the real-time analytics tools to sound the alarm.
Azure Stream Analytics is one such example. You can use Microsoft’s analytics engine to monitor multiple IoT devices and gather near-real-time analytical intelligence. When a part needs a replacement or it’s time for routine preventative maintenance, your organization can schedule upkeep with minimal disruption. Historical results can be saved and integrated with other line-of-business data to cast a wider net on the value of this telemetry data.
Real-Time Insurance Underwriting
Insurance underwriting is undergoing major changes thanks to the gig economy. Rideshare drivers need flexibility from their auto insurance provider in the form of modified commercial coverage for short-term driving periods. Insurance agencies prepared to offer flexible micro policies that reflect real-time customer usage have the opportunity to increase revenue and customer satisfaction.
In fact, one of our clients saw the value of harnessing real-time big data analysis but lacked the ability to consolidate and evaluate their high-volume data. By partnering with our team, they were able to create real-time reports that pulled from a variety of sources ranging from driving conditions to driver ride-sharing scores. With that knowledge, they’ve been able to tailor their micro policies and enhance their predictive analytics.
Healthcare Analytics
How about this? Real-time analytics saves lives. Death by sepsis, an excessive immune response to infection that threatens the lives of 1.7 million Americans each year, is preventable when diagnosed in time. The majority of sepsis cases are not detected until manual chart reviews conducted during shift changes – at which point, the infection has often already compromised the bloodstream and/or vital tissues. However, if healthcare providers identified warning signs and alerted clinicians in real time, they could save multitudes of people before infections spread beyond treatment.
HCA Healthcare, a Nashville-based healthcare provider, undertook a real-time healthcare analytics project with that exact goal in mind. They created a platform that collects and analyzes clinical data from a unified data infrastructure to enable up-to-the-minute sepsis diagnoses. Gathering and analyzing petabytes of unstructured data in a flash, they are now able to get a 20-hour early warning sign that a patient is at risk of sepsis. Faster diagnoses results in faster and more effective treatment.
That’s only the tip of the iceberg. For organizations in the healthcare payer space, real-time analytics has the potential to improve member preventive healthcare. Once again, real-time data from smart wearables, combined with patient medical history, can provide healthcare payers with information about their members’ health metrics. Some industry leaders even propose that payers incentivize members to make measurable healthy lifestyle choices, lowering costs for both parties at the same time.
Getting Started with Real-Time Analysis
There’s clear value produced by real-time analytics but only with the proper tools and strategy in place. Otherwise, powerful insight is left to rot on the vine and your overall performance is hampered in the process. If you’re interested in exploring real-time analytics for your organization, contact us for an analytics strategy session. In this session lasting 2-4 hours, we’ll review your current state and goals before outlining the tools and strategy needed to help you achieve those goals.
It’s difficult to achieve your objectives when the goalposts are always in motion. Yet that’s often the reality for the healthcare industry. Ongoing changes in competition, innovation, regulation, and care standards demand real-time insight. Otherwise, it’s all too easy to miss watershed moments to change, evolve, and thrive.
Advanced or modernized analytics are often presented as the answer to reveal these hidden patterns, trends, or predictive insights. Yet when spoken about in an abstract or technical way, it’s hard to imagine the tangible impact that unspecified data can have on your organization. Here are some of the real-world use cases of big data analytics in healthcare, showing the valuable and actionable intelligence within your reach.
Improve Preventative Care
It’s been reported that six in ten Americans suffer from chronic diseases that impact their quality of life – many of which are preventable. Early identification and mediation reduce risk of long-term health problems, but only if organizations can accurately identify vulnerable patients or members. The success of risk scoring depends on a tightrope walk exploring populace overviews and individual specifics – a feat that depends on a holistic view of each patient or member.
A wide range of data contributes to risk scoring (e.g., patient/member records, social health determinants, etc.) and implementation (e.g., service utilization, outreach results, etc.). With data contained in an accessible, centralized infrastructure, organizations can pinpoint at-risk individuals and determine how best to motivate their participation in their preventive care. This can reduce instances of diabetes, heart disease, and other preventable ailments.
Encouraging healthy choices and self-care is just one potential example. Big data analytics has also proven to be an effective solution for preventing expensive 30-day hospital readmissions. Researchers at the University of Washington Tacoma used a predictive analytics model on clinical data and demographics metrics to predict the return of congestive heart failure patients with accurate results.
From there, other organizations have repurposed the same algorithmic framework to identify other preventable health issues and reduce readmission-related costs. One Chicago-based health system implemented a data-driven nutrition risk assessment that identified those patients at risk for readmissions. With that insight, they employed programs that combated patient malnutrition, cut readmissions, and saved $4.8 million. Those are huge results from one data set.
Boost Operational Efficiency
It’s well known that healthcare administrative costs in the United States are excessive. But it’s hard to keep your jaw from hitting the floor when you learn Canadian practices spend 27% of what U.S. organization do for the same claims processing. That’s a clear sign of operational waste, yet one that doesn’t automatically illuminate the worst offenders. Organizations can shine a light on wastage with proper healthcare analytics and data visualizations.
For instance, the right analytics and BI platform is capable of accelerating improvements. It can cross-reference patient intake data, record-keeping habits, billing- and insurance-related costs, supply chain expenses, employee schedules, and other data points to extract hidden insight. With BI visualization tools, you can obtain actionable insight and make adjustments in a range of different functions and practices.
Additionally, predictive analytics solutions can help to improve the forecasting of both provider organizations. For healthcare providers, a predictive model can help anticipate fluctuations in patient flow, enabling an appropriate workforce response to patient volume. Superior forecasting at this level manages to reduce two types of waste: labor dollars from overscheduling and diminished productivity from under-scheduling.
Enhance Insurance Plan Designs
There is a distinct analytics opportunity for payers, third-party administrators, and brokers: enhancing their insurance plan designs. Whether you want to retain or acquire customers, your organization’s ability to provide a more competitive and customized plan than the competition will be a game-changer.
All of the complicated factors that contribute to the design of an effective insurance plan can be streamlined. Though most organizations have lots of data, it can be difficult to discern the big picture. But machine learning programs have the ability to take integrated data sources such as demographics, existing benefit plans, medical and prescription claims, risk scoring, and other attributes to build an ideal individualized program. The result? Organizations are better at catering to members and controlling costs.
Plenty of Other Use Cases Exist
And these are just a sample of what’s possible. Though there are still new and exciting ways you can analyze your data, there are also plenty of pre-existing roadmaps to elicit incredible results for your business. To get the greatest ROI, your organization needs guidance through the full potential of these groundbreaking capabilities.
Data is one of the insurance industry’s greatest assets, which is why data analytics is so important. Before digital transformations swept the business world, underwriters and claims adjusters were the original data-driven decision makers, gathering information to assess a customer’s risk score or evaluate potential fraud. Algorithms have accelerated the speed and complexity of analytics in insurance, but some insurers have struggled to implement the framework necessary to keep their underwriting, fraud detection, and operations competitive.
The good news is that we have a clear road map for how to implement data analytics in insurance that garners the best ROI for your organization. Here are the four steps you need to unlock even more potential from your data.
Step 1: Let your business goals, not your data, define your strategy
As masters of data gathering, insurers have no shortage of valuable and illuminating data to analyze. Yet the abundance of complex data flowing into their organizations creates an equally vexing problem: conducting meaningful analysis rather than spur-of-the-moment reporting.
It’s all too easy for agents working on the front lines to allow the data flowing into their department to govern the direction of their reporting. Though ad hoc reporting can generate some insight, it rarely offers the deep, game-changing perspective businesses need to remain competitive.
Instead, your analytics strategy should align with your business goals if you want to yield the greatest ROI. Consider this scenario. A P&C insurer wants to increase the accuracy of their policy pricing in a way that retains customers without incurring additional expenses from undervalued risk. By using this goal to define their data strategy, it’s a matter of identifying the data necessary to complete that objective.
If, for example, they lack complex assessments of the potential risks in the immediate radius of a commercial property (e.g., a history of flood damage, tornado warnings, etc.), the insurer can seek out that data from an external source to complete the analysis, rather than restricting the scope of their analysis to what they have.
Step 2: Get a handle on all of your data
The insurance industry is rife with data silos. Numerous verticals, LoBs, and M&A activity have created a disorganized collection of platforms and data storage, often with their own incompatible source systems. In some cases, each unit or function has its own specialized data warehouse or activities that are not consistent or coordinated. This not only creates a barrier to cohesive data analysis but can result in a hidden stockpile of information as LoBs make rogue implementations off the radar of key decision-makers.
Before you can extract meaningful insights, your organization needs to establish a single source of truth, creating a unified view of your disparate data sources. One of our industry-leading insurance clients provides a perfect example of the benefits of data integration. The organization had grown over the years through numerous acquisitions, and each LoB brought their own unique policy and claims applications into the fold. This piecemeal growth created inconsistency across their comprehensive insight.
For example, the operational reports conducted by each LoB reported a different amount of paid losses on claims for the current year, calling into question their enterprise-wide decision-making process. As one of their established partners, 2nd Watch provided a solution. Our team conducted a current state assessment, interviewing a number of stakeholders to determine the questions each group wanted answered and the full spectrum of data sources that were essential to reporting.
We then built data pipelines (using SSIS for ETL and SQL Server) to integrate the 25 disparate sources we identified as crucial to our client’s business. We unified the meta-data, security, and governance practices across their organizations to provide a holistic view that also remained compliant with federal regulation. Now, their monthly P&L and operational reporting are simplified in a way that creates agreements across LoBs – and helps them make informed decisions.
Step 3: Create the perfect dashboard(s)
You’ve consolidated and standardized your data. You’ve aligned your analytics strategy with your goals. But can your business users quickly obtain meaning from your efforts? The large data sets analyzed by insurance organizations can be difficult to parse without a way to visualize trends and takeaways. For that very reason, building a customized dashboard is an essential part of the data analytics process.
Your insurance analytics dashboard is not a one-size-fits-all interface. Similarly, to how business goals should drive your strategy, they should also drive your dashboards. If you want people to derive quick insights from your data, the dashboard they’re using should evaluate KPIs and trends that are relevant to their specific roles and LoBs.
Claims adjusters might need a dashboard that compares policy type by frequency of utilization and cost, regional hotspots for claims submissions, or fraud priority scores for insurance fraud analytics. C-suite executives might be more concerned with revenue comparisons across LoBs, loss ratios per policy, and customer retention by vertical. All of those needs are valid. Each insurance dashboard should be designed and customized to satisfy the most common challenges of the target users in an interactive and low-effort way.
Much like the data integration process, you’ll find ideal use cases by conducting thorough stakeholder interviews. Before developers begin to build the interface, you should know the current analysis process of your end users, their pain points, and their KPIs. That way, you can encourage them to adopt the dashboards you create, running regular reports that maximize the ROI of your efforts.
Step 4: Prepare for ongoing change
A refined data strategy, consolidated data architecture, and intuitive dashboards are the foundation for robust data analytics in insurance. Yet the benchmark is always moving. There’s an unending stream of new data entering insurance organizations. Business goals are adjusting to better align with new regulations, global trends, and consumer needs. Insurers need their data analytics to remain as fluid and dynamic as their own organizations. That requires your business to have the answers to a number of questions.
How often should your dashboard update? Do you need real-time analytics to make up-to-the-minute assessments on premiums and policies? How can you translate the best practices from profitable use cases into different LoBs or roles? Though these questions (and many others) are not always intuitive, insurers can make the right preparations by working with a partner that understands their industry.
Here’s an example: One of our clients had a vision to implement a mobile application that enabled rideshare drivers to obtain commercial micro-policies based on the distance traveled and prevailing conditions. After we consolidated and standardized disparate data systems into a star schema data warehouse, we automated the ETL processes to simplify ongoing processes.
From there, we provided our client with guidance on how to build upon their existing real-time analytics to deepen the understanding of their data and explore cutting-edge analytical solutions. Creating this essential groundwork has enabled our team to direct them as we expand big data analytics capabilities throughout the organization, implementing a roadmap that yields greater effectiveness across their analytics.
P&C insurance is an incredibly data-driven industry. Your company’s core assets are data, your business revolves around collecting data, and your staff is focused on using data in their day-to-day workstreams. Although data is collected and used in normal operations, oftentimes the downstream analytics process is painful (think of those month-end reports). This is for any number of reasons:
Large, slow data flows
Unmodeled data that takes manual intervention to integrate
Legacy software that has a confusing backend and user interface
And more
Creating an analytics ecosystem that is fast and accessible is not a simple task, but today we’ll take you through the four key steps 2nd Watch follows to solve business problems with an insurance analytics solution. We’ll also provide recommendations for how best to implement each step to make these steps as actionable as possible.
Step 1: Determine your scope.
What are your company’s priorities?
Trying to improve profit margin on your products?
Improving your loss ratio?
Planning for next year?
Increasing customer satisfaction?
To realize your strategic goals, you need to determine where you want to focus your resources. Work with your team to find out which initiative has the best ROI and the best chance of success.
First, identify your business problems.
There are so many ways to improve your KPIs that trying to identify the best approach can very quickly become overwhelming. To give yourself the best chance, be deliberate about how you go about solving this challenge.
What isn’t going right? Answer this question by talking to people, looking at existing operational and financial reporting, performing critical thinking exercises, and using other qualitative or quantitative data (or both).
Then, prioritize a problem to address.
Once you identify the problems that are impacting metrics, choose one to address, taking these questions into account:
What is the potential reward (opportunity)?
What are the risks associated with trying to address this problem?
How hard is it to get all the inputs you need?
RECOMMENDATION
Taking on a scope that is too large, too complex, or unclear will make it very difficult to achieve success. Clearly set boundaries and decide what is relevant to determine which pain point you’re trying to solve. A defined critical path makes it harder to go off course and helps you keep your goal achievable.
Step 2: Identify and prioritize your KPIs.
Next, it’s time to get more technical. You’ve determined your pain points, but now you must identify the numeric KPIs that can act as the proxies for these business problems.
Maybe your business goal is to improve policyholder satisfaction. That’s great! But what does that mean in terms of metrics? What inputs do you actually need to calculate the KPI? Do you have the data to perform the calculations?
Back to the example, here are your top three options:
Based on this information, even though the TTC metric may be your third-favorite KPI for measuring customer satisfaction, the required inputs are identified and the data is available. This makes it the best option for the data engineering effort at this point in time. It also helps you identify a roadmap for the future if you want to start collecting richer information.
RECOMMENDATION
As you identify the processes you’re trying to optimize, create a data dictionary of all the measures you want to use in your reporting. Appreciate that a single KPI might:
Have more and higher quality data
Be easier to calculate
Be used to solve multiple problems
Be a higher priority to the executive team
Use this list to prioritize your data engineering effort and create the most high-value reports first. Don’t engineer in a vacuum (i.e., generate KPIs because they “seem right”). Always have an end business question in mind.
Step 3: Design your solution.
Now that you have your list of prioritized KPIs, it’s time to build the data warehouse. This will allow your business analysts to slice your metrics by any number of dimensions (e.g., TTC by product, TTC by policy, TTC by region, etc.).
2nd Watch’s approach usually involves a star schema reporting layer and a customer-facing presentation layer for analysis. A star schema has two main components: facts and dimensions. Fact tables contain the measurable metrics that can be summarized. In the TTC example, the fact-claim tables might contain a numeric value containing the number of days to close a claim. A dimension table would then provide context for how you pivot the measure. For example, you might have a dimension-policyholder table that contains attributes to “slice” the KPI value (e.g., policyholder age, gender, tenure, etc.).
Once you design the structure of your database design, you can build it. This involves transforming the data from your source system to the target database. You’ll want to consider the ETL (extract-transform-load) tool that will automate this transformation, and you’ll also need to consider the type of database that will be used to store your data. 2nd Watch can help with all these technology decisions.
You may also want to take a particular set of data standards into account, such as the ACORD Standards, to ensure more efficient and effective flow of data across lines of business, for example. 2nd Watch can take these standards into account when implementing an insurance analytics solution, giving you confidence that your organization can use enterprise-wide data for a competitive advantage.
Finally, when your data warehouse is up and running, you want to make sure your investment pays off by managing the data quality of your data sources. This can all be part of a data governance plan, which includes data consistency, data security, and data accountability.
RECOMMENDATION
Don’t feel like you need to implement the entire data warehouse at once. Be sure to prioritize your data sources and realize you can gain many benefits by just implementing some of your data sources.
Step 4: Put your insurance analytics solution into practice.
After spending the time to integrate your disparate data sources and model an efficient data warehouse, what do you actually get out of it? As an end business user, this effort can bubble up as flat file exports, dashboards, reports, or even data science models.
I’ve outlined three levels of data maturity below:
Level 1
The most basic product would be a flat file. Often, mid-to-large-sized organizations working with multiple source systems work in analytical silos. They connect directly to the back end of a source system to build analytics. As a result, intersystem analysis becomes complex with non-standard data definitions, metrics, and KPIs.
With all of that source data integrated in the data warehouse, the simplest way to begin to analyze the data is off of a single flat extract. The tabular nature of a flat file will also help business users answer basic questions about their data at an organizational level.
Level 2
Organizations farther along the data maturity curve will begin to build dashboards and reports off of the data warehouse. Regardless of your analytical capabilities, dashboards allow your users to glean information at a glance. More advanced users can apply slicers and filters to better understand what drives their KPIs.
By compiling and aggregating your data into a visual format, you make the breadth of information at your organization much more accessible to your business users and decision-makers.
Level 3
The most mature product of data integration would be data science models. Machine learning algorithms can detect trends and patterns in your data that traditional analytics would take a long time to uncover, if ever. Such models can help insurers more efficiently screen cases and predict costs with greater precision. When writing policies, a model can identify and manage risk based on demographic or historic factors to determine ROI.
RECOMMENDATION
Start simple. As flashy and appealing as data science can be to stakeholders and executives, the bulk of the value of a data integration platform lies in making the data accessible to your entire organization. Synthesize your data across your source systems to produce file extracts and KPI scorecards for your business users to analyze. As users begin to adopt and understand the data, think about slowly scaling up the complexity of analysis.
Conclusion
This was a lot of information to absorb, so let’s summarize the roadmap to solving your business problems with insurance analytics:
Step 1: Determine your scope.
Step 2: Identify and prioritize your KPIs.
Step 3: Design your solution.
Step 4: Put your insurance analytics solution into practice.
2nd Watch’s data and analytics consultants have extensive experience with roadmaps like this one, from outlining data strategy to implementing advanced analytics. If you think your organization could benefit from an insurance analytics solution, feel free to get in touch to discuss how we can help.