Private Equity Roll-Up Strategies – Accelerated Value Creation Through Data and Analytics

Private equity firms have a major challenge when executing roll-up strategies in their investment sectors. Rolling up multiple acquisitions creates an information and reporting nightmare. How do the PE firms and their operating C suite teams quickly get basic financial reporting from each acquisition and how can they get consolidated financial information across the newly built enterprise? And once basic financial reporting is in place, how do they accelerate financial and operating transformation finding cash flow enhancement and EBITDA improvement opportunities?

Private Equity Roll-Up Strategies

The Challenge: As your roll-up deals close, your financial data sources grow exponentially and producing accurate financial reports becomes extremely difficult and time-consuming.

As deals in the acquisition roll-up pipeline begin to close, there is typically a proliferation of information management systems from each of the newly acquired companies. As the number of information systems mounts, it becomes increasingly difficult to produce good financial reports. Typically, this results in a manual financial consolidation process which requires each acquisition to submit their statements and then the PE team must manually consolidate. A very difficult and time-consuming process at a point in time that is extremely critical for the deal.

Want better reporting or advanced analytics? Our private equity analytics consultants are here to help. Request a whiteboarding session today!

With this newly acquired “Tower of Financial Babel,” PE firms and their operators are looking to accelerate profit improvement and value creation. However, they are usually trying to make management and value transformation decisions while sorting through a legion of information management systems with different charts of accounts, conflicting formats, and diverse accounting policies. In order to deal with this mounting incremental complexity, the deal accounting team must try to reconcile these different statements, frequently requiring multiple conversations with each of the acquired company’s executive teams. In turn, this takes away valuable time needed by each of the executive leadership teams to manage their company and create value.

The reality is that the financial consolidation process for the typical PE roll-up becomes highly complex, very manual, and very slow. As more roll-up deals are completed and added on to the enterprise, this problem only gets worse and more fragmented. And given this difficult process, the consolidated statements are often delayed and even become suspect as the potential for reporting errors increases with process complexity.

Those are just the basics. From there, incremental value creation and financial transformation is extremely difficult. How can the PE firms quickly begin the financial and operating transformation that needs to take place? Specifically, how does the typical PE firm identify the EBITDA improvement opportunities as well as the cash flow harvesting that needs to take place across the new and expanding acquired enterprise?

The first thought might be to merge all of the new acquisitions onto one of the incumbent systems. However, these types of large-scale ERP migrations projects are very risky and prone to failure, more often than not running late and way over budget. And they are incredibly disruptive to location operating management who are also integrating into the new ownership structure as well as trying to improve operations and financial results. Therefore, it is easy to see why migrating each new acquisition from the deal pipeline onto a new ERP, EMR, or any other centralized management platform is difficult, complex, expensive, and disruptive to the new deal at a time when the new acquisition is vulnerable. PE firms want their operating teams focused on execution, operations, and financial performance – especially when the roll-up strategy is in the initial stages.

Worst of all, the PE and the executive teams could possibly be integrating onto an existing enterprise information management system that may be sub-optimal for the new larger, consolidated company. This type of process will most likely create short-, medium-, and long-term management issues which will surely be recognized in the future during the deal exit process.

Instead of migrating onto an incumbent system, why not take a long-term approach by developing a truely data-driven strategy? Take time to develop a holistic view of the new enterprise and be deliberate when devising and implementing a new enterprise management system. This enterprise view will allow the roll-up to find and implement the market-leading systems creating the ultimate strategic competitive advantage for the future state company. And this data strategy can be developed quickly – often in a matter of a few weeks.

Then how do the PE firm and the roll-up operating executive team quickly get the financial information needed to run and manage the company and begin the financial/operational transformation?

The best way to accomplish these goals with minimal operating company disruption on a relatively short timeline is to merge the portfolio companies on a data basis. Pull each of the portfolio company’s sources of data (ERP, CRM, EMR, auxiliary systems, social media, IoT, third-party, etc.) into a consolidated data warehouse. Then integrate the companies on a virtual basis designing and implementing a powerful data model to enable the consolidation.

Once the data warehouse and the consolidated data model are in place, standardized single-source-of-truth finance and operating statement dashboards will be in place. These dashboards will be on a near real-time basis for PE management, their lenders, the executive management team, local operating managers, and employees. These dashboard reports will enable self-service analytics allowing for a much deeper understanding of portfolio company performance and efficient operational reviews and conversations.

With the dashboards in place, the complete team can focus on value identification and creation with standardized KPIs (e.g., EBITDA, operating metrics, cash flow, etc.). Dashboards can be designed and built to benchmark across the portfolio companies and also up against best industry practices. Value trend lines can be measured as well as identification of best-case or lagging performers. Specific areas of performance can be targeted (e.g., A/R, inventory, fixed asset utilization, labor efficiency, etc.), again comparing performance and finding sources of cash flow improvement. With this tremendous analytics and insight, best practices can be identified and action plans can be created to implement across the new enterprise. Conversely, remediation action plans can put in place for lagging performers with improvement monitored and scorecarded.

Building and implementing an enterprise data warehouse and business intelligence dashboards creates an asset of true incremental value. With all enterprise data consolidated and modeled, this will open up opportunities for predictive analytics, data science, and deep value discovery. New marketing analytics strategies can be put in place. New products and services can be imagined and developed through data science ROI use cases. Optimization opportunities will be found, analyzed, and implemented. And so on as the operating teams become data-savvy.

In summary, the quickest and most effective way to affect a PE financial and operating transformation for a roll-up is to integrate at the data level. This data integration, consisting of a data warehouse and targeted financial and operating dashboards, will accelerate the value creation deal process. In turn, this will maximize the financial ROI of the deal generating the greatest return for the investment firm and their fund investors.

If you’re ready to accelerate value creation throughout your portfolio, contact us today to schedule a complimentary Advanced Analytics and Reporting Whiteboard Strategy Session.

rss
Facebooktwitterlinkedinmail

Data 101 for Marketers

In Data 101 for Marketers, we’ll cover the basics of data you might encounter as you seek to take control of your data, improve your analytics, and get more value from your MarTech investments. This includes:

  • The definition of data
  • Different types of data
  • What matters most for marketers
  • Examples of marketing data
  • The benefits of marketing data management

Data 101 for Marketers

Data

Definition:

Data is any piece of information that can be used to analyze, manage, or connect with your buyers. Data is often stored in various systems throughout your organization such as your website or email marketing tool.

Why it matters for marketers:

At the most basic level, data can be used to communicate with customers. As a marketing organization matures, the need to access, analyze, and leverage data becomes more critical.

Types of Data: Structured Data vs. Unstructured Data

There are two main types of data, structured and unstructured. Each contains valuable insights about your buyers. When they are combined, your marketing team can create greater context for data and expand the depth of your analysis.

Structured Data

Definition:

Structured data is highly organized, formatted, and searchable data that fits neatly into a field in a data table. This data gives you a basic understanding of who your customers and prospects are. It’s also known as quantitative data.

Examples:

An example of structured data in marketing is data stored in systems such as customer relationship management (CRM) tools, enterprise resource planning (ERP) software, or point of sale (POS) systems. It includes information like:

  • Names
  • Dates
  • Phone numbers
  • Email addresses
  • Purchase history
  • Credit card numbers
  • Order numbers

How it is used:

Structured data is the data you use to connect with and understand your customers and prospects at the most basic level.

The information is used in:

  • Email communication in your CRM or marketing automation tool
  • Tracking of inbound and outbound sales, marketing, and service touchpoints through your CRM
  • Website and content optimization for search engine optimization (SEO)
  • Purchase history analysis

Real-world examples:

Structured Data

Example 1: Gmail uses structured data from your flight confirmation to provide a quick snapshot of your flight details within the email.

Image Source: litmus.com

Example 2: Your marketing automation software uses structured data to pull customer names for customized email campaigns.

automation software

Unstructured Data

Definition:

Unstructured data is any data that does not fit into a pre-designed data table or database. This data often holds deeper insights into your customers but can be difficult to search and analyze. It’s also known as qualitative data.

Examples:

Unstructured data is relevant and insightful information about your customers and prospects from a variety of sources such as:

  • Email or chat messages
  • Images
  • Videos or video files
  • Contracts
  • Social media posts
  • Survey results
  • Reports

How it is used:

Unstructured data, often combined with structured data, can be used to find deep insights on customer or prospect behavior, sentiment, or intent such as:

  • Understanding buying habits
  • Gaining a 360 view of the customer
  • Measuring sentiment toward a product or service
  • Tracking patterns in purchases or behaviors

Real-world examples:

Social media data has a huge impact on businesses today. Social listening is used as a way to gain deeper insight about your customers and what they think of your business. They might comment, post their own user-generated content, or post about your business. All of those highly valuable data points are unstructured or qualitative in nature but provide a deeper dive into the minds of consumers.

Data Sources

Definition:

Data sources are the origin points of your data. They can be files, databases, or even live data feeds. Marketing data sources include web analytics, marketing automation platforms, CRM software, or POS systems.

Why it matters for marketers:

Each data source holds a fragment of a story about your customers and prospects. Often these data sources come from siloed systems throughout your business. By combining various data sources, you can uncover the full narrative and get a 360 view of your customers and prospects.

Making use of new technology to aggregate and analyze data sources can reduce marketing dollars and time spent on multiple softwares to piece together the data you need for your daily questions or analysis.

Real-world examples:

CMOs and marketers are increasingly being asked to justify marketing spend against KPIs. This can be challenging because a lot of marketing activity is, by nature, indirect brand-building. However, that doesn’t mean we can’t get better at measuring it.

It isn’t an easy task, but centralizing your marketing data sources actually makes it easier to prove ROI. It cuts down on reporting time, enhances the customer experience, and makes it easy to use insights from one channel to inform another.

For example, customer service data can make a huge difference for the sales team. If a customer emails or calls a customer service rep with a complaint, that issue should not only get tracked in the service rep’s software but in the sales representative’s system as well. That way, when the sales rep calls on that customer again, they have the full history of service and/or repairs made, potentially making it easier to retain or upsell that customer.

We hope you found this intro into data management useful. Feel free to contact us with any questions or to learn more about marketing data solutions.

Want better data insights and customer analytics?

2nd Watch’s Marketing Analytics Starter Pack provides an easy way to get started or expand your current marketing reporting and analytics capabilities.

CTA

rss
Facebooktwitterlinkedinmail

Your Roll-Up Strategy: What Should You Do about Disparate ERP Systems?

Your strategy is to roll up companies and drive value through economies of scale, market influence, and more efficient operational processes. That all makes sense, but what should you do about all the disparate ERP or core application data that you need to drive that value from all those disparate organizations and systems? How can you best achieve your target returns without spending all that money to rip and replace those expensive ERP applications without disrupting the business?

Our reference architecture for private equity firms with roll-up strategies is to design and build a data hub. The data hub is external to your ERP and existing application systems at your portfolio companies. It is likely cloud-based and serves as a single source of truth for managing and monitoring performance. Data is sourced from all the disparate ERPs in your world, scrubbed and standardized, and then used to build dashboards and analytics solutions for use across your organization.

Your goals with a roll-up strategy are:

  • Quickly integrate any acquired companies into your data and analytics world.
  • Provide a common view of to-be-measured processes and performance.
  • Establish benchmarks and target performance levels
  • Broadly disseminate actual performance data.
  • Drive accountability for value creation through improved operational performance.

You have to deal with several data challenges in your roll-ups. The first is likely use of different ERP or core applications, from different vendors, with no easy way to consolidate or integrate. Your acquired companies don’t all share a common set of metrics or KPIs, nor do they all define key metrics the same way. (Product gross margins are calculated differently at different companies.) The more organizations you roll up, the more likely that you can’t access the data that you want, can’t compare performance easily across acquired companies, and can’t build accountability without a standard set of performance metrics that are broadly disseminated across the organization.

The data hub approach is the right solution. It sources the data from your existing disparate ERP systems without the expense or costly delay to replace the ERP in your acquired companies. The data can be cleansed and standardized regardless of which ERP it came from. The data hub also incorporates a business layer that transforms the data into easily understood and highly accessible form, so every executive in your portfolio companies can easily access, digest, analyze and act on the data.

We rely on a data hub reference architecture that is mature and proven across many companies. Using modern analytical tools, it provides real-time data in dashboards for CEOs down to individual account executives. It supports ad-hoc analytics, advanced data science modeling, and leverages best-of-breed reporting software with real-time data feeds. Modern data hubs can also take in IoT data and unstructured web data like social media feeds and sentiment analysis.

Data hubs are the tool to drive your value creation and you also need to develop people and process tools like an integration playbook, gamification dashboards, and alerts for driving real-time decision-making and management actions. Our preferred solution for most of our PE clients is 50% technology and 50% people and process changes to drive quick results and optimum performance.

If you buy three different companies, you are buying three sets of ERP data, three sets of KPI definitions, three sets of processes, and three sets of management styles. You need a data strategy, and a data hub, to move those companies to the single consolidated and standardized view to achieve the quick returns on your investments. Get your data out of the three ERPs, or 30 ERPs, and into a data hub for quick value creation post-close without waiting years or spending millions to standardize your ERP systems across your portfolio companies.

Need help getting a single source of truth from your portfolio companies’ systems? A 2nd Watch whiteboarding session is the first step to building an advanced analytics and reporting strategy to find the insights you need. Click here to learn more and get started.

rss
Facebooktwitterlinkedinmail

28 Questions to Ask During Due Diligence to Accelerate Data Value Creation

Data and analytics is a major driver and source of great value for private equity firms. The best private equity firms know the full power of data and analytics. They realize that portfolio company enterprise data is typically the crown jewel of an acquisition or deal target.

Data and analytics are also the foundation of financial and operational transformation. Quickly pulling data from their portfolio companies, and consolidating it into actionable information, will enable and accelerate financial and operational value opportunities, driving up EBITDA. Even better, the creation of data monetization revenue opportunities unlocks hidden sources of value creation. And down the road, a data-driven organization will always yield much higher financial valuations and returns to their investors.

Due Diligence to Accelerate Data Value Creation

Most firms doing due diligence on potential targets will only do basic due diligence. They will focus on assuring financial valuation and risk assessment. Therefore, most PE firms will conduct standard IT due diligence, analyzing expense budgets, hardware and software capital assets, license and service contracts, and headcount/staffing. They will seek to understand IT architecture, as well as assess the network in terms of capability. Because it is top of mind, the due diligence effort will also heavily focus on cyber and network security, and the architecture built to protect the portfolio company and its data. Typically, they will declare the due diligence effort complete.

Beyond classical IT due diligence, most dealmakers try to understand their data assets once the deal has closed and they begin operating the acquired company. However, best practice says otherwise. To accelerate the data and analytics value creation curve, it really starts at data due diligence. Precise data due diligence serves as the foundation for portfolio data strategy, as well as uncovers hidden sources of potential and opportunistic strategic value. Doing data due diligence will give the PE firm and portfolio company a running start on data value creation once the deal has closed.

What should deal firms look for when doing data and analytics due diligence? Here are key areas and questions for investigation and analysis when investigating a target portfolio company.

 

Step 1: Determine the target company’s current overall approach to managing and analyzing its data.

Develop an understanding of the target company’s current approach to accessing and analyzing their data. Understanding their current approach will let you know the effort needed to accelerate potential data value creation.

  1. Does the target company have a comprehensive data strategy to transform the company into a data-driven enterprise?
  2. Does the company have a single source of truth for data, analytics, and reporting?
  3. What is the target company’s usage of data-driven business decisions in operations, marketing, sales, and finance?
  4. What cloud services, architectures, and tools does the company use to manage its data?
  5. What is the on-prem data environment and architecture?
  6. What kind of cloud data and analytics proofs-of-concept does the company have in place to build out its capabilities?
  7. Has the company identified and implemented value prop use cases for data and analytics, realizing tangible ROI?
  8. Where is the target company on the data and analytics curve?

Step 2: Identify the data sources, what data they contain, and how clean the data is.

Data value depends on breadth and quality of the target company’s data and data sources. Document what the data sources are, what purpose they serve, how the target company currently integrates data sources for analytics, the existing security and data governance measures, and the overall quality of the data.

  1. Inventory all of the company’s data sources, including a data dictionary, size, physical and logical location, data architecture, data model, etc.
  2. How many of the data sources have an API for ETL (extract, transform, load) to pull data into the data warehouse?
  3. Does the target company have a data warehouse, and are all of its data sources feeding the data warehouse?
  4. How much history does each data source have? Obviously, the longer the history, the greater the value of the data source.
  5. What kind of data security is in place to protect all data sources?
  6. What kind of data quality assessment for each source has been conducted?

Step 3: Assess the quality of the target company’s analytics and reporting.

Review how the target company approaches reporting and analytics. This step should include a review of their tools and technologies, KPIs and metrics, and reporting (i.e., self- service, interactive, dashboards, Excel reports, reports delivered by IT, etc.).

  1. What kind of reporting does the company use?
  2. Does the portfolio company have a heavy dependence on Excel for producing reports?
  3. Describe the KPIs that are in place for each functional area. How has the company been tracking against these KPIs?
  4. Does the company enable self-service analytics across the enterprise?
  5. What is the inventory of all reports generated by the company?
  6. What percentage of the reports are delivered by way of dashboarding?

Step 4: Review the people and processes involved in data management and analytics.

Determine the extent of the target company as a data-driven organization by examining the people and processes behind the data strategy. Document which FTEs are involved with data and analytics, how much time is dedicated to reporting and report development, as well as the current processes for analytics.

  1. How many FTEs are engaged in financial and operational report development?
  2. What does the data and analytics team consist of, in terms of data engineers, data scientists, data administrators, and others with data titles?
  3. What kind of data governance is in place for the target company to regulate the structure of data, as well as where and how data can flow through the organization?

Step 5: Find opportunities for target company data value creation.

Assess, understand, and determine the opportunities for marketing and operational improvements, cost reduction, untapped areas of growth, data monetization, cash flow improvement, and more.

  1. Which of the following advanced data and analytics use cases does the portfolio company have in place?
    • Customer acquisition
    • Marketing channel excellence
    • Working capital rationalization
    • Fixed asset deployment and maintenance
    • Operational labor transformation
    • Forecasting predictive analytics
    • Automated customer reporting
    • Supply chain optimization
  2. What use cases does the company conduct for data science predictive and prescriptive analytics?
  3. What is the target company’s data monetization strategy, and where are they with implementation?
  4. What is the company’s usage of big data to enhance marketing, sales, and customer service understanding and strategies?
  5. What third-party data does the company use to supplement internal data to drive enhanced insights into marketing and operating?

Conclusion

To accelerate data and analytics value creation for a portfolio company target, start the process during due diligence. Gaining tremendous insight into the potential for data will accelerate the plan once the deal is closed and allow for a running start on data analytics value creation. With these insights, the PE firm, in partnership with their portfolio company, will generate fast data ROI and enable financial and operational transformation, EBITDA growth, and enhanced cash flow.

At 2nd Watch, we help private equity firms implement comprehensive data analytics solutions from start to finish. Our data experts guide, oversee, and implement focused analytics projects to help clients attain more value from modern analytics. Contact us for a complimentary 90-minute whiteboard session to get started.

 

Jim Anfield, Principle and Health Care Practice Leader

rss
Facebooktwitterlinkedinmail

Top 4 Data Management Solutions for Snowflake Success

The Data Insights practice at 2nd Watch saw the potential of Snowflake from the time it was a tech-unicorn in 2015. Its innovative approach to storing and aggregating data is a game-changer in the industry! On top of that, Snowflake’s value proposition to their customers complements the data management expertise that 2nd Watch has been developing since its inception. Whether you’re a mid-sized insurance carrier or a Fortune 500 manufacturer, Snowflake and 2nd Watch know how to build scalable, tailored solutions for your business problems.Data Management Solutions for Snowflake Success

On top of skills in AI and machine learning, app development, and data visualization, here are the top four data engineering services 2nd Watch uses to deploy a successful cloud data platform initiative using a tool like the Snowflake Data Cloud.

Data Warehousing 

Snowflake offers powerful features in the data warehousing space that allow 2nd Watch delivery teams to stay laser-focused on business outcomes. They use innovative technologies that optimize your data for storage, movement, and active use (cloud computing). They also have an ever-increasing array of valuable tools that significantly improve an organization’s ability to enrich and share large amounts of data with other companies. 

But it doesn’t happen by magic…

2nd Watch can leverage our vast industry and technical experience to create a data warehouse for your organization that provides a fast, accurate, and consistent view of your data from multiple sources. Using best practices and well-established methodologies, 2nd Watch combines data from different sources into a centralized repository, creating a single version of the truth and a unified view.

The final design contains a user-friendly enterprise data warehouse that connects with both legacy and modern business intelligence tools to help you analyze data across your organization. The data warehouse is optimized for performance, scaling, and ease-of-use by downstream applications.

Potential Deliverables

  • Conceptual and physical data models for dimensional and analytical systems
  • Deployment of three semantic layers for tracking data in a central hub (raw, business using data vault, and data warehouse optimized for visualizations)
  • Design and development of departmental data marts of curated data
  • Training of end users for the cloud-based data solution and critical data applications and tools

Data Integration 

Snowflake has a lot of flexibility when it comes to the data integration process, meaning Snowflake’s Data Cloud allows companies to go beyond traditional extract, transform, and load data flows. With the Snowflake ecosystem, companies can leverage data integration solutions that do everything from data preparation, migration, movement, and management, all in an automated and scalable way.

The consultants at 2nd Watch will partner with you every step of the way and guide the entire team in the right direction to meet your decision-makers’ specific goals and your organization’s business data needs. These are some of the popular data integration tools and technologies that 2nd Watch can help integrate to Snowflake:

  • Azure Data Factory
  • AWS Glue and Lambda
  • Google Cloud Data Fusion
  • Fivetran/HVR
  • Etlworks 
  • IBM DataStage 
  • SnapLogic 
  • Plus, all the classics, including SQL Server Integration Services (SSIS) and Informatica

Potential Deliverables

  • Integration of any number of sources to a centralized data hub
  • Establishment of a custom system that operates well with niche sources
  • Speeding up the ingestion process and increasing the auditing power
  • End-game integration to a data warehouse and other target systems

Data Modernization

Snowflake is a paradigm-shifting platform. Micro partition storage, decentralized compute, and cross-cloud sharing opens up new opportunities for companies to solve pain in their analytics processing. Our consultants at 2nd Watch are trained in the latest technologies and have the technical expertise to tackle the challenges posed by making your legacy systems “just work” in modern ecosystems like Snowflake.

Using supplemental tools like dbt or sqlDBM, this process will transform your data platform by eliminating complexities, reducing latency, generating documentation, integrating siloed sources, and unlocking the ability to scale and upgrade your existing data solutions.

Potential Deliverables

  • Migration to Snowflake from existing high-maintenance deployments
  • Refactoring, redesigning, and performance tuning of data architecture 
  • Deploying Snowpark API for integrating with Scala or Python applications 
  • Supporting modern tool selection and integration

Data Governance 

Data governance is critical to organizations hoping to achieve and maintain long-term success. Snowflake offers outstanding features such as object tagging or data classification that improve the security, quality, and value of the data. Additionally, when you work with 2nd Watch, we can help your organization establish a data governance council and program.

2nd Watch will assist you in identifying and coaching early adopters and champions. We will help with establishing roles and responsibilities (e.g., business owners, stewards, custodians), as well as creating and documenting principles, policies, processes, and standards. Finally, we will identify the right technology to help automate these processes and improve your data governance maturity level.

Potential Deliverables

  • Data governance strategy
  • Change management: identification of early adopters and champions
  • Master data management implementation
  • Data quality: data profiling, cleansing, and standardization
  • Data security and compliance (e.g., PII, HIPAA, GRC)

2nd Watch will make sure your team is equipped to make the most of your Snowflake ecosystem and analytics tools, guiding the entire process through deployment of a successful initiative. Get started with our Snowflake Value Accelerator.

rss
Facebooktwitterlinkedinmail

How to Build a Data Warehouse for the Insurance Industry

Insurance is a data-heavy industry with a huge upside to leveraging business intelligence. Today, we will discuss the approach we use at 2nd Watch to build out a data warehouse for insurance clients.

How to Build a Data Warehouse for the Insurance Industry

Understand the Value Chain and Create a Design

At its most basic, the insurance industry can be described by its cash inflows and outflows (e.g., the business will collect premiums based on effective policies and payout claims resulting from accidents). From here, we can describe the measures that are relevant to these activities:

  • Policy Transactions: Quote, Written Premium, Fees, Commission
  • Billing Transactions: Invoice, Taxes
  • Claim Transactions: Payment, Reserve
  • Payment transactions: Received amount

From these four core facts, we can collaborate with subject matter experts to identify the primary “describers” of these measures. For example, a policy transaction will need to include information on the policyholder, coverage, covered items, dates, and connected parties. By working with the business users and analyzing the company’s front-end software like Guidewire or Dovetail, we can design a structure to optimize reporting performance and scalability.

Develop a Data Flow

Here is a quick overview:

  1. Isolate your source data in a “common landing area”: We have been working on an insurance client with 20+ data sources (many acquisitions). The first step of our process is to identify the source tables that we need to build out the warehouse and load the information in a staging database. (We create a schema per source and automate most of the development work.)
  2. Denormalize and combine data into a data hub: After staging the data in the CLA, our team creates “Get” Stored Procedures to combine the data into common tables. For example, at one client, we have 13 sources with policy information (policy number, holder, effective date, etc.) that we combined into a single [Business].[Policy] table in our database. We also created tables for tracking other dimensions and facts such as claims, billing, and payment.
  3. Create a star schema warehouse: Finally, the team loads the business layer into the data warehouse by assigning surrogate keys to the dimensions, creating references in the facts, and structuring the tables in a star schema. If designed correctly, any modern reporting tool, from Tableau to SSRS, will be able to connect to the data warehouse and generate high-performance reporting.

Produce Reports, Visualizations, and Analysis

By combining your sources into a centralized data warehouse for insurance, the business has created a single source of the truth. From here, users have a well of data to extract operational metrics, build predictive models, and generate executive dashboards. The potential for insurance analytics is endless: premium forecasting, geographic views, fraud detection, marketing, operational efficiency, call-center tracking, resource optimization, cost comparisons, profit maximization, and so much more!

rss
Facebooktwitterlinkedinmail