Private Equity Roll-Up Strategies – Accelerated Value Creation Through Data and Analytics

Private equity firms have a major challenge when executing roll-up strategies in their investment sectors. Rolling up multiple acquisitions creates an information and reporting nightmare. How do the PE firms and their operating C suite teams quickly get basic financial reporting from each acquisition and how can they get consolidated financial information across the newly built enterprise? And once basic financial reporting is in place, how do they accelerate financial and operating transformation finding cash flow enhancement and EBITDA improvement opportunities?

Private Equity Roll-Up Strategies

The Challenge: As your roll-up deals close, your financial data sources grow exponentially and producing accurate financial reports becomes extremely difficult and time-consuming.

As deals in the acquisition roll-up pipeline begin to close, there is typically a proliferation of information management systems from each of the newly acquired companies. As the number of information systems mounts, it becomes increasingly difficult to produce good financial reports. Typically, this results in a manual financial consolidation process which requires each acquisition to submit their statements and then the PE team must manually consolidate. A very difficult and time-consuming process at a point in time that is extremely critical for the deal.

Want better reporting or advanced analytics? Our private equity analytics consultants are here to help. Request a whiteboarding session today!

With this newly acquired “Tower of Financial Babel,” PE firms and their operators are looking to accelerate profit improvement and value creation. However, they are usually trying to make management and value transformation decisions while sorting through a legion of information management systems with different charts of accounts, conflicting formats, and diverse accounting policies. In order to deal with this mounting incremental complexity, the deal accounting team must try to reconcile these different statements, frequently requiring multiple conversations with each of the acquired company’s executive teams. In turn, this takes away valuable time needed by each of the executive leadership teams to manage their company and create value.

The reality is that the financial consolidation process for the typical PE roll-up becomes highly complex, very manual, and very slow. As more roll-up deals are completed and added on to the enterprise, this problem only gets worse and more fragmented. And given this difficult process, the consolidated statements are often delayed and even become suspect as the potential for reporting errors increases with process complexity.

Those are just the basics. From there, incremental value creation and financial transformation is extremely difficult. How can the PE firms quickly begin the financial and operating transformation that needs to take place? Specifically, how does the typical PE firm identify the EBITDA improvement opportunities as well as the cash flow harvesting that needs to take place across the new and expanding acquired enterprise?

The first thought might be to merge all of the new acquisitions onto one of the incumbent systems. However, these types of large-scale ERP migrations projects are very risky and prone to failure, more often than not running late and way over budget. And they are incredibly disruptive to location operating management who are also integrating into the new ownership structure as well as trying to improve operations and financial results. Therefore, it is easy to see why migrating each new acquisition from the deal pipeline onto a new ERP, EMR, or any other centralized management platform is difficult, complex, expensive, and disruptive to the new deal at a time when the new acquisition is vulnerable. PE firms want their operating teams focused on execution, operations, and financial performance – especially when the roll-up strategy is in the initial stages.

Worst of all, the PE and the executive teams could possibly be integrating onto an existing enterprise information management system that may be sub-optimal for the new larger, consolidated company. This type of process will most likely create short-, medium-, and long-term management issues which will surely be recognized in the future during the deal exit process.

Instead of migrating onto an incumbent system, why not take a long-term approach by developing a truely data-driven strategy? Take time to develop a holistic view of the new enterprise and be deliberate when devising and implementing a new enterprise management system. This enterprise view will allow the roll-up to find and implement the market-leading systems creating the ultimate strategic competitive advantage for the future state company. And this data strategy can be developed quickly – often in a matter of a few weeks.

Then how do the PE firm and the roll-up operating executive team quickly get the financial information needed to run and manage the company and begin the financial/operational transformation?

The best way to accomplish these goals with minimal operating company disruption on a relatively short timeline is to merge the portfolio companies on a data basis. Pull each of the portfolio company’s sources of data (ERP, CRM, EMR, auxiliary systems, social media, IoT, third-party, etc.) into a consolidated data warehouse. Then integrate the companies on a virtual basis designing and implementing a powerful data model to enable the consolidation.

Once the data warehouse and the consolidated data model are in place, standardized single-source-of-truth finance and operating statement dashboards will be in place. These dashboards will be on a near real-time basis for PE management, their lenders, the executive management team, local operating managers, and employees. These dashboard reports will enable self-service analytics allowing for a much deeper understanding of portfolio company performance and efficient operational reviews and conversations.

With the dashboards in place, the complete team can focus on value identification and creation with standardized KPIs (e.g., EBITDA, operating metrics, cash flow, etc.). Dashboards can be designed and built to benchmark across the portfolio companies and also up against best industry practices. Value trend lines can be measured as well as identification of best-case or lagging performers. Specific areas of performance can be targeted (e.g., A/R, inventory, fixed asset utilization, labor efficiency, etc.), again comparing performance and finding sources of cash flow improvement. With this tremendous analytics and insight, best practices can be identified and action plans can be created to implement across the new enterprise. Conversely, remediation action plans can put in place for lagging performers with improvement monitored and scorecarded.

Building and implementing an enterprise data warehouse and business intelligence dashboards creates an asset of true incremental value. With all enterprise data consolidated and modeled, this will open up opportunities for predictive analytics, data science, and deep value discovery. New marketing analytics strategies can be put in place. New products and services can be imagined and developed through data science ROI use cases. Optimization opportunities will be found, analyzed, and implemented. And so on as the operating teams become data-savvy.

In summary, the quickest and most effective way to affect a PE financial and operating transformation for a roll-up is to integrate at the data level. This data integration, consisting of a data warehouse and targeted financial and operating dashboards, will accelerate the value creation deal process. In turn, this will maximize the financial ROI of the deal generating the greatest return for the investment firm and their fund investors.

If you’re ready to accelerate value creation throughout your portfolio, contact us today to schedule a complimentary Advanced Analytics and Reporting Whiteboard Strategy Session.

rss
Facebooktwitterlinkedinmail

Data 101 for Marketers

In Data 101 for Marketers, we’ll cover the basics of data you might encounter as you seek to take control of your data, improve your analytics, and get more value from your MarTech investments. This includes:

  • The definition of data
  • Different types of data
  • What matters most for marketers
  • Examples of marketing data
  • The benefits of marketing data management

Data 101 for Marketers

Data

Definition:

Data is any piece of information that can be used to analyze, manage, or connect with your buyers. Data is often stored in various systems throughout your organization such as your website or email marketing tool.

Why it matters for marketers:

At the most basic level, data can be used to communicate with customers. As a marketing organization matures, the need to access, analyze, and leverage data becomes more critical.

Types of Data: Structured Data vs. Unstructured Data

There are two main types of data, structured and unstructured. Each contains valuable insights about your buyers. When they are combined, your marketing team can create greater context for data and expand the depth of your analysis.

Structured Data

Definition:

Structured data is highly organized, formatted, and searchable data that fits neatly into a field in a data table. This data gives you a basic understanding of who your customers and prospects are. It’s also known as quantitative data.

Examples:

An example of structured data in marketing is data stored in systems such as customer relationship management (CRM) tools, enterprise resource planning (ERP) software, or point of sale (POS) systems. It includes information like:

  • Names
  • Dates
  • Phone numbers
  • Email addresses
  • Purchase history
  • Credit card numbers
  • Order numbers

How it is used:

Structured data is the data you use to connect with and understand your customers and prospects at the most basic level.

The information is used in:

  • Email communication in your CRM or marketing automation tool
  • Tracking of inbound and outbound sales, marketing, and service touchpoints through your CRM
  • Website and content optimization for search engine optimization (SEO)
  • Purchase history analysis

Real-world examples:

Structured Data

Example 1: Gmail uses structured data from your flight confirmation to provide a quick snapshot of your flight details within the email.

Image Source: litmus.com

Example 2: Your marketing automation software uses structured data to pull customer names for customized email campaigns.

automation software

Unstructured Data

Definition:

Unstructured data is any data that does not fit into a pre-designed data table or database. This data often holds deeper insights into your customers but can be difficult to search and analyze. It’s also known as qualitative data.

Examples:

Unstructured data is relevant and insightful information about your customers and prospects from a variety of sources such as:

  • Email or chat messages
  • Images
  • Videos or video files
  • Contracts
  • Social media posts
  • Survey results
  • Reports

How it is used:

Unstructured data, often combined with structured data, can be used to find deep insights on customer or prospect behavior, sentiment, or intent such as:

  • Understanding buying habits
  • Gaining a 360 view of the customer
  • Measuring sentiment toward a product or service
  • Tracking patterns in purchases or behaviors

Real-world examples:

Social media data has a huge impact on businesses today. Social listening is used as a way to gain deeper insight about your customers and what they think of your business. They might comment, post their own user-generated content, or post about your business. All of those highly valuable data points are unstructured or qualitative in nature but provide a deeper dive into the minds of consumers.

Data Sources

Definition:

Data sources are the origin points of your data. They can be files, databases, or even live data feeds. Marketing data sources include web analytics, marketing automation platforms, CRM software, or POS systems.

Why it matters for marketers:

Each data source holds a fragment of a story about your customers and prospects. Often these data sources come from siloed systems throughout your business. By combining various data sources, you can uncover the full narrative and get a 360 view of your customers and prospects.

Making use of new technology to aggregate and analyze data sources can reduce marketing dollars and time spent on multiple softwares to piece together the data you need for your daily questions or analysis.

Real-world examples:

CMOs and marketers are increasingly being asked to justify marketing spend against KPIs. This can be challenging because a lot of marketing activity is, by nature, indirect brand-building. However, that doesn’t mean we can’t get better at measuring it.

It isn’t an easy task, but centralizing your marketing data sources actually makes it easier to prove ROI. It cuts down on reporting time, enhances the customer experience, and makes it easy to use insights from one channel to inform another.

For example, customer service data can make a huge difference for the sales team. If a customer emails or calls a customer service rep with a complaint, that issue should not only get tracked in the service rep’s software but in the sales representative’s system as well. That way, when the sales rep calls on that customer again, they have the full history of service and/or repairs made, potentially making it easier to retain or upsell that customer.

We hope you found this intro into data management useful. Feel free to contact us with any questions or to learn more about marketing data solutions.

Want better data insights and customer analytics?

2nd Watch’s Marketing Analytics Starter Pack provides an easy way to get started or expand your current marketing reporting and analytics capabilities.

CTA

rss
Facebooktwitterlinkedinmail

Your Roll-Up Strategy: What Should You Do about Disparate ERP Systems?

Your strategy is to roll up companies and drive value through economies of scale, market influence, and more efficient operational processes. That all makes sense, but what should you do about all the disparate ERP or core application data that you need to drive that value from all those disparate organizations and systems? How can you best achieve your target returns without spending all that money to rip and replace those expensive ERP applications without disrupting the business?

Our reference architecture for private equity firms with roll-up strategies is to design and build a data hub. The data hub is external to your ERP and existing application systems at your portfolio companies. It is likely cloud-based and serves as a single source of truth for managing and monitoring performance. Data is sourced from all the disparate ERPs in your world, scrubbed and standardized, and then used to build dashboards and analytics solutions for use across your organization.

Your goals with a roll-up strategy are:

  • Quickly integrate any acquired companies into your data and analytics world.
  • Provide a common view of to-be-measured processes and performance.
  • Establish benchmarks and target performance levels
  • Broadly disseminate actual performance data.
  • Drive accountability for value creation through improved operational performance.

You have to deal with several data challenges in your roll-ups. The first is likely use of different ERP or core applications, from different vendors, with no easy way to consolidate or integrate. Your acquired companies don’t all share a common set of metrics or KPIs, nor do they all define key metrics the same way. (Product gross margins are calculated differently at different companies.) The more organizations you roll up, the more likely that you can’t access the data that you want, can’t compare performance easily across acquired companies, and can’t build accountability without a standard set of performance metrics that are broadly disseminated across the organization.

The data hub approach is the right solution. It sources the data from your existing disparate ERP systems without the expense or costly delay to replace the ERP in your acquired companies. The data can be cleansed and standardized regardless of which ERP it came from. The data hub also incorporates a business layer that transforms the data into easily understood and highly accessible form, so every executive in your portfolio companies can easily access, digest, analyze and act on the data.

We rely on a data hub reference architecture that is mature and proven across many companies. Using modern analytical tools, it provides real-time data in dashboards for CEOs down to individual account executives. It supports ad-hoc analytics, advanced data science modeling, and leverages best-of-breed reporting software with real-time data feeds. Modern data hubs can also take in IoT data and unstructured web data like social media feeds and sentiment analysis.

Data hubs are the tool to drive your value creation and you also need to develop people and process tools like an integration playbook, gamification dashboards, and alerts for driving real-time decision-making and management actions. Our preferred solution for most of our PE clients is 50% technology and 50% people and process changes to drive quick results and optimum performance.

If you buy three different companies, you are buying three sets of ERP data, three sets of KPI definitions, three sets of processes, and three sets of management styles. You need a data strategy, and a data hub, to move those companies to the single consolidated and standardized view to achieve the quick returns on your investments. Get your data out of the three ERPs, or 30 ERPs, and into a data hub for quick value creation post-close without waiting years or spending millions to standardize your ERP systems across your portfolio companies.

Need help getting a single source of truth from your portfolio companies’ systems? A 2nd Watch whiteboarding session is the first step to building an advanced analytics and reporting strategy to find the insights you need. Click here to learn more and get started.

rss
Facebooktwitterlinkedinmail

What Is Sisu? An Intro to Sisu via Data Analytics in the Telecom Industry

Sisu is a fairly new, relatively unique tool that applies a user-friendly interface to robust and deep-diving business analytics, such as the example of big data analytics in the telecom industry we’ll cover in this blog post. With well-defined KPIs and a strong grasp of the business decisions relying on the analytics, even non-technical users are able to confidently answer questions using the power of machine learning through Sisu.

Below, we’ll detail the process of using Sisu to uncover the main drivers of customer churn for a telecom company, showing you what kind of data is appropriate for analysis in Sisu, what analysis 2nd Watch has performed using Sisu, and what conclusions our client drew from the data analysis. Read on to learn how Sisu may offer your organization the competitive advantage you’re looking for.

What is Sisu?

Sisu uses a high-level declarative query model to allow users to tap into existing data lakes and identify the key features impacting KPIs, even enabling users who aren’t trained data analysts or data scientists. Analysis improves with time as data increases and more users interact with Sisu’s results.

Sisu moves from user-defined objectives to relevant analysis in five steps:

  1. Querying and Processing Data: Sisu ingests data from a number of popular platforms (e.g., Amazon Redshift, BigQuery, Snowflake) with light transformation and can update/ingest over time.
  2. Data Quality, Enrichment, and Featurization: Automated, human-readable featurization exposes the most relevant statistical factors.
  3. Automated Model and Feature Selection: Sisu trains multiple models to investigate KPIs on a continuous or categorical basis.
  4. Personalized Ranking and Relevance: Sisu ranks facts by several measures that prioritize human time and attention, improving the personalized model over time.
  5. Presentation and Sharing: To dig into facts, Sisu offers natural language processing (NLP), custom visualization, supporting statistics, and related facts that illustrate why a fact was chosen.

How does Sisu help users leverage data to make better data-driven decisions?

Sisu can help non-technical users analyze data from various data sources (anything from raw data in a CSV file to an up-and-running database), improving data-driven decision-making across your organization. A couple of things to keep in mind: the data should already be cleaned and of high integrity; and Sisu works best with numerical data, not text-based data.

Once the data is ready for analysis, you can easily create a simple visualization:

  1. Identify your key variable.
  2. Choose a tracking metric.
  3. Select the time frame, if applicable.
  4. Run the visualization and apply to A/B groups as necessary.

With Sisu, users don’t need to spend time on feature selection. When a user builds a metric, Sisu queries the data, identifies high-ranking factors, and presents a list of features with the most impact. This approach subverts the traditional OLAP and BI process, making it easier and faster to ask the right questions and get impactful answers – requiring less time while offering more value.

Simplicity and speed are key contributors to why Sisu is so advantageous, from both a usability standpoint and a financial point of view. Sisu can help you increase revenue and decrease expenses with faster, more accurate analytics. Plus, because Sisu puts the ability to ask questions in the hands of non-technical users, it creates more flexibility for teams throughout your organization.

How did 2nd Watch use Sisu to reduce customer churn for a telecom company?

Being able to pick out key drivers in any set of data is essential for users to develop specific business-impacting insights. Instead of creating graphics from scratch or analyzing data through multiple queries like other analytical tools require, Sisu allows your teams to query their data in a user-friendly way that delivers the answers they need.

For our client in the telecommunications industry, group comparisons were crucial in determining who would likely become long-standing customers and who would have a higher rate of churn. Filtering and grouping the demographics of our client’s customer base allowed them to outline their target market and begin understanding what attracts individuals to stay longer. Of course, this then enables the company to improve customer retention – and ultimately revenue.

Sisu can also be employed in other areas of our client’s organization. In addition to customer churn data, they can investigate margins, sales, network usage patterns, network optimization, and more. With the large volumes of data in the telecom industry, our client has many opportunities to improve their services and solutions through the power of Sisu’s analytics.

How can Sisu benefit your organization?

Sisu reduces barriers to high-level analytical work because its automated factor selection and learning capabilities make analytics projects more efficient. Using Sisu to focus on who is driving business-impacting events (like our telecom client’s customer churn) allows you to create user profiles, monitor those profiles, and track goals and tweak KPIs accordingly. In turn, this allows you to be more agile, move from reactive to proactive, and ultimately increase revenue.

Because feature selection is outsourced to Sisu’s automated system, Sisu is a great tool for teams lacking in high-level analytics abilities. If you’re hoping to dive into more advanced analytics or data science, Sisu could be the stepping stone your team needs.

Learn more about 2nd Watch’s data and analytics solutions or contact us to discuss how we can jumpstart your organization’s analytics journey.

By Sarah Dudek, 2nd Watch Data Insights Consultant

rss
Facebooktwitterlinkedinmail

28 Questions to Ask During Due Diligence to Accelerate Data Value Creation

Data and analytics is a major driver and source of great value for private equity firms. The best private equity firms know the full power of data and analytics. They realize that portfolio company enterprise data is typically the crown jewel of an acquisition or deal target.

Data and analytics are also the foundation of financial and operational transformation. Quickly pulling data from their portfolio companies, and consolidating it into actionable information, will enable and accelerate financial and operational value opportunities, driving up EBITDA. Even better, the creation of data monetization revenue opportunities unlocks hidden sources of value creation. And down the road, a data-driven organization will always yield much higher financial valuations and returns to their investors.

Due Diligence to Accelerate Data Value Creation

Most firms doing due diligence on potential targets will only do basic due diligence. They will focus on assuring financial valuation and risk assessment. Therefore, most PE firms will conduct standard IT due diligence, analyzing expense budgets, hardware and software capital assets, license and service contracts, and headcount/staffing. They will seek to understand IT architecture, as well as assess the network in terms of capability. Because it is top of mind, the due diligence effort will also heavily focus on cyber and network security, and the architecture built to protect the portfolio company and its data. Typically, they will declare the due diligence effort complete.

Beyond classical IT due diligence, most dealmakers try to understand their data assets once the deal has closed and they begin operating the acquired company. However, best practice says otherwise. To accelerate the data and analytics value creation curve, it really starts at data due diligence. Precise data due diligence serves as the foundation for portfolio data strategy, as well as uncovers hidden sources of potential and opportunistic strategic value. Doing data due diligence will give the PE firm and portfolio company a running start on data value creation once the deal has closed.

What should deal firms look for when doing data and analytics due diligence? Here are key areas and questions for investigation and analysis when investigating a target portfolio company.

 

Step 1: Determine the target company’s current overall approach to managing and analyzing its data.

Develop an understanding of the target company’s current approach to accessing and analyzing their data. Understanding their current approach will let you know the effort needed to accelerate potential data value creation.

  1. Does the target company have a comprehensive data strategy to transform the company into a data-driven enterprise?
  2. Does the company have a single source of truth for data, analytics, and reporting?
  3. What is the target company’s usage of data-driven business decisions in operations, marketing, sales, and finance?
  4. What cloud services, architectures, and tools does the company use to manage its data?
  5. What is the on-prem data environment and architecture?
  6. What kind of cloud data and analytics proofs-of-concept does the company have in place to build out its capabilities?
  7. Has the company identified and implemented value prop use cases for data and analytics, realizing tangible ROI?
  8. Where is the target company on the data and analytics curve?

Step 2: Identify the data sources, what data they contain, and how clean the data is.

Data value depends on breadth and quality of the target company’s data and data sources. Document what the data sources are, what purpose they serve, how the target company currently integrates data sources for analytics, the existing security and data governance measures, and the overall quality of the data.

  1. Inventory all of the company’s data sources, including a data dictionary, size, physical and logical location, data architecture, data model, etc.
  2. How many of the data sources have an API for ETL (extract, transform, load) to pull data into the data warehouse?
  3. Does the target company have a data warehouse, and are all of its data sources feeding the data warehouse?
  4. How much history does each data source have? Obviously, the longer the history, the greater the value of the data source.
  5. What kind of data security is in place to protect all data sources?
  6. What kind of data quality assessment for each source has been conducted?

Step 3: Assess the quality of the target company’s analytics and reporting.

Review how the target company approaches reporting and analytics. This step should include a review of their tools and technologies, KPIs and metrics, and reporting (i.e., self- service, interactive, dashboards, Excel reports, reports delivered by IT, etc.).

  1. What kind of reporting does the company use?
  2. Does the portfolio company have a heavy dependence on Excel for producing reports?
  3. Describe the KPIs that are in place for each functional area. How has the company been tracking against these KPIs?
  4. Does the company enable self-service analytics across the enterprise?
  5. What is the inventory of all reports generated by the company?
  6. What percentage of the reports are delivered by way of dashboarding?

Step 4: Review the people and processes involved in data management and analytics.

Determine the extent of the target company as a data-driven organization by examining the people and processes behind the data strategy. Document which FTEs are involved with data and analytics, how much time is dedicated to reporting and report development, as well as the current processes for analytics.

  1. How many FTEs are engaged in financial and operational report development?
  2. What does the data and analytics team consist of, in terms of data engineers, data scientists, data administrators, and others with data titles?
  3. What kind of data governance is in place for the target company to regulate the structure of data, as well as where and how data can flow through the organization?

Step 5: Find opportunities for target company data value creation.

Assess, understand, and determine the opportunities for marketing and operational improvements, cost reduction, untapped areas of growth, data monetization, cash flow improvement, and more.

  1. Which of the following advanced data and analytics use cases does the portfolio company have in place?
    • Customer acquisition
    • Marketing channel excellence
    • Working capital rationalization
    • Fixed asset deployment and maintenance
    • Operational labor transformation
    • Forecasting predictive analytics
    • Automated customer reporting
    • Supply chain optimization
  2. What use cases does the company conduct for data science predictive and prescriptive analytics?
  3. What is the target company’s data monetization strategy, and where are they with implementation?
  4. What is the company’s usage of big data to enhance marketing, sales, and customer service understanding and strategies?
  5. What third-party data does the company use to supplement internal data to drive enhanced insights into marketing and operating?

Conclusion

To accelerate data and analytics value creation for a portfolio company target, start the process during due diligence. Gaining tremendous insight into the potential for data will accelerate the plan once the deal is closed and allow for a running start on data analytics value creation. With these insights, the PE firm, in partnership with their portfolio company, will generate fast data ROI and enable financial and operational transformation, EBITDA growth, and enhanced cash flow.

At 2nd Watch, we help private equity firms implement comprehensive data analytics solutions from start to finish. Our data experts guide, oversee, and implement focused analytics projects to help clients attain more value from modern analytics. Contact us for a complimentary 90-minute whiteboard session to get started.

 

Jim Anfield, Principle and Health Care Practice Leader

rss
Facebooktwitterlinkedinmail

Top 4 Data Management Solutions for Snowflake Success

The Data Insights practice at 2nd Watch saw the potential of Snowflake from the time it was a tech-unicorn in 2015. Its innovative approach to storing and aggregating data is a game-changer in the industry! On top of that, Snowflake’s value proposition to their customers complements the data management expertise that 2nd Watch has been developing since its inception. Whether you’re a mid-sized insurance carrier or a Fortune 500 manufacturer, Snowflake and 2nd Watch know how to build scalable, tailored solutions for your business problems.Data Management Solutions for Snowflake Success

On top of skills in AI and machine learning, app development, and data visualization, here are the top four data engineering services 2nd Watch uses to deploy a successful cloud data platform initiative using a tool like the Snowflake Data Cloud.

Data Warehousing 

Snowflake offers powerful features in the data warehousing space that allow 2nd Watch delivery teams to stay laser-focused on business outcomes. They use innovative technologies that optimize your data for storage, movement, and active use (cloud computing). They also have an ever-increasing array of valuable tools that significantly improve an organization’s ability to enrich and share large amounts of data with other companies. 

But it doesn’t happen by magic…

2nd Watch can leverage our vast industry and technical experience to create a data warehouse for your organization that provides a fast, accurate, and consistent view of your data from multiple sources. Using best practices and well-established methodologies, 2nd Watch combines data from different sources into a centralized repository, creating a single version of the truth and a unified view.

The final design contains a user-friendly enterprise data warehouse that connects with both legacy and modern business intelligence tools to help you analyze data across your organization. The data warehouse is optimized for performance, scaling, and ease-of-use by downstream applications.

Potential Deliverables

  • Conceptual and physical data models for dimensional and analytical systems
  • Deployment of three semantic layers for tracking data in a central hub (raw, business using data vault, and data warehouse optimized for visualizations)
  • Design and development of departmental data marts of curated data
  • Training of end users for the cloud-based data solution and critical data applications and tools

Data Integration 

Snowflake has a lot of flexibility when it comes to the data integration process, meaning Snowflake’s Data Cloud allows companies to go beyond traditional extract, transform, and load data flows. With the Snowflake ecosystem, companies can leverage data integration solutions that do everything from data preparation, migration, movement, and management, all in an automated and scalable way.

The consultants at 2nd Watch will partner with you every step of the way and guide the entire team in the right direction to meet your decision-makers’ specific goals and your organization’s business data needs. These are some of the popular data integration tools and technologies that 2nd Watch can help integrate to Snowflake:

  • Azure Data Factory
  • AWS Glue and Lambda
  • Google Cloud Data Fusion
  • Fivetran/HVR
  • Etlworks 
  • IBM DataStage 
  • SnapLogic 
  • Plus, all the classics, including SQL Server Integration Services (SSIS) and Informatica

Potential Deliverables

  • Integration of any number of sources to a centralized data hub
  • Establishment of a custom system that operates well with niche sources
  • Speeding up the ingestion process and increasing the auditing power
  • End-game integration to a data warehouse and other target systems

Data Modernization

Snowflake is a paradigm-shifting platform. Micro partition storage, decentralized compute, and cross-cloud sharing opens up new opportunities for companies to solve pain in their analytics processing. Our consultants at 2nd Watch are trained in the latest technologies and have the technical expertise to tackle the challenges posed by making your legacy systems “just work” in modern ecosystems like Snowflake.

Using supplemental tools like dbt or sqlDBM, this process will transform your data platform by eliminating complexities, reducing latency, generating documentation, integrating siloed sources, and unlocking the ability to scale and upgrade your existing data solutions.

Potential Deliverables

  • Migration to Snowflake from existing high-maintenance deployments
  • Refactoring, redesigning, and performance tuning of data architecture 
  • Deploying Snowpark API for integrating with Scala or Python applications 
  • Supporting modern tool selection and integration

Data Governance 

Data governance is critical to organizations hoping to achieve and maintain long-term success. Snowflake offers outstanding features such as object tagging or data classification that improve the security, quality, and value of the data. Additionally, when you work with 2nd Watch, we can help your organization establish a data governance council and program.

2nd Watch will assist you in identifying and coaching early adopters and champions. We will help with establishing roles and responsibilities (e.g., business owners, stewards, custodians), as well as creating and documenting principles, policies, processes, and standards. Finally, we will identify the right technology to help automate these processes and improve your data governance maturity level.

Potential Deliverables

  • Data governance strategy
  • Change management: identification of early adopters and champions
  • Master data management implementation
  • Data quality: data profiling, cleansing, and standardization
  • Data security and compliance (e.g., PII, HIPAA, GRC)

2nd Watch will make sure your team is equipped to make the most of your Snowflake ecosystem and analytics tools, guiding the entire process through deployment of a successful initiative. Get started with our Snowflake Value Accelerator.

rss
Facebooktwitterlinkedinmail

Mind the Gap! The Leap from Legacy to Modern Applications 

Legacy to Modern Applications | modernizing legacy applications

Most businesses today have evaluated their options for application modernization. Planned movement to the cloud happened ahead of schedule, driven by the need for rapid scalability and agility in the wake of COVID-19.

Legacy applications already rehosted or replatformed in the cloud saw increased load, highlighting painful inefficiencies in scalability and sometimes even causing outages. Your business has likely already taken some first steps in app modernization and updating legacy systems. 

Of the seven options to modernize with legacy systems outlined by Gartner, 2nd Watch commonly works with clients who have already successfully rehosted and replatformed applications. To a lesser extent, we see mainframe applications encapsulated in a modern RESTful API or replaced altogether. Businesses frequently take those first steps in their digital transformation but find themselves stuck crossing the gap to a fully modern application. 

What are common issues and solutions businesses face as they move away from outdated technologies and progress towards fully modern applications? 

Keeping the Goal in Mind 

Overcoming the inertia to begin a modernization project is often a lengthy process, requiring several months or as much as a year or more to complete the first phases. Development teams require training, thorough and careful planning must occur, and unforeseen challenges are encountered and overcome. Through it all, the needs of the business never slow down, and the temptation to halt or dramatically slow legacy modernization efforts after the initial phases of modernization can be substantial. 

No matter what the end state of the modernization journey looks like, it can be helpful to keep it at the forefront of the development team’s minds. In today’s remote and hybrid working environment, that’s not as easy as keeping a whiteboard or poster in a room. Sprint ceremonies should include a brief reminder of long-term business goals, especially for backlog or sprint reviews. Keep the team invested in the business and technical reasons and the question “why modernize legacy applications” at the forefront of their minds. Most importantly, solicit their feedback on the process required to accomplish the long-term strategic goals of the business. 

With the goal firmly in your development team’s minds, it’s time to tackle tactics in migrating from legacy apps to newer systems. What are some of these common stumbling blocks on the road to refactoring and rearchitecting legacy software? 

(Related article: Rehost vs Refactor vs Replatform | AppMod Essentials) 

Refactoring 

Refactoring an application can encompass a broad set of areas. Refactoring is sometimes as straightforward as reducing technical debt, or it can be as complex as breaking apart a monolithic application into smaller services. In 2nd Watch’s experience, some common issues when refactoring running applications include: 

  • Limited knowledge of cloud-based architectural patterns.
    Even common architectures like 2- and 3-tier applications require some legacy code changes when an application has moved from a data center to a cloud service provider or among cloud service providers. Where an older application may have hardcoded IP addresses or DNS, a modern approach to accessing application tiers would use environment variables configured at runtime, pointing at load balancers. 
  • Lack of telemetry and observability.
    Development teams are frequently hesitant to make changes quickly because there are too many unknowns in their application. Proper monitoring of known unknowns (metrics) and unknown unknowns (observability) can demystify the impact of refactoring. For more context around the types of unknowns and how to work with them in an application, Charity Majors frequently writes on the topic. 
  • Lack of thorough automated tests.
    A lack of automated tests also slows the ability to make changes because developers cannot anticipate what their changes might break. Improved telemetry and observability can help, but automated testing is the other side of the equation. Tools like Codecov can initially help improve test coverage, but unless carefully attended, incentivizing a percentage of test coverage across the codebase can lead to tests that do not thoroughly cover all common use cases. Good unit tests and integration testing can halt problems before they even start. 
  • No blueprint for optimal refactoring.
    Without a clear blueprint for understanding what an optimally refactored app looks like, development and information technology teams can become frustrated or unclear about their end goals. Heroku’s Twelve-Factor App methodology is one commonly used framework for crafting or refactoring modern applications. It has the added benefit of being applicable to many deployment models – single- or multiple-server, containers, or serverless. 

Rearchitecting Rearchitecting an application

Rearchitecting an application to leverage better capabilities, such as those found in a cloud service provider’s Platform-as-a-Service (PaaS) or Software-as-a-Service (SaaS) options, may present some challenges. The most common challenge 2nd Watch encounters with clients is not fully understanding the options available in modern environments. Older applications are the product of their time and typically were built optimally for the available technology and needs. However, when rearchitecting those applications, sometimes development teams either don’t know or don’t have details about better options that may be available. 

Running a MySQL database on the same machine as the rest of the monolithic application may have made sense when initially writing the application. Today, many applications can run more cheaply, more securely, and with the same or better performance using a combination of cloud storage buckets, managed caches like Redis or Memcached, and secrets managers. These consumption-based cloud options tend to be significantly cheaper than managed databases or databases running on cloud virtual machines. Scaling automatically with end-user demand and reduced management overhead are additional benefits of software modernization. 

Rearchitecting an application can also be frustrating for experienced systems administrators tasked with maintaining and troubleshooting production applications. For example, moving from VMs to containers introduces an entirely different way of dealing with logs. Sysadmins must forward them to a log aggregator instead of storing them on disk. Autoscaling a service can mean the difference between identifying which instances – of potentially dozens or hundreds – had an issue instead of a small handful of them. Application modernization impacts every person involved with the long-term success of that application, not just developers and end-users. 

Conclusion 

Application Modernization is a long-term strategic activity, not a short-term tactical activity. Over time, you will realize the benefits of the lower total cost of ownership (TCO), increased agility, and faster time to market. Recognizing and committing to the future of your business will help you overcome the short- and mid-term challenges of app modernization. 

Engaging a trusted partner to accelerate your app modernization journey and lead the charge across that gap is a powerful strategy to overcome some of the highlighted problems. It can be difficult to overcome a challenge with the same mindset that led to creating that challenge. An influx of different ideas and experiences can be the push development teams need to reach the next level for a business. 

If you’re wondering how to modernize legacy applications and are ready to work with a trusted advisor that can help you cross that gap, 2nd Watch will meet you wherever you are in your journey. Contact us to schedule a discussion of your goals, challenges, and how we can help you reach the end game of modern business applications. 

Michael Gray, 2nd Watch Senior Cloud Consultant 

rss
Facebooktwitterlinkedinmail

Our Takeaways from Insurance AI and Innovative Tech 2022

The 2nd Watch team attended the Reuters Insurance AI and Innovative Tech conference this past month, and we took away a lot of insightful perspectives from the speakers and leaders at the event. The insurance industry has a noble purpose in the world: insurance organizations strive to provide fast service to customers suffering from injury and loss, all while allowing their agents to be efficient and profitable. For this reason, insurance companies need to constantly innovate to satisfy all parties involved in the value chain.

AI and Innovation in the Insurance Industry

But this is no easy business model. Ensuring the satisfaction and success of all parties is becoming increasingly more difficult for the following reasons: 

  • The expectations and standards for a good customer experience are very high.
  • Insurers have a monumental amount of data to ingest and process.
  • The skills required to build useful analyses are at a premium.
  • It is easy to fail or get poor ROI on a technical initiative.

To keep up with the revolution, traditional insurance companies must undergo a massive digital transformation that supports a data-driven decision-making model. However, this sort of shift is daunting and riddled with challenges throughout the process. In presenting you with our takeaways from this eye-opening conference, we hope to address the challenges associated with redefining your insurance company and highlight new solutions that can help you tackle these issues head-on.

What are the pitfalls of an insurer trying to innovate?

The paradigm in the insurance industry has changed. As a result, your insurance business must adapt and improve digital capabilities to keep up with the market standards. While transformation is vital, it isn’t easy. Below are some pitfalls we’ve seen in our experience and that were also common themes at the Reuters event.

Your Corporate Culture Is Afraid of Failure

If your corporate culture avoids failure at all costs, then the business will be paralyzed in making necessary changes and decisions toward digital innovation. A lack of delivery can be just as damaging as bad delivery.

Your organization should prioritize incentivizing innovation and celebrating calculated risks. A culture that embraces quick failures will lead to more innovation because teams have the psychological safety net of trying out new things. Innovation cannot happen without disruption and pushing boundaries. 

You Ignore the Details and Only Focus on the Aggregate

Insurtech 1.0 of the 2000s failed (Metromile, Lemonade, etc.), but from failure, we gained valuable lessons. Ultimately, they taught us that anyone can grow while unintentionally losing money, but we can avoid this pitfall if we understand the detailed events that can have the greatest effect on our key performance indicators. 

Insurtech 1.0 leaders wanted to grow fast at all costs, but when these companies IPO’d, they flopped. Why? The short answer is that they focused only on growth and ignored the criticalness of high-quality underwriting. The growth-focused mindset led these Insurtech companies to write bad business to very risky customers (without realizing it!) because they were ignoring the “black swan” events that can have a major effect on your loss ratio.

Your insurance company should take note of the painful lessons Insurtech 1.0 had to go through. Be mindful of how you are growing by using technology to understand the primary drivers of cost. 

You Don’t Pursue an Initiative Because It Doesn’t Have a Quick ROI

Innovation initiatives don’t always have an instant ROI, but that shouldn’t scare you off of them. The results of new technologies often aren’t immediately clearly defined and can take some time to come to fruition. Auto insurers using telematics is an example of a trend that is worth pursuing, even though the ROI initially feels ambiguous.  

To increase your confidence in documenting ROI, utilize historical data sources to establish your baseline. You can’t measure the impact of a new solution without comparing the before and after! From there, you can select which metrics to track to determine ROI. By leveraging your historical data, you can gather new data, leverage all data sets, and create new value.

How can you avoid these pitfalls?

The conference showed us that there are plenty of promising new technologies, solutions, and frameworks to help insurers resolve these commonly seen pain points. Below are key ways that developed new products can contribute to a successful digital transformation of your insurance offerings:

Create a Collaborative and Cross-Functional Corporate Culture

In order to drive an innovation-centric strategy, your insurance company must promote the right culture to support it. Innovation shouldn’t be centralized, and you should take a strong interest in deploying new technologies and ideas by individuals. Additionally, you should develop a technical plan that ties back to the business strategy. A common goal and alignment toward the goal will foster teamwork and shared responsibility around innovation initiatives.

Ultimately, you want to land in a place where you have created a culture of innovation. This should be a grassroots approach where every member of the organization feels capable and empowered to develop the ideas of today into the innovations and insurance products of tomorrow. Prioritize diversity of perspectives, access to leadership, employee empowerment, and alignment on results.  

Become More Customer-Centric and Less Operations-Focused

Your insurance company should make a genuine effort to understand your customers fully. This allows you to create tailored customer experiences for greater customer satisfaction. Empower your agents to use data to personalize and customize their touchpoints to the customer, and they can provide memorable customer experiences for your policyholders. 

Fraud indicators, quote modifiers, and transaction-centric features are operations-focused ways to use your data warehouse. These tools are helpful, but they can distract you from building a customer-oriented data warehouse. Your insurance business should make customers the central pillar of your technologies and frameworks.

Pilot Technologies Based on Your Company’s Strategic Business Goals

Every insurance business has a different starting point, and you have to deal with the cards that you are dealt. Start by understanding what your technology gap is and where you can reduce the pain points. From there, you can build a strong case for change and begin to implement the tools, frameworks, and processes needed to do so. 

Once you have established your business initiatives, there are powerful technologies for insurance companies that can help you transform and achieve your goals. For example, using data integration and data warehousing on cloud platforms, such as Snowflake, can enable KPI discovery and self-service. Another example is artificial intelligence and machine learning, which can help your business with underwriting transformation and provide you with “Next Best Action” by combining customer interests with the objectives of your business. 

Conclusion

Any tool or model you have in production today is already “legacy.” Digital insurance innovation doesn’t just mean upgrading your technologies and tools. It means creating an entire ecosystem and culture to form hypotheses, take measured risks, and implement the results! A corporate shift to embrace change in the insurance industry can seem overwhelming, but partnering with 2nd Watch, which has experts in both the technology and the insurance industry, will set your innovation projects up for success. Contact us today to learn how we can help you revolutionize your business!

rss
Facebooktwitterlinkedinmail

Data Strategy for Insurance: How to Build the Right Foundation for Analytics and Machine Learning

Analytics and machine learning technologies are revolutionizing the insurance industry. Rapid fraud detection, improved self service, better claims handling, and precise customer targeting are just some of the possibilities. Before you jump head first into an insurance analytics project, however, you need to take a step back and develop an enterprise data strategy for insurance that will ensure long-term success across the entire organization.

Want better dashboards? Our data and analytics insurance team is here to help. Learn more about our data visualization starter pack.

Data management Strategy for Insurance companies

Here are the basics to help get you started – and some pitfalls to avoid.

The Foundation of Data Strategy for Insurance

Identify Your Current State

What are your existing analytics capabilities? In our experience, data infrastructure and analysis are rarely implemented in a tidy, centralized way. Departments and individuals choose to implement their own storage and analytical programs, creating entire systems that exist off the radar. Evaluating the current state and creating a roadmap empowers you to conduct accurate gap analysis and arrange for all data sources to funnel into your final analytics tool.

Define Your Future State

A strong ROI depends on a clear and defined goal from the start. For insurance analytics, that means understanding the type of analytics capabilities you need (e.g., real-time analytics, predictive analytics) and the progress you want to make (e.g., more accurate premiums, reduced waste, more personalized policies). Through stakeholder interviews and business requirements, you can determine the exact fix to reduce waste during the implementation process.

Pitfalls to Avoid

Even with a solid roadmap, some common mistakes can hinder the end result of your insurance analytics project. Keep these in mind during the planning and implementation phases.

Don’t Try to Eat the Elephant in One Bite

Investing $5 million in an all-encompassing enterprise-wide platform is good in theory. However, that’s a hefty price tag for an untested concept. We recommend our clients start on a more strategic proof of concept that can provide ROI in months rather than years.

Maximize Your Data Quality

Your insights are only as good as your data. Even with a well-constructed data hub, your findings cannot turn low-quality data into gems. Data quality management within your business provides a framework for better outcomes by identifying old or unreliable data. But your team needs to take it to the next level, acting with care to input accurate and timely data that your internal system can use for analysis.

Align Analytics with Your Strategic Goals

Alignment with your strategic goals is a must for any insurance analytics project. There needs to be consensus among all necessary stakeholders – business divisions, IT, and top business executives – or each group will pull the project in different directions. This challenge is avoidable if the right stakeholders and users are included in planning the future state of your analytics program.

Integrate Analytics with Your Whole Business

Incompatible systems result in significant waste in any organization. If an analytics system cannot access the range of data sources it needs to evaluate, then your findings will fall short. During one project, our client wanted to launch a claims system and assumed it would be a simple integration of a few systems. When we conducted our audit, we found that 25 disparate source systems existed. Taking the time up front to run these types of audits prevents headaches down the road when you can’t analyze a key component of a given problem.

If you have any questions or are looking for additional guidance on analytics, machine learning, or data strategy for insurance, 2nd Watch’s insurance data and analytics team is happy to help. Feel free to contact us here.

Data strategy insurance industry

rss
Facebooktwitterlinkedinmail

What You Gain from a Well-Defined Enterprise Data Strategy

We find most businesses are eager to dig into their data. The possibility of solving persistent problems, revealing strategic insights, and reaching a better future state has mass appeal. However, when those same businesses insist on pursuing the latest and greatest technology without a game plan, I pump the brakes. Analytics without an enterprise data strategy is a lot like the Griswold’s journey in National Lampoon’s Vacation. They ended up at their destination…but at what cost? More than just gaining a better streamlined process, here is what your business can expect from putting the time into defining your enterprise data strategy.

What You Gain from a Well-Defined Enterprise Data Strategy

Verification of Your Current State

Data is messier than most businesses imagine. Data practices, storage, and quality without a robust governance strategy often morph data beyond recognition. That’s why we recommend that any business interested in enterprise analytics first audit their current state – a key component of an effective enterprise data strategy. 

Take one of our former enterprise clients as an example. Prior to our data strategy and architecture assessment, they believed there were no issues with their reporting environment. The assumption was that all of their end users could access the reporting dashboards and that there were negligible data quality issues. Our assessment uncovered a very different scenario.

Right off the bat, we found that many of their reports were still conducted manually. Plenty of business users were waiting on reporting requests. There were both data quality and performance issues with the reporting environment and their transactional system. With that reality brought to the forefront, we were able to effectively address those issues and bring the client to an improved future state.

Even in less drastic instances, conducting the assessment phase of an enterprise data strategy enables a business to verify that assumptions about their data architecture and ecosystem are based in fact. In the long run, this prevents expensive mistakes and maximizes the future potential of your data.

Greater Need Fulfillment

An ad hoc approach to data restricts your return on investment. Without an underlying strategy, data storage architecture or an analytics solution is at best reactive, addressing an immediate concern without consideration for the big picture. Enterprise-wide needs go unfulfilled and the shelf life of any data solution lasts about as long as a halved avocado left out overnight.

A firm enterprise data strategy can avoid all these issues. For starters, there is an emphasis on holistic needs assessment across the enterprise. Interviews conducted within management-level stakeholders and a wide array of end users help to gain a panoramic view of the pain points and opportunities that data can help solve. This leads to fewer organizational blind spots and a greater understanding of real-world scenarios.

How do you gain an enterprise-wide perspective? Asking the following questions in an assessment is a good start:

  •   What data is being measured? What subject areas are important to analyze?
  •   Which data sources are essential parts of your data architecture?
  •   What goals are you trying to achieve? What goals are important to your end users?
  •   Which end users are using the data? Who needs to run reports?
  •   What manual processes exist? Are there opportunities to automate?
  •   Are there data quality issues? Is data compliant with industry regulations?
  •   What are your security concerns? Are there any industry-specific compliance mandates?
  •   Which technologies are you using in-house? Which technologies do you want to use?

This is only the start of the process. Our own data strategy and architecture assessment goes in-depth with the full range of stakeholders to deliver data solutions that achieve the greatest ROI.

A Clearer Roadmap for the Future

How do your data projects move from point A to point B? The biggest advantage of a data strategy is providing your organization with a roadmap to achieve your goals. This planning phase outlines how your team will get new data architecture or analytics initiatives off the ground.

For example, one of our clients in the logistics space wanted to improve their enterprise-wide analytics. We analyzed their current data ingestion tool, SQL Server data warehouse, and separate data sources. The client knew their new solution would need to pull from a total of nine data sources ranging from relational databases on SQL Server and DB2 to API sources. However, they didn’t know how to bridge the gap between their vision and a real outcome for their business.

We conducted a gap analysis to determine what steps existed between their current and future state. We incorporated findings from our stakeholder assessments. Then, we were able to build out a roadmap for a cloud-based data warehouse that will offer reports for executives and customers alike, in addition to providing access to advanced analytics. Our roadmap provided them with timelines, technologies needed, incremental project milestones, and workforce requirements to facilitate a streamlined process.

With a similar roadmap at your disposal, you will start your organization on the right path to building out an effective data and analytics solution. 

Jason Maas

rss
Facebooktwitterlinkedinmail