Controlling Your AWS Spend for Innovation Investments

Cloud Spend 101: What is it, and why does it matter?

Cloud spend is the amount of money an organization spends in AWS and across all cloud platforms. A common belief is that moving to the cloud will significantly decrease your total cost of ownership (TCO) quickly, easily, and almost by default. Unfortunately, reaping the infrastructure cost savings of AWS is not that simple, but it certainly is obtainable. To achieve a lower TCO while simultaneously boosting productivity and gaining operational resilience, business agility, and sustainability, you must strategize your migration and growth within AWS.

The most common mistake made when migrating from on-prem environments to the cloud is going “like-for-like.” Basically, creating a cookie-cutter image of what existed on-prem in the new cloud environment. Because they are two completely different types of infrastructure, organizations end up way over-provisioned using unnecessary and expensive On-Demand Instance pricing.

Ideally, you want a well-developed game plan before migration starts to avoid losing money in the cloud. With the advice and support of a trusted cloud partner, a comprehensive strategy takes your organization from design to implementation to optimization. That puts you in the best position to stay on top of costs during every step of migration and once you’re established in AWS. Cost savings realized in the cloud can be reinvested in innovation that expands business value and grows your bottom line.

The 6 pillars of cloud spend optimization.

While it’s best to have a comprehensive strategy before migrating to the cloud, cloud spend optimization is an ongoing necessity in any cloud environment. With hundreds of thousands of different options for cloud services today, choosing the right tools is overwhelming and leaves much room for missteps. At the same time, there are also a lot of opportunities available. Regardless of where you are in your cloud journey, the six pillars of cloud spend optimization provide a framework for targeted interventions.

#1: Reserved Instances (RIs)

RIs deliver meaningful savings on Amazon EC2 costs compared to On-demand Instance pricing. RIs aren’t physical instances but a billing discount for using On-Demand Instances in your account. Pricing is based on the instance type, region, tenancy, and platform; term commitment; payment cadence; and offering class.

#2: Auto-Parking

A significant benefit of the cloud is scalability, but the other side of that is individual control. Often, an organization’s team members forget or are not prompted or incentivized to terminate resources when they aren’t being used. Auto-Parking schedules and automates the spin-up/spin-down process depending on hours of use to prevent paying for idle resources. This is an especially helpful tool for development and test environments.

#3: Right-Sizing

Making sure you have exactly what you need and nothing you don’t requires an analysis of resource consumption, chargebacks, auto-parked resources, and available RIs. Using those insights, organizations can implement policies and guardrails to reduce overprovisioning by tagging resources for department-level chargebacks and properly monitoring CPU, memory, and I/O (input/output).

#4: Family Refresh

Instance types, VM series, and Instance Families all describe the methods cloud providers use to package instances depending on the hardware. When instance types are retired and replaced with new technology, cloud pricing changes based on compute memory and storage parameters – this process is referred to as “Family Refresh.” Organizations must closely monitor instances and expected costs to manage these price fluctuations and prevent redundancies.

#5: Waste

Inherent in optimization is waste reduction. You need the checks and balances we’ve discussed to prevent unnecessary costs and reap the financial benefits of a cloud environment. Identifying waste and stopping the leaks takes time and regular, accurate reporting within each business unit. For example, when developers are testing, make sure they’re only spinning up new environments for a specific purpose. Once those environments are no longer used, they should be decamped to avoid waste.

#6: Storage

Storage catalyzes many organizations to move to the cloud because it’s a valuable way to reduce on-prem hardware spend. Again, to realize those savings, businesses must keep a watchful eye on what is being stored, why it’s being stored, and how much it will cost. There are typically four components impacting storage costs:

  1. Size – How much storage do you need?
  2. Data transfer (bandwidth) – How often does data move from one location to another?
  3. Retrieval time – How quickly do you need to access the data?
  4. Retrieval requests – How often do you need access to the data?

Depending on your answers to these questions, there are different ways to manage your environment using file storage, databases, data backup, and data archives. Organizations can estimate storage costs with a solid data lifecycle policy while right sizing and amplifying storage capacity and bandwidth.

Private Pricing Agreements

Another way to control your AWS spend is with a PPA or a Private Pricing Agreement – formally known as an EDP or Enterprise Discount Program. A PPA is a business-led pricing agreement with AWS that considers a specific term and commit amount. Organizations that are already in the cloud and love the service can use their expected growth over the next three or five years to get a discount for committing to that amount of time with AWS. In addition to expected compute services, reservations for reserved instances, and existing savings plans, the business also includes software purchases from the marketplace in the agreement to get further discounts.

Choosing a cloud optimization partner.

It’s easy to know what to do to control spend, but it’s a whole other beast to integrate cloud optimization into business initiatives and the culture of both IT teams and finance teams. Of course, you can go it alone if you have the internal cloud expertise required for optimization, but most businesses partner with an external cloud expert to avoid the expenses, risk, and time needed to see results. Attempting these strategies without an experienced partner can cost you more in the long run without achieving the ROI you expected.

In fact, when going it alone, businesses gain about 18% savings on average. While that may sound satisfying, companies that partner with the cloud experts at 2nd Watch average 40% savings on their compute expenses alone. How? We aim high, and so should you. Regardless of how you or your cloud optimization partner tackles cloud spend, target 90% or greater coverage in reserved instances and savings plans. In addition to the six pillars of optimization and PPAs, you or your partner also need to…

  • Know how to pick the right services and products for your business from the hundreds of thousands of options available.
  • Develop a comprehensive cloud strategy that goes beyond just optimizing cost.
  • Assess the overall infrastructure footprint to determine the effectiveness of serverless or containerization for higher efficiency.
  • Evaluate applications running on EC2 instances to identify opportunities for application modernization.

Take the next step in your cloud journey.

2nd Watch saves organizations hundreds of thousands of dollars in the cloud every year, and we’d love to help you reallocate your cloud spend toward business innovation. Our experienced cloud experts work with your team to teach cloud optimization strategies that can be carried out independently in the future. As an AWS Premier Partner with 10 years of experience, 2nd Watch advisors know how to maximize your environment within budget so you can grow your business. Contact Us to learn more and get started!

 

 

 

 

 

 

rss
Facebooktwitterlinkedinmail

Value-Focused Due Diligence with Data Analytics

Private equity funds are shifting away from asset due diligence toward value-focused due diligence. Historically, the due diligence (DD) process centered around an audit of a portfolio company’s assets. Now, private equity (PE) firms are adopting value-focused DD strategies that are more comprehensive in scope and focus on revealing the potential of an asset.

Data analytics are key in support of private equity groups conducting value-focused due diligence. Investors realize the power of data analytics technologies to accelerate deal throughput, reduce portfolio risk, and streamline the whole process. Data and analytics are essential enablers for any kind of value creation, and with them, PE firms can precisely quantify the opportunities and risks of an asset.

Value-Focused Due Diligence with Data Analytics

The Importance of Taking a Value-Focused Approach to Due Diligence

Due diligence is an integral phase in the merger and acquisition (M&A) lifecycle. It is the critical stage that grants prospective investors a view of everything happening under the hood of the target business. What is discovered during DD will ultimately impact the deal negotiation phase and inform how the sale and purchase agreement is drafted.

The traditional due diligence approach inspects the state of assets, and it is comparable to a home inspection before the house is sold. There is a checklist to tick off: someone evaluates the plumbing, another looks at the foundation, and another person checks out the electrical. In this analogy, the portfolio company is the house, and the inspectors are the DD team.

Asset-focused due diligence has long been the preferred method because it simply has worked. However, we are now contending with an ever-changing, unpredictable economic climate. As a result, investors and funds are forced to embrace a DD strategy that adapts to the changing macroeconomic environment.

With value-focused DD, partners at PE firms are not only using the time to discover cracks in the foundation, but they are also using it as an opportunity to identify and quantify huge opportunities that can be realized during the ownership period. Returning to the house analogy: during DD, partners can find the leaky plumbing and also scope out the investment opportunities (and costs) of converting the property into a short-term rental.

The shift from traditional asset due diligence to value-focused due diligence largely comes from external pressures, like an uncertain macroeconomic environment and stiffening competition. These challenges place PE firms in a race to find ways to maximize their upside to execute their ideal investment thesis. The more opportunities a PE firm can identify, the more competitive it can be for assets and the more aggressive it can be in its bids.

Value-Focused Due Diligence Requires Data and Analytics

As private equity firms increasingly adopt value-focused due diligence, they are crafting a more complete picture using data they are collecting from technology partners, financial and operational teams, and more. Data is the only way partners and investors can quantify and back their value-creation plans.

During the DD process, there will be mountains of data to sift through. Partners at PE firms must analyze it, discover insights, and draw conclusions from it. From there, they can execute specific value-creation strategies that are tracked with real operating metrics, rooted in technological realities, and modeled accurately to the profit and loss statements.

This makes data analytics an important and powerful tool during the due diligence process. Data analytics can come in different forms:

  • Data Scientists: PE firms can hire data science specialists to work with the DD team. Data specialists can process and present data in a digestible format for the DD team to extract key insights while remaining focused on key deal responsibilities.
  • Data Models: PE firms can use a robustly built data model to create a single source of truth. The data model can combine a variety of key data sources into one central hub. This enables the DD team to easily access the information they need for analysis directly from the data model.
  • Data Visuals: Data visualization can aid DD members in creating more succinct and powerful reports that highlight key deal issues.
  • Document AI: Harnessing the power of document AI, DD teams can glean insights from a portfolio company’s unstructured data to create an ever more well-rounded picture of a potential acquisition.

Data Analytics Technology Powers Value

Value-focused due diligence requires digital transformation. Digital technology is the primary differentiating factor that can streamline operations and power performance during the due diligence stage. Moreover, the right technology can increase or decrease the value of a company.

Data analytics ultimately allows PE partners to find operationally relevant data and KPIs needed to determine the value of a portfolio company. There will be enormous amounts of data for teams to wade through as they embark on the DD process. However, savvy investors only need the right pieces of information to accomplish their investment thesis and achieve value creation. Investing in robust data infrastructure and technologies is necessary to implement the automated analytics needed to more easily discover value, risk, and opportunities. Data and analytics solutions include:

  • Financial Analytics: Financial dashboards can provide a holistic view of portfolio companies. DD members can access on-demand insights into key areas, like operating expenses, cash flow, sales pipeline, and more.
  • Operational Metrics: Operational data analytics can highlight opportunities and issues across all departments.
  • Executive Dashboards: Leaders can access the data they need in one place. This dashboard is highly tailored to present hyper-relevant information to executives involved with the deal.

Conducting value-focused due diligence requires timely and accurate financial and operating information available on demand. 2nd Watch partners with private equity firms to develop and execute the data, analytics, and data science solutions PE firms need to drive these results in their portfolio companies. Schedule a no-cost, no-obligation private equity whiteboarding session with one of our private equity analytics consultants.

rss
Facebooktwitterlinkedinmail

Live Broadcast Delivery and Playout

 Delivering Live Broadcasts and Playout with AWS: How 2nd Watch Can Help

Live broadcasts and playouts are critical requirements for media companies to satisfy their audiences. More customers are cutting cords and watching live channels on the web, mobile, tablets, and smart TVs. As a result, media companies are pressured to bring additional channels to market and scale up their delivery capabilities.

Amazon Web Services (AWS) is a solution for media companies to deliver live broadcasts and playouts. AWS offers various services to help media companies design, build, and manage a scalable, reliable, cost-effective live broadcast and playout solution. These services include AWS Elemental MediaLive, a real-time video encoding service; AWS CloudFront, a content delivery network (CDN); and AWS MediaConnect, a transport service for live video streams. In addition, partnerships with leading media and entertainment technology companies – such as Amagi, BCNEXXT, Grass Valley, and Imagine Communications – can provide expertise and support in implementing and managing a live broadcast and playout solution on AWS.

Here is an example Terraform template that creates a MediaLive channel that receives an SRT input and outputs HLS to CloudFront:

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

This template creates a MediaLive channel with an SRT input and an HLS output. The HLS output is written to an S3 bucket, and a CloudFront distribution is created to serve the HLS output to users.

This is an example: you will need to customize it for your specific use case. For instance, you must specify the SRT input settings, such as the IP address and port of the SRT source. You must also specify the details of the S3 bucket and CloudFront distribution.

2nd Watch is an experienced AWS partner with the knowledge and resources to support media companies in designing, building, and managing an effective live broadcast and playout solution on AWS. Our team is familiar with a range of AWS services, including AWS Elemental MediaLive, AWS CloudFront, and AWS MediaConnect, as well as partnerships with leading media and entertainment technology companies. Contact us today to learn more about our consulting services for live broadcast and playout on AWS.

Aleksander Hansson | Specialist Solutions Architect | 2ND Watch

rss
Facebooktwitterlinkedinmail

Data & AI Predictions in 2023

As we reveal our data and AI predictions for 2023, join us at 2nd Watch to stay ahead of the curve and propel your business towards innovation and success. How do we know that artificial intelligence (AI) and large language models (LLMs) have reached a tipping point? It was the hot topic at most families’ dinner tables during the 2022 holiday break.

Modern data management: Comparing modern data warehouse options

AI has become mainstream and accessible. Most notably, OpenAI’s ChatGPT took the internet by storm, so much so that even our parents (and grandparents!) are talking about it. Since AI is here to stay beyond the Christmas Eve dinner discussion, we put together a list of 2023 predictions we expect to see regarding AI and data.

#1. Proactively handling data privacy regulations will become a top priority.

Regulatory changes can have a significant impact on how organizations handle data privacy: businesses must adapt to new policies to ensure their data is secure. Modifications to regulatory policies require governance and compliance teams to understand data within their company and the ways in which it is being accessed. 

To stay ahead of regulatory changes, organizations will need to prioritize their data governance strategies. This will mitigate the risks surrounding data privacy and potential regulations. As a part of their data governance strategy, data privacy and compliance teams must increase their usage of privacy, security, and compliance analytics to proactively understand how data is being accessed within the company and how it’s being classified. 

#2. AI and LLMs will require organizations to consider their AI strategy.

The rise of AI and LLM technologies will require businesses to adopt a broad AI strategy. AI and LLMs will open opportunities in automation, efficiency, and knowledge distillation. But, as the saying goes, “With great power comes great responsibility.” 

There is disruption and risk that comes with implementing AI and LLMs, and organizations must respond with a people- and process-oriented AI strategy. As more AI tools and start-ups crop up, companies should consider how to thoughtfully approach the disruptions that will be felt in almost every industry. Rather than being reactive to new and foreign territory, businesses should aim to educate, create guidelines, and identify ways to leverage the technology. 

Moreover, without a well-thought-out AI roadmap, enterprises will find themselves technologically plateauing, teams unable to adapt to a new landscape, and lacking a return on investment: they won’t be able to scale or support the initiatives that they put in place. Poor road mapping will lead to siloed and fragmented projects that don’t contribute to a cohesive AI ecosystem.

#3. AI technologies, like Document AI (or information extraction), will be crucial to tap into unstructured data.

According to IDC, 80% of the world’s data will be unstructured by 2025, and 90% of this unstructured data is never analyzed. Integrating unstructured and structured data opens up new use cases for organizational insights and knowledge mining.

Massive amounts of unstructured data – such as Word and PDF documents – have historically been a largely untapped data source for data warehouses and downstream analytics. New deep learning technologies, like Document AI, have addressed this issue and are more widely accessible. Document AI can extract previously unused data from PDF and Word documents, ranging from insurance policies to legal contracts to clinical research to financial statements. Additionally, vision and audio AI unlocks real-time video transcription insights and search, image classification, and call center insights.

Organizations can unlock brand-new use cases by integrating with existing data warehouses. Finetuning these models on domain data enables general-purpose models across a wide variety of use cases. 

#4. “Data is the new oil.” Data will become the fuel for turning general-purpose AI models into domain-specific, task-specific engines for automation, information extraction, and information generation.

Snorkel AI coined the term “data-centric AI,” which is an accurate paradigm to describe our current AI lifecycle. The last time AI received this much hype, the focus was on building new models. Now, very few businesses need to develop novel models and algorithms. What will set their AI technologies apart is the data strategy.

Data-centric AI enables us to leverage existing models that have already been calibrated to an organization’s data. Applying an enterprise’s data to this new paradigm will accelerate a company’s time to market, especially those who have modernized their data and analytics platforms and data warehouses

#5. The popularity of data-driven apps will increase.

Snowflake recently acquired Streamlit, which makes application development more accessible to data engineers. Additionally, Snowflake introduced Unistore and hybrid tables (OLTP) to allow data science and app teams to work together and jointly off of a single source of truth in Snowflake, eliminating silos and data replication.

Snowflake’s big moves demonstrate that companies are looking to fill gaps that traditional business intelligence (BI) tools leave behind. With tools like Streamlit, teams can harness tools to automate data sharing and deployment, which is traditionally manual and Excel-driven. Most importantly, Streamlit can become the conduit that allows business users to work directly with the AI-native and data-driven applications across the enterprise.

#6. AI-native and cloud-native applications will win.

Customers will start expecting AI capabilities to be embedded into cloud-native applications. Harnessing domain-specific data, companies should prioritize building upon module data-driven application blocks with AI and machine learning. AI-native applications will win over AI-retrofitted applications. 

When applications are custom-built for AI, analytics, and data, they are more accessible to data and AI teams, enabling business users to interact with models and data warehouses in a new way. Teams can begin classifying and labeling data in a centralized, data-driven way, rather than manually and often-repeated in Excel, and can feed into a human-in-the-loop system for review and to improve the overall accuracy and quality of models. Traditional BI tools like dashboards, on the other hand, often limit business users to consume and view data in a “what happened?” manner, rather than in a more interactive, often more targeted manner.

#7. There will be technology disruption and market consolidation.

The AI race has begun. Microsoft’s strategic partnership with OpenAI and integration into “everything,” Google’s introduction of Bard and funding into foundational model startup Anthropic, AWS with their own native models and partnership with Stability AI, and new AI-related startups are just a few of the major signals that the market is changing. The emerging AI technologies are driving market consolidation: smaller companies are being acquired by incumbent companies to take advantage of the developing technologies. 

Mergers and acquisitions are key growth drivers, with larger enterprises leveraging their existing resources to acquire smaller, nimbler players to expand their reach in the market. This emphasizes the importance of data, AI, and application strategy. Organizations must stay agile and quickly consolidate data across new portfolios of companies. 

Conclusion

The AI ball is rolling. At this point, you’ve probably dabbled with AI or engaged in high-level conversations about its implications. The next step in the AI adoption process is to actually integrate AI into your work and understand the changes (and challenges) it will bring. We hope that our data and AI predictions for 2023 prime you for the ways it can have an impact on your processes and people.

Think you’re ready to get started? Find out with 2nd Watch’s data science readiness assessment.

rss
Facebooktwitterlinkedinmail

Modern Data Warehouses and Machine Learning: A Powerful Pair

Artificial intelligence (AI) technologies like machine learning (ML) have changed how we handle and process data. However, AI adoption isn’t simple. Most companies utilize AI only for the tiniest fraction of their data because scaling AI is challenging. Typically, enterprises cannot harness the power of predictive analytics because they don’t have a fully mature data strategy.

Modern Data Warehouses and Machine Learning

To scale AI and ML, companies must have a robust information architecture that executes a company-wide data and predictive analytics strategy. This requires businesses to focus their data application beyond cost reduction and operations, for example. Fully embracing AI will require enterprises to make judgment calls and face challenges in assembling a modern information architecture that readies company data for predictive analytics. 

A modern data warehouse is the catalyst for AI adoption and can accelerate a company’s data maturity journey. It’s a vital component of a unified data and AI platform: it collects and analyzes data to prepare the data for later stages in the AI lifecycle. Utilizing your modern data warehouse will propel your business past conventional data management problems and enable your business to transform digitally with AI innovations.

What is a modern data warehouse?

On-premise or legacy data warehouses are not sufficient for a competitive business. Today’s market demands organizations to rely on massive amounts of data to best serve customers, optimize business operations, and increase their bottom lines. On-premise data warehouses are not designed to handle this volume, velocity, and variety of data and analytics.

If you want to remain competitive in the current landscape, your business must have a modern data warehouse built on the cloud. A modern data warehouse automates data ingestion and analysis, which closes the loop that connects data, insight, and analysis. It can run complex queries to be shared with AI technologies, supporting seamless ML and better predictive analytics. As a result, organizations can make smarter decisions because the modern data warehouse captures and makes sense of organizational data to deliver actionable insights company-wide.

How does a modern data warehouse work with machine learning?

A modern data warehouse operates at different levels to collect, organize, and analyze data to be utilized for artificial intelligence and machine learning. These are the key characteristics of a modern data warehouse:

Multi-Model Data Storage

Data is stored in the warehouse to optimize performance and integration for specific business data. 

Data Virtualization

Data that is not stored in the data warehouse is accessed and analyzed at the source, which reduces complexity, risk of error, cost, and time in data analysis. 

Mixed Workloads

This is a key feature of a modern data warehouse: mixed workloads support real-time warehousing. Modern data warehouses can concurrently and continuously ingest data and run analytic workloads.

Hybrid Cloud Deployment

Enterprises choose hybrid cloud infrastructure to move workloads seamlessly between private and public clouds for optimal compliance, security, performance, and costs. 

A modern data warehouse can collect and process the data to make the data easily shareable with other predictive analytics and ML tools. Moreover, these modern data warehouses offer built-in ML integrations, making it seamless to build, train, and deploy ML models.

What are the benefits of using machine learning in my modern data warehouse?

Modern data warehouses employ machine learning to adjust and adapt to new patterns quickly. This empowers data scientists and analysts to receive actionable insights and real-time information, so they can make data-driven decisions and improve business models throughout the company. 

Let’s look at how this applies to the age-old question, “how do I get more customers?” We’ll discuss two different approaches to answering this common business question.

The first methodology is the traditional approach: develop a marketing strategy that appeals to a specific audience segment. Your business can determine the segment to target based on your customers’ buying intentions and your company’s strength in providing value. Coming to this conclusion requires asking inductive questions about the data:

  • What is the demand curve?
  • What product does our segment prefer?
  • When do prospective customers buy our product?
  • Where should we advertise to connect with our target audience?

There is no shortage of business intelligence tools and services designed to help your company answer these questions. This includes ad hoc querying, dashboards, and reporting tools.

The second approach utilizes machine learning within your data warehouse. With ML, you can harness your existing modern data warehouse to discover the inputs that impact your KPIs most. You simply have to feed information about your existing customers into a statistical model, then the algorithms will profile the characteristics that define an ideal customer. We can ask questions around specific inputs:

  • How do we advertise to women with annual income between $100,000 and $200,000 who like to ski?
  • What are the indicators of churn in our self-service customer base?
  • What are frequently seen characteristics that will create a market segmentation?

ML builds models within your data warehouse to enable you to discover your ideal customer via your inputs. For example, you can describe your target customer to the computing model, and it will find potential customers that fall under that segment. Or, you can feed the computer data on your existing customers and have the machine learn the most important characteristics. 

Conclusion

A modern data warehouse is essential for ingesting and analyzing data in our data-heavy world.  AI and predictive analytics feed off more data to work effectively, making your modern data warehouse the ideal environment for the algorithms to run and enabling your enterprise to make intelligent decisions. Data science technologies like artificial intelligence and machine learning take it one step further and allow you to leverage the data to make smarter enterprise-wide decisions.

2nd Watch offers a Data Science Readiness Assessment to provide you with a clear vision of how data science will make the greatest impact on your business. Our assessment will get you started on your data science journey, harnessing solutions such as advanced analytics, ML, and AI. We’ll review your goals, review your current state, and design preliminary models to discover how data science will provide the most value to your enterprise.

-Ryan Lewis | Managing Consultant at 2nd Watch

Get started with your Data Science Readiness Assessment today to see how you can stay competitive by automating processes, improving operational efficiency, and uncovering ROI-producing insights.

rss
Facebooktwitterlinkedinmail

3 Data Priorities for Organic Value Creation

Organic value creation focuses on a few main areas, including improving current performance (both financial and operational) of your companies, establishing a pattern of consistent growth, strengthening your organizational leadership team, and building the potential for a brighter future through product and competitive positioning. All of these are supported by and/or partially based on the data foundation you create in your companies. At exit, your buyers want to see and feel confident that the created organic value is sustainable and will endure. Data and analytics are key to proving that. 

A CTO’s Guide to a Modern Data Platform

Companies that solely focus on competition will ultimately die. Those that focus on value creation will thrive. — Edward De Bono

To organically create and drive value, there are a few key data priorities you should consider:

  1. A starting point is data quality, which underpins all you will ever do and achieve with data in your organization. Achieving better-quality data is an unrelenting task, one that many organizations overlook.
  2. Data monetization is a second priority and is also not top-of-mind for many organizations. The adage that “data is the new oil” is at least partially true, and most companies have ways and means to leverage the data they already possess to monetize and grow revenue for improved financial returns.
  3. A third data priority is to focus on user adoption. Having ready data and elite-level analytical tools is not sufficient. You need to be sure the data and tools you have invested in are broadly used – and not just in the short term. You also need to continue to evolve and enhance both your data and your tools to grow that adoption for future success.

Data Quality

Data quality is a complicated topic worthy of a separate article. Let’s focus our data quality discussion on two things: trust and the process of data quality.

If you are organically growing your companies and increasing the use of and reliance upon your data, you better make sure you trust your data. The future of your analytics solutions and broad adoption across your operational management teams depend on your data being trustworthy. That trust means that the data is accurate, consistent across the organization, timely, and involved in a process to ensure the continuing trust in the data. There is also an assumption that your data aligns with external data sources. You can measure accuracy of your portfolio company’s data in many ways, but the single best measure is going to be how your operating executives answer the question, “How much do you trust your data?”

Data quality is never stagnant. There are always new data sources, changes in the data itself, outside influences on the data, etc. You cannot just clean the data once and expect it to stay clean. The best analogy is a stream that can get polluted from any source that feeds into the stream. To maintain high data quality over time, you need to build and incorporate processes and organizational structures that monitor, manage, and own the quality of your company’s data.

One “buzzwordy” term often applied to good data governance is data stewardship – the idea being that someone within your enterprise has the authority and responsibility to keep your data of the highest quality. There are efficient and effective ways to dramatically improve your company data and to keep it of the highest quality as you grow the organization. Simply put, do something about data quality, make sure that someone or some group is responsible for data quality, and find ways to measure your overall data quality over time.

A leading equipment distributor found new revenue sources and increased competitive edge by leveraging the cloud data warehouse that 2nd Watch built for their growing company to share data on parts availability in their industry. Using the centralized data, they can grow revenue, increase customer service levels, and have more industry leverage from data that they already owned. Read this private equity case study here.

Data Monetization

Organic value creation can also come from creating value out of the data your portfolio companies already own. Data monetization for you can mean such options as:

Enriching your internal data – Seek ways to make your data more valuable internally. This most often comes from cross-functional data creation (e.g., taking costing data and marrying it with sales/marketing data to infer lifetime customer value). The unique view that this enriched internal data offers will often lead to better internal decision-making and will drive more profitable analytics as you grow your analytics solutions library.

Finding private value buyers – Your data, cleansed and anonymized, is highly valuable. Your suppliers will pay for access to more data and information that helps them customize their offerings and prices to create value for customers. Your own customers would pay for enhanced information about your products and services if you can add value to them in the process. Within your industry, there are many ways to anonymize and sell the data that your portfolio companies create.

Finding public value buyers – Industry trade associations, consultancies, conference organizations, and the leading advisory firms are all eager to access unique insights and statistics they can use and sell to their own clients to generate competitive advantage.

Building a data factory mindset – Modern cloud data warehouse solutions make the technology to monetize your data quite easy. There are simple ways to make the data accessible and a marketplace for selling such data from each of the major cloud data warehouse vendors. The hardest part is not finding buyers or getting them the data; it is building an internal mindset that your internal data is a valuable asset that can be easily monetized. 

User Adoption

Our firm works with many private equity clients to design, build, and implement leading analytics solutions. A consistent learning across our project work is that user adoption is a critical success factor in our work.

Just because we have more accurate data, or more timely data, or more enriched data won’t necessarily increase the adoption of advanced analytical solutions in your portfolio companies. Not all of your operating executives are data driven nor are they all analytically driven. Just because they capably produce their monthly reporting package and get it to you on time does not mean they are acting on issues and opportunities that they should be able to discern from the data. Better training, organizational change techniques, internal data sharing, and many other ways can dramatically increase the speed and depth of the user adoption in your companies.

You know how to seek value when you invest. You know how to grow your companies post-close. Growing organically during your hold period will drive increased exit valuations and let you outperform your investment thesis. Focus on data quality and broad user adoption as two of your analytics priorities for strong organic value creation across your portfolio.

Contact us today to set up a complimentary private equity data whiteboarding session. Our analytics experts have a template for data monetization and data quality assessments that we can run through with you and your team.

rss
Facebooktwitterlinkedinmail

What Is the Difference Between Snowflake and Amazon Redshift?

The modern business world is data-centric. As more businesses turn to cloud computing, they must evaluate and choose the right data warehouse to support their digital modernization efforts and business outcomes. Data warehouses can increase the bottom line, improve analytics, enhance the customer experience, and optimize decision-making. 

A data warehouse is a large repository of data businesses utilize for deep analytical insights and business intelligence. This data is collected from multiple data sources. A high-performing data warehouse can collect data from different operational databases and apply a uniform format for better analysis and quicker insights.

Two of the most popular data warehouse solutions are Snowflake and Amazon Web Services (AWS) Redshift. Let’s look at how these two data warehouses stack up against one another. 

What Is the Difference Between Snowflake and Amazon Redshift

What is Snowflake?

Snowflake is a cloud-based data warehousing solution that uses third-party cloud-compute resources, such as Azure, Google Cloud Platform, or Amazon Web Services (AWS.) It is designed to provide users with a fully managed, cloud-native database solution that can scale up or down as needed for different workloads. Snowflake separates compute from storage: a non-traditional approach to data warehousing. With this method, data remains in a central repository while compute instances are managed, sized, and scaled independently. 

Snowflake is a good choice for companies that are conscious about their operational overhead and need to quickly deploy applications into production without worrying about managing hardware or software. It is also the ideal platform to use when query loads are lighter, and the workload requires frequent scaling. 

The benefits of Snowflake include:

  • Easy integration with most components of data ecosystems
  • Minimal operational overhead: companies are not responsible for installing, configuring, or managing the underlying warehouse platform
  • Simple setup and use
  • Abstracted configuration for storage and compute instances
  • Robust and intuitive SQL interface

What is Amazon Redshift?

Amazon Redshift is an enterprise data warehouse built on Amazon Web Services (AWS). It provides organizations with a scalable, secure, and cost-effective way to store and analyze large amounts of data in the cloud. Its cloud-based compute nodes enable businesses to perform large-scale data analysis and storage. 

Amazon Redshift is ideal for enterprises that require quick query outputs on large data sets. Additionally, Redshift has several options for efficiently managing its clusters using AWS CLI/Amazon Redshift Console, Amazon Redshift Query API, and AWS Software Development Kit. Redshift is a great solution for companies already using AWS services and running applications with a high query load. 

The benefits of Amazon Redshift include:

  • Seamless integration with the AWS ecosystem
  • Multiple data output formatting support
  • Easy console to extract analytics and run queries
  • Customizable data and security models

Comparing Data Warehouse Solutions

Snowflake and Amazon Redshift both offer impressive performance capabilities, like scalability across multiple servers and high availability with minimal downtime. There are some differences between the two that will determine which one is the best fit for your business.

Performance

Both data warehouse solutions harness massively parallel processing (MPP) and columnar storage, which enables advanced analytics and efficiency on massive jobs. Snowflake boasts a unique architecture that supports structured and semi-structured data. Storage, computing, and cloud services are abstracted to optimize independent performance. Redshift recently unveiled concurrency scaling coupled with machine learning to compete with Snowflake’s concurrency scaling. 

Maintenance

Snowflake is a pure SaaS platform that doesn’t require any maintenance. All software and hardware maintenance is handled by Snowflake. Amazon Redshift’s clusters require manual maintenance from the user.

Data and Security Customization

Snowflake supports fewer customization choices in data and security. Snowflake’s security utilizes always-on encryption enforcing strict security checks. Redshift supports data flexibility via partitioning and distribution. Additionally, Redshift allows you to tailor its end-to-end encryption and set up your own identity management system to manage user authentication and authorization.

Pricing

Both platforms offer on-demand pricing but are packaged differently. Snowflake doesn’t bundle usage and storage in its pricing structure and treats them as separate entities. Redshift bundles the two in its pricing. Snowflake tiers its pricing based on what features you need. Your company can select a tier that best fits your feature needs. Redshift rewards businesses with discounts when they commit to longer-term contracts. 

Which data warehouse is best for my business?

To determine the best fit for your business, ask yourself the following questions in these specific areas:

  • Do I want to bundle my features? Snowflake splits compute and storage, and its tiered pricing provides more flexibility to your business to purchase only the features you require. Redshift bundles compute and storage to unlock the immediate potential to scale for enterprise data warehouses. 
  • Do I want a customizable security model? Snowflake grants security and compliance options geared toward each tier, so your company’s level of protection is relevant to your data strategy. Redshift provides fully customizable encryption solutions, so you can build a highly tailored security model. 
  • Do I need JSON storage? Snowflake’s JSON storage support wins over Redshift’s support. With Snowflake, you can store and query JSON with native functions. With Redshift, JSON is split into strings, making it difficult to query and work with. 
  • Do I need more automation? Snowflake automates issues like data vacuuming and compression. Redshift requires hands-on maintenance for these sorts of tasks. 

Conclusion

A data warehouse is necessary to stay competitive in the modern business world. The two major data warehouse players – Snowflake and Amazon Redshift – are both best-in-class solutions. One product is not superior to the other, so choosing the right one for your business means identifying the one best for your data strategy.

2nd Watch is an AWS Certified Partner and an Elite Snowflake Consulting Partner. We can help you choose the right data warehouse solution for you and support your business regardless of which data warehouse your choose.

We have been recognized by AWS as a Premier Partner since 2012, as well as an audited and approved Managed Service Provider and Data and Analytics Competency partner for our outstanding customer experiences, depth and breadth of our products and services, and our ability to scale to meet customer demand. Our engineers and architects are 100% certified on AWS, holding more than 200 AWS certifications.

Our full team of certified SnowPros has proven expertise to help businesses implement modern data solutions using Snowflake. From creating a simple proof of concept to developing an enterprise data warehouse to customized Snowflake training programs, 2nd Watch will help you to utilize Snowflake’s powerful cloud-based data warehouse for all of your data needs.

Contact 2nd Watch today to help you choose the right data warehouse for your business!

rss
Facebooktwitterlinkedinmail

4 Data Principles for Operational Resilience

Scaling your portfolio companies creates value, and increasing their native agility multiplies the value created. The foundation of better resilience in any company is often based on the ready availability of operational data. Access to the data you need to address problems or opportunities is necessary if you expect your operating executives and management teams to run the business more effectively than their competitors.

Resilience is the strength and speed of our response to adversity – and we can build it. It isn’t about having a backbone. It’s about strengthening the muscles around our backbone. — Sheryl Sandberg

Data Principles for Operational Resilience

You need and want your portfolio companies to be operationally resilient – to be ready and able to respond to changes and challenges in their operations. We all have seen dramatic market changes in recent years, and we all should expect continued dynamic economic and competitive pressures to challenge even the best of our portfolio companies. Resilient companies will respond better to such challenges and will outperform their peers.

This post highlights four areas that you and your operating executives should consider as you strive to make yourself more operationally resilient:

  1. Data engineering takes time and effort. You can do a quick and dirty version of data engineering, also called loading it into a spreadsheet, but that won’t be sufficient to achieve what you really need in your companies.
  2. Building a data-driven culture takes time. Having the data ready is not enough, you need to change the way your companies use the data in their tactical and strategic decision-making. And that takes some planning and some patience to achieve.
  3. Adding value to the data takes time. Once you have easily accessible data, as an organization you should strive to add or enrich the data. Scoring customers or products, cleaning or scrubbing your source data, and adding external data are examples of ways you can enrich your data once you have it in a centrally accessible place.
  4. Get after it. You need and want better analytics in every company you own or manage. This is a journey, not a single project. Getting started now is paramount to building agility and resiliency over time on that journey.

Data Engineering can be Laborious

Every company has multiple application source systems that generate and store data. Those multiple systems store the data in their proprietary databases, in a format that best suits transactional systems, and likely redundantly stores common reference data like customer number and customer name, address, etc. To get all that data, standardize it, scrub it, and model it in the way that you need to manage your business takes months. You likely must hire consultants to build the data pipelines, create a data warehouse to store the data, and then build the reports and dashboards for data analysis.

On most of our enterprise analytics projects, data engineering consumes 60-70% of the time and effort put into the project. Ask any financial analyst or business intelligence developer – most of their time is spent getting their hands on the right, clean data. Dashboards and reports are quickly built once the data is available.

CASE STUDY

The CEO of a large manufacturing company wanted to radically increase the level of data-driven decision-making in his company. Working with his executive team, we quickly realized that functional silos, prior lack of easy data access, and ingrained business processes were major inhibitors to achieving their vision. 2nd Watch incorporated extensive organizational change work while we built a new cloud-based analytics warehouse to facilitate and speed the pace of change. Read the full case study.

A Data-driven Culture needs to be Nurtured and Built

Giving your executives access to data and reports is only half the battle. Most executives are used to making decisions without the complete picture and without a full set of data. Resiliency comes from having the data and from using it wisely. If you build it, not all will come to use it.

Successful analytics projects incorporate organizational change management elements to drive better data behaviors. Training, better analytics tools, collaboration, and measuring adoption are just some of the best practices that you can bring to your analytics projects to drive better use of the data and analysis tools that will lead to more resilience in your portfolio companies.

Data Collaboration Increases the Value of your Data

We consistently find that cross-functional sharing of data and analytics increases the value and effectiveness of your decision-making. Most departments and functions have access to their own data – finance has access to the GL and financial data, marketing has access to marketing data, etc. Building a single data model that incorporates all of the data, from all of the silos, increases the level of collaboration that lets your executives from all functions simultaneously see and react to the performance of the business.

Let’s be honest, most enterprises are still managed through elaborate functional spreadsheets that serve as the best data source for quick analysis. Spreadsheets are fine for individual analysis and reporting, and for quick ad-hoc analytics. They are not a viable tool for extensive collaboration and won’t ever enable the data value enhancement that comes from a “single source of truth.”

Operating Executives need to Build Resilience as they Scale their Companies.

Change is constant, markets evolve, and today’s problems and opportunities are not tomorrow’s problems and opportunities. Modern data and analytics solutions can radically improve their operational resilience and drive higher value. These solutions can be technically and organizationally complex and will take time to implement and achieve results. Start building resiliency in your portfolio companies by mapping out a data strategy and creating the data foundation that your companies need.

Contact us today to set up a complimentary whiteboarding session. Our analytics experts will work through a high-level assessment with you.

rss
Facebooktwitterlinkedinmail

How to Maximize the Business Value of The Cloud Using Cloud Economics

Cloud economics is crucial for an organization to make the most out of their cloud solutions, and business leaders need to prioritize shifting their company culture to embrace accountability and trackability. 

When leaders hear the phrase “cloud economics,” they think about budgeting and controlling costs. Cost management is an element of cloud economics, but it is not the entire equation. In order for cloud economics to be implemented in a beneficial way, organizations must realize that cloud economics is not a budgetary practice, but rather an organizational culture shift.

The very definition of “economics” indicates that the study is more than just a numbers game. Economics is “a science concerned with the process or system by which goods and services are produced, sold, and bought.” The practice of economics involves a whole “process or system” where actors and actions are considered and accounted for. 

With this definition in mind, cloud economics means that companies are required to look at key players and behaviors when evaluating their cloud environment in order to maximize the business value of their cloud. 

Once an organization has fully embraced the study of cloud economics, it will be able to gain insight into which departments are utilizing the cloud, what applications and workloads are utilizing the cloud, and how all of these moving parts contribute to greater business goals. Embodying transparency and trackability enables teams to work together in a harmonious way to control their cloud infrastructure and prove the true business benefits of the cloud. 

If business leaders want to apply cloud economics to their organizations, they must go beyond calculating cloud costs. They will need to promote a culture of cross-functional collaboration and honest accountability. Leadership should prioritize and facilitate the joint efforts of cloud architects, cloud operations, developers, and the sourcing team. 

Cloud economics will encourage communication, collaboration, and change in culture, which will have the added benefit of cloud cost management and cloud business success. 

Where do companies lose control of their cloud costs?

When companies lose control of cloud costs, the business value of the cloud disappears as well. If the cloud is overspending and there is no business value to show for, how are leaders supposed to feel good about their cloud infrastructure? Going over budget with no benefits would not be a sound business case for any enterprise in any industry. 

Out-of-control cloud spending is quite easy, and it usually boils down to poor business decisions that come from leadership. Company leaders should first recognize that they wield the power to manage cloud costs and foster communication between teams. If they are making poor business decisions, like prioritizing speedy delivery over well-written code or not promoting transparency, then they are allowing practices that negatively impact cloud costs. 

When leaders push their teams to be fast rather than thorough, it creates technical debt and tension between teams. The following sub-optimal practices can happen when leadership is not prioritizing cloud cost optimizations:

  • Developers ignore seemingly small administrative tasks that are actually immensely important and consequential, like rightsizing infrastructure or turning off inactive applications. 
  • Architects select suboptimal designs that are easier and faster to run but are more expensive to implement.
  • Developers use inefficient code and crude algorithms in order to ship a feature faster, but then fail to consider performance optimizations to execute less resource consumption.
  • Developers forgo deployment automation that would help to automatically rightsize.
  • Developers build code that isn’t inherently cloud-native, and therefore not cloud-optimized.
  • Finance and procurement teams are only looking at the bottom line and don’t fully understand why the cloud bill is so high, therefore, creating tension between IT/dev and finance/procurement. 

When these actions compound, it leads to an infrastructure mess that is incredibly difficult to clean up. Poorly implemented bad designs that are not easily scalable will require a significant amount of development time; therefore, leaving companies with inefficient cloud infrastructure and preposterously high cloud costs.

Furthermore, these high and unexplained cloud bills cause rifts between teams and are detrimental to collaboration efforts. Lack of accountability and visibility causes developer and finance teams to have misaligned business objectives. 

Poor cloud governance and culture are derived from leadership’s misguided business decisions and muddled planning. If leaders don’t prioritize cloud cost optimization through cloud economics, the business value of the cloud is diminished and company collaboration will suffer. Developers and architects will continue to execute processes that create high cloud costs, and finance and procurement teams will forever be at odds with the IT team.

What are the benefits of cloud economics?

Below are a few common business pitfalls that leaders can easily address if they embrace the practice of cloud economics:

Decentralized Costs and Budgets

Knowing budgets may seem obvious, but more often than not, leaders don’t even know what they are spending on the cloud. This is usually due to siloed department budgets and a lack of disclosure. Cloud economics requires leaders to create visibility into their cloud spend and open channels of communication about allocation, budgeting, and forecasting.

Lack of Planning and Unanticipated Usage 

If organizations don’t plan, then they will end up over-utilizing the cloud. Failing to forecast or proactively budget cloud resources will lead to using too many unnecessary and/or unused resources. With cloud economics, leaders are responsible for strategies, systems, and internal communications to connect cloud costs with business goals. 

Non-Committal Mindset 

This issue is a culmination of other problems. If business leaders are unsure of what they are doing in the cloud, they are less willing to commit to long-term cloud contracts. Unwillingness to commit to contracts is a missed opportunity for business leaders because long-term engagements are more cost-friendly. Once leaders have implemented cloud economics to inspire confidence in their cloud infrastructure, they can assertively evaluate purchasing options in the most cost-effective way.

What are the steps to creating a culture around cloud economics?

Cloud economics is a study that goes beyond calculating and cutting costs. It is a company culture that is a cross-functional effort. Though it seems like a significant undertaking, the steps to get started are quite manageable. Below is a high-level plan that business leaders must take charge of to create a culture around prioritizing cloud economics:

#1. Inform

Stage one consists of lots of data collecting and understanding of the current cloud situation. Company leaders will need to know what the trust costs of the cloud are before they can proceed forward. Creating visibility around the current state is also the first step to creating a culture of communication and transparency amongst teams and stakeholders.

#2. Optimize

Once the baseline is understood, leadership can analyze the data in order to optimize cloud costs. The visibility of the current state is crucial for teams and leadership to understand what they are working with and how they can optimize it. This stage is where a lot of conversations happen amongst teams to come up with an optimization action plan. It requires teams and stakeholders to communicate and work together, which ultimately builds trust among each other.

#3. Operate

Finally, the data analysis and learnings can be implemented. With the optimization action plan, leaders should know what areas of the cloud demand optimization first and how to optimize these areas. At this point in the process, teams and stakeholders are comfortable with cross-collaboration and honest communications amongst each other. This opens up a transparent feedback loop that is necessary for continuous improvement. 

Conclusion

The entire organization stands to gain when cloud economics is prioritized. A cost-efficient cloud infrastructure will lead to improved productivity, cross-functional collaboration between teams, and focused efforts towards greater business objectives. 

When it comes to cloud economics and optimization, 2nd Watch is the go-to partner for enterprise-level services and support. Our team of experts and cloud-accredited professionals help businesses plan, analyze, and recommend strategies to create a culture of cloud economics and accountability. Control cloud costs and maximize the business value of your cloud today by contacting a 2nd Watch cloud expert.

Mary Fellows | Director of Cloud Economics at 2ND Watch

rss
Facebooktwitterlinkedinmail