1-888-317-7920 info@2ndwatch.com

Cloud Crunch Podcast: 5 Strategies to Maximize Your Cloud’s Value – Create Competitive Advantage from your Data

AWS Data Expert, Saunak Chandra, joins today’s episode to break down the first of five strategies used to maximize your cloud’s value – creating competitive advantage from your data. We look at tactics including Amazon Redshift, RA3 node type, best practices for performance, data warehouses, and varying data structures. Listen now on Spotify, iTunes, iHeart Radio, Stitcher, or wherever you get your podcasts.

We’d love to hear from you! Email us at CloudCrunch@2ndwatch.com with comments, questions and ideas.

Facebooktwitterlinkedinmailrss

Google Cloud, Open-Source and Enterprise Solutions

In 2020, a year where enterprises had to rethink their business models to stay alive, Google Cloud was able to grow 47% and capture market share. If you are not already looking at Google Cloud as part of your cloud strategy, you probably should.

Google has made conscious choices about not locking in customers with proprietary technology. Open-source technology has, for many years, been a core focus for Google, and many of Google Cloud’s solutions can integrate easily with other cloud providers.

Kubernetes (GKE), Knative (Cloud Functions), TensorFlow (Machine Learning), and Apache Beam (Data Pipelines) are some examples of cloud-agnostic tools that Google has open-sourced and which can be deployed to other clouds as well as on-premises, if you ever have a reason to do so.

Specifically, some of Google Cloud’s services and its go-to-market strategy set Google Cloud apart. Modern and scalable solutions like BigQuery, Looker, and Anthos fall into this category. They are best of class tools for each of their use cases, and if you are serious about your digital transformation efforts, you should evaluate their capabilities and understand what they can do for your business.

Three critical challenges we see from our enterprise clients here at 2nd Watch repeatedly include:

  1. How to get started with public cloud
  2. How to better leverage their data
  3. How to take advantage of multiple clouds

Let’s dive into each of these.

Foundation

Ask any architect if they would build a house without a foundation, and they would undisputedly tell you “No.” Unfortunately, many companies new to the cloud do precisely that. The most crucial step in preparing an enterprise to adopt a new cloud platform is to set up the foundation.

Future standards are dictated in the foundation, so building it incorrectly will cause unnecessary pain and suffering to your valuable engineering resources. The proper foundation, that includes your project structure aligned with your project lifecycle and environments, and a CI/CD pipeline to push infrastructure changes through code will enable your teams to become more agile while managing infrastructure in a modern way.

A foundation’s essential blocks include project structure, network segmentation, security, IAM, and logging. Google has a multi-cloud tool called Cloud Operations for logs management, reporting, and alerting, or you can ingest logs into existing tools or set up the brand of firewalls you’re most familiar and comfortable with from the Google Cloud Marketplace. Depending on your existing tools and industry regulations, compliance best practices might vary slightly, guiding you in one direction or another.

DataOps

Google has, since its inception, been an analytics powerhouse. The amount of data moving through Google’s global fiber network at any given time is incredible. Why does this matter to you? Google has now made some of its internal tools that manage large amounts of data available to you, enabling you to better leverage your data. BigQuery is one of these tools.

Being serverless, you can get started with BigQuery on a budget, and it can scale to petabytes of data without breaking a sweat. If you have managed data warehouses, you know that scaling them and keeping them performant is a task that is not easy. With BigQuery, it is.

Another valuable tool, Looker, makes visualizing your data easy. It enables departments to share a single source of truth, which breaks down data silos and enables collaboration between departments with dashboards and views for data science and business analysis.

Hybrid Cloud Solutions

Google Cloud offers several services for multi-cloud capabilities, but let’s focus on Anthos here. Anthos provides a way to run Kubernetes clusters on Google Cloud, AWS, Azure, on-premises, or even on the edge while maintaining a single pane of glass for deploying and managing your containerized applications.

With Anthos, you can deploy applications virtually anywhere and serve your users from the cloud datacenter nearest them, across all providers, or run apps at the edge – like at local franchise restaurants or oil drilling rigs – all with the familiar interfaces and APIs your development and operations teams know and love from Kubernetes.

Currently in preview, soon Google Cloud will release BigQuery Omni to the public. BigQuery Omni lets you extend the capabilities of BigQuery to the other major cloud providers. Behind the scenes, BigQuery Omni runs on top of Anthos and Google takes care of scaling and running the clusters, so you only have to worry about writing queries and analyzing data, regardless of where your data lives. For some enterprises that have already adopted BigQuery, this can mean a ton of cost savings in data transfer charges between clouds as your queries run where your data lives.

Google Cloud offers some unmatched open-source technology and solutions for enterprises you can leverage to gain competitive advantages. 2nd Watch has helped organizations overcome business challenges and meet objectives with similar technology, implementations, and strategies on all major cloud providers, and we would be happy to assist you in getting to the next level on Google Cloud.

2nd Watch is here to serve as your trusted cloud data and analytics advisor. When you’re ready to take the next step with your data, contact Us.

Learn more

Webinar: 6 Essential Tactics for your Data & Analytics Strategy

Webinar:  Building an ML foundation for Google BigQuery ML & Looker

-Aleksander Hansson, 2nd Watch Google Cloud Specialist

Facebooktwitterlinkedinmailrss

Are You Ready to Migrate Your Data to the Cloud? Answer These 4 Questions to Find Out

Many companies are already storing their data in the cloud and even more are considering making the migration to the cloud. The cloud offers unique benefits for data access and consolidation, but some businesses choose to keep their data on-prem for various reasons. Data migration isn’t a one size fits all formula, so when developing your data strategy, think about your long-term needs and goals for optimal results.

We recommend evaluating these 4 questions before making the decision to migrate your data to the cloud:

1. Why do you want to move your data?

Typically, there are two reasons businesses find themselves in a position of wanting to change their IT infrastructure. Either your legacy platform is reaching end of life (EOL) and you’re forced to make a change, or it’s time to modernize. If you’re faced with the latter – your business data expanded beyond the EOL platform – it’s a good indication migrating to the cloud is right for you. The benefits of cloud-based storage can drastically improve your business agility.

2. What is important to you?

You need to know why you’re choosing the platform you are deploying and how it’s going to support your business goals better than other options. Three central arguments for cloud storage – that are industry and business agnostic – include:

  • Agility: If you need to move quickly (and what business doesn’t?), the cloud is for you. It’s easy to start, and you can spin up a cloud environment and have a solution deployed within minutes or hours. There’s no capital expense, no server deployment, and no need for an IT implementation team.
  • Pay as you go: If you like starting small, testing things before you go all in, and only paying for what you use, the cloud is for you. It’s a very attractive feature for businesses hesitant to move all their data at once. You get the freedom and flexibility to try it out, with minimal financial risk. If it’s not a good fit for your business, you’ve learned some things, and can use the experience going forward. But chances are, the benefits you’ll find once utilizing cloud features will more than prove their value.
  • Innovation: If you want to ride the technology wave, the cloud is for you. Companies release new software and features to improve the cloud every day, and there’s no long release cycles. Modernized technologies and applications are available as soon as they’re released to advance your business capabilities based on your data.

3. What is your baseline?

The more you can plan for potential challenges in advance, the better. As you consider data migration to the cloud, think about what your data looks like today. If you have an on-prem solution, like a data warehouse, lift and shift is an attractive migration plan because it’s fairly easy.

Many businesses have a collection of application databases and haven’t yet consolidated their data. They need to pull the data out, stage it, and store it without interfering with the applications. The main cloud providers offer different, but similar options to get your data into a place where it can be used. AWS offers S3, Google Cloud has Cloud Storage, and Azure provides Blob storage. Later, you can pull the data into a data warehousing solution like AWS Redshift, Google BigQuery, Microsoft Synapse, or Snowflake.

4. How do you plan to use your data?

Always start with a business case and think strategically about how you’ll use your data. The technology should fit the business, not the other way around. Once you’ve determined that, garner the support and buy-in of sponsors and stakeholders to champion the proof of concept. Bring IT and business objectives together by defining the requirements and the success criteria. How do you know when the project is successful? How will the data prove its value in the cloud?

As you move forward with implementation, start small, establish a reasonable timeline, and take a conservative approach. Success is crucial for ongoing replication and investment. Once everyone agrees the project has met the success criteria, celebrate loudly! Demonstrate the new capabilities, and highlight overall business benefits and impact, to build and continue momentum.

Be aware of your limitations

When entering anything unknown, remember that you don’t know what you don’t know. You may have heard things about the cloud or on-prem environments anecdotally, but making the decision of when and how to migrate data is too important to do without a trusted partner. You risk missing out on big opportunities, or worse, wasting time, money, and resources without gaining any value.

2nd Watch is here to serve as your trusted cloud advisor, so when you’re ready to take the next step with your data, contact Us.

Learn more about 2nd Watch Data and Analytics services

-Sam Tawfik, Sr Product Marketing Manager, Data & Analytics

Facebooktwitterlinkedinmailrss

Migrating Data to Snowflake – An Overview

When considering migrating your data to the cloud, everyone’s familiar with the three major cloud providers – AWS, Google Cloud, and Microsoft Azure. But there are a few other players you should also take note of. Snowflake is a leading cloud data platform that offers exceptional design, scalability, simplicity, and return on investment (ROI).

What is Snowflake?

The Snowflake cloud data platform was born in the cloud for data warehousing. It’s built entirely to maximize cloud usage and designed for almost unlimited scalability. Users like the simplicity, and businesses gain significant ROI from the wide range of use cases Snowflake supports.

Out of the box, Snowflake is easy to interact with through its web interface. Without having to download any applications, users can connect with Snowflake and create additional user accounts for a fast and streamlined process. Additionally, Snowflake performs as a data platform, rather than just a data warehouse. Data ingestion is cloud native and existing tools enable effortless data migration.

Business Drivers

The decision to migrate data to a new cloud environment, or data warehousing solution, needs to be based on clearly defined value. Why are you making the transition? What’s your motivation? Maybe you need to scale up, or there’s some sort of division or business requirement for migration. Often times, companies have a particular implementation that needs to change, or they have specific needs that aren’t being met by their current data environment.

Take one of our clients, for instance. When the client’s company was acquired, they came to utilize a data warehouse shared by all the companies the acquiring company owned. When the client was eventually sold, they needed their own implementation and strategy for migrating data into the cloud. Together, we took the opportunity to evaluate some of the newer data platform tools, like Snowflake, for their specific business case and to migrate quickly to an independent data platform.

With Snowflake, set up was minimal and supported our client’s need for a large number of database users. Migrating from the shared data warehouse to Snowflake was relatively easy, and it gave all users access through a simple web interface. Snowflake also provided more support for unstructured data usage, which simplified querying things like JSON or nested data.

Implementation

Migrating data to Snowflake is generally a smooth transition because Snowflake accepts data from your existing platform. For instance, if data is stored in Amazon S3, Google Cloud, or Azure, you can create Snowflake environments in each then ingest the data using SQL commands and configuration. Not only can you run all the same queries with minor tweaks and get the same output, but Snowflake also fits additional needs and requirements. If you’ve worked in SQL in any manner – on an application database, or in data warehousing – training is minimal.

Another advantage with Snowflake is its ability to scale either horizontally or vertically to pull in any amount of data. And since it is cloud native, Snowflake has embraced the movement toward ‘pay as you go’ – in fact, that’s their entire structure. You only pay for the ingestion time and when the data warehouse is running. After that, it shuts off, and so does your payment. Cost-effective implementation lets you experiment, compare, test, and iterate on the best way to migrate each piece of your data lifecycle.

Long Term Results

Snowflake has yielded successful data migrations with users because of its ease of use and absence of complications. Users also see performance improvements because they’re able to get their data faster than ever and they can grow with Snowflake, bringing in new and additional data sources and tools, taking advantage of artificial intelligence and machine learning, increasing automation, and experimenting and iterating.

From a security and governance perspective, Snowflake is strong. Snowflake enforces a multi-layer security structure, including user management. You can grant access to certain groups, organize them accordingly, integrate with your active directory, and have it run with those permissions. You assign an administrator to regulate specific accessibility for tables in specified areas. Snowflake also lets you choose your desired security level during implementation. You have the option of enterprise level, HIPAA compliance, and a maximum security level with a higher rate per second.

Do you want to explore data migration opportunities? Make the most of your data by partnering with trusted experts. We’re here to help you migrate, store, and utilize data to grow your business and streamline operations. If you’re ready to the next step in your data journey, Contact Us.

Learn more about 2nd Watch Data and Analytics services

-Sam Tawfik, Sr Product Marketing Manager, Data & Analytics

Facebooktwitterlinkedinmailrss

3 Ways McDonald’s France is Preparing their Data for the Future

Data access is one of the biggest influences on business intelligence, innovation, and strategy to come out of digital modernization. Now that so much data is available, the competitive edge for any business is derived from understanding and applying it meaningfully. McDonald’s France is gaining business-changing insights after migrating to a data lake, but it’s not just fast food that can benefit. Regardless of your industry, gaining visibility into and governance around your data is the first step for what’s next.

1. No More Manual Legacy Tools

Businesses continuing to rely on spreadsheets and legacy tools that require manual processes are putting in a lot more than they’re getting out. Not only are these outdated methods long, tedious, subject to human error, and expensive in both time and resources – but there’s a high probability the information is incomplete or inaccurate. Data-based decision making is powerful, however, without a data platform, a strong strategy, automation, and governance, you can’t easily or confidently implement takeaways.

Business analysts at McDonald’s France historically relied on Excel-based modeling to understand their data. Since partnering with 2nd Watch, they’ve been able to take advantage of big data analytics by leveraging a data lake and data platform. Architected from data strategy and ingestion, to management and pipeline integration, the platform provides business intelligence, data science, and self-service analytics. Now, McDonald’s France can rely on their data with certainty.

2. Granular Insights Become Opportunities for Smart Optimization

Once intuitive solutions for understanding your data are implemented, you gain finite visibility into your business. Since completing the transition from data warehouse to data lake, McDonald’s France has new means to integrate and analyze data at the transaction level. Aggregate information from locations worldwide provides McDonald’s with actionable takeaways.

For instance, after establishing the McDonald’s France data lake, one of the organization’s initial projects focused on speed of service and order fulfilment. Speed of service encompasses both food preparation time and time spent talking to customers in restaurants, drive-thrus, and on the online application. Order fulfilment is the time it takes to serve a customer – from when the order is placed to when it’s delivered. With transaction-level purchase data available, business analysts can deliver specific insights into each contributing factor of both processes. Maybe prep time is taking too long because restaurants need updated equipment, or the online app is confusing and user experience needs improvement. Perhaps the menu isn’t displayed intuitively and it’s adding unnecessary time to speed of service.

Multiple optimization points provide more opportunity to test improvements, scale successes, apply widespread change, fail fast, and move ahead quickly and cost-effectively. Organizations that make use of data modernization can evolve with agility to changing customer behaviors, preferences, and trends. Understanding these elements empowers businesses to deliver a positive overall experience throughout their customer journey – thereby impacting brand loyalty and overall profit potential.

3. Machine Learning, Artificial Intelligence, and Data Science

Clean data is absolutely essential for utilizing machine learning (ML), artificial intelligence (AI), and data science to conserve resources, lower costs, enable customers and users, and increase profits. Leveraging data for computers to make human-like decisions is no longer a thing of the future, but of the present. In fact, 78% of companies have already deployed ML, and 90% of them have made more money as a result.

McDonald’s France identifies opportunity as the most important outcome of migrating to a data lake and strategizing on a data platform. Now that a wealth of data is not only accessible, but organized and informative, McDonald’s looks forward to ML implementation in the foreseeable future. Unobstructed data visibility allows organizations in any industry to predict the next best product, execute on new best practices ahead of the competition, tailor customer experience, speed up services and returns, and on, and on. We may not know the boundaries of AI, but the possibilities are growing exponentially.

Now it’s Time to Start Preparing Your Data

Organizations worldwide are revolutionizing their customer experience based on data they already collect. Now is the time to look at your data and use it to reach new goals. 2nd Watch Data and Analytics Services uses a five-step process to build a modern data management platform with strategy to ingest all your business data and manage the data in the best fit database. Contact Us to take the next step in preparing your data for the future.

-Ian Willoughby, Chief Architect and Vice President

Listen to the McDonald’s team talk about this project on the 2nd Watch Cloud Crunch podcast.

Facebooktwitterlinkedinmailrss

McDonald’s France Gains Business-Changing Insights from New Data Lake

McDonald’s is famous for cheeseburgers and fries, but with 1.5 million customers a day, and each transaction producing 20 to 30 data points, it has also become a technology organization. With the overarching goal to improve customer experience, and as a byproduct increase conversion and brand loyalty, McDonald’s France partnered with 2nd Watch to build a data lake on AWS.

Customer Priorities Require Industry Shifts

As is common in many industries today, the fast-food industry has shifted from a transaction centric view to a customer centric view. The emphasis is no longer on customer satisfaction, but on customer experience. It’s this variable that impacts conversion rate and instills loyalty. Consequently, McDonald’s wanted to build a complete perspective of a customer’s lifetime value, with visibility into each step of their journey. Understanding likes and dislikes based on data would give McDonald’s the opportunity to improve experience at a variety of intersections across global locations.

McDonald’s is a behemoth in its size, multi-national reach, and the abundance of data it collects. Making sense of that data required a new way of storing and manipulating it, with flexibility and scalability. The technology necessary to accomplish McDonald’s data goals has significantly reduced in cost, while increasing in efficiency – key catalysts for initiating the project within McDonald’s groups, gaining buy-in from key stakeholders, and engaging quickly.

From Datacenter to Data Lake

To meet its data collection and analysis needs, McDonald’s France needed a fault-tolerant data platform equipped with data processing architecture and a loosely coupled distribution system. But, the McDonald’s team needed to focus on data insights rather than data infrastructure, so they partnered with 2nd Watch to move from a traditional data warehouse to a data lake, allowing them to reduce the effort required to analyze or process data sets for different properties and applications.

During the process, McDonald’s emphasized the importance of ongoing data collection from anywhere and everywhere across their many data sources. From revenue numbers and operational statistics to social media streams, kitchen management systems, commercial, regional, and structural data – they wanted everything stored for potential future use. Historical data will help to establish benchmarks, forecast sales projections, and understand customer behavior over time.

The Data Challenges We Expect…And More

With so much data available, and the goal of improving customer experience as motivation, McDonald’s France wanted to prioritize three types of data – sales, speed of service, and customer experience. Targeting specific sets of data helps to reduce the data inconsistencies every organization faces in a data project. While collecting, aggregating, and cleaning data is a huge feat on its own, McDonald’s France also had to navigate a high level of complexity.

As an omnichannel restaurant, McDonald’s juggles information from point of sales systems with sales happening online, offline, and across dozens of different locations. Data sources include multiple data vendors, mobile apps, loyalty programs, customer relationship management (CRM) tools, and other digital interfaces. Combined in one digital ecosystem, this data is the force that drives the entire customer journey. Once it’s all there, the challenge is to find the link for any given customer that transforms the puzzle into a holistic picture.

Endless Opportunities for the Future

McDonald’s France now has visibility into speed of service with a dedicated dashboard and can analyze and provide syntheses of that data. National teams can make data-based, accurate decisions using the dashboard and implement logistical changes in operations. They’re able to impact operational efficiency using knowledge around prep time to influence fulfilment.

The data lake was successful in showing the organization where it was losing opportunities by not taking advantage of the data it had. McDonald’s also proved it was possible, affordable, and advantageous to invest in data. While their data journey has only begun, these initial steps opened the door to new data usage possibilities. The models established by McDonald’s France will be used as an example to expand data investments throughout the McDonald’s corporation.

If your organization is facing a similar of issue of too much data and not enough insight, 2nd Watch can help. Our data and analytics solutions help businesses make better decisions, faster, with a modern data stack in the cloud. Contact Us to start talking about the tools and strategies necessary to reach your goals.

-Ian Willoughby, Chief Architect and Vice President

Listen to the McDonald’s team talk about this project on the 2nd Watch Cloud Crunch podcast.

Facebooktwitterlinkedinmailrss