3 Types of Employees That Can Use AI Offerings on Google Cloud

The Google Cloud Platform (GCP) comes with a number of services, databases, and tools to operationalize company-wide data management and analytics. With the insights and accessibility provided, you can leverage data into artificial intelligence (AI) and machine learning (ML) projects cost-efficiently. GCP empowers employees to apply their ideas and experience into data-based solutions and innovation for business growth. Here’s how.

Google Cloud’s AI and ML Offerings for Data Scientists

1. Developers without Data Science Experience

With GCP, developers can connect their software engineering experience with AI capabilities to produce powerful results. Using product APIs, developers can incorporate ML into the product without ever having to build a model.

Let’s take training videos for example – Your company has thousands of training videos varying in length and across subjects. They include everything from full-day trainings on BigQuery, to minutes-long security trainings. How do you operationalize all that information for employees to quickly find exactly what they want?

Using Google’s Cloud Video Intelligence API, the developer can transcribe not only every single video, word-for-word, but also document the start and end time of every word, in every video. The developer builds a search index on top of the API, and just like that, users can search specific content in thousands of videos. Results display both the relevant videos and timestamps within the videos, where the keyword is found. Now employees can immediately find the topic they want to learn more about, without needing to sift through what could be hours of unrelated information.

Additional APIs include, Cloud Natural Language, Speech-to-Text, Text-to-Speech, Cloud Data Loss Prevention, and many others in ML.

2. Everyone without Data Science Experience, who isn’t a Developer

Cloud AutoML enables your less technical employees to harness the power of machine learning. It bridges the gap between the API and building your own ML model. Using AutoML, anyone can create custom models tailored to your business needs, and then integrate those models into applications and websites.

For this example, let’s say you’re a global organization who needs to translate communications across dialects and business domains. The intricacies and complexities of natural language require expensive linguists and specialist translators with domain-specific expertise. How do you communicate in real time effectively, respectfully, and cost-efficiently?

With AutoML Translation, almost anyone can create translation models that return query results specific to your domain, in 50 different language pairs. It graphically ingests your data from any type of Sheet or CSV file. The input data necessary is pairs of sentences that mean the same thing in both the language you want to translate from, and the one you want to translate to. Google goes the extra mile between generic translation and specific, niche vocabularies with an added layer of specificity to help the model get the right translation for domain-specific material. Within an hour, the model translates based on your domain, taxonomy, and the data you provided.

Cloud AutoML is available for platform, sight, structured data, and additional language capabilities.

3. Data Scientists

Data scientists have the experience and data knowledge to take full advantage of GCP AI tools for ML. One of the issues data scientists often confront is notebook functionality and accessibility. Whether its TensorFlow, PyTorch, or JupyterLab, these open source ML platforms require too many resources to run on a local computer, or easily connect to BigQuery.

Google AI Platform Notebooks is a managed service that provides a pre-configured environment to support these popular data science libraries. From a security standpoint, AI Platform Notebooks is attractive to enterprises for the added security of the cloud. Relying on a local device, you run the risk of human error, theft, and fatal accidents. Equipped with a hosted, integrated, secure, and protected JupyterLab environment, data scientists can do the following:

  • Virtualize in the cloud
  • Connect to GCP tools and services, including BigQuery
  • Develop new models
  • Access existing models
  • Customize instances
  • Use Git / GitHub
  • Add CPUs, RAM, and GPUs to scale
  • Deploy models into production
  • Backup machines

With a seamless experience from data to a deployed ML model, data scientists are empowered to work faster, smarter, and safer. Contact Us to further your organization’s ability to maximize data, AI, and ML.

Here are a few resources for those who wish to learn more about this subject:

Sam Tawfik, Sr Product Marketing Manager

rss
Facebooktwitterlinkedinmail

Maximizing Cloud Data with Google Cloud Platform Services

If you’re trying to run your business smarter, not harder, utilizing data to gain insights into decision making gives you a competitive advantage. Cloud data offerings empower utilization of data in the cloud, and the Google Cloud Platform (GCP) is full of options. Whether you’re migrating data, upgrading to enterprise-class databases, or transforming customer experience on cloud-native databases – Google Cloud services can fit your needs.

Highlighting some of what Google has to offer

With so many data offerings from GCP, it’s nearly impossible to summarize them all. Some are open source projects being distributed by other vendors, while others were organically created by Google to service their own needs before being externalized to customers. A few of the most popular and widely used include the following.

  • BigQuery: Core to GCP, this serverless, scalable, multi-cloud, data warehouse enables business agility – including data manipulation and data transformation, and it is the engine for AI, machine learning (ML), and forecasting.
  • Cloud SQL: Traditional relational database in the cloud that reduces maintenance costs with fully managed services for MySQL, PostgreSQL, and SQL Server.
  • Spanner: Another fully managed relational database offering unlimited scale, consistency, and almost 100% availability – ideal for supply chain and inventory management across regions and between two databases.
  • Bigtable: Low latency, NoSQL, fully managed database for ML and forecasting, using very large amounts of data in analytical and operational workloads.
  • Data Fusion: Fully managed, cloud-native data integration tool that enables you to move different data sources to different targets – includes over 150 preconfigured connectors and transformers.
  • Firestore: From the Firebase world comes the next generation of Datastore. This cloud-native, NoSQL, document database lets you develop custom apps that directly connect to the database in real-time.
  • Cloud Storage: Object based storage can be considered a database because of all the things you can do with BigQuery – including using standard SQL language to query objects in storage.

Why BigQuery?

After more than 10 years of development, BigQuery has become a foundational data management tool for thousands of businesses. With a large ecosystem of integration partners and a powerful engine that shards queries across petabytes of data and delivers a response in seconds, there are many reasons BigQuery has stood the test of time. It’s more than just super speed, data availability, and insights.

Standard SQL language
If you know SQL, you know BigQuery. As a fully managed platform, it’s easy to learn and use. Simply populate the data and that’s it! You can also bring in large public datasets to experiment and further learn within the platform.

Front-end data
If you don’t have Looker, Tableau, or another type of business intelligence (BI) tool to visualize dashboards off of BigQuery, you can use the software development kit (SDK) for web-based front-end data display. For example, government health agencies can show the public real-time COVID-19 case numbers as they’re being reported. The ecosystem of BigQuery is so broad that it’s a source of truth for your reports, dashboards, and external data representations.

Analogous across offerings

Coming from on-prem, you may be pulling data into multiple platforms – BigQuery being one of them. GCP offerings have a similar interface and easy navigation, so functionality, user experience, and even endpoint verbs are the same. Easily manage different types of data based on the platforms and tools that deliver the most value.

BigQuery Omni

One of the latest GCP services was built with a similar API and platform console to various other platforms. The compatibility enables you to query data living in other places using standard SQL. With BigQuery Omni, you can connect and combine data from outside GCP without having to learn a new language.

Ready for the next step in your cloud journey?

As a Google Cloud Partner, 2nd Watch is here to be your trusted cloud advisor throughout your cloud data journey, empowering you to fuel business growth while reducing cloud complexity. Whether you’re embracing cloud data for the first time or finding new opportunities and solutions with AI, ML, and data science our team of data scientists can help. Contact Us for a targeted consultation and explore our full suite of advanced capabilities.

Learn more

Webinar: 6 Essential Tactics for your Data & Analytics Strategy

Webinar:  Building an ML foundation for Google BigQuery ML & Looker

-Sam Tawfik, Sr Product Marketing Manager

rss
Facebooktwitterlinkedinmail

3 Ways McDonald’s France is Preparing their Data for the Future

Data access is one of the biggest influences on business intelligence, innovation, and strategy to come out of digital modernization. Now that so much data is available, the competitive edge for any business is derived from understanding and applying it meaningfully. McDonald’s France is gaining business-changing insights after migrating to a data lake, but it’s not just fast food that can benefit. Regardless of your industry, gaining visibility into and governance around your data is the first step for what’s next.

1. No More Manual Legacy Tools

Businesses continuing to rely on spreadsheets and legacy tools that require manual processes are putting in a lot more than they’re getting out. Not only are these outdated methods long, tedious, subject to human error, and expensive in both time and resources – but there’s a high probability the information is incomplete or inaccurate. Data-based decision making is powerful, however, without a data platform, a strong strategy, automation, and governance, you can’t easily or confidently implement takeaways.

Business analysts at McDonald’s France historically relied on Excel-based modeling to understand their data. Since partnering with 2nd Watch, they’ve been able to take advantage of big data analytics by leveraging a data lake and data platform. Architected from data strategy and ingestion, to management and pipeline integration, the platform provides business intelligence, data science, and self-service analytics. Now, McDonald’s France can rely on their data with certainty.

2. Granular Insights Become Opportunities for Smart Optimization

Once intuitive solutions for understanding your data are implemented, you gain finite visibility into your business. Since completing the transition from data warehouse to data lake, McDonald’s France has new means to integrate and analyze data at the transaction level. Aggregate information from locations worldwide provides McDonald’s with actionable takeaways.

For instance, after establishing the McDonald’s France data lake, one of the organization’s initial projects focused on speed of service and order fulfilment. Speed of service encompasses both food preparation time and time spent talking to customers in restaurants, drive-thrus, and on the online application. Order fulfilment is the time it takes to serve a customer – from when the order is placed to when it’s delivered. With transaction-level purchase data available, business analysts can deliver specific insights into each contributing factor of both processes. Maybe prep time is taking too long because restaurants need updated equipment, or the online app is confusing and user experience needs improvement. Perhaps the menu isn’t displayed intuitively and it’s adding unnecessary time to speed of service.

Multiple optimization points provide more opportunity to test improvements, scale successes, apply widespread change, fail fast, and move ahead quickly and cost-effectively. Organizations that make use of data modernization can evolve with agility to changing customer behaviors, preferences, and trends. Understanding these elements empowers businesses to deliver a positive overall experience throughout their customer journey – thereby impacting brand loyalty and overall profit potential.

3. Machine Learning, Artificial Intelligence, and Data Science

Clean data is absolutely essential for utilizing machine learning (ML), artificial intelligence (AI), and data science to conserve resources, lower costs, enable customers and users, and increase profits. Leveraging data for computers to make human-like decisions is no longer a thing of the future, but of the present. In fact, 78% of companies have already deployed ML, and 90% of them have made more money as a result.

McDonald’s France identifies opportunity as the most important outcome of migrating to a data lake and strategizing on a data platform. Now that a wealth of data is not only accessible, but organized and informative, McDonald’s looks forward to ML implementation in the foreseeable future. Unobstructed data visibility allows organizations in any industry to predict the next best product, execute on new best practices ahead of the competition, tailor customer experience, speed up services and returns, and on, and on. We may not know the boundaries of AI, but the possibilities are growing exponentially.

Now it’s Time to Start Preparing Your Data

Organizations worldwide are revolutionizing their customer experience based on data they already collect. Now is the time to look at your data and use it to reach new goals. 2nd Watch Data and Analytics Services uses a five-step process to build a modern data management platform with strategy to ingest all your business data and manage the data in the best fit database. Contact Us to take the next step in preparing your data for the future.

-Ian Willoughby, Chief Architect and Vice President

Listen to the McDonald’s team talk about this project on the 2nd Watch Cloud Crunch podcast.

rss
Facebooktwitterlinkedinmail

Cloud Crunch Podcast: Data, AI & ML on Google Cloud

If you’re trying to run your business smarter, not harder, chances are you’re utilizing data to gain insights into the decision-making process and gain a competitive advantage. In the latest episode of our podcast, we talk with data and AI & ML expert, Rui Costa at Google Cloud, about why and when to use cloud data offerings and how to make the most of your data in the cloud. Listen now on Spotify, iTunes, iHeart Radio, Stitcher, or wherever you get your podcasts.

We’d love to hear from you! Email us at CloudCrunch@2ndwatch.com with comments, questions and ideas.

rss
Facebooktwitterlinkedinmail

Top Enterprise IT Trends for 2021

Between the global pandemic and the resulting economic upheaval, it’s fair to say many businesses spent 2020 in survival mode. Now, as we turn the page to 2021, we wonder what life will look like in this new normalcy. Whether it is employees working from home, the shift from brick and mortar to online sales and delivery, or the need to accelerate digital transformation efforts to remain competitive, 2021 will be a year of re-invention for most companies.

How might the new normal impact your company? Here are five of the top technology trends we predict will drive change in 2021:

1. The pace of cloud migration will accelerate

Most companies, by now, have started the journey to the public cloud or to a hybrid cloud environment. The events of 2020 have added fuel to the fire, creating an urgency to maximize cloud usage within companies that now understand that the speed, resilience, security and universal access provided by cloud services is vital to the success of the organization.

“By the end of 2021, based on lessons learned in the pandemic, most enterprises will put a mechanism in place to accelerate their shift to cloud-centric digital infrastructure and application services twice as fast as before the pandemic,” says Rick Villars, group vice president, worldwide research at IDC. “Spending on cloud services, the hardware and software underpinning cloud services, and professional and managed services opportunities around cloud services will surpass $1 trillion in 2024,”

The progression for most companies will be to ensure customer-facing applications take priority. In the next phase of cloud migration, back-end functionality embodied in ERP-type applications will move to the cloud. The easiest and fastest way to move applications to the cloud is the simple lift-and-shift, where applications remain essentially unchanged. Companies looking to improve and optimize business processes, though, will most likely refactor, containerize, or completely re-write applications. They will turn to “cloud native” approaches to their applications.

2. Artificial intelligence (AI) and machine learning (ML) will deliver business insight

Faced with the need to boost revenue, cut waste, and squeeze out more profits during a period of economic and competitive upheaval, companies will continue turning to AI and machine learning to extract business insight from the vast trove of data most collect routinely, but don’t always take advantage of.

According to a recent PwC survey of more than 1,000 executives, 25% of companies reported widespread adoption of AI in 2020, up from 18% in 2019. Another 54% are moving quickly toward AI. Either they have started implementing limited use cases or they are in the proof-of-concept phase and are looking to scale up. Companies report the deployment of AI is proving to be an effective response to the challenges posed by the pandemic.

Ramping up AI and ML capabilities in-house can be a daunting task, but the major hyperscale cloud providers have platforms that enable companies to perform AI and ML in the cloud. Examples include Amazon’s SageMaker, Microsoft’s Azure AI and Google’s Cloud AI.

Edge computing will take on greater importance

For companies that can’t move to the cloud because of regulatory or data security concerns, edge computing is emerging as an attractive option. With edge computing, data processing is performed where the data is generated, which reduces latency and provides actionable intelligence in real time. Common use cases include manufacturing facilities, utilities, transportation, oil and gas, healthcare, retail and hospitality.

The global edge computing market is expected to reach $43.4 billion by 2027, fueled by an annual growth rate of nearly 40%, according to a report from Grand View Research.

The underpinning of edge computing is IoT, the instrumentation of devices (everything from autonomous vehicles to machines on the factory floor to a coffee machine in a fast-food restaurant) and the connectivity between the IoT sensor and the analytics platform. IoT platforms generate a vast amount of real-time data, which must be processed at the edge because it would too expensive and impractical to transmit that data to the cloud.

Cloud services providers recognize this reality and are now bringing forth specific managed service offerings for edge computing scenarios, such as Amazon’s new IoT Greengrass service that extends cloud capabilities to local devices, or Microsoft’s Azure IoT Edge.

4. Platform-as-a-Service will take on added urgency

To increase the speed of business, companies are shifting to cloud platforms for application development, rather than developing apps in-house. PaaS offers a variety of benefits, including the ability to take advantage of serverless computing delivering scalability, flexibility and quicker time to develop and release new apps. Popular serverless platforms include Amazon Lambda and Microsoft’s Azure Functions.

5. IT Automation will increase

Automating processes across the entire organization is a key trend for 2021, with companies prioritizing and allocating money for this effort. Automation can cut costs and increase efficiency in a variety of areas – everything from Robotics Process Automation (RPA) to automate low-level business processes, to the automation of security procedures such as anomaly detection or incident response, to automating software development functions with new DevOps tools.

Gartner predicts that, through 2024, enhancements in analytics and automatic remediation capabilities will refocus 30% of IT operations efforts from support to continuous engineering. And by 2023, 40% of product and platform teams will use AIOps for automated change risk analysis in DevOps pipelines, reducing unplanned downtime by 20%.

Tying it all together

These trends are not occurring in isolation.  They’re all part of the larger digital transformation effort that is occurring as companies pursue a multi-cloud strategy encompassing public cloud, private cloud and edge environments. Regardless of where the applications live or where the processing takes place, organizations are seeking ways to use AI and machine learning to optimize processes, conduct predictive maintenance and gain critical business insight as they try to rebound from the events of 2020 and re-invent themselves for 2021 and beyond.

Where will 2021 take you? Contact us for guidance on how you can take hold of these technology trends to maximize your business results and reach new goals.

-Mir Ali, Field CTO

rss
Facebooktwitterlinkedinmail