1-888-317-7920 info@2ndwatch.com

BigQuery, Looker, and Cloud Functions – Let’s create a data solution.

As a cloud consulting company, we witness enterprise clients with a lot of data; and typical for most of these clients is that data is siloed with universal access to the information not easily transparent. Client libraries are essentially islands of misfit toys.

During an internal hackathon, Nick Centola and I decided to take up the challenge of creating an enterprise class solution that would extract, transform and load (ETL) data from multiple sources to a data warehouse, with the capability of performing advanced forecasting and in turn be 100% serverless by design that inherently keeps running cost to a minimum. We decided to keep the scope relatively simple and used the publicly available Citi Bike NYC dataset. The Citi Bike NYC dataset has monthly trip data exported as CSV files and public and a near real-time API, which from our experience is a pattern we often see in enterprises. The diagram below represents what we were trying to achieve.


At 2nd Watch, we love Functions-as-a-Service (FaaS) and Cloud Functions as we can create very scalable solutions, have no infrastructure to manage, and in most instances, we will not have to worry about the cost associated with the Cloud Functions.

There were two ETL jobs to write. One was to take the zipped CSV data from the public S3 trip data bucket and land it in our Google Cloud Storage Bucket for an automated daily import into BigQuery. The other function was to grab data from the stations’ near real-time restful API endpoint and insert it into our BigQuery table.

Nick is most efficient with Python; I am most efficient with NodeJS. As both languages are acceptable production code languages for most organizations we work with, we decided to write a function in our respected preferred languages.

The data that we pulled into BigQuery was already clean. We did not need to enrich or transform the data for our purpose – this is not always the case, and cleaning and enriching data are areas where we usually spend most of our time when building similar solutions for our customers.

Machine Learning

We wanted to enable a relatively simple forecast on bike demand on individual stations across New York City. BigQuery ML is incredibly powerful and has more than 30 built-in machine learning models. The model of choice for our use case would be the ARIMA model, which takes time series data as an input. I won’t go into too much in detail on why the ARIMA model is a good model for this as compared to the multitude of google cloud functions; the full form of the acronym describes why; Auto Regressive (AR) Integrated (I) Moving Average (MA).


Bringing it all together, we created our LookML models in Looker and interacted with the data exceptionally easily. We made a couple of heat map-based visualizations of New York City to easily visualize the popular routes and stations and a station dashboard to monitor expected supply and demand over the next hour. With the bike stations API data flowing into BQ every 5 seconds, we get a close-to-real-time dashboard that we can use for the basis of alerting staff of an inadequate number of bikes at any station across NYC.

The station forecast shows the upper and the lower bound forecast for each hour over the next month. We use the upper bound forecast for our predicted “amount of bikes in the next hour” and pull in our available bikes from the real-time API. If you use your imagination, you can think of other use cases where a similar prediction could be relevant; franchise restaurant ingredient forecasting or forecasting at retailers for inventory or staffing needs to service customers – the possibilities are endless.

One of the coolest things we did from Nick and my perspective was to drive model training and forecasting straight from Looker and LookML allowing us to essentially kick off our model training every time we receive new data in BigQuery – all from the convenient interface of Looker.

Enterprise solution

As this was a quick prototyping effort, we took a few shortcuts compared to our delivery standards at 2nd Watch. We did not use infrastructure as code, a best practice we implement for all production-ready customer engagements. Second, we decided not to worry about data quality, which would be something we would clean, enrich, and transform based on your documented business requirements. Third, we did not set up telemetry that would allow us to respond to things like slow queries and broken ETL jobs or visualizations.

Is this hard?

Yes and no. For us it was not – Nick and my combined experience accumulates to thousands of hours building and documenting data pipelines and distributed systems. If you are new to this and your data footprint includes more than a few data sources, we highly recommend that you ask for enterprise expertise in building out your pipeline. You’ll need a team with in-depth experience to help you set up LookML as this will be the foundation for self-service within your organization. Ultimately though, experiments like this can serve to create both business intelligence and allow your organizations to proactively respond to events to meet your corporate and digital transformation initiatives.

Do you want to see a demo of our solution, check out our webinars below:

Aleksander Hansson, 2nd Watch Google Cloud Specialist



Digital Transformation in Healthcare Accelerates in 2021

Digital Transformation in Healthcare Accelerates in 2021

The COVID-19 pandemic has driven change throughout industry. For healthcare and healthcare professionals in general, attention on digital transformation and use of cloud technology has come to the forefront in a convergence of regulatory requirements, changing modes of care, data security threats, and evolving patient expectations.

Accelerating digitalization and identifying best practices for carrying data through to actionable insight has become a necessity. Many organizations have been forced to adjust their I.T. strategies and priorities. So, what is digital transformation in healthcare and what will it look like? According to a 2021 Stoltenburg/CHIME survey, at the top of the list of rearranged priorities among CIOs focused on digital transformation include:

  • Using digital health to improve patient engagement
  • Updating or modernizing EHRs for better data flow
  • Strengthening cybersecurity
  • Improving data analytics

Of course, healthcare is inherently data driven. “Data drives nearly every aspect of our healthcare industry.  It helps identify longitudinal treatment trends, socioeconomic risks, missed payment opportunities and more. The broad scope of all of these inputs illustrates how mission critical it is to use the right information sources, tools and expertise” says James Bohnsack, Sr. VP and Chief Strategy Office of global information company TransUnion Healthcare.

Hospitals and health systems collect and store volumes of medical records and patient data estimated to be increasing 48% annually. Data collected across admissions, diagnostics, treatment and discharge as well as patient-provider communications represent a rich resource to those looking to improve care and reduce cost. The opportunities for organizational improvement owing to use of this “big data” have never been better. But how many healthcare organizations have designed future-ready systems to effectively move beyond the collection, storage and simple exchanges of data to leverage all the value of that collective data in the form of actionable insights?

In the past, organizations developed reporting capabilities (operational, regulatory, quality, financial) with disparate paths and processing of data. Although we’ve seen improved data collection, processing and access to data analytics among providers, the pandemic forced healthcare providers to quickly leverage new technologies and see a brighter light. Cloud-enabled supply chain tracking and patient scheduling for vaccines provided a clear example of the value of technology in improving workflow efficiency and management.

Moving forward, the good news is that healthcare providers and payers today have access to numerous solutions for efficient and effective data sharing, managing, mining and integrating to guide and improve their operational and clinical decision making. As digital transformation in Healthcare accelerates in 2021, cloud computing and data lakes that enable machine learning and artificial intelligence will enhance the way organizations gain value from their vast volumes of data and information.

Data standardization through use of HL7 FHIR is helping to build secure interoperability. Transforming from disparate sources and types of data to real-time actionable intelligence and insight has the potential for increased cost efficiency.

Data heavy trends such as telehealth and increased patient interaction through mobile applications are poised to push the need for more efficient technology and data flow further. As healthcare organizations continue to advance along the digitalization journey, we’re seeing a number of consistently mentioned priorities and needs in 2021. These include:

  • Increased digital transformation initiatives with emphasis on flexibility and future readiness
  • Greater investment in digital technologies that are transformative
  • Rising attention to compliance and security
  • Identification of strategies to best manage multi-cloud environments
  • Application modernization to improve patient engagement and boost productivity
  • Post-pandemic revisits of business continuity plans
  • Continued increases in use of telehealth
  • Continued interest in leveraging Artificial Intelligence (AI) and Machine Learning (ML)
  • Increased resource needs for technology initiatives
  • Desire to work with expert partners, not generic suppliers of services

While “89% of healthcare leaders surveyed are accelerating their digital transformation” (MIT Technology Review Study, 2020), most provider organizations do not have the internal resources and expertise to support their initiatives. It becomes critical to success to find expert partners that can provide a premium level of service and engagement.

2nd Watch is ready for your next step in digital transformation

As a partner for AWS, Google Cloud, and Microsoft Azure, 2nd Watch is a trusted cloud advisor. With digital transformation in healthcare accelerating in 2021, we can enable you to achieve your digital transformation objectives and fuel performance improvement while reducing cloud complexity. Whether you’re embracing cloud data for the first time, strengthening compliance and security, or seeking improvements through advanced analytics, our team of data scientists can help. Contact us to discuss your current priorities and explore our full suite of advanced cloud-native capabilities.

-Tom James, Sr. Marketing Manager, Healthcare


Is a Hybrid Cloud Environment Right for Your Enterprise? …Probably

Finding the perfect cloud platform for your business isn’t black and white. Nothing is 100% accurate or can guarantee a right fit, and no two organizations are the same. However, there are practical ways to think about the structure as your enterprise evolves. Introducing a hybrid cloud solution into your overall computing environment offers enterprises a number of benefits from innovation and enablement, to cybersecurity and application.

Hybrid Cloud Environment

Choice and flexibility

Different departments and employees are going to view cloud platforms through the perspective of their responsibilities, tasks, and goals. This typically results in a variety of input as to which type of cloud infrastructure is best. For example, the marketing team might be drawn to Salesforce because of their 360-degree customer view. Some techs might favor Azure for consistency and mobility between on-prem and public cloud environment, while others like the resources and apps available within Amazon Web Services (AWS).

More than ever before, companies are taking advantage of the seemingly endless opportunities with a hybrid cloud strategy. And that is something to embrace. You don’t want to get stuck on a single cloud vendor and miss out on the competitive drive of the market. Competition moves technology forward with new applications, customer-based cost structure, service delivery, and so on. With a hybrid approach, you can take advantage of those innovations to build the best system for your business.

Business continuity

Since the digital transformation fast-tracked and remote work became the ‘new norm,’ bad actors have been having a field day. Ransomware attacks continue to spike, and human error remains the number one cause of data loss. Hybrid cloud environments offer enterprises the backup and recovery tools necessary to keep business moving.

If you’re using the cloud for the bulk of your operations, you can backup and restore from an on-premises environment. If you’re focusing on-premises, you can use the cloud as your backup and restore. With both systems able to work interchangeably as a hybrid cloud architecture, you get an ideal model for data protection and disaster recovery.

Artificial intelligence

Technology requires enterprises to always look ahead in order to remain competitive. Data science, AI, and machine learning are the latest developments for business enablement using data-based decision making. Key to implementing AI is both having the capacity necessary to collect incoming and historical data, as well as the tools to make it operational. AWS provides a huge amount of storage, while Google Cloud Platform (GCP) maximizes data with a variety of services and AI access.

A hybrid infrastructure lets you leverage the best resources and innovation available in the dynamic cloud marketplace.  You’re better equipped to meet targeted AI functionalities and goals with more opportunities. Aware of the benefits and customer preference for hybrid environments, cloud providers are making it easier to ingest data from platform to platform. While interoperability can induce analysis paralysis, the hybrid environment removes a lot of the risks of a single cloud environment. If something doesn’t work as expected, you can easily consume data in a different cloud, using different services and tools. With hybrid cloud, It’s ok to use 100 applications and 100 different cloud-based sources to achieve your desired functionality.


A service-oriented architecture (SOA) calls on enterprises to build IT granularly and responsively.  According to the SOA manifesto, a set of guiding principles similar to the agile manifesto, IT should not be a monolith. Instead, let business needs be the focus and stay close to those as you evolve. SOA is really the foundation of a hybrid cloud environment that allows you to ebb and flow as necessary. It’s common to get distracted by shiny new features – especially in a hybrid cloud environment – but the business needs to drive strategy, direction, and implementation. If you stay focused, you can both leverage hybrid cloud opportunities, and follow SOA to accomplish enterprise goals.

Next step toward a hybrid cloud environment

If you agree with the tile of this article, then it’s time to see what a hybrid cloud could look like in your enterprise. 2nd Watch is an AWS Premier Partner, a Microsoft Azure Gold Partner, and a Google Cloud Partner with 10 years of experience in cloud. Our experts and industry veterans are here to help you build your environment for lasting success.

Contact Us to discuss picking your public cloud provider, or providers; utilizing on-prem resources; ensuring financial transparency and efficiency; and to get impartial advice on how best to approach your cloud modernization strategy.






Multi-Cloud Challenges and Solutions: Cloud Cost Sprawl and Integration

Multi-cloud strategies suggest that enterprises run their applications and workloads in whatever cloud environment makes the most sense from a cost, performance and functionality perspective. That’s the theory anyway. In practice however, a multi-cloud environment requires a fair amount of tooling. Many enterprises grapple with integrating technologies created by competing suppliers in the hopes of achieving the elusive, single pane of glass.

Regardless of the challenges, a multi-cloud strategy has inherent benefits. Understanding problems upfront, and mitigating the consequences, is step one to realizing those benefits.

Multi-Cloud Challenges and Solutions: Cloud Cost Sprawl and Integration

Problem: Cloud cost sprawl

Typically, the first migration in a company is initiated as a cost saving measure. Maybe you’re moving from CAPEX to OPEX, or joining a monthly subscription plan, so there’s a clear strategy and goal. As you move deeper into the cloud, developers see the potential of new applications, and opportunities for innovation. The shiny new tools and enhanced flexibility of the cloud can lead to unexpected and expensive surprises.

All too often, an organization moves to a monthly subscription model and slowly but surely, everybody increases and expands their use of services. Next thing you know, you’re getting huge bills from your cloud provider that nearly equal or exceed the cost of buying equipment. Cloud cost sprawl is the expensive result of unrestricted and unregulated use of cloud resources. It’s so rampant that an estimated 35% of enterprise cloud spend is wasted via cloud cost sprawl.

Solution: Cloud cost optimization

There’s more than one way to wrangle cloud use, achieve your goals, and maintain your budget while cutting out waste. Cloud cost optimization is a complex organizational process that runs parallel to cloud migration and cloud-based service use.

With simplified cloud billing, an understanding of how cloud cost sprawl happens, why cost optimization is important, and an ongoing optimization effort – large enterprises can save up to 70%. Through a combination of software, services and strategy, cost optimization helps businesses immediately achieve significant cost savings.

Learn more in the links below and schedule your free, 2-hour, Cloud Cost Optimization Discovery Session with 2nd Watch cloud experts to discover how best to get started.

Problem: Environment integration

With various types of environments coming together in multi-cloud, it can be hard to integrate, interoperate, and move data across the infrastructure for performance and use. Each environment has its own managing and monitoring systems that require certain expertise.

Infrastructure as a service (IaaS), including cloud providers like AWS, GCP and Azure, are one layer of the environment – and higher level services, or platform as a service (PaaS), is another layer. Platforms like Salesforce and NetSuite offer additional tools to build within specific domains, but the challenge is bringing everything together.

Solution: Expertise and tools

If you don’t have the in-house knowledge for a multi-cloud environment, outsourcing cloud management to an expert who also provides guidance and direction for cloud growth is a cost-effective solution. Regardless of who is in charge of integration, cloud providers offer tools and services to help monitor and manage the infrastructure as a whole. Recently, services have been introduced to directly address integration with other environments in a multi-cloud infrastructure.

For example, Google’s latest service, BigQuery Omni, lets you connect, combine, and query data from outside GCP without having to learn a new language. AWS is taking a more multi-cloud approach with ECS and EKS Anywhere. There’s also Anthos on GCP, and Arc on Azure – all services that allow organizations to run containers in the environments of their choosing.

See how cloud providers are embracing the movement toward multi-cloud to make multi-cloud integration easier.

Managed Cloud Services

Accelerate the delivery of innovative solutions and gain competitive advantage, without impacting current operations, using 2nd Watch Managed Cloud Services. Your dedicated cloud experts provide a simple, flexible set of Day 2 operational services to support cost optimization, management, monitoring, and integration within a multi-cloud infrastructure. Contact Us to make multi-cloud a success at your organization.


6 Cloud Trends from 2020 Continuing in 2021

At the end of summer 2020, 2nd Watch surveyed over 100 cloud-focused IT directors about their cloud use. Now in 2021, we’re looking back at the 2020 Enterprise Cloud Trends Report to highlight six situations to anticipate going forward. As you would expect, COVID-19 vaccination availability, loosening of restrictions, and personal comfort levels continue to be an influential focus of cloud growth, and a significant factor in the acceleration of digital transformation.

1. Remote work adoption

The forced experiment of work from home, work from anywhere, and remote work in general has proven effective for many organizations. Employees are happy with the flexibility, and many businesses are enjoying increased productivity, a larger talent pool, and significant cost savings. While just about half (46%) of survey respondents said more than 50% of their employees were working remote in summer 2020, that number is expected to grow 14%. Rather than pulling back on remote work enablement, 60% of companies say almost 60% of employees will work away from the office.

2. Remote work challenges

It’s anticipated that as the number of remote workers grows, so do the challenges of managing the distributed work environment. Remote access, specifically into a corporate system, is the highest ranked challenge by survey respondents. Other issues include the capacity of conferencing and collaboration tools, and user competence. The complexities of both managing remote workers, and being a remote worker continue to evolve.

Business and IT leaders are constantly having to revisit the cloud infrastructure established in 2020, to provide flexibility, access, and business continuity during 2021 and beyond.

3. Cloud services and business collaboration

The cloud services market is maintaining their COVID-19 growth spurt, but the relationship between provider and client is shifting. The digital transformation and increasing reliance on cloud-based services is creating a new level of engagement and desire for collaboration from businesses. Organizations want to work alongside cloud providers and service providers so they can upskill along the way, and for the future. Businesses are using providers to build their cloud foundation and establish some pipelines – particularly around data migration – and in effect, learning on the job. As the business gets more involved in each project, and continues to build skills and evolve their DevOps culture, they can ultimately reduce their dependence, and associated costs, on cloud partners.

4. Growing cloud budgets

Surviving and thriving organizations have been, and continue to, position themselves for the long haul. Just over 64% of survey respondents said their cloud budgets had either remained the same, or increased. And almost 60% say their cloud budget will grow over the next 12 months.

Many are utilizing this time to gain competitive advantage by improving mobile app development, customer experience, and operations. The expectation of a payback period has businesses focused on boosting ROI using cloud-based services. 2020 forced business leaders to re-adjust how they see IT within their organization. It’s no longer a cost center, but something that can propel and enable the company forward.

5. Cloud security and data governance

As everyone moved out of the office in 2020, hackers took notice. Since then, ransomware attacks have been steadily increasing and there’s no signs of slowing down. The majority of survey respondents, 75%, agreed that cloud security and data governance are their number one concern and priority.

High profile breaches and remote work risks are grabbing headlines, causing organizations to question their security posture, and the tools necessary to prevent incidents. The role of proactive AI in cloud security is enabling faster response times and higher visibility into recovery and prevention. While tools are getting better, threats are also getting bigger.

6. Optimism

Overall, the majority of respondents are leaning in to today’s circumstances and have a positive perspective on the future. With many organizations responding by accelerating cloud use to support the current environment, the most successful are also thinking ahead. Increasing cloud budgets, fostering external cloud collaboration for skill growth, and relying on the cloud to support remote employees showcases how the business landscape has changed. Data is more critical than ever as organizations accelerate toward the digital transformation – and the future looks bright.

Stay tuned for the 2021 Enterprise Cloud Trends Report for a focus on IT departments, see what security improvements have been made, and how organizations are continuing to use and support remote environments. Until then, if you’re moving applications to the cloud, modernizing your licensed database systems, and optimizing cloud use, let 2nd Watch help! With premier services across cloud platforms, and as a trusted cloud advisor, we embrace your unique journey. Contact Us to take the next step.

Nicole Maus – Director of Marketing


3 Reasons Businesses Are Using Google Cloud Platform for AI

Google Cloud Platform (GCP) offers a wide scope of artificial intelligence (AI) and machine learning (ML) services fit for a range of industries and use cases. With more businesses turning to AI for data-based innovation and new solutions, GCP services are proving effective. See why so many organizations are choosing Google Cloud to motivate, manage, and make change easy.

1. Experimentation and cost savings

Critical to the success of AI and ML models are data scientists. The more you enable, empower, and support your data scientists through the AI lifecycle, the more accurate and reliable your models will be. Key to any successful new strategy is flexibility and cost management. Oneway GCP reduces costs while offering enterprise flexibility is with Google’s AI Platform Notebooks.

Managed JuptyerLab notebook instances give data scientists functional flexibility – including access to BigQuery, with the ability to add CPUs, RAM, and GPUs to scale – cloud security, and data access with a streamlined experience from data to deployment. Relying on on-prem environments, data scientists are limited by resource availability and a variety of costs related data warehousing infrastructure, hosting, security, storage, and other expenses. JuptyerLab notebooks and Big Query, on the other hand, are pay as you go and always available via the AI Platform Notebooks. With cost-effective experimentation, you avoid over provisioning, only pay for what you use and when you run, and give data scientists powerful tools to get data solutions fast.

2. Access and applications

AI and ML projects are only possible after unifying data. A common challenge to accomplishing this first step are data silos across the organization. These pockets of disjointed data across departments threaten the reliability and business outcomes of data-based decision making. The GCP platform is built on a foundation of integration and collaboration, giving teams the necessary tools and expansive services to gain new data insights for greater impacts.

For instance, GCP enables more than just data scientists to take advantage of their AI services, databases, and tools. Developers without data science experience can utilize APIs to incorporate ML into the solution without ever needing to build a model. Even others, who don’t have knowledge around data science, can create custom models that integrate into applications and websites using Cloud AutoML.

Additionally, BigQuery Omni, a new service from GCP, enables compatibility across platforms. BigQuery Omni enables you to query data residing in other places using standard SQL with the powerful engine of BigQuery. This innovation furthers your ability to join data quickly and without additional expertise for unobstructed applicability.

3. ML Training and labs

Google enables users with best practices for cost-efficiency and performance. Through its Quiklabs platform, you get free, temporary access to GCP and AWS, to learn the cloud on the real thing, rather than simulations. Google also offers training courses ranging from 30-minute individual sessions, to multi-day sessions. The courses are built for introductory users, all the way up to expert level, and are instructor-led or self-paced. Thousands of topics are covered, including AI and ML, security, infrastructure, app dev, and many more.

With educational resources at their fingertips, data teams can roll up their sleeves, dive in, and find some sample data sets and labs, and experience the potential of GCP hands-on. Having the ability to experiment with labs without running up a bill – because it is in a sandbox environment – makes the actual implementation, training, and verification process faster, easier, and cost-effective. There is no danger of accidentally leaving a BigQuery system up and running, executing over and over, with a huge cost to the business.

Next steps

If you’re contemplating AL and ML on Google Cloud Platform, get started with Quiklabs to see what’s possible. Whether you’re the one cheerleading AI and ML in your organization or the one everyone is seeking buy-in from, Quiklabs can help. See what’s possible on the platform before going full force on a strategy. Google is constantly adding new services and tools, so partner with experts you can trust to achieve the business transformation you’re expecting.

Contact 2nd Watch, a Google Cloud Partner with over 10 years of cloud experience, to discuss your use cases, level of complexity, and our advanced suite of capabilities with a cloud advisor.

Learn more

Webinar: 6 Essential Tactics for your Data & Analytics Strategy

Webinar:  Building an ML foundation for Google BigQuery ML & Looker

-Sam Tawfik, Sr Product Marketing Manager


3 Types of Employees That Can Use AI Offerings on Google Cloud

The Google Cloud Platform (GCP) comes with a number of services, databases, and tools to operationalize company-wide data management and analytics. With the insights and accessibility provided, you can leverage data into artificial intelligence (AI) and machine learning (ML) projects cost-efficiently. GCP empowers employees to apply their ideas and experience into data-based solutions and innovation for business growth. Here’s how.

1. Developers without data science experience

With GCP, developers can connect their software engineering experience with AI capabilities to produce powerful results. Using product APIs, developers can incorporate ML into the product without ever having to build a model.

Let’s take training videos for example – Your company has thousands of training videos varying in length and across subjects. They include everything from full-day trainings on BigQuery, to minutes-long security trainings. How do you operationalize all that information for employees to quickly find exactly what they want?

Using Google’s Cloud Video Intelligence API, the developer can transcribe not only every single video, word-for-word, but also document the start and end time of every word, in every video. The developer builds a search index on top of the API, and just like that, users can search specific content in thousands of videos. Results display both the relevant videos and timestamps within the videos, where the keyword is found. Now employees can immediately find the topic they want to learn more about, without needing to sift through what could be hours of unrelated information.

Additional APIs include, Cloud Natural Language, Speech-to-Text, Text-to-Speech, Cloud Data Loss Prevention, and many others in ML.

2. Everyone without data science experience, who isn’t a developer

Cloud AutoML enables your less technical employees to harness the power of machine learning. It bridges the gap between the API and building your own ML model. Using AutoML, anyone can create custom models tailored to your business needs, and then integrate those models into applications and websites.

For this example, let’s say you’re a global organization who needs to translate communications across dialects and business domains. The intricacies and complexities of natural language require expensive linguists and specialist translators with domain-specific expertise. How do you communicate in real time effectively, respectfully, and cost-efficiently?

With AutoML Translation, almost anyone can create translation models that return query results specific to your domain, in 50 different language pairs. It graphically ingests your data from any type of Sheet or CSV file. The input data necessary is pairs of sentences that mean the same thing in both the language you want to translate from, and the one you want to translate to. Google goes the extra mile between generic translation and specific, niche vocabularies with an added layer of specificity to help the model get the right translation for domain-specific material. Within an hour, the model translates based on your domain, taxonomy, and the data you provided.

Cloud AutoML is available for platform, sight, structured data, and additional language capabilities.

3. Data scientists

Data scientists have the experience and data knowledge to take full advantage of GCP AI tools for ML. One of the issues data scientists often confront is notebook functionality and accessibility. Whether its TensorFlow, PyTorch, or JupyterLab, these open source ML platforms require too many resources to run on a local computer, or easily connect to BigQuery.

Google AI Platform Notebooks is a managed service that provides a pre-configured environment to support these popular data science libraries. From a security standpoint, AI Platform Notebooks is attractive to enterprises for the added security of the cloud. Relying on a local device, you run the risk of human error, theft, and fatal accidents. Equipped with a hosted, integrated, secure, and protected JupyterLab environment, data scientists can do the following:

  • Virtualize in the cloud
  • Connect to GCP tools and services, including BigQuery
  • Develop new models
  • Access existing models
  • Customize instances
  • Use Git / GitHub
  • Add CPUs, RAM, and GPUs to scale
  • Deploy models into production
  • Backup machines

With a seamless experience from data to a deployed ML model, data scientists are empowered to work faster, smarter, and safer. Contact Us to further your organization’s ability to maximize data, AI, and ML.

Learn more

Webinar: 6 Essential Tactics for your Data & Analytics Strategy

Webinar:  Building an ML foundation for Google BigQuery ML & Looker

-Sam Tawfik, Sr Product Marketing Manager


Maximizing Cloud Data with Google Cloud Platform Services

If you’re trying to run your business smarter, not harder, utilizing data to gain insights into decision making gives you a competitive advantage. Cloud data offerings empower utilization of data in the cloud, and the Google Cloud Platform (GCP) is full of options. Whether you’re migrating data, upgrading to enterprise-class databases, or transforming customer experience on cloud-native databases – Google Cloud services can fit your needs.

Highlighting some of what Google has to offer

With so many data offerings from GCP, it’s nearly impossible to summarize them all. Some are open source projects being distributed by other vendors, while others were organically created by Google to service their own needs before being externalized to customers. A few of the most popular and widely used include the following.

  • BigQuery: Core to GCP, this serverless, scalable, multi-cloud, data warehouse enables business agility – including data manipulation and data transformation, and it is the engine for AI, machine learning (ML), and forecasting.
  • Cloud SQL: Traditional relational database in the cloud that reduces maintenance costs with fully managed services for MySQL, PostgreSQL, and SQL Server.
  • Spanner: Another fully managed relational database offering unlimited scale, consistency, and almost 100% availability – ideal for supply chain and inventory management across regions and between two databases.
  • Bigtable: Low latency, NoSQL, fully managed database for ML and forecasting, using very large amounts of data in analytical and operational workloads.
  • Data Fusion: Fully managed, cloud-native data integration tool that enables you to move different data sources to different targets – includes over 150 preconfigured connectors and transformers.
  • Firestore: From the Firebase world comes the next generation of Datastore. This cloud-native, NoSQL, document database lets you develop custom apps that directly connect to the database in real-time.
  • Cloud Storage: Object based storage can be considered a database because of all the things you can do with BigQuery – including using standard SQL language to query objects in storage.

Why BigQuery?

After more than 10 years of development, BigQuery has become a foundational data management tool for thousands of businesses. With a large ecosystem of integration partners and a powerful engine that shards queries across petabytes of data and delivers a response in seconds, there are many reasons BigQuery has stood the test of time. It’s more than just super speed, data availability, and insights.

Standard SQL language
If you know SQL, you know BigQuery. As a fully managed platform, it’s easy to learn and use. Simply populate the data and that’s it! You can also bring in large public datasets to experiment and further learn within the platform.

Front-end data
If you don’t have Looker, Tableau, or another type of business intelligence (BI) tool to visualize dashboards off of BigQuery, you can use the software development kit (SDK) for web-based front-end data display. For example, government health agencies can show the public real-time COVID-19 case numbers as they’re being reported. The ecosystem of BigQuery is so broad that it’s a source of truth for your reports, dashboards, and external data representations.

Analogous across offerings

Coming from on-prem, you may be pulling data into multiple platforms – BigQuery being one of them. GCP offerings have a similar interface and easy navigation, so functionality, user experience, and even endpoint verbs are the same. Easily manage different types of data based on the platforms and tools that deliver the most value.

BigQuery Omni

One of the latest GCP services was built with a similar API and platform console to various other platforms. The compatibility enables you to query data living in other places using standard SQL. With BigQuery Omni, you can connect and combine data from outside GCP without having to learn a new language.

Ready for the next step in your cloud journey?

As a Google Cloud Partner, 2nd Watch is here to be your trusted cloud advisor throughout your cloud data journey, empowering you to fuel business growth while reducing cloud complexity. Whether you’re embracing cloud data for the first time or finding new opportunities and solutions with AI, ML, and data science our team of data scientists can help. Contact Us for a targeted consultation and explore our full suite of advanced capabilities.

Learn more

Webinar: 6 Essential Tactics for your Data & Analytics Strategy

Webinar:  Building an ML foundation for Google BigQuery ML & Looker

-Sam Tawfik, Sr Product Marketing Manager


App Modernization in the Cloud

The cloud market is maturing, and organizations worldwide are well into implementing their cloud strategies. In fact, a recent McKinsey survey estimates that, by 2022, 75% all workloads will be running in either public or private clouds. Additionally, according to VMWare, 72% of businesses are looking for a path forward for their existing applications, and it is important to consider an app modernization strategy as part of these migration efforts. Whether it be a desire to containerize, utilize cloud-native services, increase agility, or realize cost savings, the overall goal should be to deliver business value faster in the rapidly changing cloud environment.

Modern Application

Application modernization has a focus on legacy or “incumbent” line of business applications, and approaches range anywhere between re-hosting from the datacenter to cloud, to full cloud native application rewrites. We prefer to take a pragmatic approach, which is to address issues with legacy applications that hinder organizations from realizing the benefits of modern software and cloud native approaches, while retaining as much of the intellectual property that has been built into incumbent applications over the years as possible. Additionally, we find ways of augmenting existing code bases to make use of modern paradigms.

Application Modernization Strategies

When approaching legacy software architecture, people often discuss breaking apart monolithic applications and microservices. However, the most important architectural decisions should be centered around how to best allow the application to function well in the cloud, with scalability, fault-tolerance, and observability all being important aspects. A popular approach is to consider the tenants of the 12-Factor App to help guide these decisions.

Architecture discussions go hand in hand with considering platforms. Containerization and serverless functions are popular approaches, but equally valid is traditional VM clustering or even self-hosting. Additionally, we start to think about utilizing cloud services to offload some application complexity, such as AWS S3 for document storage or AWS KMS for key management. This leads us to consider different cloud providers themselves for best fit for the organization and the applications overall, whether it be AWS, Azure, GCP (Google cloud platform), or hybrid-cloud solutions.

Another very important aspect of application modernization, especially in the cloud, is ensuring that applications have proper automation. Strong continuous integration and continuous deployment (CI/CD) pipelines should be implemented or enhanced for legacy applications. Additionally, we apply CD/CI automation for deploying database migrations and performing infrastructure-as-code (IaaC) updates, and ensure paradigms like immutable infrastructure (i.e. pre-packaging machine images or utilizing containerization) are utilized.

Last, there is an important cultural aspect to modernization from an organizational to team level. Organizations must consider modernization a part of their overall cloud strategy and support their development teams in this area. Development teams must adapt to new paradigms to understand and best utilize the cloud – adopting strong DevOps practices and reorganizing teams along business objectives instead of technology objectives is key.

By implementing a solid modernization strategy, businesses can realize the benefits the cloud provides, deliver value to their customers more rapidly, and compete in a rapidly changing cloud environment. If you’re ready to implement a modernization strategy in your organization, contact us for guidance on how to get started. Learn more about application modernization here.

– James Connell, Sr Cloud Consultant


What is App Modernization?

Application modernization is the process of migrating an incumbent or legacy software application to modern development patterns, paradigms and platforms with the explicit purpose of improving business value. It’s a part of your entire application modernization strategy and implies improving the software architecture, application infrastructure, development techniques and business strategy using a cloud native approach. Essentially, it allows you to derive increased business value from your existing application code.

Modernizing software architecture is often described as splitting a monolithic codebase but can imply any improvements to the software itself, such as decoupling of components or addressing tech debt in the codebase. Other examples might be finding new design patters that allow for scale, addressing resiliency within an application or improving observability through logs and tracing.

We often think of application modernization in the context of cloud, and when planning a migration to cloud or modernizing an application already in the cloud, we look at what services and platforms are beneficial to the effort. Utilizing a service such as Amazon S3 for serving documents instead of a network share or utilizing ElasticSearch instead of the database for search are examples of infrastructure improvements. Containerization and other serverless platforms are also considered.

Development techniques also need to be addressed in the context of modernization. Developers should focus on the parts of the application that deliver value to customers and provide competitive advantage. If developers are focused on maintenance, long manual deployments, bugs, and log investigation, they are unable to deliver value quickly.

When working with modern distributed cloud applications, teams need to follow strong DevOps practices in order to be successful. CI/CD, unit testing, diagnostics and alerting are all areas that development teams can focus on modernizing.

Legacy Application and Legacy Systems

In this context, legacy software refers to an incumbent application or system that blocks or slows an organization’s ability to accomplish its business goals. These systems still provide value and are great candidates for modernization.

Legacy can imply many things, but some common characteristics of legacy apps are:

  • Applications that run older libraries, outdated frameworks, or development platforms or operating systems that are no longer supported.
  • Architectural issues – monolithic or tightly coupled systems can lead to difficulties in deployment, long release cycles and high defect rates.
  • Large amounts of technical debt, dead or unused code, teams who no longer understand how older parts of the application work, etc.
  • Security issues caused by technical debt, outdated security paradigms, unpatched operating systems, and improper secret management.
  • Lack instrumentation with no way to observe the application.
  • Maintain session state on the client (require sticky sessions, etc.).
  • Manually deployed or must be deployed in specific ways due to tight coupling.

 Pillars of Application Modernization

When approaching a modernization project, we specifically look to ensure the following:

Flexible Architecture

The modernization initiative should follow a distributed computing approach, meaning it should take advantage of concepts such as elasticity, resiliency, and containerization. Converting applications to adhere to the principals of the “12-factor app” in order to take advantage of containerization is a prime example.


The application must be built, tested and deployed using modern CI/CD processes. Older source control paradigms such as RCS or SVN should be replaced with distributed version control systems (git). Infrastructure as code should be included as part of the CI/CD system.


Holistically integrate logs, metrics, and events enabling “the power to ask new questions of your system, without having to ship new code or gather new data in order to ask those new questions” (Charity Majors https://www.honeycomb.io/blog/observability-a-manifesto).  Observability is key to understanding performance, error rates, and communication patterns and enables the ability to measure your system and establish baselines.


Application teams should be aligned along business function, not technology, meaning multi-disciplinary teams that can handle operations (DevOps), database, testing (QA) and development. A culture of ownership is important in a cloud-native application.

App Modernization Examples

Application Modernization is not:

  • Just containerization – To take full advantage of containerization, applications must be properly architected (12-factor), instrumented for observability and deployed using CI/CD.
  • Just technical solutions adapting the latest framework or technology – The technology might be “modern” in a sense but doesn’t necessarily address cultural or legacy architectural issues.
  • Just addressing TCO – Addressing cost savings without addressing legacy issues does not constitute modernization.
  • Just running a workload in the cloud
  • Just changing database platforms – Licensing issues or the desire to move to open source clustered cloud databases does not equate to modernization.
  • Limited to a specific programming languages or specific cloud providers as a hybrid cloud approach can be deployed.

Application modernization includes, among others, combinations of:

  • Moving a SaaS application from a single to multi-tenant environment.
  • Breaking up a monolithic application into microservices.
  • Applying event driven architecture to decouple and separate concerns.
  • Utilizing cloud services such as S3 to replace in-house solutions.
  • Refactoring to use NoSQL technologies such as MongoDB, ElasticSearch, or Redis.
  • Containerization and utilization of PaaS technologies such as Kubernetes or Nomad.
  • Utilization of Serverless (FaaS) technologies such as AWS Lambda, Azure Functions, OpenFaas, or Kubeless.
  • Creating strong API abstractions like REST or gRPC and utilizing API Gateways.
  • Transitioning to client-side rendering frameworks (React, Vue.js, etc.) and serverless edge deployment of UI assets, removing the webserver.
  • Moving long running synchronous tasks to asynchronous batch processes.
  • Utilizing saga patterns or business process workflows.
  • Will ultimately focus on enhancing business applications, improving customer experience, and enable rapid digital transformation for the organization.

If you’re ready to start considering application modernization in your organization, contact us for guidance on how to get started.

-James Connell, Sr Cloud Consultant