Data & AI Predictions in 2023

As we reveal our data and AI predictions for 2023, join us at 2nd Watch to stay ahead of the curve and propel your business towards innovation and success. How do we know that artificial intelligence (AI) and large language models (LLMs) have reached a tipping point? It was the hot topic at most families’ dinner tables during the 2022 holiday break.

AI has become mainstream and accessible. Most notably, OpenAI’s ChatGPT took the internet by storm, so much so that even our parents (and grandparents!) are talking about it. Since AI is here to stay beyond the Christmas Eve dinner discussion, we put together a list of 2023 predictions we expect to see regarding AI and data.

#1. Proactively handling data privacy regulations will become a top priority.

Regulatory changes can have a significant impact on how organizations handle data privacy: businesses must adapt to new policies to ensure their data is secure. Modifications to regulatory policies require governance and compliance teams to understand data within their company and the ways in which it is being accessed. 

To stay ahead of regulatory changes, organizations will need to prioritize their data governance strategies. This will mitigate the risks surrounding data privacy and potential regulations. As a part of their data governance strategy, data privacy and compliance teams must increase their usage of privacy, security, and compliance analytics to proactively understand how data is being accessed within the company and how it’s being classified. 

#2. AI and LLMs will require organizations to consider their AI strategy.

The rise of AI and LLM technologies will require businesses to adopt a broad AI strategy. AI and LLMs will open opportunities in automation, efficiency, and knowledge distillation. But, as the saying goes, “With great power comes great responsibility.” 

There is disruption and risk that comes with implementing AI and LLMs, and organizations must respond with a people- and process-oriented AI strategy. As more AI tools and start-ups crop up, companies should consider how to thoughtfully approach the disruptions that will be felt in almost every industry. Rather than being reactive to new and foreign territory, businesses should aim to educate, create guidelines, and identify ways to leverage the technology. 

Moreover, without a well-thought-out AI roadmap, enterprises will find themselves technologically plateauing, teams unable to adapt to a new landscape, and lacking a return on investment: they won’t be able to scale or support the initiatives that they put in place. Poor road mapping will lead to siloed and fragmented projects that don’t contribute to a cohesive AI ecosystem.

#3. AI technologies, like Document AI (or information extraction), will be crucial to tap into unstructured data.

According to IDC, 80% of the world’s data will be unstructured by 2025, and 90% of this unstructured data is never analyzed. Integrating unstructured and structured data opens up new use cases for organizational insights and knowledge mining.

Massive amounts of unstructured data – such as Word and PDF documents – have historically been a largely untapped data source for data warehouses and downstream analytics. New deep learning technologies, like Document AI, have addressed this issue and are more widely accessible. Document AI can extract previously unused data from PDF and Word documents, ranging from insurance policies to legal contracts to clinical research to financial statements. Additionally, vision and audio AI unlocks real-time video transcription insights and search, image classification, and call center insights.

Organizations can unlock brand-new use cases by integrating with existing data warehouses. Finetuning these models on domain data enables general-purpose models across a wide variety of use cases. 

#4. “Data is the new oil.” Data will become the fuel for turning general-purpose AI models into domain-specific, task-specific engines for automation, information extraction, and information generation.

Snorkel AI coined the term “data-centric AI,” which is an accurate paradigm to describe our current AI lifecycle. The last time AI received this much hype, the focus was on building new models. Now, very few businesses need to develop novel models and algorithms. What will set their AI technologies apart is the data strategy.

Data-centric AI enables us to leverage existing models that have already been calibrated to an organization’s data. Applying an enterprise’s data to this new paradigm will accelerate a company’s time to market, especially those who have modernized their data and analytics platforms and data warehouses

#5. The popularity of data-driven apps will increase.

Snowflake recently acquired Streamlit, which makes application development more accessible to data engineers. Additionally, Snowflake introduced Unistore and hybrid tables (OLTP) to allow data science and app teams to work together and jointly off of a single source of truth in Snowflake, eliminating silos and data replication.

Snowflake’s big moves demonstrate that companies are looking to fill gaps that traditional business intelligence (BI) tools leave behind. With tools like Streamlit, teams can harness tools to automate data sharing and deployment, which is traditionally manual and Excel-driven. Most importantly, Streamlit can become the conduit that allows business users to work directly with the AI-native and data-driven applications across the enterprise.

#6. AI-native and cloud-native applications will win.

Customers will start expecting AI capabilities to be embedded into cloud-native applications. Harnessing domain-specific data, companies should prioritize building upon module data-driven application blocks with AI and machine learning. AI-native applications will win over AI-retrofitted applications. 

When applications are custom-built for AI, analytics, and data, they are more accessible to data and AI teams, enabling business users to interact with models and data warehouses in a new way. Teams can begin classifying and labeling data in a centralized, data-driven way, rather than manually and often-repeated in Excel, and can feed into a human-in-the-loop system for review and to improve the overall accuracy and quality of models. Traditional BI tools like dashboards, on the other hand, often limit business users to consume and view data in a “what happened?” manner, rather than in a more interactive, often more targeted manner.

#7. There will be technology disruption and market consolidation.

The AI race has begun. Microsoft’s strategic partnership with OpenAI and integration into “everything,” Google’s introduction of Bard and funding into foundational model startup Anthropic, AWS with their own native models and partnership with Stability AI, and new AI-related startups are just a few of the major signals that the market is changing. The emerging AI technologies are driving market consolidation: smaller companies are being acquired by incumbent companies to take advantage of the developing technologies. 

Mergers and acquisitions are key growth drivers, with larger enterprises leveraging their existing resources to acquire smaller, nimbler players to expand their reach in the market. This emphasizes the importance of data, AI, and application strategy. Organizations must stay agile and quickly consolidate data across new portfolios of companies. 

Conclusion

The AI ball is rolling. At this point, you’ve probably dabbled with AI or engaged in high-level conversations about its implications. The next step in the AI adoption process is to actually integrate AI into your work and understand the changes (and challenges) it will bring. We hope that our data and AI predictions for 2023 prime you for the ways it can have an impact on your processes and people.

Think you’re ready to get started? Find out with 2nd Watch’s data science readiness assessment.


Marketing Dashboard Examples for Data-Driven Marketers

In recent years, marketers have seen a significant emphasis on data-driven decision-making. Additionally, there is an increased need to understand customer behavior and key metrics such as ROI or average order value (AOV). With an endless number of data sources (social media, email marketing, ERP systems, etc.) and a rapidly growing amount of data, both executives and analysts struggle to make sense of all the available information and respond in a timely manner.

A well-developed marketing dashboard helps companies overcome these challenges by organizing data into digestible metrics and reports that update automatically. Dashboards provide intuitive visuals that unlock the business value within your data quickly and without the manual effort of getting pieces of information from multiple places. Below, we have outlined four ways that dashboards benefit the entire marketing team, from the executives to the analysts.

Download Now: Sales and Marketing Dashboard Lookbook

1. Dashboards centralize all of your key information in one place.

Dashboards combine all of the essential information that would typically be found across various reports from disparate systems such as your CRM, ERP, or even third-party reports. Questions such as “How does our average order value compare to this time last year?”, “Which marketing channels are driving the most new customers?”, or “What is our marketing ROI?” can be quickly, consistently, and reliably answered without waiting on IT to put the information together or spending your department’s working day going to each software to download a new report and then marry them together. Instead, you can simply refer to a dashboard that highlights key statistics.

2. Dashboards automate manual processes and ensure reliable and consistent reports.

Marketing analysts often spend more time wrangling, cleaning, and validating data for a report than they do gleaning insights from it. Even after these reports are created, it’s difficult to ensure the metrics will remain consistent each time they are delivered to executives. Dashboards are typically built using one source of data that has predefined metrics and inputs, to ensure the reports remain consistent. They can automatically refresh data on a set schedule or surface it as it’s collected. Not only does this create more time for marketing analysts to focus on creating campaigns and incentives based on these insights, but it also allows executives to make decisions using accurate and consistent data points.

This dashboard highlights the trend in a selected metric, including its predicted future value, allowing marketers to quickly pivot when current campaign ROI is trending downward. It also allows the marketing department to demonstrate ROI to the company as a whole, which often leads to an increase in marketing budget.

This dashboard highlights the order activity associated with experimental products to allow the merchandising team to pivot quickly when a new product isn’t working, or it can allow the marketing team to increase advertising dollars if a new product isn’t getting enough attention.

3. Quick and reliable understanding of customer behavior paves the way to a stronger customer relationship.

Dashboards consolidate and visualize the story your data is telling. This often reflects the reality of how customers interact with your brand and clearly points out new trends.

For example, creating a picture that combines key pieces of information across systems, such as how many orders a customer placed and the value of those orders from your ERP with click analytics for that customer from your email marketing platform, allows you to identify like-customers who may respond to similar incentives. These customer profiles can then grow and change over time as you gather more data, leading to insights that allow you to more efficiently target the audiences who are more likely to convert based on that incentive. It also cuts down on the number of incentives or touchpoints you put in front of a customer who isn’t interested in that particular part of your business. Less spam for your customers and more conversions for you.

This dashboard quickly highlighted which customer segment was more engaged when the brand pushed social/email/web content, giving the marketing department the perfect focus group on which they could test new ideas.

This dashboard provides an executive-level overview of marketing performance.

This dashboard shows the comparison in purchase behavior across loyalty and non-loyalty customers. Our clients use dashboards like these to inform when they should incentivize customers to participate in a loyalty program or jump to the next tier and when the loyalty program is actually costing them more money than it’s yielding.

This dashboard highlights the time between orders across your customer base. Say, for example, you have a set of customers who place an order every six weeks, but they haven’t returned for 10. Our clients automate the discovery of these customers and send that email list to their email service provider (ESP) to automatically re-engage that customer base.

4. Dashboards are a gateway to advanced customer analytics.

With your analysts no longer consumed by manually building out reports, you’re on your way to identifying strong use cases for machine learning (ML) and predictive analytics. This form of advanced customer analytics covers a wide variety of use cases.

A common marketing use case for ML is predicting customer lifetime value (CLV/LTV) prior to investing marketing dollars on acquisition by matching a potential customer to the profile of existing customers that either yield high net profit or end up costing your business money. Another great marketing use case for data science is predicting the probability of conversion for a specific campaign or promotion based on a customer or customer segment’s previous behavior with like-campaigns. Your branding and merchandising teams may want to focus on identifying products that would yield a higher profit or increase in orders as a bundle instead of being sold individually.

Regardless of the use case, your dashboards will put you in a strong position to have a more targeted and therefore effective data science use case.

This is an example of a marketing dashboard that helps better understand customer demographics.

This dashboard gives marketers a place to test theories on customer demographics that would yield the highest LTV for a specific campaign.

Implementing strategic, goal-oriented dashboards significantly improves your marketing efforts at all levels. They provide analysts with the ability to spend their time acting on information rather than searching for and cleaning up data. More importantly, they enable executives to make informed decisions that ultimately increase ROI and ensure marketing budget is spent on impactful efforts.

2nd Watch has a vast array of experience helping marketers create dashboards that unlock valuable insights. Contact us or check out our Marketing Analytics Starter Pack to quickly gain the benefits listed above with a marketing dashboard specialized for your company.


3 Reasons Businesses Use Google Cloud Platform (GCP) for AI

Google Cloud Platform (GCP) offers a wide scope of artificial intelligence (AI) and machine learning (ML) services fit for a range of industries and use cases. With more businesses turning to AI for data-based innovation and new solutions, GCP services are proving effective. See why so many organizations are choosing Google Cloud to motivate, manage, and make change easy.

1. Experimentation and Cost Savings

Critical to the success of AI and ML models are data scientists. The more you enable, empower, and support your data scientists through the AI lifecycle, the more accurate and reliable your models will be. Key to any successful new strategy is flexibility and cost management. Oneway GCP reduces costs while offering enterprise flexibility is with Google’s AI Platform Notebooks.

Managed JuptyerLab notebook instances give data scientists functional flexibility – including access to BigQuery, with the ability to add CPUs, RAM, and GPUs to scale – cloud security, and data access with a streamlined experience from data to deployment. Relying on on-prem environments, data scientists are limited by resource availability and a variety of costs related data warehousing infrastructure, hosting, security, storage, and other expenses. JuptyerLab notebooks and Big Query, on the other hand, are pay as you go and always available via the AI Platform Notebooks. With cost-effective experimentation, you avoid over provisioning, only pay for what you use and when you run, and give data scientists powerful tools to get data solutions fast.

2. Access and Applications

AI and ML projects are only possible after unifying data. A common challenge to accomplishing this first step are data silos across the organization. These pockets of disjointed data across departments threaten the reliability and business outcomes of data-based decision making. The GCP platform is built on a foundation of integration and collaboration, giving teams the necessary tools and expansive services to gain new data insights for greater impacts.

For instance, GCP enables more than just data scientists to take advantage of their AI services, databases, and tools. Developers without data science experience can utilize APIs to incorporate ML into the solution without ever needing to build a model. Even others, who don’t have knowledge around data science, can create custom models that integrate into applications and websites using Cloud AutoML.

Additionally, BigQuery Omni, a new service from GCP, enables compatibility across platforms. BigQuery Omni enables you to query data residing in other places using standard SQL with the powerful engine of BigQuery. This innovation furthers your ability to join data quickly and without additional expertise for unobstructed applicability.

3. ML Training and Labs

Google enables users with best practices for cost-efficiency and performance. Through its Quiklabs platform, you get free, temporary access to GCP and AWS, to learn the cloud on the real thing, rather than simulations. Google also offers training courses ranging from 30-minute individual sessions, to multi-day sessions. The courses are built for introductory users, all the way up to expert level, and are instructor-led or self-paced. Thousands of topics are covered, including AI and ML, security, infrastructure, app dev, and many more.

With educational resources at their fingertips, data teams can roll up their sleeves, dive in, and find some sample data sets and labs, and experience the potential of GCP hands-on. Having the ability to experiment with labs without running up a bill – because it is in a sandbox environment – makes the actual implementation, training, and verification process faster, easier, and cost-effective. There is no danger of accidentally leaving a BigQuery system up and running, executing over and over, with a huge cost to the business.

Next Steps

If you’re contemplating AL and ML on Google Cloud Platform, get started with Quiklabs to see what’s possible. Whether you’re the one cheerleading AI and ML in your organization or the one everyone is seeking buy-in from, Quiklabs can help. See what’s possible on the platform before going full force on a strategy. Google is constantly adding new services and tools, so partner with experts you can trust to achieve the business transformation you’re expecting.

Contact 2nd Watch, a Google Cloud Partner with over 10 years of cloud experience, to discuss your use cases, level of complexity, and our advanced suite of capabilities with a cloud advisor.

Learn more

Webinar: 6 Essential Tactics for your Data & Analytics Strategy

Webinar:  Building an ML foundation for Google BigQuery ML & Looker

-Sam Tawfik, Sr Product Marketing Manager


Maximizing Cloud Data with Google Cloud Platform Services

If you’re trying to run your business smarter, not harder, utilizing data to gain insights into decision making gives you a competitive advantage. Cloud data offerings empower utilization of data in the cloud, and the Google Cloud Platform (GCP) is full of options. Whether you’re migrating data, upgrading to enterprise-class databases, or transforming customer experience on cloud-native databases – Google Cloud services can fit your needs.

Highlighting some of what Google has to offer

With so many data offerings from GCP, it’s nearly impossible to summarize them all. Some are open source projects being distributed by other vendors, while others were organically created by Google to service their own needs before being externalized to customers. A few of the most popular and widely used include the following.

  • BigQuery: Core to GCP, this serverless, scalable, multi-cloud, data warehouse enables business agility – including data manipulation and data transformation, and it is the engine for AI, machine learning (ML), and forecasting.
  • Cloud SQL: Traditional relational database in the cloud that reduces maintenance costs with fully managed services for MySQL, PostgreSQL, and SQL Server.
  • Spanner: Another fully managed relational database offering unlimited scale, consistency, and almost 100% availability – ideal for supply chain and inventory management across regions and between two databases.
  • Bigtable: Low latency, NoSQL, fully managed database for ML and forecasting, using very large amounts of data in analytical and operational workloads.
  • Data Fusion: Fully managed, cloud-native data integration tool that enables you to move different data sources to different targets – includes over 150 preconfigured connectors and transformers.
  • Firestore: From the Firebase world comes the next generation of Datastore. This cloud-native, NoSQL, document database lets you develop custom apps that directly connect to the database in real-time.
  • Cloud Storage: Object based storage can be considered a database because of all the things you can do with BigQuery – including using standard SQL language to query objects in storage.

Why BigQuery?

After more than 10 years of development, BigQuery has become a foundational data management tool for thousands of businesses. With a large ecosystem of integration partners and a powerful engine that shards queries across petabytes of data and delivers a response in seconds, there are many reasons BigQuery has stood the test of time. It’s more than just super speed, data availability, and insights.

Standard SQL language
If you know SQL, you know BigQuery. As a fully managed platform, it’s easy to learn and use. Simply populate the data and that’s it! You can also bring in large public datasets to experiment and further learn within the platform.

Front-end data
If you don’t have Looker, Tableau, or another type of business intelligence (BI) tool to visualize dashboards off of BigQuery, you can use the software development kit (SDK) for web-based front-end data display. For example, government health agencies can show the public real-time COVID-19 case numbers as they’re being reported. The ecosystem of BigQuery is so broad that it’s a source of truth for your reports, dashboards, and external data representations.

Analogous across offerings

Coming from on-prem, you may be pulling data into multiple platforms – BigQuery being one of them. GCP offerings have a similar interface and easy navigation, so functionality, user experience, and even endpoint verbs are the same. Easily manage different types of data based on the platforms and tools that deliver the most value.

BigQuery Omni

One of the latest GCP services was built with a similar API and platform console to various other platforms. The compatibility enables you to query data living in other places using standard SQL. With BigQuery Omni, you can connect and combine data from outside GCP without having to learn a new language.

Ready for the next step in your cloud journey?

As a Google Cloud Partner, 2nd Watch is here to be your trusted cloud advisor throughout your cloud data journey, empowering you to fuel business growth while reducing cloud complexity. Whether you’re embracing cloud data for the first time or finding new opportunities and solutions with AI, ML, and data science our team of data scientists can help. Contact Us for a targeted consultation and explore our full suite of advanced capabilities.

Learn more

Webinar: 6 Essential Tactics for your Data & Analytics Strategy

Webinar:  Building an ML foundation for Google BigQuery ML & Looker

-Sam Tawfik, Sr Product Marketing Manager


AWS re:Invent Keynote Recap – Wednesday

I have been looking forward to Andy Jassy’s keynote since I arrived in Las Vegas. Like the rest of the nearly 50k cloud-geeks in attendance, I couldn’t wait to learn about all of the cool new services and feature enhancements that will be unleashed that can solve problems for our clients, or inspire us to challenge convention in new ways.

Ok, I’ll admit it. I also look forward to the drama of the now obligatory jabs at Oracle, too!

Andy’s 2017 keynote was no exception to the legacy of previous re:Invents on those counts, but my takeaway from this year is that AWS has been able to parlay their flywheel momentum of growth in IaaS to build a wide range of higher-level managed services. The thrill I once got from new EC2 instance type releases has given way to my excitement for Lambda and event-based computing, edge computing and IoT, and of course AI/ML!

AWS Knows AI/ML

Of all the topics covered in the keynote, the theme that continues to resonate throughout this conference for me is that AWS wants people to know that they are the leader in AI and machine learning. As an attendee, I received an online survey from Amazon prior to the conference asking for my opinion on AWS’s position as a leader in the AI/ML space. While I have no doubts that Amazon has unmatched compute and storage capacity, and certainly has access to a wealth of information to train models, how does one actually measure a cloud provider’s AI/ML competency? Am I even qualified to answer without an advanced math degree?

That survey sure makes a lot more sense to me following the keynote as I now have a better idea of what “heavy lifting” a cloud provider can offload from the traditional process.

Amazon has introduced SageMaker, a fully managed service that enables data scientists and developers to quickly and easily build, train, and deploy machine learning models at any scale. It integrates with S3, and with RDS, DynamoDB, and Redshift by way of AWS Glue. It provides managed Jupyter notebooks and even comes supercharged with several common ML algorithms that have been tuned for “10x” performance!

In addition to SageMaker, we were introduced to Amazon Comprehend, a natural language processing (NLP) service that uses machine learning to analyze text. I personally am excited to integrate this into future chatbot projects, but the applications I see for this service are numerous.

After you’ve built and trained your models, you can run them in the cloud, or with the help of AWS Greengrass and its new machine learning inference feature, you can bring those beauties to the edge!

What is a practical application for running ML inference at the edge you might ask?

Dr. Matt Wood demoed a new hardware device called DeepLens for the audience that does just that! DeepLens is a deep-learning enabled wireless video camera specifically designed to help developers of all skill levels grow their machine learning skills through hands-on computer vision tutorials. Not only is this an incredibly cool device to get to hack around with, but it signals Amazon’s dedication to raising the bar when it comes to AI and machine learning by focusing on the wet-ware: hungry minds looking to take their first steps.

Andy’s keynote included much more than just AI/ML, but to me, the latest AI/ML services that were announced on Tuesday represent the signal of Amazon’s future of higher-level services which will keep them the dominant cloud provider into the future.

 

–Joe Conlin, Solutions Architect, 2nd Watch