1-888-317-7920 info@2ndwatch.com

AWS re:Invent 2020 was a little different, to say the least. The line for the restroom was way shorter, if you entered the Tatonka Challenge you were guaranteed to win or at least have a really good shot at the trophy depending on how many of your family members joined in and how many wings your air fryer can handle, and the lines for shuttle busses were non-existent as the commute time between sessions was reduced from hours to seconds depending on how fast you can click. However, in typical AWS fashion, they made lemonade out of lemons and put on one of the best public cloud virtual event of the year.

Instead of the typical action packed, sleepless week in Vegas, AWS broke it up in to 3 weeks sprinkled with all of their major announcements throughout.  Vendors set up to provide breakout sessions and virtual booths to discuss solutions/products and have 1 on 1 sessions with potential leads via chat and live demos. Hunters of the precious SWAG had to engage with vendors as well as participate in specific activities to obtain their various rewards. With all of the turmoil going on in the world, AWS was still able to announce over 140 new products and features at re:Invent 2020. Here are just a few of the highlights.

Week 1

For the first time ever, re:Invent was opened to the world free of charge and attracted over 500,000 participants. Andy Jassy’s overall keynote theme was centered around the customer driving innovation within AWS based on solving their needs. In part due to the pandemic, cloud adoption has accelerated this year and has fueled AWS’ continued growth.

AWS announced new compute innovations including MacOS (literally integrating a Mac Mini into a server chassis) as well as making tremendous investments in the processor space with their Graviton 2 processors and Trainium chips. If you didn’t catch week 1, here’s what you missed:

Reinventing Compute:

  • EC2: macOS instances, Intel/AMD/ARM/Graviton2 options
  • New C6g Graviton instance announced, almost 50% savings
  • Lower cost for AWS Inferentia, used by Alexa
  • Habana Gaudi Based EC2 Instances GPU based, machine learning instances
  • AWS Trainium AWS ML chip used in EC2 and Sagemaker

Reinventing Storage:

  • Gp3 for EBS allowing 4x peak throughput
  • Io2 Block Express First SAN built for cloud

The mindset of “100% in the cloud all the time” is slowly being shifted to include new options for hybrid environments with the announcements of ECS and EKS anywhere allowing customers to run their workloads in their own data center. Taking it a step further is the announcement of AWS Monitron that uses machine learning to help predict failures in data center infrastructure. Placing compute closer to the customer (Edge Computing) has become more important especially as connectivity providers roll out 5G. To allow for this evolution, AWS has released AWS Wavelength. Also, additional options for Outposts (1 and 2U server sizes) have been released for customers not requiring a full cabinet of hardware.

Data science, AI, and machine learning have become front and center as customers continue to take advantage of cloud native technologies. Making the best use of your data and making it work for your business have been a huge focus this year. Some of the highlights include:

  • Amazon SageMaker Data Wrangler: Clean and aggregate data to prepare it for machine learning.
  • AWS Glue Elastic Views: Easily combine and replicate data from different data stores.
  • Amazon Code Guru: Automate code reviews and identify your most expensive lines of code.
  • Amazon DevOps Guru: automatically detect operational issues and recommend actions to fix
  • Amazon Quicksight: Ask any question in natural language and get answers in seconds.
  • Amazon Connect Wisdom: Reduces the time agents spend finding answers for customers.

AWS partner relationships continue to be a central focus as well, and this was highlighted by Doug Yeum in his keynote:

  • Cohesity DMaaS (Data Management as a Service) service announcement.
  • AWS SaaS Boost: Open source SaaS reference environment to accelerate traditional applications to SaaS on AWS.
  • AWS ISC Partner path: More access to millions of active AWS customers with AWS field sellers globally.
  • Managed entitlements for AWS Marketplace: Automate 3rd party software license distribution and simplify entitlement tracking.
  • AWS Service Catalog App Registry: Define and associate resources to better manage applications.
  • AWS Energy Competency: Helping customers accelerate their transition to a more balanced and sustainable energy future.

Week 2

Kicking off week two was an infrastructure specific deep dive with Peter DeSantis. Given my background in the data center space, I found his keynote to be extremely interesting as I have noticed over the past few years that questions and conversations around how cloud services are actually provided are very common. Back before the “cloud” and even virtual machines existed, servers were deployed into data centers and enterprises ran their mission critical workloads on them. Some companies deployed and managed their own physical infrastructure, some outsourced the management of those environments to MSPs, but the overall principals have not changed over the years.  Yes, your workloads run “in the cloud” but behind that are still data centers housing servers, networking gear, storage, cooling, water chillers, power distribution, connectivity, etc.

AWS has taken those principals and scaled them to another level and has been focusing on redundancy and sustainability to ensure that, if built properly, their customers’ workloads have no single point of failure and can keep running should an outage occur. AWS has not only made strides in the disk storage and processor space, but they have also designed and integrated their own switching gear control systems and custom designed, rack installed UPS infrastructure.

These are items that users of the cloud don’t have to deal with and one of the major selling points of moving to cloud. You don’t have to worry about rack space, power, cooling, hardware purchases, maintenance contracts, and the list goes on and on. BUT rest assured that the man behind the curtain is very aware of these items and is taking best in class steps to ensure that the infrastructure behind the scenes is always on.

Next on the list was the machine learning Keynote with Swami Sivasubramanian. This was more of a deep dive into some of the announcements made by Andy Jassy in week one, and he did not disappoint. As customers continue the shift to cloud native, ML and AI services have become front and center in their Application Modernization journey. Out of the 250+ new products and product enhancements announced by AWS in 2020, most of those were centered around SageMaker and 11 other AI and ML products.

ML Frameworks and Infrastructure

AWS announced AWS Inferentia, a high performance, machine learning chip that powers EC2 Inf1 instances. Inferentia boasts 45% lower costs and 30% higher throughput than comparable GPU based instances and helps Alexa achieve 25% lower end to end latency. AWS Tranium is another high-performance machine learning chip with the most teraflops of compute power for ML that enables a broader set of ML applications.

Amazon SageMaker

AWS had several announcements around Amazon SageMaker.

“Thus, we need a platform where the data scientist will be able to leverage his existing skills to engineer and study data, train and tune ML models and finally deploy the model as a web-service by dynamically provisioning the required hardware, orchestrating the entire flow and transition for execution with simple abstraction and provide a robust solution that can scale and meet demands elastically.” – Jojo John Moolayil, AWS AI Research Scientist

  • SageMaker Data Wrangler is a faster way to prepare data for ML without a single line of code.
  • SageMaker Clarify provides machine learning developers with greater visibility into their training data and models so they can identify and limit bias and explain predictions.
  • SageMaker Debugger helps identify bottlenecks, visualize system resources like GPU, CPU, I/O, memory and provides adjustment recommendations.

AI Services:

The most important take-away from this keynote is AWS’ goal of the democratization of machine learning, or the transparent embedding of ML functionality into other AWS services.

“The company’s overall aim is to enable machine learning to be embedded into most applications before the decade is out by making it accessible to more than just experts.” – Andy Jassy, AWS CEO

With that goal in mind, AWS announced Redshift ML, which imports trained models into the data warehouse and makes them accessible using standard SQL queries. Use SQL statements to create and train Amazon SageMaker machine learning models using your Redshift data and embed them directly in reports.

Aurora ML enables you to add ML-based predictions to applications via the familiar SQL programming language, so you don’t need to learn separate tools or have prior machine learning experience. It provides simple, optimized, and secure integration between Aurora and AWS ML services without having to build custom integrations or move data around.

Neptune ML brings predictions to their fully managed graph database service in the form of graph neural networks and the Deep Graph Library.

For companies involved with handling medical data, Amazon Healthlake is worth looking at. With built-in data query, search and ML capabilities, you can seamlessly transform data to understand meaningful \ medical information at petabtye scale.

Week 3

Wrapping up the final week of re:Invent 2020 was Werner Vogels rocking his typical iconic t-shirt, however not announcing who would be playing at re:Play this year, unfortunately. Presenting from the Netherlands in the historic SugarCity factory, he masterfully wove in the story of transforming and adapting to external events. To say that COVID has impacted all aspects of our lives in 2020 would be an understatement, but when presented with challenges, innovators continue to find ways to overcome those obstacles.

Collaboration and remote working were beyond challenging to everyone in 2020. AWS CloudShell was announced to provide users access to AWS critical resources such as the AWS console, AWS CLI and even 1GB of persistent storage at no cost. In addition, enhancements to AWS Cloud9 were announced that enables users to develop, run, and debug code from a browser.

To help mitigate potential issues in the future, AWS announced Fault Injection Simulator that sounds more like a load test on steroids utilizing chaos engineering. Chaos engineering allows an application or environment to be pushed to its limits to highlight any potential issues, bottle necks, or failures before they are pushed into production for end user use.

Additionally, Werner focused on helping the community and sustainability. The pandemic has financially hurt millions of people and AWS has developed the re:Start program designed to help the unemployed develop new skills that will allow them to pursue new career paths.

In summary, AWS continues to dominate the public cloud market and rapidly innovates based on their customer requirements. We may not have been standing elbow to elbow with 60,000 of our closest friends, navigating the miles and miles of casino floors, or enjoying all of the surprises of re:Invent in-person this year, but AWS did a stellar job of bringing us together virtually. Hopefully in a year’s time, we will all be back together and enjoying the wonderful craziness that is AWS re:Invent, Vegas style.

-Jeff Collins, Optimization Product Manager, 2nd Watch

Facebooktwitterlinkedinmailrss