AWS Media2Cloud: Efficient Digital Archive Transformation and Migration

In our previous post, we introduced the concept of Digital Archive Transformation and Migration (OTL) and its significance in managing digital assets. In this follow-up post, we’re taking a technical deep dive into Media2Cloud on AWS, which plays a crucial role in streamlining Old Tape Library (OTL) migrations by leveraging artificial intelligence. Keep reading to understand the four key technical components of Media2Cloud on AWS, see seven valuable reasons for migrating your media archive to AWS, and discover five ways that 2nd Watch can help make it a success.

AWS Media2Cloud: Efficient Digital Archive Transformation and Migration

What is Media2Cloud on AWS?

Media2Cloud on AWS is a serverless solution built on the AWS platform that automates digital media content ingestion, analysis, and organization within the media supply chain. By leveraging AWS services like AWS Step Functions, AWS Lambda, Amazon S3, Amazon Rekognition, Amazon Transcribe, and Amazon Comprehend, Media2Cloud enables efficient cloud migration and management of digital assets.

4 Key Technical Components of Media2Cloud on AWS

  1. Ingestion Workflow: The ingestion workflow is initiated by uploading media files to an Amazon Simple Storage Service (S3) bucket. AWS Lambda triggers the processing pipeline, which includes Amazon Rekognition and Amazon Transcribe for metadata extraction using machine learning. AWS Step Functions coordinate the overall workflow, ensuring the extracted metadata is indexed in Amazon OpenSearch Service for easy search and retrieval.
  2. Analysis Workflow: The analysis workflow is powered by AWS Lambda and Amazon AI services. Some examples include using Amazon Rekognition to verify identity with facial analysis, applying Amazon Transcribe to convert speech to text, and implementing Amazon Comprehend, a natural-language processing (NLP) service for sentiment analysis. The metadata is indexed in Amazon OpenSearch Service, allowing you to build custom applications for advanced media search and retrieval.
  3. Media Asset Management: Media2Cloud on AWS relies on Amazon S3 for storing media assets and their associated metadata. This provides a scalable, durable, and cost-effective storage solution, ensuring your digital assets are always accessible and protected as part of an effective media management system.
  4. AWS CloudFormation: Media2Cloud on AWS utilizes AWS CloudFormation to automate the deployment of the solution and its components. This makes deploying and configuring the solution easy, enabling you to focus on migrating and managing your digital assets.

7 Reasons to Migrate Your Media Archive to AWS

  1. Enhanced Data Protection: Storing valuable media assets on hard drives or old tapes poses significant risks, such as data loss due to hardware failure, damage, or obsolescence. By migrating to Media2Cloud on AWS, you safeguard your assets in a dependable and secure infrastructure with built-in redundancies, encryption, and access control mechanisms.
  2. Improved Accessibility and Collaboration: Migrating your media archive to AWS allows you to centralize your assets, making them easily accessible to teams across the globe. Without the limitations imposed by physical storage, you enhance collaboration, ensuring that the right content is available for the right people at the right time.
  3. Scalability and Flexibility: Media2Cloud on AWS offers virtually unlimited storage, allowing your media library to grow without constraints. Moreover, its flexible infrastructure accommodates changing business needs and technology advancements, so assets remain relevant and accessible long-term.
  4. Cost Efficiency: By migrating to Media2Cloud on AWS, you eliminate the overhead costs associated with maintaining physical storage infrastructure. With the pay-as-you-go model, you only pay for the storage and services you use, delivering to significant cost savings.
  5. Advanced AI-driven Workflows: Media2Cloud on AWS integrates with powerful AI-driven AWS services, such as Amazon Rekognition, Amazon Transcribe, and Amazon Comprehend. These services enable advanced workflows for image recognition and video analysis, speech-to-text conversion, and sentiment extraction, thereby unlocking new levels of value and insights from your media assets.
  6. Revenue Generation through Monetization of Assets: By enabling your team to search and find relevant assets easily, you maximize the utilization of your media library and monetize more of your content. This can include licensing, repurposing, or distributing previously underutilized assets, ultimately contributing to your organization’s bottom line.
  7. AWS Migration Acceleration Program (MAP): AWS’s MAP is designed to help organizations offset the costs associated with cloud migration. By participating in the program, you gain access to AWS credits, specialized training, and expert support to accelerate the migration process. This makes moving to Media2Cloud on AWS even more cost-effective and feasible for organizations of all sizes.

5 Ways 2nd Watch Can Help

As an experienced AWS Premier Partner, 2nd Watch can manage the entire Digital Archive Transformation and Migration process for you. Our team of experts handle all the heavy lifting, including leveraging specialized equipment to transfer your media assets from tapes or hard drives to the AWS Cloud. Our end-to-end migration services include the following:

  1. Assessment and Strategy Development: We analyze your existing media archive and develop a comprehensive migration strategy tailored to your organization’s needs and goals.
  2. AWS Landing Zone Deployment: Managing the intricacies of AWS can be daunting. To adhere to best practices, we devised an architectural blueprint called the AWS Landing Zone. This framework creates a secure, multi-account,  AWS environment that lays a robust foundation for media asset migration. As part of the Landing Zone, we can deploy a tailored and customized version of Media2Cloud solution on AWS with additional features to meet your specific requirements.
  3. Physical-to-Digital Migration: Our team handles the physical-to-digital conversion of your media assets using specialized equipment and best practices for a smooth and efficient transfer.
  4. Cloud Migration and Integration: Once your assets are digitized, we migrate them to Media2Cloud on AWS and integrate them with your existing systems and workflows.
  5. Optimization and Ongoing Support: After the migration is complete, we help optimize media workflows and provide ongoing support to ensure media assets remain secure, accessible, and valuable.

Digital Archive Transformation and Migration is a complex and technical process, but the benefits of adopting a solution like Media2Cloud on AWS are worth the effort. Leveraging the expertise of a trusted cloud partner like 2nd Watch guarantees a smooth transition to AWS, so your organization can fully harness the power of your digital assets. Contact us today to learn more about how we can help you embrace the future of media management.

Does this stand for Oracle Time and Labor? If so, I don’t understand it being the accronym for Digital Archive Transformation and Migration. Wouldn’t that be (DATM)?



Delivering Live Broadcasts and Playout with AWS

Live broadcasts and playouts are critical requirements for media companies to satisfy their audiences. More customers are cutting cords and watching live channels on the web, mobile, tablets, and smart TVs. As a result, media companies are pressured to bring additional channels to market and scale up their delivery capabilities.

Amazon Web Services (AWS) is a solution for media companies to deliver live broadcasts and playouts. AWS offers various services to help media companies design, build, and manage a scalable, reliable, cost-effective live broadcast and playout solution. These services include AWS Elemental MediaLive, a real-time video encoding service; AWS CloudFront, a content delivery network (CDN); and AWS MediaConnect, a transport service for live video streams. In addition, partnerships with leading media and entertainment technology companies – such as Amagi, BCNEXXT, Grass Valley, and Imagine Communications – can provide expertise and support in implementing and managing a live broadcast and playout solution on AWS.

Here is an example Terraform template that creates a MediaLive channel that receives an SRT input and outputs HLS to CloudFront:
















This template creates a MediaLive channel with an SRT input and an HLS output. The HLS output is written to an S3 bucket, and a CloudFront distribution is created to serve the HLS output to users.

This is an example: you will need to customize it for your specific use case. For instance, you must specify the SRT input settings, such as the IP address and port of the SRT source. You must also specify the details of the S3 bucket and CloudFront distribution.

2nd Watch is an experienced AWS partner with the knowledge and resources to support media companies in designing, building, and managing an effective live broadcast and playout solution on AWS. Our team is familiar with a range of AWS services, including AWS Elemental MediaLive, AWS CloudFront, and AWS MediaConnect, as well as partnerships with leading media and entertainment technology companies. Contact us today to learn more about our consulting services for live broadcast and playout on AWS.

Aleksander Hansson | Specialist Solutions Architect | 2ND Watch


Data Clean Rooms: Share Your Corporate Data Fearlessly

Data sharing has become more complex, both in its application and our relationship to it. There is a tension between the need for personalization and the need for privacy. Businesses must share data to be effective and ultimately provide tailored customer experiences. However, legislation and practices regarding data privacy have tightened, and data sharing is tougher and fraught with greater compliance constraints than ever before. The challenge for enterprises is reconciling the increased demand for data with increased data protection.

Data Clean Rooms

The modern world runs on data. Companies share data to facilitate their daily operations. Data distribution occurs between business departments and external third parties. Even something as innocuous as exchanging Microsoft Excel and Google Sheets spreadsheets is data sharing!

Data collaboration is entrenched in our business processes. Therefore, rather than avoiding it, we must find the tools and frameworks to support secure and privacy-compliant data sharing. So how do we govern the flow of sensitive information from our data platforms to other parties?

The answer: data clean rooms. Data clean rooms are the modern vehicle for various data sharing and data governance workflows. Across industries – including media and entertainment, advertising, insurance, private equity, and more – a data clean room can be the difference-maker in your data insights.

Ready to get started with a data clean room solution? Schedule time to talk with a 2nd Watch data expert.

What is a data clean room?

There is a classic thought experiment wherein two millionaires want to find out who is richer without actually sharing how much money they are individually worth. The data clean room solves this issue by allowing parties to ask approved questions, which require external data to answer, without actually sharing the sensitive information itself!

In other words, a data clean room is a framework that allows two parties to securely share and analyze data by granting both parties control over when, where, and how said data is used. The parties involved can pool together data in a secure environment that protects private details. With data clean rooms, brands can access crucial and much-needed information while maintaining compliance with data privacy policies.

Data clean rooms have been around for about five years with Google being the first company to launch a data clean room solution (Google Ads Data Hub) in 2017. The era of user privacy kicked off in 2018 when data protection and privacy became law, most notably with the General Data Protection Regulation (GDPR).

This was a huge shake-up for most brands. Businesses had to adapt their data collection and sharing models to operate within the scope of the new legislation and the walled gardens that became popular amongst all tech giants. With user privacy becoming a priority, data sharing has become stricter and more scrutinized, which makes marketing campaign measurements and optimizations in the customer journey more difficult than ever before.

Data clean rooms are crucial for brands navigating the era of consumer protection and privacy. Brands can still gain meaningful marketing insights and operate within data privacy laws in a data clean room.

Data clean rooms work because the parties involved have full control over their data. Each party agrees upon access, availability, and data usage, while a trusted data clean room offering oversees data governance. This yields the secure framework needed to ensure that one party cannot access the other’s data and upholds the foundational rule that individual, or user-level data cannot be shared between different parties without consent.

Personally, identifying information (PII) remains anonymized and is processed and stored in a way that is not exposed to any parties involved. Thus, data sharing within a data clean room complies with privacy policies, such as GDPR and California Consumer Privacy Act (CCPA).

How does a data clean room work?

Let’s take a deeper dive into the functionality of a data clean room. Four components are involved with a data clean room:

#1 – Data ingestion
Data is funneled into the data clean room. This can be first-party data (generated from websites, applications, CRMs, etc.) or second-party data from collaborating parties (such as ad networks, partners, publishers, etc.)

#2 – Connection and enrichment
The ingested data sets are matched at the user level. Tools like third-party data enrichment complement the data sets.

#3 – Analytics
The data is analyzed to determine if there are intersections/overlaps, measurement/attribution, and propensity scoring. Data will only be shared where the data points intersect between the two parties.

#4 – Application
Once the data has finished its data clean room journey, each party will have aggregated data outputs. It creates the necessary business insights to accomplish crucial tasks such as optimizing the customer experience, performing reach and frequency measurements, building effective cross-platform journeys, and conducting deep marketing campaign analyses.

What are the benefits of a data clean room?

Data clean rooms can benefit businesses in any industry, including media, retail, and advertising. In summary, data clean rooms are beneficial for the following reasons:

You can enrich your partner’s data set.
With data clean rooms, you can collaborate with your partners to produce and consume data regarding overlapping customers. You can pool common customer data with your partners, find the intersection between your business and your partners, and share the data upstream without sharing sensitive information with competitors. An example would be sharing demand and sales information with an advertising partner for better-targeted marketing campaigns.

You can create governance within your enterprise.
Data clean rooms provide the framework to achieve the elusive “single source of truth.” You can create a golden record encompassing all the data in every system of records within your organization. This includes sensitive PII such as social security numbers, passport numbers, financial account numbers, transactional data, etc.

You can remain policy compliant.
In a data clean room environment, you can monitor where the data lives, who has access to it, and how it is used with a data clean room. Think of it as an automated middleman that validates requests for data. This allows you to share data and remain compliant with all the important acronyms: GDPR, HIPPA, CCPA, FCRA, ECPA, etc.

But you have to do it right…

With every data security and analytics initiative, there is a set of risks if the implementation is not done correctly. A truly “clean” data clean room will allow you to unlock data for your users while remaining privacy compliant. You can maintain role-based access, tokenized columns, and row-level security – which typically lock down particular data objects – and share these sensitive data sets quickly and in a governed way. Data clean rooms satisfy the need for efficient access and the need for the data producer to limit the consumer to relevant information for their use case.

Of course, there are consequences if your data clean room is actually “dirty.” Your data must be federated, and you need clarity on how your data is stored. The consequences are messy if your room is dirty. You risk:

  • Loss of customer trust
  • Fines from government agencies
  • Inadvertently oversharing proprietary information
  • Locking out valuable data requests due to a lack of process

Despite the potential risks of utilizing a data clean room, it is the most promising solution to the challenges of data-sharing in a privacy-compliant way.


To get the most out of your data, your business needs to create secure processes to share data and decentralize your analytics. This means pooling together common data with your partners and distributing the work to create value for all parties involved.

However, you must govern your data. It is imperative to treat your data like an asset, especially in the era of user privacy and data protection. With data clean rooms, you can reconcile the need for data collaboration with the need for data ownership and privacy.

2nd Watch can be your data clean room guide, helping you to establish a data mesh that enables sharing and analyzing distributed pools of data, all while maintaining centralized governance. Schedule time to get started with a data clean room.

Fred Bliss – CTO Data Insights 2nd Watch