In this age of ever-expanding data security challenges, which have only increased with the mass move to remote workforces, data-centric organizations need to easily but securely access data. Enter ALTR: a cloud-native platform delivering Data Security as a Service (DSaaS) and helping companies to optimize data consumption governance.
Not sure you need another tool in your toolkit? We’ll dive into ALTR’s benefits so you can see for yourself how this platform can help you get ahead of the next changes in data security, simplify processes and enterprise collaboration, and maximize your technology capabilities, all while staying in control of your budget.
How Does ALTR Work?
With ALTR, you’re able to track data consumption patterns and limit how much data can be consumed. Even better, it’s simple to implement, immediately adds value, and is easily scalable. You’ll be able to see data consumption patterns from day one and optimize your analytics while keeping your data secure.
ALTR delivers security across three key stages:
Observe – ALTR’s DSaaS platform offers critical visibility into your organization’s data consumption, including an audit record for each request for data. Observability is especially critical as you determine new levels of operational risk in today’s largely remote world.
Detect and Respond – You can use ALTR’s observability to understand typical data consumption for your organization and then determine areas of risk. With that baseline, you’re able to create highly specific data consumption policies. ALTR’s cloud-based policy engine then analyzes data requests to prevent security incidents in real time.
Protect – ALTR can tokenize data at its inception to secure data throughout its lifecycle. This ensures adherence to your governance policies. Plus, ALTR’s data consumption reporting can minimize existing compliance scope by assuring auditors that your policies are solid.
What Other Benefits Does ALTR Offer?
ALTR offers various integrations to enhance your data consumption governance:
Share data consumption records and security events with your favorite security information and event management (SIEM) software.
View securely shared data consumption information in Snowflake.
Analyze data consumption patterns in Domo.
ALTR delivers undeniable value through seamless integration with technologies like these, which you may already have in place; paired with the right consultant, the ROI is even more immediate. ALTR may be new to you, but an expert data analytics consulting firm like 2nd Watch is always investigating new technologies and can ease the implementation process. (And if you need more convincing, ALTR was selected as a finalist for Bank Director’s 2020 Best of FinXTech Awards.)
Dedicated consultants can more quickly integrate ALTR into your organization while your staff stays on top of daily operations. Consultants can then put the power in the hands of your business users to run their own reports, analyze data, and make data-driven decisions. Secure in the knowledge your data is protected, you can encourage innovation by granting more access to data when needed.
As a tech-agnostic company, 2nd Watch helps you find the right tools for your specific needs. Our consultants have a vast range of product expertise to make the most of the technology investments you’ve already made, to implement new solutions to improve your team’s function, and to ultimately help you compete with the companies of tomorrow. Reach out to us directly to find out if ALTR, or another DSaaS platform, could be right for your organization.
Data sharing has become more complex, both in its application and our relationship to it. There is a tension between the need for personalization and the need for privacy. Businesses must share data to be effective and ultimately provide tailored customer experiences. However, legislation and practices regarding data privacy have tightened, and data sharing is tougher and fraught with greater compliance constraints than ever before. The challenge for enterprises is reconciling the increased demand for data with increased data protection.
The modern world runs on data. Companies share data to facilitate their daily operations. Data distribution occurs between business departments and external third parties. Even something as innocuous as exchanging Microsoft Excel and Google Sheets spreadsheets is data sharing!
Data collaboration is entrenched in our business processes. Therefore, rather than avoiding it, we must find the tools and frameworks to support secure and privacy-compliant data sharing. So how do we govern the flow of sensitive information from our data platforms to other parties?
The answer: data clean rooms. Data clean rooms are the modern vehicle for various data sharing and data governance workflows. Across industries – including media and entertainment, advertising, insurance, private equity, and more – a data clean room can be the difference-maker in your data insights.
There is a classic thought experiment wherein two millionaires want to find out who is richer without actually sharing how much money they are individually worth. The data clean room solves this issue by allowing parties to ask approved questions, which require external data to answer, without actually sharing the sensitive information itself!
In other words, a data clean room is a framework that allows two parties to securely share and analyze data by granting both parties control over when, where, and how said data is used. The parties involved can pool together data in a secure environment that protects private details. With data clean rooms, brands can access crucial and much-needed information while maintaining compliance with data privacy policies.
Data clean rooms have been around for about five years with Google being the first company to launch a data clean room solution (Google Ads Data Hub) in 2017. The era of user privacy kicked off in 2018 when data protection and privacy became law, most notably with the General Data Protection Regulation (GDPR).
This was a huge shake-up for most brands. Businesses had to adapt their data collection and sharing models to operate within the scope of the new legislation and the walled gardens that became popular amongst all tech giants. With user privacy becoming a priority, data sharing has become stricter and more scrutinized, which makes marketing campaign measurements and optimizations in the customer journey more difficult than ever before.
Data clean rooms are crucial for brands navigating the era of consumer protection and privacy. Brands can still gain meaningful marketing insights and operate within data privacy laws in a data clean room.
Data clean rooms work because the parties involved have full control over their data. Each party agrees upon access, availability, and data usage, while a trusted data clean room offering oversees data governance. This yields the secure framework needed to ensure that one party cannot access the other’s data and upholds the foundational rule that individual, or user-level data cannot be shared between different parties without consent.
Personally, identifying information (PII) remains anonymized and is processed and stored in a way that is not exposed to any parties involved. Thus, data sharing within a data clean room complies with privacy policies, such as GDPR and California Consumer Privacy Act (CCPA).
How does a data clean room work?
Let’s take a deeper dive into the functionality of a data clean room. Four components are involved with a data clean room:
#1 – Data ingestion
Data is funneled into the data clean room. This can be first-party data (generated from websites, applications, CRMs, etc.) or second-party data from collaborating parties (such as ad networks, partners, publishers, etc.)
#2 – Connection and enrichment
The ingested data sets are matched at the user level. Tools like third-party data enrichment complement the data sets.
#3 – Analytics
The data is analyzed to determine if there are intersections/overlaps, measurement/attribution, and propensity scoring. Data will only be shared where the data points intersect between the two parties.
#4 – Application
Once the data has finished its data clean room journey, each party will have aggregated data outputs. It creates the necessary business insights to accomplish crucial tasks such as optimizing the customer experience, performing reach and frequency measurements, building effective cross-platform journeys, and conducting deep marketing campaign analyses.
What are the benefits of a data clean room?
Data clean rooms can benefit businesses in any industry, including media, retail, and advertising. In summary, data clean rooms are beneficial for the following reasons:
You can enrich your partner’s data set.
With data clean rooms, you can collaborate with your partners to produce and consume data regarding overlapping customers. You can pool common customer data with your partners, find the intersection between your business and your partners, and share the data upstream without sharing sensitive information with competitors. An example would be sharing demand and sales information with an advertising partner for better-targeted marketing campaigns.
You can create governance within your enterprise.
Data clean rooms provide the framework to achieve the elusive “single source of truth.” You can create a golden record encompassing all the data in every system of records within your organization. This includes sensitive PII such as social security numbers, passport numbers, financial account numbers, transactional data, etc.
You can remain policy compliant.
In a data clean room environment, you can monitor where the data lives, who has access to it, and how it is used with a data clean room. Think of it as an automated middleman that validates requests for data. This allows you to share data and remain compliant with all the important acronyms: GDPR, HIPPA, CCPA, FCRA, ECPA, etc.
But you have to do it right…
With every data security and analytics initiative, there is a set of risks if the implementation is not done correctly. A truly “clean” data clean room will allow you to unlock data for your users while remaining privacy compliant. You can maintain role-based access, tokenized columns, and row-level security – which typically lock down particular data objects – and share these sensitive data sets quickly and in a governed way. Data clean rooms satisfy the need for efficient access and the need for the data producer to limit the consumer to relevant information for their use case.
Of course, there are consequences if your data clean room is actually “dirty.” Your data must be federated, and you need clarity on how your data is stored. The consequences are messy if your room is dirty. You risk:
Loss of customer trust
Fines from government agencies
Inadvertently oversharing proprietary information
Locking out valuable data requests due to a lack of process
Despite the potential risks of utilizing a data clean room, it is the most promising solution to the challenges of data-sharing in a privacy-compliant way.
To get the most out of your data, your business needs to create secure processes to share data and decentralize your analytics. This means pooling together common data with your partners and distributing the work to create value for all parties involved.
However, you must govern your data. It is imperative to treat your data like an asset, especially in the era of user privacy and data protection. With data clean rooms, you can reconcile the need for data collaboration with the need for data ownership and privacy.
2nd Watch can be your data clean room guide, helping you to establish a data mesh that enables sharing and analyzing distributed pools of data, all while maintaining centralized governance. Schedule time to get started with a data clean room.
Data governance is a broad-ranging discipline that affects everyone in an organization, whether directly or indirectly. It is most often employed to improve and consistently manage data through deduplication and standardization, among other activities, and can have a significant and sustained effect on reducing operational costs, increasing sales, or both.
Data governance can also be part of a more extensive master data management (MDM) program. The MDM program an organization chooses and how they implement it depends on the issues they face and both their short- and long-term visions.
For example, in the insurance industry, many companies sell various types of insurance policies renewing annually over a number of years, such as industrial property coverages and workers’ compensation casualty coverages. Two sets of underwriters will more than likely underwrite the business. Having two sets of underwriters using data systems specific to their lines of business is an advantage when meeting the coverage needs of their customers but often becomes a disadvantage when considering all of the data — but it doesn’t have to be.
The disadvantage arises when an agent or account executive needs to know the overall status of a client, including long-term profitability during all the years of coverage. This involves pulling data from policy systems, claims systems, and customer support systems. An analyst may be tasked with producing a client report for the agent or account executive to truly understand their client and make better decisions on both the client and company’s behalf. But the analyst may not know where the data is stored, who owns the data, or how to link clients across disparate systems.
Fifteen years ago, this task was very time-consuming and even five years ago was still quite cumbersome. Today, however, this issue can be mitigated with the correct data governance plan. We will go deeper into data governance and MDM in upcoming posts; but for this one, we want to show you how innovators like Snowflake are helping the cause.
What is data governance?
Data governance ensures that data is consistent, accurate, and reliable, which allows for informed and effective decision-making. This can be achieved by centralizing the data into one location from few or many siloed locations. Ensuring that data is accessible in one location enables data users to understand and analyze the data to make effective decisions. One way to accomplish this centralization of data is to implement the Snowflake Data Cloud.
Snowflake not only enables a company to store their data inexpensively and query the data for analytics, but it can foster data governance. Dynamic data masking and object tagging are two new features from Snowflake that can supplement a company’s data governance initiative.
What is dynamic data masking?
Dynamic data masking is a Snowflake security feature that selectively omits plain-text data in table or view columns based on predefined policies for masking. The purpose of data masking or hiding data in specific columns is to ensure that data is accessed on a need-to-know basis. This kind of data is most likely sensitive and doesn’t need to be accessed by every user.
When is dynamic data masking used?
Data masking is usually implemented to protect personally identifiable information (PII), such as a person’s social security number, phone number, home address, or date of birth. An insurance company would likely want to reduce risk by hiding data pertaining to sensitive information if they don’t believe access to the data is necessary for conducting analysis.
However, data masking can also be used for non-production environments where testing needs to be conducted on an application. The users testing the environment wouldn’t need to know specific data if their role is just to test the environment and application. Additionally, data masking may be used to adhere to compliance requirements like HIPAA.
What is object tagging?
Another resource for data governance within Snowflake is object tagging. Object tagging enables data stewards to track sensitive data for compliance and discovery, as well as grouping desired objects such as warehouses, databases, tables or views, and columns.
When a tag is created for a table, view, or column, data stewards can determine if the data should be fully masked, partially masked, or unmasked. When tags are associated with a warehouse, a user with the tag role can view the resource usage of the warehouse to determine what, when, and how this object is being utilized.
When is object tagging used?
There are several instances where object tagging can be useful; one use would be tagging “PII” to a column and adding extra text to describe the type of PII data located there. For example, a tag can be created for a warehouse dedicated to the sales department, enabling you to track usage and deduce why a specific warehouse is being used.
Where can data governance be applied?
Data governance applies to many industries that maintain a vast amount of data from their systems, including healthcare, supply chain and logistics, and insurance; and an effective data governance strategy may use data masking and object tagging in conjunction with each other.
As previously mentioned, one common use case for data masking is for insurance customers’ PII. Normally, analysts wouldn’t need to analyze the personal information of a customer to uncover useful information leading to key business decisions. Therefore, the administrator would be able to mask columns for the customer’s name, phone number, address, social security number, and account number without interfering with analysis.
Object tagging is also valuable within the insurance industry as there is such a vast amount of data collected and consumed. A strong percentage of that data is sensitive information. Because there is so much data and it can be difficult to track those individual pieces of information, Snowflake’s object tagging feature can help with identifying and tracking the usage of those sensitive values for the business user.
Using dynamic data masking and object tagging together, you will be able to gain insights into the locations of your sensitive data and the amount specific warehouses, tables, or columns are being used.
Think back to the situation we mentioned earlier where the property coverage sales department is on legacy system X. During that same time period, the workers’ compensation sales department is on another legacy system Y. How are you supposed to create a report to understand the profitability of these two departments?
One option is to use Snowflake to store all of the data from both legacy systems. Once the information is in the Snowflake environment, object tagging would allow you to tag the databases or tables that involve data about their respective departments. One tag can be specified for property coverage and another tag can be set for workers’ compensation data. When you’re tasked with creating a report of profitability involving these two departments, you can easily identify which information can be used. Because the tag was applied to the database, it will also be applied to all of the tables and their respective columns. You would be able to understand what columns are being used. After the data from both departments is accessible within Snowflake, data masking can then be used to ensure that the new data is only truly accessible to those who need it.
This was just a small introduction to data governance and the new features that Snowflake has available to enable this effort. Don’t forget that this data governance effort can be a part of a larger, more intricate MDM initiative. In other blog posts, we touch more on MDM and other data governance capabilities to maintain and standardize your data, helping you make the most accurate and beneficial business decisions. If you have any questions in the meantime, feel free to get in touch.
It has been said that the “hero of a successful digital transformation is GRC.” The ISACA website states, “to successfully manage the risk in digital transformation you need a modern approach to governance, risk and regulatory compliance.” For GRC program development, it is important to understand the health information technology resources and tools available to enable long term success.
What is GRC and why it important?
According to the HIPAA Journal, the average cost of a healthcare data breach is now $9.42 million. In the first half of 2021, 351 significant data breaches were reported, affecting nearly 28 million individuals. The needs have never been more acute among healthcare providers, insurers, biotechnology and health research companies for effective information security and controls. Protecting sensitive data and establishing a firm security posture is essential. Improving health care and reducing cost relies on structured approaches and thoughtful implementation of available technologies to help govern data and mitigate risk across the enterprise.
Effective and efficient management of governance, risk, and compliance, or GRC, is fast becoming a business priority across industries. Leaders at hospitals and health systems of all sizes are looking for ways to build operating strategies that harmonize and enhance efforts for GRC. Essential to that mission are effective data governance, risk management, regulatory compliance, business continuity management, project governance, and security. But rather than stand-alone or siloed security or compliance efforts, a cohesive program coupled with GRC solutions allow for organizational leaders to address the multitude of challenges more effectively and efficiently.
What are the goals for I.T. GRC?
For GRC efforts, leaders are looking to:
Safeguard Protected Healthcare Data
Meet and Maintain Compliance to Evolving Regulatory Mandates and Standards
Identify, Mitigate and Prevent Risk
Reduce operational friction
Build in and utilize best practices
Managing governance, risk, and compliance in healthcare enterprises is a daunting task. GRC implementation for healthcare risk managers can be difficult, especially during this time of rapid digital and cloud transformation. But relying on internal legacy methods and tools leads to the same issues that have been seen on-premises, stifling innovation and improvement. As organizations adapt to cloud environments as a key element of digital transformation and integrated health care, leaders are realizing that now is the time to leverage the technology to implement GRC frameworks that accelerate their progress toward positive outcomes. What’s needed is expertise and a clear roadmap to success.
Cloud Automation of GRC
The road to success starts with a framework, aligned to business objectives, that provides cloud automation of Governance, Risk, and Compliance. Breaking this into three distinct phases, ideally this would involve:
Building a Solid Foundation – within the cloud environment, ensuring infrastructure and applications are secured before they are deployed.
Image/Operation System hardening automation pipelines.
Infrastructure Deployment Automation Pipelines including Policy as Code to meet governance requirements.
CI/CD Pipelines including Code Quality and Code Security.
Disaster Recovery as a Service (DRaaS) meeting the organization’s Business Continuity Planning requirements.
Configuration Management to allow automatic remediation of your applications and operating systems.
Cost Management strategies with showback and chargeback implementation.
Automatic deployment and enforcement of standard security tools including FIM, IDS/IPS, AV and Malware tooling.
IAM integration for authorization and authentication with platforms such as Active Directory, Okta, and PingFederate, allowing for more granular control over users and elevated privileges in the clouds.
Reference Architectures created for the majority of the organization’s needs that are pre-approved, security baked-in to be used in the infrastructure pipelines.
Self-service CMDB integration with tools such ServiceNow, remedy and Jira ServiceDesk allowing business units to provision their own infrastructure while providing the proper governance guardrails.
Resilient Architecture designs
Proper Configuration and Maintenance – Infrastructure misconfiguration is the leading cause of data breaches in the cloud, and a big reason misconfiguration happens is infrastructure configuration “drift,” or change that occurs in a cloud environment post-provisioning. Using automation to monitor and self-remediate the environment will ensure the cloud environment stays in the proper configuration eliminating the largest cause of incidents. Since workloads will live most of their life in this phase, it is important to ensure there isn’t any drift from the original secure deployment. An effective program will need:
Cloud Integrity Monitoring using cloud native tooling.
Log Management and Monitoring with centralized logging, critical in a well-designed environment.
Managed Services including patching to resolve issues.
SLAs to address incidents and quickly get them resolved.
Cost Management to ensure that budgets are met and there are no runaway costs.
Perimeter security utilizing cloud native and 3rd party security appliance and services.
Use of Industry Leading Tools – for risk assessment, reporting, verification and remediation. Thwart future problems and provide evidence to stakeholders that the cloud environment is rock solid. Tools and verification components would include:
Risk Registry integration into tools
Future attestations (BAAs)
Audit evidence generation
Where do you go from here?
Your organization needs to innovate faster and drive value with the confidence of remaining in compliance. You need to get to a proactive state instead of being reactive. Consider an assessment to help you evaluate your organization’s place in the cloud journey and how the disparate forms of data in the organization are collected, controlled, processed, stored, and protected.
Start with an assessment that includes:
Identification of security gaps
Identification of foundational gaps
Managed service provider onboarding plan
A Phase Two (Foundational/Remediation) proposal and Statement of Work
About 2nd Watch
2nd Watch is a trusted and proven partner, providing deep skills and advisory to leading organizations for over a decade. We earned a client Net Promoter Score of 85, a good way of telling you that our customers nearly always recommend us to others. We can help your organization with cloud native solutions. We offer skills in the following areas:
Developing cloud first strategies
Migration of workloads to the cloud
Implementing automation for governance and security guardrails
Implementing compliance controls and processes
Pipelines for data, infrastructure and application deployment
Cloud adoption throughout all industries has become incredibly pervasive in recent years. With cloud management as a relatively newer concept, business organizations may struggle to understand each aspect that is required to effectively run a cloud environment. One aspect that should be involved at every layer of the cloud is security, yet many organizations fail to implement a strong security system in their cloud until an attack happens and it is too late.
A cloud environment and the controls necessary to orchestrate a robust security and governance platform is not the same as your traditional on-premises environment.
The State of Cloud Security Today
As beneficial as the public cloud is for companies globally today, lack of security in the cloud can be a major issue. A report from Sophos indicated that iMost of these attacks are simply from misconfigurations of these organizations’ cloud security. Thus, the attacks can be prevented if configured and managed properly. Orca Security’s 2020 State of Public Cloud Security Report revealed that 80.7% of organizations have at least one neglected, internet-facing workload – meaning the OS is unsupported or unpatched. Attackers can use one small vulnerability as leverage to move across an organization, which is how most data breaches occur.
Managed cloud security services help lay a strong foundation for security in the cloud that is automated and continuous with 24/7 management. With constant management, threats and attacks are detected before they occur, and your business avoids the repercussions that come with security misconfigurations.
What are managed cloud security services?
Managed cloud security services provide security configurations, automation, 24/7 management, and reporting from an external cloud security provider. If an attack should occur, the result is downtime and the loss of money and data. Additionally, the lack of a well-rounded security system can lead to regulatory compliance challenges.
Monitoring and maintaining strong security requires continuous attention to be effective. Employing a managed security service gives businesses the protection they need while simultaneously providing IT departments with additional time to focus on other business concerns. Redirecting cybersecurity efforts to an external provider not only provides IT departments with flexibility, but also reduces costs compared to handling cybersecurity in house. Managing cybersecurity independently creates costs such as staffing, software licensing, hardware, implementation costs, and management costs. All the costs and management required for effective security can be overwhelming and managed security services takes the weight of maintaining the security of your data off your shoulders.
What are the benefits of using cloud security services?
Implementing strong cloud security may seem like an obvious choice for a business to make, but many businesses may not want to devote the time, resources, or money to building and maintaining a strong cybersecurity system. Investing your resources into cloud security is imperative for your business and pays off in the long run.
Five different benefits resulting from a strong cloud security system include:
Automation: Once your configurations have been set up, there is reduced reliance on human intervention. This minimizes time spent managing security while also reducing the risk for error.
Efficiency: Cloud services improve the security of your data and maintain regulatory compliance through timely patching and automated updates with less downtime.
Safety: Data is well-protected with cloud security due to 24/7 monitoring and real-time threat detection.
Proactive Defense: Threats are identified quickly and treated proactively in the cloud should an incident occur.
Cost-effective: The cloud requires a unique approach to security. While managed cloud security services can seem costly upfront, they prove to be worthwhile in the long run by utilizing expertise that may not be available in-house. Additionally, cloud security services will ensure the safety of your workloads and data, and prevent the costs associated with a data breach.
2nd Watch Managed Cloud Security
At 2nd Watch, we understand cloud security is important at every step of your cloud journey. 2nd Watch has a dedicated Managed Security Team that monitors your cloud environments 24/7/365, remediating vulnerabilities quickly. Rather than putting security on the backburner, we believe security is a pillar of business, and building it into the foundation of a company is important to meet evolving compliance needs in a cost-effective manner.
Companies just getting started in the cloud can rely on 2nd Watch to get security right for them the first time. Even for companies already established in the cloud, we can take an in-depth look at security and compliance maturity, existing capabilities, and growth trajectory to provide a prescriptive security roadmap. No matter where you are in your cloud journey, we ensure your security is well-integrated into your cloud environments.
At 2nd Watch we are with you from beginning to end, monitoring your security even after implementation. At a glance, our end-to-end services include:
Security Review: Ensures the proper safeguards are utilized for your multi-cloud environments with a single point of contact for your security needs. Our security assessment and remediation offering can reveal how your cloud security posture stacks up to industry standards such as CIS, GDPR, CCPA, HIPAA, NIST, PCI DSS, and SOC 2.
Environment Monitoring: 24/7/365 multi-cloud monitoring protects against the most recent vulnerabilities.
Threat Analysis: Managed Reliability Operations Center (ROC) proactively analyzes and remediates potential threats.
Issue Resolution: Identified issues are quickly resolved providing enterprise class and proactive defense.
Other solutions we provide include:
Security should be integrated into every layer of your public cloud infrastructure. We can help you achieve that through our comprehensive suite of security services and a team of experts that cares about your success in the cloud. To learn more about our managed cloud security services, visit our Cloud, Compliance, Security, & Business Continuity page or talk to someone directly through our Contact Us page.
The report, although disturbing, is not shocking by any measure. Although businesses continue to migrate to the cloud, many have failed to make it a core part of their business strategy. The reasons for this vary – poorly trained staff, inability to utilize cloud resources effectively, or the absence of a strategy that leverages the power of cloud.
For these reasons and many others, businesses incur unexpected costs, unproductive workflows, and cybersecurity risks to their data on the cloud. These organizations need a set of protocols for utilizing cloud resources efficiently, effectively, and securely. In short, they need a cloud governance framework that enables them to extract the benefits of the cloud.
Organizations can fully realize these benefits only when their cloud policies are designed to leverage them. Therefore, a well-designed cloud governance framework is critical to the success of cloud adoption. What is cloud governance and how does it lay the foundation for the success of your cloud adoption?