Over 10 years we help companies reach their financial and branding goals. Engitech is a values-driven technology agency dedicated.

Gallery

Contacts

411 University St, Seattle, USA

engitech@oceanthemes.net

+1 -800-456-478-23

Cloud

Data Security in Cloud

In a cloud environment, multiple organizations share the same resources. There is a chance for data misuse. So it is often necessary to protect data repositories, data in transit, or process. This cannot be achieved with a single data protection technology or policy. Multiple techniques such as authentication, encryption, data encryption, data masking, and data integrity should be combined to create a security model over the cloud.

This article discusses the available Data Security features in Amazon AWS and Microsoft Azure in comparison with CSA’s Cloud Control Matrix control framework.

Classification – (Sub Control)

Data and objects containing data shall be assigned a classification by the data owner based on data type, value, sensitivity, and criticality to the organization.

Data Objects/Resources can be classified by this tagging process but not data (i.e files in the storage). Data classification helps the organization to understand the value of data, risk associated, and implement controls to mitigate it.

AWS:

AWS supports data object tagging for AWS resources such as S3(Simple Storage Service) Buckets and EC2(Elastic Compute Cloud) instances. For example, if any particular S3 bucket has confidential data, that can be tagged with Data Classification = “Critical” tag. Based on the tags, access can be restricted for the resources or encryption can be applied to secure data.

Amazon recently launched the Amazon Macie service for data security. Amazon Macie can automatically discover and classify data stored in Amazon S3. Like traditional data classification, Amazon Macie uses keywords, regex, and vector machine learning. But it doesn’t provide the freedom to use custom regex or keywords. The predefined regex, the keyword can be either enabled/disabled. Custom regex or keywords cannot be created based on the organizational requirement. Amazon Macie assigns each matching object with severity such as low, medium, or high. The severity scale/criteria are predefined and cannot be customized. Data classification can also be implemented by third-party solutions hosted on AWS.

Azure:

Azure resources can be organized using tags. The tag can be applied based on the resource environment (Production/Non-Production), sensitivity (Confidential/Public). Resource policies can be created to ensure that the resources are tagged with an appropriate value. Azure Information Protection service helps organizations to classify and label data stored/accessed in Azure Cloud. Pre-defined patterns can be used for automatic classification. Azure Information Protection also supports custom string or regular expression for data classification. Policies can be set to apply classification automatically. Also, it can prompt users to apply the recommended classification.

Handling / Labeling / Security Policy – (Sub Control)

Policies and procedures shall be established for the labeling, handling, and security of data and objects which contain data. Mechanisms for label inheritance shall be implemented for objects that act as aggregate containers for data.

AWS:

AWS Resources can be labeled using tags. These tags can be applied based on the environment (Production/Non-Production), Security, or business. As discussed in the previous section, AWS supports data object tagging for AWS resources such as S3(Simple Storage Service) Buckets and EC2(Elastic Compute Cloud) instances.

If a resource is tagged, that tag is not applied to the dependent/attached resources automatically. These dependent resources should be identified and tagged manually. This process can be automated by third-party tools (Ex: Graffiti Monkey).

Azure:

As mentioned in the previous sections Azure resources can be labeled using tags. These tags should be applied manually for dependent resources. It is not inherited automatically. Using resource policies one can ensure whether the tags are applied properly.

Non-Production Data – (Sub Control)

Production data shall not be replicated or used in non-production environments. Any use of customer data in non-production environments requires explicit, documented approval from all customers whose data is affected, and must comply with all legal and regulatory requirements for scrubbing of sensitive data elements.

AWS:

AWS Cloud Database doesn’t provide data masking as the default service. Third-party tools can be used to achieve Data masking in AWS Cloud Environment. DataGuise, HexaTier, Mentis, and Camouflage are few available tools in the market for Data Masking in Cloud.

Azure:

Azure SQL Database as a service is a relational database service provided by Azure. Azure supports Dynamic Data Masking for this SQL database service. It hides the sensitive data in the result set, while the data in the database is not changed. DDM can be set in Azure SQL DB using Powershell cmdlets or Rest API. A particular user can be excluded from data masking and they can view original data.

DDM feature is only available for Azure SQL Database as service not for the databases configured in the Virtual Machines. Third-party data masking solutions should be used to mask data in these databases.

Secure Disposal – (Sub Control)

Policies and procedures shall be established with supporting business processes and technical measures implemented for the secure disposal and complete removal of data from all storage media, ensuring data is not recoverable by any computer forensic means.

In many organizations, production data is replicated and used for testing, leaving the sensitive data unprotected in a test environment. It is often necessary to protect sensitive data in a test environment to meet data security compliance. Sensitive data in the test environment can be masked with dummy data to serve the testing purpose. At the same time, data is protected.

AWS:

When a user deletes the object from Amazon S3, first it removes the mapping from the public name. This restricts remote access to the deleted data object. The storage area is used by the system for other purposes. Amazon EFS will never serve deleted data. If the organization needs to follow the procedures mentioned in DoD 5220.22-M (“National Industrial Security Program Operating Manual “) or NIST 800-88 (“Guidelines for Media Sanitization”), AWS suggests conducting a specialized wipe procedure before deleting the file system.

Azure:

Microsoft uses procedures and a media wiping solution that is NIST 800-88 compliant. It destroys the hard drives that cannot be erased. Destruction of hard drive renders the recovery of information impossible (e.g., shredding). Records of the destruction are retained.

Author

Dev@Kivyo2020