Encryption and Key Management
The first rule of cloud is: encrypt everything! While many
firms have data classification strategies in place, data tag-
ging is less common, and a consistent deployment of
encryption tied to the classification strategy is seen even
less. To simplify implementation, encryption is embedded
in the platforms of the three major cloud service providers.
You no longer have to bear the computational or time
expense of encryption because it is provided, rapidly, by the
platform! As always, when you encrypt something, you
need a key. This is similar to your house key–anyone you
give it to has access to your house. So if you want to limit
access to a set of data, you must limit the people to whom
you give the encryption key. You will also want to limit your
“blast radius,” and define your key scoping by the levels of
risk in your data classification policies.
Here is a simple example. You have four business functions:
finance, trading, marketing and customer service. All have
confidential data to protect. While it is possible to leverage
just a single key to protect the private data across all these
functions, if that one key is compromised, all the data is
compromised.
In this scenario, you could use two keys to manage blast
radius: one that covers everything in your AWS account,
and a second key for each business unit’s private data. At a
minimum, each group gets one key, and for private informa-
tion that can span all the groups, each gets one key that all
four groups can use.
This simple model is typically expanded to distinguish
between service types (e.g., AWS S3), but should be
extended to support your specific data classification
strategy.
A Strategy for Data Leakage
Since data does not always stay stationary, you will also
need to put strategies in place to deal with data leakage
monitoring and protection.
Data loss prevention (DLP) is the ability to understand and
prevent data from going someplace it should not be. In the
cloud, it is easy to monitor leakage, but it is more difficult to
stop, because cloud DLP tools are not as mature as
on-premises ones. Even the more cloud-ready DLP tools
remain focused on the perimeter, while neglecting a more
robust focus on workload orientation. Paying attention to
where your data is located, and wrapping a DLP tool around
it, offers a layer of protection, but it will impose an on-prem-
ises oriented architecture.
Cloud providers also offer tools to help with DLP, such as
AWS’s Amazon Macie and Azure Advanced Threat Protec-
tion, both of which have data leakage monitoring and alert-
ing capabilities. These tools are still maturing and need to
be augmented in the near term with enforcement
capabilities.
How It All Evolves
As your cloud estate grows, so too does its data gravity.
This requires evolving and scaling your data management
and data protection strategies in the cloud. In order to
understand if data gravity and location is creating friction,
you will need to expand your logging and monitoring capa-
bilities, and continue to actively monitor performance
across the chasms of your hybrid IT environment. Increased
cloud data gravity requires automated compliance enforce-
ment and remediation, in order to effectively deliver secu-
rity and data protection. Finally, your business continuity
and disaster recovery capabilities must expand to protect
this growing critical asset. With a well-planned data protec-
tion foundation, all these important characteristics can be
scaled effectively.
Conclusion
Organizations are leveraging their data to its fullest extent
in an effort to improve operations and gain market share.
Protecting data is therefore a mission-critical priority. To
fend off threats and manage data resources well into the
future, organizations should take specific steps and develop
comprehensive data security policies. Data gravity compli-
cates the landscape. Leaders need to look at practices gov-
erning aspects such as classification, tagging, encryption
and key management, with an understanding of where their
data will be located. Using this information, they will need to
reexamine the use of on-premises tools for data protection
in a hybrid IT environment, and confirm they can meet the
needs of hybrid IT workloads.
FALL 2019 | THE DOPPLER | 31