Some recent events – mainly, Amazon Web Services (AWS) S3 data leakages coming from Amazon – have left organizations vulnerable and risking not only their corporate data, but also the data of their customers. It’s a huge problem that has affected Dow Jones, Verizon and millions upon millions of users who are losing sensitive data. Yes, a lot of it was due to misconfigurations, but there are some proactive things you can do to ensure this doesn’t happen again.
Cloud data breaches are on the rise because organizations do not leverage best practices and make recommended configuration changes to their S3 instances in AWS. As a start, when companies transition to the cloud they should diligently study AWS’s Shared Responsibility Model. This shared model helps relieve customer’s operational burden as AWS protects the infrastructure that runs all of the services offered in the AWS Cloud, while the customer assumes responsibility and management of the guest operating system and other associated application software. Users and organizations must be educated on who is responsible for what, or they will leave themselves exposed for a potential breach.
If you take these three simple steps you can help protect your data and prevent breaches in S3:
1. Turn on logging
This capability comes disabled automatically from AWS, so turn it on. Then, you can take the logs and move them into what’s called a source bucket. Amazon S3 inventory creates lists of the objects in an S3 bucket and creates output files of the data. The bucket that the inventory lists the objects for is called the source bucket.
Once you have them in a source bucket, you may accrue some cost but you can delete logs as you see fit, and as you come and go within your S3 instance.
2. Use native encryption before you even send your data up to S3
There are a couple options for encrypting data in Amazon S3.
First, there is server-side encryption, or encrypting data at rest using techniques like Amazon S3 Managed Keys (SSE-S3), AWS KMS-Managed Keys (SSE-KMS) or Customer-Provided Keys (SSE-C). You can request S3 to encrypt your object before saving it on disks in its data centers and decrypt it when you download the objects.
You can also opt for client-side encryption, which allows an organization to manage the entire process by encrypting data before it goes across the network to AWS. You can leverage a few tools and options to do this, including an AWS KMS–managed customer master key (CMK) or a client-side master key.
Without encryption, your data can easily be viewed, transferred and compromised by anyone who may have access.
3. Setting permissions correctly.
The best way to set permissions in S3 is by leveraging AWS Identity and Access Management (IAM) to create policies by defining actions by users and groups that are either allowed or denied. The best way to do this are through managed policies that are easy to use and are automatically updated with the required actions as the service evolves.
Once you’re in S3, make sure you are correctly setting your permissions. While logging and monitoring data before going to S3 will help reinforce your permissions, setting them correctly is crucial.
If you do not set up proper IAM, you will leave your data exposed to non-privileged users and open yourself up to a possible breach.
AWS has a lot of great documentation and guidelines for getting started, so you can not only understand the Shared Responsibility Model, but also the different options and techniques on how to configure and secure your workloads. Once you have a clear understanding of what you need to do to fulfill your role as part of the model, determine the best strategy for your organization and monitor and log everything for visibility at scale.
About the Author
As Sumo Logic’s Vice President of Security and Compliance, George Gerchow brings 18 years of information technology and systems management expertise to the application of IT processes and disciplines. His expertise informs the security, compliance and operational status of complex, heterogeneous, virtual and cloud computing environments. George’s practical experience and insight from managing the infrastructures of some of the world’s largest corporate and government institutions make him a highly-regarded speaker and invited panelist on topics including secure cloud architecture design, virtualization, configuration management, operational security and compliance. George is one of the original founders of the VMware Center for Policy and Compliance and the coauthor of “Center for Internet Security Quick Start Cloud Infrastructure Benchmark v1.0.0.” He is also a faculty member for IANS, the Institute of Applied Network Security.