During the 1992 presidential election, strategist James Carville famously hung a sign in the Little Rock headquarters of then-candidate Bill Clinton, which was meant to keep the campaign on message. It read:
What we have to say, what you want us to hear.
That’s how our blog works. It’s interactive. Let’s learn together.
The ever-increasing torrent of reports of “misconfigured” S3 buckets contributing to egregious breaches of customer data is an epidemic, unfairly placing a black mark on the name of Amazon Web Services’ (AWS) outstanding object store. Worse, these breaches are completely avoidable through the application of simple automated compliance enforcement. Tying my last two posts together, let’s take a quick look at how applying a DevOps mentality could save these companies the public embarrassment and expense of remediation that results from human carelessness.
From a business perspective, cloud migrations are driven largely by a desire for flexibility and resilience. When we move systems to the cloud, we expect them to be both more adaptable and more reliable than on-premise solutions. These two objectives are somewhat competitive, however. The Jenga tower is most likely to fall when you are moving a piece. Adding flexibility naturally introduces change which puts stability at risk. (photo credit: pwmag.com)
With a number of recent high-profile leaks of personal data on Amazon Web Services (AWS) S3 service, it seems like a good time to review the security mechanisms that govern storage and sharing of data from AWS. If your organization uses S3 for storage of PHI, PII, or other sensitive data, you should be aware that failure to properly secure and restrict access and log this information can carry hefty fines and even jail time. Leaking users’ personal data can also be detrimental to your business model. (Photo Credit: Flickr)