Data breaches are very expensive cloud affairs. Demystifying To be more precise, Rs 12.8 crore was the average cost that data breaches caused for Indian organisations during the time frame of July 2018 to April 2019 (Cost of a Data Breach Report, conducted by the Ponemon Institute and IBM). Close to 70 percent of Indian organisations are at risk of data breach, according to another report from Frost & Sullivan.
When we consider these facts in combination with the growing adoption of cloud model, it becomes an even more complicated scenario. Security is the pillar of a cloud architecture and is critical for the success of any cloud workload. As the cloud market leader, AWS has always been at the forefront of ensuring that its customers meet the core security and compliance requirements. At the same time, security on cloud is one of the most misinterpreted concepts.
In the initial years, security stood in the way of organizations embracing cloud. But it lasted only till IT pros themselves figured out that security is better in the cloud. Nevertheless, some recent high-profile cloud breach incidents have brought cloud security back in to focus, and raised a very important question—who is responsible for cloud security?
Security OF the cloud’ part is fairly straight forward, and AWS provides some of the best-in-class security features in the industry. But the ‘security IN the cloud’ is finally going to be defined by the customer. How do you get that right?
AWS recommends approaching security from a data perspective, rather than compartmentalizing it into on-premise and off-premise security. To achieve this, organizations need to focus on three broad categories, the cloud provider advises. This includes:
Data classification and security-zone modelling: How you classify data can not only decide your security posture but can be critical in bringing the much-touted cloud benefits like agility and flexibility into the environment. It’s important to add the right level of fidelity to your data classification model and the data security control models need to be designed to match the varying degree of sensitivity that the data comes with. The data classification model can then be combined with a ‘security zone’ which provides a well-specified network perimeter that protects all the critical assets within.
Defense in depth: This model focuses on a layered security control environment to ensure that one control works if the other one fails. In today’s dynamic technology and business landscapes, it’s critical to have both preventive and detective security measures, AWS emphasizes in its Cloud Adoption Framework. Preventive control looks at aspects like IAM, Infrastructure security and encryption/tokenization.
On the detective control side, organizations should prioritize areas such as detection of unauthorized traffic, configuration drift and fine-grained audits.
Swim-lane isolation: Swim-lane isolation basically looks at security from a business domain-driven design and recommends to approach security of the data stores attached to each of the microservices from the context of a business domain. This basically helps to ensure that sensitive data from one microservice domain does not leak out through another microservice domain.
Cloud service providers like AWS has been investing heavily on public cloud security. But ultimately, the customers are equally responsible for the security of their data. In today’s cloud-driven world, look beyond the traditional tools and methods of securing data, and reassess your security postures and strategies.