The amount of data stored in the cloud in the United States may grow by 278 percent by 2025, from 900,000 petabytes to 3.6 million PB, with about 55 percent of that stored in a public cloud, according to the IDC report, “DataAge 2025: The Digitization of the World.” Just as striking, more than 11.7PB of data — imagine enough music to play continuously for 24,000 years — is currently exposed on cloud servers because of misconfiguration.
These suggestions can help agencies realize the benefits of the cloud while minimizing the risk of data theft.
1. Control Federal Users' Access to Data
Giving workers access to data they don’t need opens the door to breaches. Access should be determined based on the concept of least privilege, where users are given the minimum set of permissions they need to do their job. With public cloud services, this requires setting policies for network, inbound access, identity and access management, and other access controls.
2. Agencies Should Enable Policy as Code
No matter how strongly an agency’s security policies are written, it won’t make a difference if the policies aren’t implemented correctly or the administrators aren’t aware of all of them. Security solutions that enable policy as code can check compliance before any code is written or infrastructure is deployed.
3. Log Everything to Track Misconfigurations
In large organizations, tracking changes made by users in a cloud environment can be difficult, if not impossible. Logging services enable administrators to track activity and identify any misconfigurations. Without logging, the business will be blind to malicious activity until a breach happens.
4. Deploy Network Visibility Tools to See Traffic Flow
There’s an axiom that “you can’t secure what you can’t see,” and most security professionals can’t see what’s happening in the cloud. Network visibility tools enable administrators to see every application flow. Once a baseline for traffic and “normal” behavior is established, any deviation could indicate a breach. For example, if data that normally moves from the cloud to a certain location starts flowing to an unknown location, access should be terminated.
5. Audit Continuously to Uncover Errors and Security Gaps
Establishing policies and configuring cloud resources is not sufficient to protect data. Most federal agencies are highly dynamic, and things can change quickly. Organizations should conduct regular audits to uncover configuration errors and maintain compliance with internal policies. The large cloud providers offer tools to conduct the audits and automate them if desired. For sensitive data and other important resources, some vendors can automate remediation of any misconfiguration as well.