Faced with updating an aging data center infrastructure and increased demand to support business needs, IT leaders at the Office of Justice Programs pondered a question two years ago that every agency is now wrestling with: Should they go with a public cloud, build their own private cloud or use a hybrid approach?
OJP, an agency of the Justice Department, deliberated and ultimately chose to go private because security was one of its top priorities. The agency collects crime data, analyzes trends and provides local, state and tribal law enforcement with strategies, training and nearly $3 billion in annual grants to support criminal justice.
The agency will consider taking advantage of a public cloud for non-mission-critical applications in the future, but IT leaders feel that they can better protect the agency's grant management system and other core applications and data by keeping everything in-house in a private cloud, says OJP Deputy CIO Angel Santa.
"Disbursing $3 billion in grants annually, or any amount, comes with a fiduciary responsibility, so security and integrity is paramount," Santa says. "If you look at private industry, they can go from boom to bust if their core data is compromised. And in government, if we don't take appropriate measures to secure access to confidential and sensitive data or the ability to 'draw down funds' appropriately, it can impact public confidence."
Agencies have made steady progress in the two years since the Obama administration began pushing cloud computing as a way to run government IT services more efficiently, improve services and cut costs.
Some early adopters have moved full speed into the cloud, but others are cautiously dipping a toe in the waters first. For many federal IT leaders, security remains a top concern and a barrier to adoption.
To allay those fears, the government has released several publications on cloud security over the past year, and more are on the way. This fall, the National Institute of Standards and Technology (NIST) will release a draft cloud computing roadmap, which identifies high-priority interoperability, portability and security standards and recommended actions. The roadmap is intended to help agencies planning to deploy cloud services.
The General Services Administration has spearheaded an interagency effort, called the Federal Risk and Authorization Management ÂProgram (FedRAMP), to create a uniform set of security requirements for cloud providers. FedRAMP, which is expected to become operational in some capacity by the fall, will include baseline security controls, processes to continuously monitor cloud systems for security and proposed uniform approaches for the government to better leverage security assessments and authorizations for cloud services.
The goal is for the government to certify a private or cloud service once, so each agency doesn't have to go through the lengthy and costly process on its own. And if agencies' security requirements go beyond FedRAMP, they can leverage the work done through FedRAMP certification and focus their testing, assessment and certification efforts on the delta resulting from their unique requirements, says Sanjeev Bhagowalia, who recently left his post as deputy associate administrator for GSA's Office of Citizen Services and Innovative Technologies to become CIO of Hawaii.
While the government finalizes its security standards, early adopters are making do on their own, using existing security guidance from NIST special publications to comply with the Federal Information Security Management Act.
The challenge with cloud computing is that, in general, half of all IT leaders feel the cloud is safe, while the other half feel it's a security risk, says Pete Lindstrom, research director for Spire Security, an industry analyst firm. One reason is that every agency is different in terms of its mission, size, IT requirements and IT expertise. In general, moving to the cloud does increase security risks, so agencies should apply or require more security controls to safeguard their new environments, he says.
Federal IT administrators say new NIST guidance and FedRAMP will help. But it's important for each agency to perform its own security risk assessment to determine what deployment model to choose — public, private, community or hybrid cloud — and to determine what security requirements are necessary, says Kevin Smith, deputy CIO of the U.S. Patent and Trademark Office, whose agency is deploying a hybrid cloud.
"Security, at the end of the day, is a risk-based decision, and it depends on what levels of security are required by each agency for its data," Smith says.
OJP Goes Private
At OJP, the IT department focused on security from the beginning. That's one of the key drivers behind its choice to build a private cloud on DOJ's premises.
"By staying within the DOJ perimeter, we inherit the security controls of DOJ's data centers, and that helps tremendously in securing our data," Santa says. For example, data is encrypted on the Justice Uniform Network, the department's private backbone network.
Two years ago, OJP's IT leaders began plans to upgrade its antiquated data center infrastructure. At that time, Congress had just passed the American Recovery and Reinvestment Act, which doubled OJP's grant funding to $6 billion, resulting in doubling the number of applications received and grants for crime prevention that OJP would award. The agency needed new data center equipment to improve business continuity and to handle increased demand on network and server resources expected from grant applicants, Santa says.
The IT department spent 16 months planning and implementing the private cloud, which cost $9 million to build. The cloud straddles three data centers (in Rockville, Md., Dallas and Washington, D.C.) using state-of-the-art tools, including VMware virtualization software; Oracle middleware and database farms; EMC storage area networks and replication technology; Cisco Systems 10-Gigabit Ethernet networking; and F5 Networks load-balancing equipment.
When the cloud launched in May 2010, the first two applications to go live were OJP's grant management system and grant payment system, which support approximately 70,000 users nationwide, says Victor Pham, chief of OJP's Systems Engineering and Operations Branch. Since then, the IT staff has migrated three other mission-critical applications as well as its web servers, e-mail and BlackBerry server to the cloud.
OJP uses a multilayered security approach that's no different from how it traditionally has protected its data centers, Pham says.
"The fabric of the OJP cloud is virtualization," he says. "To secure such an infrastructure, the protection strategy and technology must be cloud-adaptable, such as [the use of] Cisco virtual routing and forwarding for [traffic] segmentation, virtual domain firewalls and attribute-based access control."
OJP deployed two types of firewalls: web application firewalls to protect web servers from malicious traffic and unified threat management firewalls that include antivirus software and intrusion prevention and detection tools, Pham says.
The strategy to protect data in a private cloud is much different from a public or a hybrid cloud, he says. In a private cloud, data resides in multiple, secure physical locations. The data inherits a lot of physical and logical security controls and regulatory compliance from these in-house locations and operations.
"The focus of safeguarding data then shifts to data classification, segregation and access points," Pham says. "Data has to be tiered and accessible within an attribute-based access control mechanism and framework."
OJP architected its private cloud so it can evolve into a hybrid cloud, Santa says. For example, in the future, if OJP needs 400 desktop computers for training purposes, it may be more cost-effective to turn to a public cloud provider to deliver virtual clients to users. This use of the public cloud is highly secure because no sensitive data would be involved, Pham says.
"Do we rent a truckload of computers, or do we deliver desktops using virtual desktop infrastructure to those locations?" he says. "It's a lot cheaper if we can leverage a public cloud and be able to buy the service and deliver our image for 400 virtual desktops. Once we are done, we close it up."
USDA Takes the Hybrid Road
The Agriculture Department is adopting a hybrid approach, having built its own private cloud while leveraging commercial cloud offerings for its e-mail and unified communications.
The department, which built its private cloud in 2010, provides infrastructure, platform and software as a service to its agencies — including virtual clients, databases and soon-to-be-implemented geospatial solutions. By consolidating data centers in its private cloud, USDA has reduced data center operating costs by 60 percent for those applications that have been moved, Agriculture CIO Chris Smith says.
To secure the cloud, the IT staff performs vulnerability scanning and penetration testing of applications; and on the network perimeter, the IT department has deployed seven technologies, including data loss prevention and intrusion prevention tools. "It's a multilayered defense," he says.
When software developers spin up a virtual machine, the standard operating system has been hardened, meaning unnecessary services and features are disabled, making it harder to break into the server, Smith says.
"You can harden that OS and lock it down as tightly as you need to, and it's monitored by security tools and managed by a privileged few," he says.
Furthermore, USDA provides every employee with security awareness training, such as appropriate handling of data and passwords, he adds.
Agriculture had developed its own internal e-mail system for the cloud. But by mid-2010, Smith felt public cloud e-mail offerings had matured enough that turning to a public cloud provider would be more cost-effective and provide USDA with higher levels of service.
This March, the department turned to Microsoft's Business Productivity Online Services (BPOS) to provide the agency's 120,000 users with e-mail services, collaboration and unified communications tools over the cloud. To safeguard data, it set stringent security requirements in the contract, so that security standards are the same as for services operating within USDA's own facility, Smith says.
When asked what is holding their organization back from further implementing cloud computing:
32% of cloud users say "security concerns"
45% of nonusers say "security concerns"
BPOS employs Microsoft's Forefront Online Protection for Exchange for the USDA e-mail service. FOPE leverages multiple layers of inspection to ensure users receive only valid, clean e-mail. The application provides a Domain Name System block list, safe-sender list and directory synchronization to filter out spam. It also has three antivirus engines to nab malware before it reaches the e-mail system. USDA also monitors the continuous and detailed real-time security reports provided by BPOS, Smith says. (For tips on implementing FOPE, go here.)
USDA required its cloud provider to meet FISMA's moderate impact data security level, which includes implementing security controls found in NIST ÂSpecial Publication 800-53 and a six-step risk management process in NIST Special ÂPublication 800-37.
The department also made sure that its overall security posture aligned with the goals outlined in the FedRAMP draft, Smith says. It helps that there are many synergies between the NIST special publications and FedRAMP, he added.
Microsoft houses USDA's services in a separate, dedicated infrastructure in secure facilities in the United States that use biometric access controls. "They have to secure the facility and secure the network so it meets our standards," Smith says.
Federal cloud users who say they have successfully reduced the cost of their applications by moving them to the cloud (Average savings: 22%)
NOAA Adopts a Spirit of Sharing
As a nod to FedRAMP, the National Oceanic and Atmospheric Administration is one agency that is already taking advantage of certification and accreditation work completed by another agency.
Needing to modernize its e-mail system, NOAA in June picked a public cloud service provider because it will deliver better service and save the agency money, says Larry Reed, director of the NOAA IT Security Office. The service provider had already passed FISMA requirements when GSA hired the vendor for its e-mail services, so NOAA didn't have to duplicate the work.
"We are using the GSA's work. There's no reason for us to repeat what they have done," Reed says. "But where we have unique requirements and risk management issues, we will focus our efforts on those areas."
NOAA, which provides everything from daily weather forecasts and severe storm warnings to fisheries management and coastal restoration, will migrate its 25,000 users to the vendor's cloud-based e-mail, calendaring and collaboration tools by year's end.
Respondents with security concerns who say their organizations's management does not trust the cloud's data security
The agency is addressing security through the normal risk management framework process as detailed in NIST Special Publication 800-37, Reed says. That includes annual testing of security controls to verify they are working as intended and using those results to make risk management decisions.
NOAA has also embarked on several other cloud pilots and projects, including a cloud-based emergency notification service and using a public cloud for web hosting. For example, the agency is making geographic information system data available to researchers and the general public over the public cloud.
One benefit of public web hosting is the ability to increase server resources as needed, Reed says. Securitywise, putting the information on the public cloud is low-risk because the data is public information, he says.
For extra security, however, the agency does keep a master copy in its own data center. That way, if there are any problems with the cloud service, an original copy of the data is secure within NOAA.
To read about what's next for OJP's cloud effort, go to fedtechmagazine.com/
"It's a way to minimize risk because we have secured the data, and we're not worried about losing information or having it corrupted in the cloud," he says. "However, we have to be careful with how we present NOAA data to the public; we want to make sure it is accurate and true."
Overall, cloud security will always be a concern. But through FISMA, the NIST and FedRAMP guidelines, the government is providing pointers on how to secure cloud infrastructure. As long as agencies deploy security processes, procedures and best practices, they can secure their systems in the cloud, Reed says.
USDA's Smith agrees. "My strong belief is that you can secure the cloud as long as you set standards for commercial purveyors in the public cloud and monitor them," he says. "We assess, document and monitor them on an ongoing basis, so I'm confident that it is as secure as it needs to be."