The Many Faces of Cloud
It takes the defense industry years to design and build a new tank or jet fighter. But when it comes to cloud computing, the Defense Information Systems Agency has rolled out services faster than most of its civilian counterparts and with a speed that rivals the private sector.
DISA, which provides IT support for the Defense Department, built its own private cloud called Rapid Access Computing Environment in 2008. RACE lets Defense agencies rent servers and storage space so they can test and deploy new applications quickly and affordably without having to buy and install their own hardware.
DISA continues to embrace cloud computing. It has built Forge.mil, a cloud service for software developers to access development tools, share software components and collaborate. The agency offers real-time communication and collaboration tools over the cloud, including web conferencing, blogging and wiki software. Most recently, DISA announced a pilot to make file storage and Microsoft Office applications available to employees as hosted software.
“We’ve been trying to move our data and services to the cloud because it promotes sharing of information and it promotes mobility,” says DISA Chief Technology Officer David Mihelcic. “A user can move from one base to another without worrying about local files. They point to a URL and access the services they need.”
As the government works to develop a detailed cloud computing strategy that includes guidance on standards and security requirements, early adopters such as DISA and NASA are already reaping the cloud’s benefits. The two agencies have built private clouds, while other agencies have turned to the external clouds for hosted applications and services.
Cloud computing is critical to the Obama administration’s IT modernization effort. The White House has directed agencies to consolidate data centers, increase hardware utilization and reduce energy consumption. The ultimate goal is a more efficient IT infrastructure, improved services and cost savings.
“The attraction to a cloud-based service is it’s fast and cheap, and if you need to scale, the cloud is a lot more elastic,” says John Sloan, lead analyst of Info-Tech Research Group.
DISA estimates that its Forge.mil software development cloud service saves between $200,000 to $500,000 per project and $15 million in cost avoidance.
DISA Speeds Software Development
Instead of having to buy and install their own hardware, a process that can take weeks or months, all Defense organizations can now take advantage of DISA’s private cloud and gain access to virtual machines and provisioned storage within 24 hours.
RACE, built with HP servers, VMware virtualization software and Hitachi Data Systems storage equipment, was initially created for testing and developing Windows and Linux applications. But in 2009, DISA made the service available for production systems as well.
Through a self-service portal, military IT staffers can provision as many VMs and as much storage as they need. They can increase or decrease computing resources as requirements change, and when their project is complete, they can cancel the service. It’s a pay-as-you-go model — DISA charges organizations only for what they use, which saves everyone money, says Alfred Rivera, DISA’s director of computing services.
“You basically request the OS stack, memory and storage that you require, and over e-mail, we provide you the ID for login and virtual private network access, and you can start developing,” Rivera says.
DISA’s private cloud is a natural extension of the net-centric approach the agency has taken since it launched its classified and unclassified networks in the mid-1990s, Mihelcic says. Initially, RACE was available only for the Nonclassified IP Router Network (NIPRNet). But this October, DISA will make RACE available for use on the Secret IP Router Network (SIPRNet), where classified information and messages are exchanged.
“We can share data more effectively and build applications much more rapidly than before,” Mihelcic says.
During a typical week this summer, 170 users took advantage of RACE’s on-demand server and storage services, Rivera says. Hundreds of applications, including command and control systems and satellite programs, have been developed and tested on RACE.
To further accelerate application development, DISA in 2009 introduced a companion cloud service calledForge.mil, a site where developers can access open-source development tools, collaborate, and share and re-use software components. The service streamlines the creative process because developers don’t have to reinvent the wheel on certain components, such as security artifacts, Mihelcic says.
Forge.mil comes in two flavors: SoftwareForge for development in public and ProjectForge for those who need to develop in private. So far, 5,000 users have signed on to the service, resulting in more than 400 software projects. It’s been so successful, the General Services Administration may create a similar service, Forge.gov, for all agencies.
“The journey we are on is to make developers more rapid and agile and lower the barrier of entry to build software and make it operational on the DOD network,” Mihelcic says.
NASA Serves Up Cloud in a Container
If there’s one agency that should be in the clouds, it’s NASA.
The space agency in 2008 launched a $2 million private cloud pilot at Ames Research Center in Mountain View, Calif., providing servers, storage and high-speed network access as a hosted service to its researchers. The project, called Nebula, is now a NASA-wide program used for education, public outreach and mission support.
Rather than go through a lengthy procurement process, scientists can get the computing and storage services they need almost immediately from Nebula, says NASA CTO for IT Chris C. Kemp. It’s faster and cheaper, and they don’t have to worry about installation, maintenance and security. The cloud handles it for them.
“If you need 100 terabytes of storage and 1,000 cores of CPUs, the difference is six months versus six minutes,” Kemp says, comparing a NASA project buying its own equipment with provisioning data processing through Nebula.
Having access to the larger computing power of Nebula can accelerate research, Kemp adds. For example, scientists who buy small clusters of servers may take three months to process data. But with Nebula, they can provision 1,000 CPUs and have that same data processed in two or three days, he says.
Nebula, housed in a shipping container, is built with open-source software and hardware from several manufacturers, including Cisco Systems' Unified Computing System. It provides about 12,000 cores of processing power, 16 petabytes of storage and 10 Gigabit Ethernet network connections. Kemp standardized on direct-attached storage to keep costs down.
NASA built its cloud in a container because it allows for much higher density of equipment. Through water cooling, it takes less energy to cool the equipment than if it were built in a traditional data center, Kemp says. Containers are also more easily deployed. The agency can expand the cloud by replicating the equipment in additional containers.
NASA considered turning to commercial cloud providers but built its own cloud because of users’ unique needs, Kemp says. Its scientists work with vast amounts of data, require high-speed networks and a lot of bandwidth — things not always easy to obtain. Security was another reason. Unlike commercial vendors’ clouds, NASA’s cloud is compliant with the Federal Information Security Management Act.
With Nebula’s initial success, NASA is expanding its cloud this fall by deploying a second container at the Goddard Space Flight Center in Greenbelt, Md. The container at Ames will serve West Coast users, while Goddard will serve East Coast users. “We want to keep the computing and storage close to users, so they have faster access to data,” he says.
NASA’s research community will ultimately determine Nebula’s future, Kemp says. “If scientists adopt this model, there could be dozens of containers — easily.”
Going Public for the Public
This spring, the Recovery Accountability and Transparency Board (RATB) contracted with a third-party hosting service to host its Recovery.gov website, which provides details on how stimulus funds are distributed and used.
RATB, created by the American Recovery and Reinvestment Act of 2009, oversees stimulus spending and is in charge of identifying fraud, waste and abuse. The agency, which previously paid GSA to host the site, switched to a third-party cloud-hosting firm to save money, improve security and allow its staff to focus on oversight, says Michael Wood, Recovery.gov’s director.
“It allowed us to get out of the nuts and bolts of worrying about running the data center and allowed us to focus on what we are really about, and that’s putting up rich content on the website,” Wood says.
The move will save $754,800 total during fiscal 2010 and 2011. Security is improved because the host security platform works in conjunction with RATB’s own security systems, he says. Moving the site to the cloud also allowed the IT staff to redirect about $1 million in hardware and software expenses to oversight operations.
Most important, the move to the public cloud has worked well. Wood even gave the project the ultimate transparency compliment: “It’s been a positive experience. No one has noticed a difference.”