While the IC’s research organization looks into adding security to cloud environments, in the here and now, intelligence agencies are sharing more data.
Design, reliability and screaming speed. When you think Air Force, these three things typically would be synonymous with jets. But they also drove the thinking of the Hill Air Force Base systems staff when they set out to upgrade the Air Force Materiel Command’s data center operations.
The command at the northern Utah base repairs and maintains F-16 and A-10 jet aircraft and intercontinental ballistic missiles. The mission: Keep these aircraft and missiles ever-ready for war. To do their jobs, military personnel rely on 80 applications supported by Oracle databases. Although the apps worked fine, in recent years, server sluggishness and downtime had become a problem, says Mike Jolley, chief of the Operational Policy Branch and program manager for the command’s computer center.
The aging infrastructure — 42 antiquated servers strewn across six buildings — needed a major overhaul. The 3- to 5-year-old servers, running Hewlett-Packard OpenVMS, Microsoft Windows and multiple flavors of Unix, crashed frequently, causing outages that lasted as long as four hours. In fact, when the information technology department analyzed data services over a three-month period, the servers crashed six times, recalls Doug Babb, chief architect of the base’s new data center systems design.
“It was hard to manage, unstable and highly volatile,” he says. “With our increasing server and storage demands, it would have been impossible to sustain the environment.”
This spring, the base’s IT team completed rollout of a state-of-the-art data center with new servers, new storage devices and a comprehensive data backup and recovery environment that gives the base the uptime, reliability, server power and storage capacity it needs.
The 11-month-long project serves as a model for server consolidation and disaster recovery planning that other federal agencies can emulate, says Chuck Horton, EMC’s client solutions manager for the Air Force. “They are trendsetters and have created a solution that can be repeated elsewhere,” Horton adds.
To build its new data center, dubbed Project BonFire, Hill Air Force Base’s IT department took a three-pronged approach:
The base replaced its 42 existing servers with 24 new x86 servers, resulting in massive cost savings, Babb says. “The x86 servers represent a small fraction of the cost, and [they’re] outperforming the legacy environment.”
In addition, the base was able to buy equipment to support new needs. It bought backup servers for continuity of operations and servers for a new testing environment that lets the IT staff try security patches and new applications before putting them in production. In the past, when problems arose, the IT team had to quickly make fixes on the fly and run them in controlled live settings before pushing programs out to all users.
Although the price was right, there were other motivating factors for consolidating servers, says Jolley. Previously, many of the Oracle database applications ran on standalone servers with direct-attached storage. “We had a lot of stovepipe systems,” Jolley says.
A second issue was disaster recovery. Although it backed its data up to tape, the center had no redundancy and was unable to restore services immediately — or even quickly sometimes — when hardware faltered.
The IT department built the new data center using a mix of HP ProLiant DL585 and DL385 servers, which have dual-core AMD Opteron processors and up to 32 gigabytes of RAM each. The memory capacity and the move from 32-bit to 64-bit chips lets each server handle 10 times more users, Babb says. Previously, each server could support between 200 and 400 users; now each server can handle up to 4,000 users.
To make its servers easy to manage and more affordable, the base standardized on one operating system, Red Hat Enterprise Linux. The base also built storage area networks and network-attached storage systems around EMC storage systems and a Brocade Communication Systems Fibre Channel network.
For continuity of operations planning, the IT team built a second data center, made up of seven servers, that replicates all the data from the primary data center. If a disaster takes down the primary center, the shift from one data hub to the other would be almost transparent to users. For added measure, the base also backs up data to tape nightly.
To ensure availability of applications and data, the IT department turned to grid computing. “Gridding allowed us to have multiple layers of redundancy,” Babb says. By pooling servers and storage devices into one big system, the center can handle complex computing tasks by spreading the workload across multiple machines, he says.
With Oracle Database 10G, Hill Air Force Base now runs its applications across the network of servers. The servers share the processing load, and if one fails, the other picks up the slack and keeps apps running.
The grid approach also improves load balance. If one app needs more processing power, it can borrow the CPU power from other servers. Before, Hill had widely fluctuating use capacities, Babb says. Some servers were overtaxed at 80 percent utilization, while others were underused at 3 percent utilization. Now, server utilization is about 40 percent across the grid, he says, leaving room for growth.
To store and protect the base’s 100 terabytes of data, the IT department also put in place a tiered storage strategy when it upgraded the data center. The most important data resides on high-end disk storage devices; less-crucial data has been moved to lower-cost disk devices. A third tier of storage systems handles data backup and archiving. Storage management software helps automate the flow of files between the different tiers.
“To improve total cost of ownership, you only use high-performance storage where you need it,” Babb says. “Tier 1 data has the highest performance and uptime characteristics, while in Tier 2, if applications go down for two hours, it wouldn’t hurt as bad.”
The base uses EMC Symmetrix DMX storage for first tier, which houses 8TB of data. For the second tier, the center stores 50TB using grid-enabled SANs built around EMC CLARiiON CX700s and CX300 storage. This tier also maintains backup copies of Tier 1 data. “In case of a catastrophe, we can recover the data within minutes,” Babb says.
Tier 3 servers at the backup center mostly support data backup and long-term archival needs. For this tier, which houses 42TB of data, the base uses EMC CLARiiON CX300s and tape libraries for data backup and EMC Centera Content Addressed Storage Systems for data archiving.
Now that the main effort is complete, Hill Air Force Base’s IT department is busy making improvements to the new data center.
For starters, Babb says, the IT team plans to bolster its backup technology by converting to disk-based virtual tape libraries that will give users faster access to offline data. The VTL systems will sit in front of the tape libraries after Hill moves to disk-to-disk-to-tape.
Additionally, Hill wants to move its second center out of state to better protect the data. The IT department is also investigating ways to use blade servers and VMware virtualization software to further consolidate servers.
With its central data center servers upgraded, the IT department has also embarked on an effort to consolidate about 400 file servers in use across the base.
For this effort, Babb and Jolley will use EMC Rainfinity Global File Virtualization software to create a unified view of the file servers and improve load balancing across these systems.
The success of Project BonFire makes these further data management improvements appealing, Babb says. “It’s just amazing the reliability, performance, savings and the reduced time to value that we get from consolidation.”