FedTech Interview: Dawn Leaf

Senior Executive for Cloud Computing, NIST

Cloud computing is definitely a game changer, says Dawn Leaf.

For agencies to take advantage of the technology will require work on their part, on industry’s part and by organizations such as the National Institute of Standards and Technology. At NIST, Leaf is leading efforts to create a roadmap to speed standards development and identify the best ways to overcome common hurdles.

Leaf chatted with FedTech Managing Editor Vanessa Jo Roberts about NIST’s Cloud Computing Program and Leaf’s perspective on what to expect next.

FedTech: NIST has launched a multi-pronged cloud effort at the behest of federal CIO Vivek Kundra and is working with the CIO Council. Could you explain the key projects under way at NIST?

Leaf: NIST has been charged with a technical leadership role in advancing interoperability, portability and security standards and guidance to support agencies in secure and effective use of cloud computing. Our goal is also to more broadly support the advancement of cloud technology through our work and collaboration.

There are two parts to the NIST Cloud Computing Program: the strategic piece and the tactical piece.

First, to look at the strategic part of the program, NIST spent May through November of last year developing a strategy for cloud computing — our role in cloud computing.

We completed fact-finding with industry, standards organizations, academia, and federal, state and local government agencies, as well as the international community. And then, based on that input, in consultation with the federal CIO community, we defined our strategy. Basically, it’s a NIST-led effort to collaboratively develop a U.S. Government Cloud Computing Standards Roadmap.

“Roadmap” is a term that’s often used. For us, the goal is to develop a prioritized list of interoperability, security and portability standards, guidance, R&D, pilots, prototypes and policy decisions that are needed to facilitate agencies in their secure and effective application of cloud — in other words, it’s our job to advance cloud work in the area of standards and guidance.

FedTech: … give agencies a helping hand, make migration to cloud possible.

Leaf: Exactly. The roadmap then is the strategy for standards work. It will be directly used to drive and focus scarce resources on those cloud computing technology requirements that are the most critical, from the agency perspective, to support their mission.

It’s such a big space, and we are trying to narrow the focus to make sure that NIST is not only supporting agency efforts effectively but that we are really focusing our resources on the highest priorities from an operational perspective.

The strategy has three steps, and the steps are performed in parallel — iteratively and then incrementally. It’s a cycle, and each step is associated with a project and then a voluntary working group that’s open to the public.

So, Step 1 is defining target U.S. government business-use cases.

Step 2 is defining a neutral reference architecture, and that means a reference architecture for cloud computing that is not aligned with or constrained to a particular vendor’s product or service or implementation.

And then in Step 3, we bring these two together. We will use the reference architecture as a logical and physical model to frame the discussion about government-use cases to help understand how they can be implemented and supported using cloud services.

Agencies can then identify the gaps or, in other words, the requirements that are needed to make this happen. The list of priorities is the actual roadmap. So we will be exercising these target use cases against this concept of cloud computing and then figuring out where the holes are, where the government needs a stand­ard or needs guidance to go forward.

FedTech: And these steps will align with an agency’s broader enterprise architecture?

Leaf: Absolutely. In fact, Step 1 from our perspective is really owned by the agencies — we are the facilitators. We will help bring the perspectives in the projects of the different agencies together, but obviously they own their mission space and they are the ones who best understand each use case.

We kicked off the strategy in November at the NIST Cloud Computing Forum and Workshop II. We publicly presented it to see what sort of reception it would get; we got a very positive reception across the board. Over the ensuing six weeks, we held one-on-one meetings with specific agencies to further define how the process would work and to gather their opinions and ideas on how exactly we would do this.

We have initiated the working groups, launched a collaboration site and completed some work already. By the end of March, we want to finalize the detailed plan based on that collaborative effort. We are holding another forum and workshop on April 7 and 8, and then will ramp up execution.

The goal is to have an initial draft roadmap completed by the end of fiscal 2011.

FedTech: Which is pretty quick.

Leaf: It is, and that’s where that iterative incremental step comes in. You have to start, but then you have a milestone to

assess how well it’s working and then to change what you are doing. That covers the strategy piece.

Then, we have the tactical projects. There are two in particular. One is SAJACC — that’s the Standards Acceleration to Jumpstart the Adoption of Cloud Computing. SAJACC targets that interim period between when a technology model such as cloud computing is emerging and when standards are formalized, which is sometime later.

The goal of SAJACC is to reduce uncertainty during that interim period by defining interoperability, security and portability requirements and developing tasks. The tasks, when executed, measure the extent to which candidate interface specifications — the ones that are not yet formalized standards — satisfy the requirements.

FedTech: This will help you meet the acceleration goal that you have set.

Leaf: Right. So, if I can tie it to the strategy a little bit: Let’s say, for example, that there is a hypothetical business-use case for the Census Bureau to use cloud services to set up their telework program. In that strategic process, we would go through scenarios to figure out what the different options are in cloud services using the reference architecture to make that business-use case work.

One requirement that might emerge, and this is also hypothetical, is that the Census Bureau would need to be sure that if they started using services from one cloud provider, that when they put data into the provider’s environment that they could take it out and move to a different provider. That’s the portability piece.

What you would expect is that that strategic business case would then drive the requirements in a tactical SAJACC program. One of the reasons I mention that is because we are using the term “use case” in two different ways: Use case is just a methodology, an engineering term for defining requirements using scenarios. In the strategy, the use case is a target business-use case — at a very high level. But SAJACC uses really low-level generic technical use cases that could apply to many agencies’ business cases. It’s the same methodology; we are just using it two different ways, and there are two different sets of use cases.

FedTech: This hypothetical portability example at Census illustrates something every agency needs, right?

Leaf: Exactly. But the reason you need to do the business cases is that in executing that strategic piece, you may identify generic cases that you wouldn’t have thought of otherwise. There is broad community agreement generally as to what these generic low-level SAJACC cases are, but we don’t necessarily know which ones are most important.

FedTech: How to prioritize them ...

Leaf: Exactly. The other tactical effort is the Koala Project, which is a complex computing and simulation model. The goal is to study resource allocation algorithms that infrastructure-as-a-service providers would execute in their various solutions. We want to be able to characterize the behavior in a cloud environment when you need to expand resource capacity.

FedTech: What do you view as the most advantageous benefits agencies can expect from migrating to the cloud?

Leaf: The bottom line — and it is a commonly held view — is that by providing IT resources as a utility type of service, agencies have the opportunity to drive down IT costs. Agencies also will be able to respond more quickly when they have new mission requirements or change mission requirements.

If you take that down a level, what the cloud computing model really does is extend the ability that we have today of leveraging physical hardware and software capacity to support multiple uses.

Basically, you can support more computing systems and more users with less physical hardware and software resources, and you can get to the services more quickly and more easily.

One way to differentiate between cloud and more traditional data center resources and externally hosted resources is to think about each of the unique characteristics of cloud, one by one.

One characteristic is the ability of the consumer to initiate service demands. That means they can get to the service quicker than in traditional procurement models. Even when you outsource, you have to use a longer procurement lead time to get to those services.

The broad access through commonly available network and end-user devices again makes it easier to get to the IT services — you are not locked into a particular set of end devices, and you have many, many network options.

The multitenancy aspect, and the ability to respond to elastic demand, allows the consumer to only use and pay for the services that they need without having to invest in reserve capacity.

“I think cloud computing is a game changer, and it’s a changer in the sense that it will allow us to improve in ways that we can’t envision now.”

And then the metered and measured characteristic gives you a basis for determining what the capacity is that you have used so that you can manage it and pay for it.

Those particular characteristics on top of the traditional model are what really give agencies the extra ability to leverage their resources, which is critical when they will have fewer resources to support more needs.

FedTech: Based on the timelines for your projects, when do you expect to see broader use and expansion of cloud?

Leaf: We expect that agencies will initially support the adoption of cloud computing using the private delivery model because there is less uncertainty there. It gets back to the physical control and physical security of the infrastructure.

The logical course of action is to expand from a private-cloud model to a hybrid or community model as agencies learn more, as there is more experience and more of an experienced user base.

A natural time to consider use of cloud is either when you are doing a technology refresh or when you have to make a new investment. That piece is the same. You always consider your next approach in computing services when you have a driver for change — either a new requirement or you need to reinvest.

It’s also natural to focus first on infrastructure because there’s the most known about infrastructure.

FedTech: You have had a long career in the government, and before that you worked on several government projects as a contractor. When you look at the things that have happened in technology in the government, how do you rank cloud relative to other changes?

Leaf: I do agree with the consensus that cloud computing is a transformational technology model.

Even if you consider the case of applying the cloud computing model only internally to improve return on investment, there is tremendous potential. It’s because we have a great and growing dependency on IT to support all of our missions. The flexibility we get from it — not just the cost reduction — will allow agencies to improve their ability to respond quickly to mission requirements.

I think cloud computing is a game changer, and it’s a changer in the sense that it will allow us to improve in ways that we can’t envision now because the technology is not fully defined.

Innovation takes place in response to operational problems; then you innovate and then you have new capabilities.

The cloud model is here; it’s not going away.

FedTech: What are some of the key challenges agencies will face?

Leaf: One set of the hurdles is not new: We do have security, privacy and data ownership issues and policy questions that are related to the potential separation of data from its physical source of origin. These hurdles emerged even with the development of Internet access and distributed systems.

FedTech: Virtualization and shared distributed systems.

Leaf: Exactly. But these issues have gotten a lot more attention now because of the focus on cloud computing. If you aren’t close to technology in general, it may appear or sound like these challenges are new just because of the phrase “cloud computing.”

That being said, there are a set of new technical and policy questions and issues that are emerging because we are using the IT resources differently in cloud computing. There are new boundaries besides the model of just data center outsourcing and subscriber services.

A few topics get the most attention and justifiably so.

One area, of course, is security. I think the real question is, how do we change the way we protect the data and the systems from the way we did it when they were physically stored and running on a single dedicated set of hardware and at a dedicated location?

And, because the model that’s emerging is one in which many of the cloud services are provided by a relatively small set of providers that use proprietary solutions, how do we ensure that we can switch providers easily? That’s portability.

Then, given the dependence of the cloud model on dynamically making use of services from different sources, how do we make sure that these services work together? That’s interoperability.

Another hurdle I would include is the fact that cloud computing is still emerging, so there are a lot of unknowns. This is true when there is any new technology model. Even though cloud computing is based on a foundation of existing technologies, there isn’t a large installed base compared to the total level of IT.

FedTech: That’s particularly true in the federal government, where some agencies have massive infrastructures and data processing capability.

Leaf: That’s a very valid related point and not only for the government but for commercial organizations too. It’s much, much easier for very small startup firms using new services to make decisions to leverage commodity type IT resources through a cloud provider than a large established commercial organization that has mission specific legacy systems requirements.

FedTech: Legacy systems have always been an issue for any kind of paradigm shift of technology.

Leaf: It’s generally accepted that everyone uses cloud services to some extent. If you use free online social media or access free applications through the Internet, you are by definition using some that are delivered through a cloud model. And while that may be true, it’s a relatively narrow set of applications.

The majority of IT services are still provided in traditional data centers, whether for the federal government or for business, and even in cases where the services are outsourced. So translating the use of cloud services cost effectively for legacy is something that we have to look at very carefully because they are not necessarily like the common commodity type applications makes use of a very common and narrow set of configuration requirements.

Going back to that example with Census: If you were to implement telework in cloud, then you need to think about all the small legacy applications that might be distributed throughout Census where individuals or organizations have customized configuration requirements. Even though you can translate those to cloud services, you need to define an approach for migrating or replacing legacy applications that maintains the cloud model cost advantages.

FedTech: Let’s touch back again on security.

Leaf: One main focus in cloud computing security is the trade-off of control between the consumer and provider. Another aspect is the potential abstraction of the physical location, of the data and the systems. Then, you have additional trade-offs depending on whether the delivery model is public, private, hybrid or community and whether the service level is infrastructure, platform or software.

Again, the risks in general are not new as a result of cloud computing when you compare it to distributed computing and outsourced computing, but they are may be more complex, and they are more visible because of the focus of the attention on the cloud. A broader issue and challenge — regardless of whether the model is cloud or traditional outsourcing — is to evolve past security reliance on auditing and physical location and boundaries to develop technology for data protection and systems protection regardless of the location.

FedTech: In the past few years, the move has been toward data protection. Plus, continuous monitoring will probably come into play because it looks at the environment more dynamically.

Leaf: Yes, these are both trends. The way continuous monitoring works is that it uses automatically collected information about the configuration and the security measures and events that take place, and then constantly provides a reading on that or calculates an assessment of the security posture. And of course that’s where the potential physical abstraction comes into play in cloud. Even in a cloud computing model, continuous monitoring relies on getting accurate measurements from the hardware and software components where the data, software and systems are deployed.

What you will see — not only in the federal government but for commercial service providers, too — is the creation of private cloud services where the physical boundaries and controls are still in place even though the cloud model characteristics apply, such as elastic demand dynamic response and end-user-initiated demand.

What happens currently is a commercial vendor sells computing services to the government or to another user using the cloud model, thereby making it a publicly available service, but their actual implementation is still a protected private cloud. What has to happen to move past this model to a true public interoperable cloud is technology development that provides the ability to protect data regardless of its physical location for the owner. This isn’t a case where the commercial environment has implemented it and the government has not; it’s that that whole technology base needs to move.

FedTech: Are there also issues of bandwidth and other infrastructure components for agencies as well?

Leaf: There are. There are building-block technologies that are the foundation of cloud: These are virtualization, network capacity and the security infrastructure. All of this is, of course, provided by industry.

One thing that’s an interesting point is that the agencies need these foundation technologies and capacities regardless of whether it’s a cloud model or not. It’s just that the way they access or use them may change.

So to your point, if you look at the Defense Department example — and again, this is hypothetical — if they wanted to make a broad change to use cloud computing services, they have such a large geographically dispersed client base that to use cloud services, regardless of a public or private cloud model, they need to ensure that telecommunications performance supports their entire user base or to identify the exceptions as part of their strategy.

What we are really focusing on is that we are expanding the boundaries with cloud. It’s the same issues that we have already dealt with in data centers and outsourcing. But now, we are bumping up on the edges of the boundaries with cloud. It’s the same thing with security measures and potential performance issues or latency.

When we dynamically reassign resources — going back to the NIST Koala model on resource allocations between virtual machines — we need to look at how that’s going to behave in a new cloud model.

FedTech: What about internal collaboration within government — how crucial is that, and how do you see that working so far?

Leaf: We are seeing very strong collaboration, not only in the area of cloud computing but in cybersecurity and other areas. The reason that that’s important is that there is a lot of commonality in the requirements.

Even though there are some mission-specific and mission-critical requirements that each agency understands best, there is also a lot of commonality. What we are seeing is a real willingness to work collaboratively. The reason why collaboration works is that there is a fundamental benefit to applying the different strengths of the agencies.

We all have very scarce and limited resources and when you look at something complex like cloud, the devil is in the details. You really do need to look at those specific agency implementations, and it just makes sense to leverage our knowledge and our resources together and to do that with industry toward a common objective.

FedTech: The little guys may not have enough money to go it alone, but if they can ride with the bigger players, they have the same opportunities.

Leaf: That’s true — and that’s actually true for industry and standards development organizations and nation states, as well. Of course, NIST is very sensitive to the role of being neutral and supporting a level playing field — by sharing this information publicly and broadly, you really do go a long way towards leveling the playing field for the small players.

FedTech: And there is the driver of data center consolidation. That’s a huge push.

Leaf: It’s a complementary model of cloud computing. Cloud just takes it several steps further.

FedTech: But in terms of drivers, from the administration’s point of view, that obviously would be a factor.

Leaf: It is. From a cost perspective alone, it’s generally recognized that IT services are becoming more and more a part of each agency’s mission. Many agencies have a relatively narrow capital investment scope. In some cases all they may have in terms of significant investment is IT and people. That means when they are trying to drive down costs, they either have to focus on IT or they have to focus on people. The obvious goal is to reduce IT cost if possible.