Cloud Computing: A Way to Reduce Risk?

Two Experts Offer Differing Views
Cloud Computing: A Way to Reduce Risk?

NIST's Ron Ross sees the cloud as a helpful tool in reducing the complexity of managing data and keeping it secure. But security expert Eugene Spafford of Purdue University has a different point of view.

See Also: Cybersecurity for the SMB: Steps to Improve Defenses on a Smaller Scale

"The cloud provides opportunities for organizations to make some decisions about the criticality and sensitivity of their data," Ross, who leads the National Institute of Standards and Technology's efforts to develop information risk guidance, says in the first part of a two-part joint interview with Information Security Media Group [transcript below].

Once that less important data is moved to the cloud, Ross contends, the remaining critical information is more manageable.

Spafford, a computer science professor at Purdue, sees issues that often aren't discussed in cloud computing conversations. "Too often, organizations [are] told that moving things to the cloud will be safer and cheaper, and cheaper as we know is always what tends to dominate these conversations and lead to new vulnerabilities," Spafford says.

"I agree with [Ross] that good security and risk management requires that you have an understanding of what your assets are, where they're located and where the vulnerabilities are on your defenses," he says.

"Moving things into a cloud environment may not reduce complexity," Spafford says. "It may add to it because you may not know then where your assets are, how they're protected and what the threats are to them."

In the interview:

  • Ross explains why complexity is the biggest information risk to organizations and how moving data to the cloud can mitigate it;
  • Spafford explains why moving data into the cloud could add, not reduce, risk organizations face;
  • Both security experts strive to identify areas of agreement between them.

Ross, a NIST fellow, serves as the architect of the risk-management framework that integrates the suite of NIST security standards and guidelines into a comprehensive enterprise security program. He leads NIST's Federal Information Security Management Act Implementation Project as well as the Joint Task Force Transformation Initiative Working Group, a partnership of NIST and the Defense Department, the intelligence community and the Committee on National Security Systems, to develop a unified information security framework for the federal government.

Spafford also serves as executive director of the Purdue Center for Education and Research in Information Assurance and Security. A leading expert in information security, Spafford has served on the Purdue computer science faculty since 1987. His research focuses on information security, computer crime investigation and information ethics.

Addressing Complexity Using the Cloud

ERIC CHABROW: Why is complexity the biggest IT security risk and how can the cloud help mitigate that risk?

RON ROSS: It's no doubt that the federal government today and many of our private sector organizations are building very complex information systems with lots of hardware, software and firmware applications. All of this complexity goes back to the basic principles of computer science. If you go back to some of our basic principles you have to really understand what you have, how it's all put together, what are the information flows and how do you enforce policies across all these complex entities. It just gets harder and harder the larger our digital footprint grows.

That's why we're hoping that some of the new computing paradigms, some of the new technologies provided by the cloud computing environments with lots of virtualization, can help us manage and reduce that complexity. The cloud provides opportunities for organizations to make some decisions about the criticality and sensitivity of their data. They have a lot of different date types in the federal government. There are lots of different missions and all of that data is important, but it's not all critical. Some data can be moved off to the public cloud. [It's] being able to decide where your critical assets are, what data is actually the most critical, and things you want to retain within your own organizational boundaries. If you can make those kinds of decisions and then move some of the other data, which is less important, into the public cloud where maybe your risk might be a little bit greater - and you can decide whatever your risk tolerance might be - that's at least a first step to try to reduce some of that complexity. What remains behind within your organization is more manageable.

Issues with the Cloud

EUGENE SPAFFORD: I definitely agree with Ron in that good security and good risk management requires that you have an understanding of what your assets are, where they're located, where the vulnerabilities are on your defenses, where the threats are and how to counter them. All of those are important in protecting those assets, and complexity makes that more difficult because it's much more difficult to keep in mind all the threats, vulnerabilities and where the assets are located. Moving things into a cloud environment may not reduce complexity. It may add to it because you may not know then where your assets are, how they're protected and what the threats are to them.

ROSS: I wouldn't disagree with that part. In fact, I guess it depends on what part of the problem we're looking at. What I'm suggesting is that even if you go to the public cloud - and there's going to be some degree of complexity in the public clouds - if you look at some of the major cloud providers they have lots of customers and there are lots of information technologies within those different cloud providers' environments.

However, what I'm saying is that type of information that you consider maybe less important, less critical, if you move that to the public cloud, even if there's some complexity out there in the cloud, it reduces the complexity on your side of the problems. That information that may be at greater risk - maybe not, depending on what your risk tolerance is - we use our FIPS 199 as our categorization standards so we can divide information up into one of three categories: low, moderate or high impact.

The FedRAMP program, for example, which is the Federal Risk and Authorization Management Program that's operated by GSA, is the program that's dealing with the cloud services for the federal government. They've developed specific requirements for low impact and moderate impact cloud systems. They haven't done it yet for the high impact because the data at the high-impact range - being much more critical - we define that as information that's so critical that it could have a severe catastrophic effect on a federal agency's mission or business operations if that information were compromised or there was a systems breach. But the low and the moderate data, we've got specific requirements defined for that now and our federal agencies are starting to move into the FedRAMP program and be able to move some of that data at the low and moderate ends off to the public clouds. I agree with what Spaf said, but it reduces complexity on our side of defense, not necessarily on the cloud provider side.

SPAFFORD: I think your remarks actually illustrate two of the problems that I have, particularly, when people are talking about the cloud being more secure or providing simplification. You qualify that in two ways. First of all, you said moving the low and moderate risks. You have to understand the criticality of your services and data sufficiently well that you can determine which ones you might move, and many organizations don't have that understanding or they aren't separable.

The second is the public cloud versus a private or a hybrid cloud, and too often inside organizations they're told that moving things to the cloud will be safer and cheaper, and cheaper as we know is always what tends to dominate these conversations and lead to new vulnerabilities. The generic "it will be less complex to move to the cloud" or "it will be safer in the cloud" is too often as far as decision makers bother to look. They don't look for the nuances that obviously you're aware of and that are documented.

Classifying Data Types

ROSS: Again, I don't disagree with that. I think the challenge for every organization, and Spaf as you just illustrated, is to really have a good handle on what's critical, what's important within the organization's mission space, and how they're supporting their customers. We assume that organizations can do that. It doesn't all get done down to the level of every specific data type because they're dealing with a lot of different data types.

One of the things that the feds are doing now - I'll give a shout to our NARA organization, the National Archives and Records Administration - they recently became the executive agent for the executive order for controls on classified information. They work with every federal agency in developing common data types for the federal government, and they're now in the process of assigning levels of protection to those different data types. That's going to help provide more of a structure. We have something in our special publication 800-60 which goes with our FIPS publication I just mentioned, where you categorize low, moderate and high. The NARA work is going to help provide more structure. Having that structure now is one thing, but actually using it and executing to make those decisions so you can take the next step in moving information to a different environment - that's another issue.

Having said that, there's a need - as Spaf said - to understand the nuances of the cloud. There are cost issues going to the cloud. It isn't always cheaper to go to the cloud because you've got to sometimes restructure some of your basic mission business processes to make them cloud-enabled, if you will, because it's more of an on-demand type of a service. There could be some greater cost initially moving to the cloud as you restructure those types of things. Part of that process - and this is tied back not just to cloud but also to enterprise architecture - is just going through that analysis where you're categorizing your different data types. You're looking at your different mission and business processes. You're trying to find consolidation, optimization and standardization opportunities within some of these very large infrastructures. Just going through that drill provides you an opportunity to understand a little more about how things are put together within that IT infrastructure today and where there are opportunities to reduce and manage that complexity, and when I say that I mean getting smaller in a good way, because we have a lot of redundancy; sometimes it's not always necessary.

It's not a trivial task, but I think its one that it's tractable. It's something that organizations can take on, and I think it's a good investment of their time, because long-term I think they're going to be saving a great deal of money if they can re-engineer some of these basic mission business processes for making them more cloud-enabled.

Low Visibility into Cloud Environments

SPAFFORD: I would add to this. Another aspect, looking at this re-engineering, is a lot of organizations don't know how to do that. They don't have good procedures and tools to do that effectively. The visibility into the cloud environment, not on the current cloud environment that we might be using out in a public arena, a public cloud, but where it will be in five or 10 years, significantly complicates making these decisions.

The way I mean that is, first of all, knowing exactly what the fault-tolerance measures are at some of these public cloud vendors, continuity of access, integrity of the data, maintaining the confidentiality against leakage, knowing what those measures are and knowing that they will be kept at the appropriate level for as long as our data and services reside there is a major challenge. The cloud providers doing this as a contractual service will have certainly service guarantees written into a contract, but for things that are very valuable there's a chance that the cloud provider can mess up, possibly even go out of business, and the recourse that's normally contractually available can be far less than the value of the services and the data, even being able to retrieve them. Having a cloud provider, for instance, go out of business, or have all of their disks confiscated by a law enforcement agency, is a contingency that's very difficult for many organizations to plan for or to think about particularly several years down the line. ...

One of the goals of the cloud is location independence. If some [cloud providers] are located across jurisdictional borders, it's now the case where ... the data that we're storing may be subject to laws that are different than what we expect because the data is going some place other than what we expect. We're already seeing that, in some respects, some multinational corporations that are bound by privacy laws in Europe, Canada and other places that are more stringent than in the U.S. do not want to use cloud providers in the U.S. because, once their data is located here, it doesn't have the same protections. It may be subject to search under national security letters or other kinds of issues. They have a concern. We haven't quite seen U.S. companies get that same visibility and awareness yet. I'm concerned here when people talk about the cloud that these issues are not brought out. ...

FedRAMP

CHABROW: The FedRAMP program, which the agencies use to vet providers, does it address some of the concerns that Spaf just mentioned?

ROSS: It does, actually. In fact, we talked about some of those issues. When you talk about complexity, I guess we have two different aspects of complexity. We have the complexity that's actually involved in building the infrastructure, and then there's the complexity of the operational side that Spaf just mentioned, where you've got laws, cross-jurisdiction, and you've got lots of things that come into play that can complicate the operation of that cloud as far as how it appears to the customer on the customer's side.

In the FedRAMP program, one of the things that we've done is use the controls in 800-53, one of our special publications. Those controls also go as deep as you would like. ... In FedRAMP, there are concerns that you can draw upon that require the provider to store the data within a U.S. facility, for example, or on U.S. territory. What we tell our customers is if you put that requirement in your RFP ... you may limit which cloud providers may respond, because it may be an onerous requirement for some providers and they may not bid on that contract. But it's a contractual relationship, and if it's really important to you to have that knowledge about where your data is stored, that's a contractual requirement you can put into the contract.

The other point Spaf brought up was a good one. We've talked about this for a long time - the contingency plan if something happens to the cloud providers. In our control catalog in our contingency planning family, we have a whole set of controls organizations typically are required to implement with regard to their continuity of operations contingency planning. When those go to the FedRAMP program, for example, the cloud provider now has to provide those contingency plans and it has to be consistent with what you would expect in your own contingency planning operations.

Some of the things that we have cautioned the federal agencies who are using FedRAMP about are that if a cloud provider goes bankrupt, as Spaf was saying before, and you have data in a cloud provider system at the moderate level. That's by definition - if that data is compromised - a serious potential adverse impact on your operations. With that knowledge, the only recourse you may have is to chase that provider through the court system because if they go bankrupt you're going to be with all the other folks lined up trying to get restitution, and it may take months and months and months if ever [before] that gets resolved. There has to be that consideration on the table if you're putting moderate level data into the cloud.

I think Spaf makes a good point that the visibility doesn't always come about for every organization that may be potentially thinking about going to the cloud. These are issues that we should be talking about. Then, when you go into that relationship and that contract with the cloud provider, you go with your eyes wide-opened. You understand what the implications are if that provider goes belly-up, if that facility has a fire or there's an earthquake and the cloud provider services are down. What's their back-up plan? There are ways to do that and do it in a manageable way with regard to your risk tolerance.

Scrutinizing Agreements Carefully

SPAFFORD: I would just add to that two additional things. The first is that organizations need to understand the size of business that they would be bringing to a cloud provider and how much negotiating room they actually have to get statements built into contracts, or service agreements. For many smaller organizations, they don't have the purchasing clout and they should read the agreements pretty carefully because the cloud providers may very significantly limit their liability or their service requirements, because they don't have to go out of their way. The government is certainly a different purchasing entity than many smaller businesses that are being sold the idea of moving to the cloud.

The second issue is that something that has been talked about and promoted as a benefit of the cloud to many smaller organizations is that the size of the cloud providers allows them to aggregate resources and apply better security or better back-up than smaller organizations can provide on their own. They can employ full-time people who know more about issues or they can license security software that's better. It's really important that organizations capture guarantees on that, because many of the cloud providers will make claims that it's better but they aren't actually putting those services in place. They're saying it's really up to the customers to protect their own data and that they're just providing the spinning real estate for disc or the virtual CPUs for the services or platforms. It's a visibility and awareness issue.

The cloud has great potential for small businesses that can't afford a big investment in their own platforms and services, but they have to be able to approach it with their eyes wide open and understand the trade-offs. Everything that I've seen at conferences and promotional literature, not everybody is being honest about this or giving it enough detail, particularly for those who aren't versed in security.

ROSS: I would agree with that, and I think that's one of the things in the FedRAMP Program we've tried to address. The General Services Administration and the FedRAMP program - NIST has helped them build this third-party independent assessment organization. They're actually accredited to organizations under international standards so they can go into a cloud provider, and if that provider has made a claim that they have put in the NIST security controls at the lower and the moderate impact levels, those controls actually get assessed for effectiveness by this independent organization. The cloud provider actually hires those folks to come in. They pick the organization that's been accredited. That addresses maybe not all of Spaf's concerns, but some of his concerns, about making sure the cloud providers are actually doing what they say they're going to do.

Now, you can go control by control and that assessment will give you evidence that will help you make that final risk-based decision. Certainly, cloud providers are no different than the Feds and their systems. There are going to be controls that you thought went into the system that never made it. There's going to be some controls that went in and they're not quite as effective as you thought because there was a problem with implementation. All of these things that we've been discovering in our systems for years, now that visibility will hopefully be there for the cloud providers as well.

This is not a "gotcha" drill, as much as it is providing the Feds with good information so they can make good risk-based decisions. For smaller organizations that don't have that ability to have the large assessments or the big investments for the controls, the FedRAMP program really can provide a great value for some of the smaller organizations or even mid-size organizations that can leverage some of the work that's done that all the agencies could take advantage of.


About the Author

Jeffrey Roman

Jeffrey Roman

News Writer, ISMG

Roman is the former News Writer for Information Security Media Group. Having worked for multiple publications at The College of New Jersey, including the College's newspaper "The Signal" and alumni magazine, Roman has experience in journalism, copy editing and communications.




Around the Network

Our website uses cookies. Cookies enable us to provide the best experience possible and help us understand how visitors use our website. By browsing govinfosecurity.com, you agree to our use of cookies.