FISMA Reform Without Reforming FISMA

New NIST Guidance Advances IT Security Without New Law
FISMA Reform Without Reforming FISMA
Must Congress reform the 7-year-old Federal Information Security Management Act to get government agencies to move away from paper compliance to real-time monitoring of digital assets to show their IT systems are secure? Not necessarily, says one of the leading computer scientists at the National Institute of Standards and Technology.

Among the top goals of FISMA reform legislation before Congress is to codify practices agencies must follow to measure IT security in real time, and not annually or triennially as is the current practice.

But Ron Ross, NIST senior computer scientist and FISMA implementation project leader, says a draft of revised guidance from NIST - known as Special Publication 800-37 - shows that a move to real-time metrics doesn't require new laws. A collaborative effort involving NIST, Defense Department, Office of the Director of National Intelligence and the Committee on National Security Systems is behind the new guidance that should change the way federal agencies and other organizations secure their IT systems without Congress telling them to do so.

"Legislation will come and legislation will go," Ross said in an interview with (transcript below). "We are making fundamental changes on the ground here that will significantly impact our federal agencies ability to protect their systems."

The release of Guide for Applying the Risk Management Framework to Federal Information Systems, a Security Lifecycle Approach represents a sea-change in the way on the way federal agencies determine the safety of their digital assets.

"Everybody is talking about are the continuous monitoring aspects (of the guidance), and this really reflects the significant uptick in the threats and the types of attacks that we have seen grow almost exponentially over the past couple of years," Ross said. "The adversary, they are launching more attacks, they are more sophisticated, and we have to have the tools, techniques and the types of technologies available and deploy those the appropriate strategies and tactics to really make a difference in helping defend our systems."

In the interview, Ross also addresses how:

  • The six-step risk management framework aimed at building security into new technology can be employed to minimize risk in legacy systems.
  • A three-year collaboration with information security experts from the military, intelligence agencies and the private sector to create common guidance.
  • Architecting security into the beginning of the creation of a system will alleviate problems in the future.

Ross was interviewed by managing editor Eric Chabrow.

Ross also supports the State Department in the international outreach program for information security and critical infrastructure protection. He previously served as the director of the National Information Assurance Partnership, a joint activity of NIST and the National Security Agency.

A graduate of the United States Military Academy at West Point, Ross served in a variety of leadership and technical positions during his 20-year career in the Army. While assigned to the National Security Agency, he received the Scientific Achievement Award for his work on an interagency national security project and was awarded the Defense Superior Service Medal upon his departure from the agency. He's a two-time recipient of the Federal 100 award for his leadership and technical contributions to critical information security projects affecting the federal government. During his military career, Ross served as a White House aide and as a senior technical advisor to the Department of the Army.

Last month, the Information Systems Security Association named Ross its distinguished fellow, the group's highest tribute, for his leadership in the development of influential information security documents.

Ross is a graduate of the Program Management School at the Defense Systems Management College and holds a master and Ph.D. in computer science from the United States Naval Postgraduate School.

ERIC CHABROW: First off, why is the revision of SP800-37 significant?

RONN ROSS: There are a lot of reasons. I think the obvious ones that everybody is talking about are the continuous monitoring aspects, and this really reflects the significant uptick in the threats and the types of attacks that we have seen grow almost exponentially over the past couple of years. The adversaries, they are launching more attacks, they are more sophisticated and we have to have the tools, techniques and the types of technologies available and deploy those the appropriate strategies and tactics to really make a difference in helping defend our systems.

So the new 37 is intended to recast the previous C&A process, that is certification and accreditation, that we have been using for decades just to reflect the up tempo of the kinds of threats base that we operate in today.

CHABROW: There is a lot of talk in Congress about reforming FISMA, but doesn't the revised 800-37 demonstrate the fact that reforms in information security can occur even without a new law?

ROSS: Well that is very true. I mean legislation will come and legislation will go, and obviously at NIST we are charged with implementing legislation when the call on us to do so. But we have been working at the grass roots level for the past three years to do some very important things. The 800-37 is just the second of five documents that we are working on with our partners in the Defense Department and the Office of Director of National Intelligence in association with the Committee on National Security Systems, and these publications, the first one of which we produced in August of this year, the Special Publication 800-53, Revision 3, which really unified all of our security requirements and controls in one catalog that all federal agencies can now use.

So the ability to make change is happening at the grass roots level. The 37 document again, part of the unified framework and part of the partnership with the DNI, the DOD, we are making fundamental changes on the ground here that will significantly impact our federal agencies ability to protect their systems and we are hoping that the private sector will also choose to adopt some of these standards and guidelines on a voluntary basis to help protect whatever missions or business operations they have going as well because they are actually subject to the same types of attacks that we are.

CHABROW: I was going to get to it a little later, but let's talk a little about this: this publication was developed in cooperation with the Joint Task Force Transformation Initiative, which is, I guess, the group you were just referring to. You are member of that interagency working group, so tell us about the initiative and why is this type of collaboration important in developing guidance to secure federal government IT assets?

ROSS: It is a recognition of some very important things that have occurred over the past five to 10 years, maybe more in the last five years. There has been kind of a blending of the national security interests and those on the economic side. When I first came into the business, probably 20 years ago, there was a very bright line between what were considered national security things and what were on the other side of that divide, so to speak. But when you talk about the great dependence that we have on information technology today, not just at the federal level but state and local governments, private sector organizations, that dependency and all of us using pretty much the same commercial off the shelf products to carry out our missions, build those systems and help support those missions, that commonality rally drove us to a point where we had to collaborate.

We have found that most of the things that we were doing, whether it was on the Defense Department side or on the intelligence community side, and the NIST civil side, we were doing about 95 percent of those things in common and they were only a small number of things where we really diverged with regard to the protection of these systems.

I think having a unified framework shows leadership number one; it presents a unified approach to all of our support contractors so a contractor is not developing three different types of solutions for the intelligence folks, the Defense Department and then the NIST civil side and our customer base. It gives us a unified structure in how to deal with some of these very persistent and advanced cyber threats. It is the right thing to do. It will make us more cost effective, give us better solutions, and I think it will provide a much better foundation as we go forward in the future.

CHABROW: Back to SP800-37, it promotes what is called "near real-time" risk management and continuing information systems authorizations through the implementation of robust continuous monitoring. What does "near real-time" mean?

ROSS: Well, near real time is a recognition that it is very difficult to do some of the types of things that we know we have to do with security in a real-time basis. It is an attempt to say let's monitor on a much more frequent basis than we have in the past, but recognize that to get to a real-time stage may not be achievable. So the definition of near real-time is a way to kind of increase the up tempo of how we are monitoring our key controls and our ability to take action on what we find in a more timely manner.

But the bigger part of the 800-37 that sometimes goes unnoticed is we are trying to also change the focus from what I call "back-end security" to what is commonly referred to as "front-end security." What that means is that we spend a whole lot of time chasing vulnerabilities and trying to fix things after these systems are deployed and we don't spend nearly enough time on building the right types of products and engineering those products into more secure systems at the front-end of the process.

So the new 37, by using the six-step risk management framework, which goes through the lifecycle approach, we are now able to put as much emphasis on defining good sets of requirements and controls up front and making sure those are implemented correctly and they are operating as they are supposed to, and then going and finding out after some kind of testing and evaluation process how effective those controls really are and then assuming some level of risk at the end of that whole process. Moving into the continuous monitoring mode then will take us forward in time so as the threats base changes, as our technologies evolve, as our missions change, we can react to those types of changes and look at the security impact, the security state of our systems after those changes have taken place.

The other big story of the 37 is a tighter integration into enterprise architecture and also into the system development lifecycle process.

CHABROW: Tell us a little bit more about this lifecycle approach.

ROSS: The lifecycle approach is really characterized by the six-step risk management framework, which we developed very early on to try to unify all of our standards and guidelines. And it really goes back to building security into your missions and business processes early in the development process. We tend to want to look at security late in the process, after we purchase products, after those products have been integrated.

By defining your requirements early, and I'm talking about when you are first defining what the core missions of the organization really are and how those business operations are carried out, what are the information flows that are going through to allow you to carry out those missions? And then eventually you get down to purchasing technologies that will actually reflect the hardware, the software, the firmware and the applications that allow you to carry out those types of activities.

But getting back to the enterprise architecture, that is where you first start to define the important requirements. And so having the new C&A process reflect those six steps will really ensure that we get started early, we define our requirements to the best of our ability and then we can go through the steps in sequence in the RMF, the Risk Management Framework, to make sure that the requirements actually ended up in the system and the products that we specify and of course there will always be residual risks because we can never get perfect technology, perfect solutions, we will always have to manage risk to make sure that whatever we have deployed and however we end up using that great technology, we are able to manage the risk in an appropriate way. Everybody has a risk tolerance and that risk tolerance you have to decide what that is so you can effectively protect whatever missions you are asked to carry out.

CHABROW: Obviously most of the systems exist already; they are legacy systems. How does this lifecycle fit into that?

ROSS: That is a very, very common question because I would say the vast majority of our federal systems are legacy systems. The risk management framework is a perfect tool, even in a legacy environment. And the way that you can use the framework, you can start with the same six-step framework you started out with the categorization step, which is looking at the value of information within whatever missions you are being asked to carry out. You are going to go back and you are going to look at what security controls you would select today knowing what you know about the threat, your environment of operations and what your critical missions are.

If you were going back today and you were developing your security plan and you were selecting a set of controls that you think would be the optimal set to really protect against a certain class of adversaries, you can go ahead and develop that plan and then you can use that plan to compare against what you actually have deployed into the current system that you are operating.

And what that may show is what we call a "gap analysis." These are the controls that I should be deploying, these are the controls that I actually have deployed, so where is the delta between what I have and what I need and that would be very instructive to know because that can serve then to drive your plan of action and milestones, which is a document that talks about weaknesses and deficiencies that exist and we have them in all systems, and what is my prioritization, my prioritized approach for filling that gap or bringing it up to code, so to speak.

About the Author

Eric Chabrow

Eric Chabrow

Retired Executive Editor, GovInfoSecurity

Chabrow, who retired at the end of 2017, hosted and produced the semi-weekly podcast ISMG Security Report and oversaw ISMG's GovInfoSecurity and InfoRiskToday. He's a veteran multimedia journalist who has covered information technology, government and business.

Around the Network

Our website uses cookies. Cookies enable us to provide the best experience possible and help us understand how visitors use our website. By browsing, you agree to our use of cookies.