Infosec Guru Ron Ross on NIST's Revolutionary Guidance

As the government looks to deploy cloud computing and other new technologies securely, just-issued guidance from the National Institute of Standards and Technology shows how agencies can pool resources to qualify technologies and services for purchase and, in turn, save taxpayers millions of dollars, says a senior NIST computer scientists.

Traditionally, each agency had been required to have its own officer make a judgment on whether the technology or service being acquired met certain IT security standards. But Special Publication 800-37, Guide for Applying the Risk Management Framework to Federal Information Systems: A Security Life Cycle Approach, shows how agencies can either have their authorization official team up with counterparts from other agencies - known as joint authorization - or piggyback on the work performed by another agency's authorization official - leveraged authorization - to qualify IT products and services for acquisition.

Ron Ross - who led the team that wrote the revised SP 800-37, which NIST released late last month - explained in an interview with GovInfoSecurity.com (transcript below) how the new process works:

"For example, the GSA (General Services Administration) may go out and accredit or authorize cloud providers information systems. Then, there may be a string of other federal agencies that decide somewhere down the line, after that authorization is completed, that they also want to use that cloud providers services. Instead of having to go back and having each of those agencies do a complete reauthorization for their own purposes, they can now use the documentation and evidence used as part of that first agency's authorization, and use that as the basis of their risk decision."
"This has the potential to save the federal government literally millions of dollars so every agency doesn't have to go forward and do the same process over and over and over again.

Ross, in the conversation with GovInfoSecurity.com's Eric Chabrow, also discussed the:

  • Importance of the new guidance in providing real-time monitoring of IT systems.
  • Challenges federal agencies face in adopting NIST IT security guidance.
  • State of cybersecurity in the federal government.

Ross, the highly regarded NIST senior computer scientist and information security researcher, serves as the institute's Federal Information Security Management Act implementation project leader. He also supports the State Department in the international outreach program for information security and critical infrastructure protection. Ross previously served as the director of the National Information Assurance Partnership, a joint activity of NIST and the National Security Agency.

A graduate of the United States Military Academy at West Point, Ross served in a variety of leadership and technical positions during his 20-year career in the Army. While assigned to the National Security Agency, he received the Scientific Achievement Award for his work on an interagency national security project and was awarded the Defense Superior Service Medal upon his departure from the agency. He's a two-time recipient of the Federal 100 award for his leadership and technical contributions to critical information security projects affecting the federal government. During his military career, Ross served as a White House aide and as a senior technical advisor to the Department of the Army.

Ross is a graduate of the Program Management School at the Defense Systems Management College and holds a master and Ph.D. in computer science from the United States Naval Postgraduate School.

ERIC CHABROW Before we get into the specifics about NIST 800-37, I would like to get your characterization of the current state of cybersecurity in America. At a hearing on Feb. 23, former National Intelligence Director Mike McConnell said if we were in a cyber war today the United States would loose. Is the state of cybersecurity in the United States and federal government that bad?

RON ROSS:I think we are evolving. With data security we have made enormous progress in the last decade. We have had so many areas that we are trying to attend to at one time, there has been a tremendous amount of progress but there is still a lot of work to be done. His characterization represents the situation that we continue to see fairly devastating set of cyber attacks that are being launched from adversaries around the globe, and these are something that are still impacting our systems within the federal government and also within the private sector. I think this is going to be an ongoing thing. I don't think there is ever going to be a point where we are actually done and we can say everything is buttoned down and we are totally secure.

Technologies are evolving, we are going to cloud computing, we are moving into new technology areas, we are increasing our network connectivity all of the time, we are using technology aggressively and our job as security professionals is always going to be trying to find a way so we can characterize the risk in an appropriate way. And I am talking risks to our operations, assets, individuals, our missions, how do we characterize that risk and make sure that we are applying all of the security controls that we need to have to protect these missions and operations to the best of our ability.

I do agree that there is still much work to be done, but I also agree that we have made an awful lot of progress as well.

CHABROW In our previous chats, you have maintained that if federal agencies, as well as other organizations, rigorously followed NIST cybersecurity guidance, many of the problems they face in securing their digital assets would be achieved. Representative David Woo, who chairs the House subcommittee that has NIST oversight, says NIST has created some great guidance but is not necessarily in a language that everybody understands. Are the standards too complicated to follow? If that's not the case, why do you think it is hard for some agencies to follow the guidance to secure IT that NIST provides?

ROSS: It was Albert Einstein that said things needs to be as simple as they need to be and no simpler. We try, in all of our standards and guidelines, to provide guidance that is just both technically correct and also implemental. To reach that bar, it has to be understandable to our constituents.

Now, some of the issues that we deal with are by nature complicated, but I believe the way our guidance is written - and you can go down every one of our publications from 800-53, where we have our management operational technical controls, all the way to the new risk management framework, which characterizes a new C&A (certification and accreditation) process - are very understandable by our constituents and I believe they are able and are implementing those guidelines very effectively today.

CHABROW Let's talk about the guide for applying risk management framework to federal information systems. Briefly explain the main points presented in this publication and why they are so important.

ROSS: In our redesign of the certification and accreditation process there were certain very large issues that we wanted to make sure that we achieved in the new 800-37. And I think the one that comes to the top of the stack for me and the most important one is achieving what I call "better front-end security." I use the term front-end security to characterize building better products, building better systems ends up with better security at the end of the day.

We talk about building security into these systems early in the life cycle, and by using the risk management framework, which has our six-step process that characterizes a good security program, we are able to spend as much time up front defining good sets of requirements, good sets of security controls, implementing those controls in the most effective manner we can, and then once we get those steps accomplished, then we can go on to the traditional ways that we talked about with certification and accreditation, where we access the controls to see how effective they are. And then we end up with whatever deficiency and weaknesses remain, vulnerabilities that remain uncovered, residual vulnerabilities that then can go into some sort of a risk determination and risk acceptance process.

The ability for us to focus on the front-end is greatly increased by using the risk management framework because each of those steps is equally important and we don't end up focusing all of our attention on just the certification and accreditation part of the process, which has always been important, but no more important than getting things done up front in the right way.

By that same token it also allows us to focus on continuous monitoring after we have made the initial authorizational risk acceptance decision and that is really where the action is today. Continuous monitoring is critical and making sure we understand on an ongoing basis the security state of our systems, not just every three years or every six months, but on a day-by-day, hour-by-hour basis. That is the up tempo that our adversaries are working in today is they launched these very sophisticated cyber attacks against our critical systems.

CHABROW What happens next in getting agencies to follow this guidance?

ROSS: As with all of our publications as they are updated, agencies have one year for their legacy systems to implement the new guidance. For any systems that are brand new, they are currently going through a life cycle development; they will be expected to comply with the new guidance as that system gets fielded. That policy has already been established by OMB (Office of Management and Budget) and continues to be carried out through our guidance. So there will be a transition period with all of these new publications as agencies start to adopt the new guidance.

CHABROW You issued a draft of this revision last fall, is there anything new in this final revision that was implemented since the draft was issued?

ROSS: Yes. There are a couple of new things in here that I think our customers are going to be very excited to see. We start to address something that has been going on for a long time. Service oriented architectures and cloud computing are examples of what we characterize as dynamic subsystems within the new 800-37.

This is kind of an acknowledgement that our classic information system boundary, which for years and years we viewed as being kind of a static boundary, that boundary now become more porous; as we start to use external services and we start to build service oriented architectures, sometimes the components of your system are not there all of the time, they get brought in on an on demand basis, and we start to address dynamic subsystems in the new 800-37 by talking about how do we make sure that those services that are being provided wherever they emanate from have some standard of security due diligence applied to them, too. So you can make certain assumptions and you can establish certain constraints on how those services are used and how they impact other operating parts of your system that you are using to carry out your core missions.

The other very important thing we have added to Appendix F, we have extended the types of authorization approaches that an organization can use. The traditional approach has a single, authorizing official or if you are in the DOD they call them Designated Approving Authorities. The traditional approach has a single authorizing official in making a single authorization decision for each of the system We have added two new approaches.

One is called a joint authorization where you can have multiple authorizing officials working together going through all of the steps in the risk management framework to include defining requirements all the way through implementation, and then together making a collective, a joint authorization decision. This could be a situation where for example you might have several federal agencies that are considering using an external service or an external service provider and they want to be involved all the way through the process to make sure everything that is important to them as an organization to support their missions are reflected in that authorization process as they apply the risk management framework.

The second new type that is now the third type of authorization is called a leveraged authorization. This again is going to apply probably on a pretty big way with some of the new paradigms out there like cloud computing. So whoever's authorization would be something along the lines that a federal agencies, for example the GSA, may go out and accredit or authorize cloud providers information systems and then after they have gone through that authorization process there may be a string of other federal agencies that decide somewhere down the line, after that authorization is completed, that they also want to use that cloud providers services. But instead of having to go back and having each of those agencies do a complete reauthorization for their own purposes, they can now use the documentation and evidence used as part of that first agency's authorization, and they can use that as the basis of their risk decision.

Sometimes that is all they need to have to go forward and make their own authorization decision, their own risk determination and acceptance. This has the potential to save the federal government literally millions of dollars so every agency doesn't have to go forward and do the same process over and over and over again.

CHABROW Is it a change of culture among IT and IT security professionals in government or the technology itself or both that has prompted these kinds of changes or these new kinds of authorizations?

ROSS: Well there is currently a change in culture because I don't think three decades ago, or even two decades, even a decade ago if you would have come to us and said let's try to bring all of these three communities together (Defense, national security and non-national security), I think it would have next to impossible.

What has brought us together is all of us, each of the three communities are looking at the types of attacks and threats that we are up against today, and we are all agreeing that in order to be successful and defend ourselves to the best of our ability, we need to cooperate and we need to bring together all of the expertise that we have across the entire federal government and the private sector where they choose to adopt our standards and guidelines. Bring everyone together to try to unify our forces and to develop the most effective solutions that we possible can.

So yes, there has been a change in culture. There has also been evolving technology, which always happens. We are going into cloud computing now and that is a new computing paradigm, new technology area, that we are going to be facing head on with some of our current standards and guidelines and best practices. It is all about being able to adapt to the new technologies, assessing risk to the greatest degree possible, and making sure that you go into your mission operations with your eyes wide open knowing everything you can possibly know about where you stand with regard to security.

CHABROW So what is next for you?

ROSS: Well, this is our second publication in a series of five that the DOD and the intelligence community have agreed to work with NIST on. We are moving on to 800-53A, which is the document that defines all of the assessment procedures so agencies can figure out if their controls are actually working effectively. From there we are going to move out and finish our enterprise-wide risk management guideline that goes under the 800-39. That document will be out sometime in the early summer. And then, of course, we are going to finish the series of five with the new 800-30, which is the risk assessment guideline.

While all of that is going on, there are two new publications I wanted to mention to you and your listeners that are not part of the five in the joint task force, but they are very important.

One is a systems and security engineering guideline. This gets back to the basic theme that we are trying to emphasize this year of getting back to the fundamentals; better products, better systems, better security. This guideline will talk about practices for building commercial products into an integrated information system and using our best practices to do that integration effectively and as securely as we possibly can.

We are also going to be publishing later this year a guideline on application level security. This one creates things like web applications and any type of an application where we can build in some of the security controls and best practices into those applications, which are a very critical part of the information technology stack as you go from applications to middleware to operating systems down to hardware. We have to worry about all of these things in a defense in depth type of a solution.




Around the Network

Our website uses cookies. Cookies enable us to provide the best experience possible and help us understand how visitors use our website. By browsing govinfosecurity.com, you agree to our use of cookies.