Making Privacy Concrete (Three Words Not Usually Found Together)

By: Sean Brooks, Mike Garcia, Naomi Lefkovitz, Suzanne Lightman, Ellen Nadeau

Most in the IT space won’t know this, but NIST has one of the world’s best concrete engineering programs. Maybe we just have concrete on the mind since a couple of us in the office are doing house renovations, but with today’s publication of the NIST Internal Report 8062, An Introduction to Privacy Engineering and Risk Management in Federal Systems (NISTIR 8062), we are taking a page from the concrete folks’ book with a document that we believe hardens the way we treat privacy, moving us one step closer to making privacy more science than art. NISTIR 8062 introduces the concept of applying systems engineering practices to privacy and provides a new model for conducting privacy risk assessments on federal systems.

There were several reasons for venturing into this territory. Certainly the Office of Management and Budget’s July 2016 update to Circular A-130 gave us a strong impetus, but our ongoing trusted identities pilot program was also a significant earlier driver. The pilots need to demonstrate their alignment with the NSTIC Guiding Principles, but in the first couple of years of the program, grant recipients often had difficulty expressing to us how their solutions aligned with the Privacy Guiding Principle. Even agreeing about the kinds of privacy risks that were of greatest concern in federated identity solutions could drag out over multiple rounds of discussion.

NIST has produced a wealth of guidance on information security risk management (the foundation of which is NIST’s Risk Management Framework), but there is no comparable body of work for privacy. While there are international privacy framework standards that include the need for identifying privacy risk, there are no widely accepted models for doing the actual assessment.

We learned from stakeholders that part of the problem is the absence of a universal vocabulary for talking about the privacy outcomes that organizations want to see in their systems. In information security, organizations understand that they are trying to avoid losses of confidentiality, integrity and availability in their systems. The privacy field has the Fair Information Practice Principles, but as high-level principles they aren’t written in terms that system engineers can easily understand and apply. Oftentimes, privacy policy teams must make ad hoc translations to implement them in specific systems.

To try to bridge this communication gap and produce processes that are repeatable and could lead to measurable results, we began by considering how privacy and information security are related and how they are distinct. The Venn diagram below illustrates how information security operates in the space of unauthorized behavior within the system, whereas privacy can be better described as dealing with the aspects of system processing of personally identifiable information (PII) that is permissible, or authorized. The two fields overlap around security of PII.

Security and Privacy Concerns Venn Diagram

We also reflected on whether having privacy engineering objectives that had some functional equivalency to confidentiality, integrity, and availability could help bridge the gap between privacy principles and their implementation in systems. Here’s what we came up with.

privacy engineering objectives

Lastly, we developed, and confirmed with stakeholders, a privacy risk model to use in conducting privacy risk assessments. We needed a frame of reference for analysis—a clear outcome—that organizations could understand and identify. In information security, the risk model is based on the likelihood that a system vulnerability could be exploited by a threat, and the impact if that occurs. What is the adverse event though when systems are processing data about people in an authorized manner – meaning any life cycle action the system takes with data from collection through disposal? We know that people can experience a variety of problems as a result of data processing such as psychologically-based problems like embarrassment or more quantifiable problems like identity theft. We think that if organizations could focus on identifying whether there was a likelihood that any given action the system was taking with data could create a problem for individuals, and what the impact would be, this would give them a clearer frame of reference for analyzing their systems and addressing any concerns they discovered.

How did this work out for our pilots? Frankly, it exceeded our expectations. Using this privacy risk model, they could identify new privacy risks, prioritize the risks, communicate them to senior management, and implement controls as appropriate (usually some combination of policy-based and technical controls). Shoutout to the pilots—we greatly appreciate your insights!

NISTIR 8062 is only an introduction to privacy engineering and risk management concepts. In the coming months and years, we will continue our engagement with stakeholders to refine these ideas and develop guidance on how to apply them. One of the properties of concrete that makes it so useful is that you can mold it into just about any shape, but once it sets you know exactly what to expect of its performance. This sort of flexible but consistent performance has long eluded those who care about systems-implementable privacy protections.

This entry was posted in Uncategorized and tagged , , . Bookmark the permalink.

7 Responses to Making Privacy Concrete (Three Words Not Usually Found Together)


    Way to concrete the foundation

  2. Ann Cavoukian says:

    I applaud this development relating to privacy engineering! I view it as a concrete extension of Privacy by Design, that not only complements PbD but provides solid measures that may be taken by engineers and systems designers. Great work! Ann Cavoukian

  3. Xavier Le Hericy says:

    Much welcomed addition. Hopefully this effort will adhere to the KISS principle and acknowledge that not all PII present the same risks. Privacy could in fact suffer if controls were too rigid and not commensurate to the risks.

  4. Richard Lopez says:

    The premise to this report and other polices emerging from OMB show the emerging importance that Privacy and Security practitioners have a mutual responsibility to work together in designing, altering, or integrating systems containing PII. Now if we can get the local and state governments on-board these projects can have a positive psychological effect on every citizen.

  5. John Moehrke says:

    It is so good to see NIST bring Privacy out of the closet. I promoted the hints of Privacy in NIST 800-53, but always needed to enhance with a Privacy Framework, Privacy Impact Assessment, and Privacy Risk Management. It is great to see NIST bring these things into the NIST Privacy/Security as a distinct, yet related. Well done.

  6. David Staggs says:

    Was the Privacy Risk Assessment Methodology (PRAM) and the PRAM forms (in the original appendix D) removed from the latest version of 8062 for a specific reason? Will there be a separate work product discussing PRAM?

    • TIG says:

      We removed the PRAM from the final version of NISTIR 8062 in an effort to streamline the document and clarify that it is an introductory document rather than guidance. However, we do plan to post publicly the PRAM documents in the near future. In the meantime, if you’d like to leverage these worksheets, please email us at and we’ll send them to you directly.

Leave a Reply

Your email address will not be published. Required fields are marked *