Governance & Risk Management , Information Sharing , Privacy

Employing Technology to Ensure Privacy

DARPA's 'Brandeis' Program Seeks to Protect PII
Employing Technology to Ensure Privacy
A DARPA PII protection initiative is named after Louis Brandeis.

Automating the process of excising personally identifiable information when sharing data is a challenge that the Defense Department hopes to overcome.

See Also: Research Survey Results Report: Evaluating Managed Security Provider Offerings in 2023

The Defense Advanced Research Projects Agency, known as DARPA, will consider proposals from the public that would expedite the way organizations safeguard PII while sharing the data with others. It's a technology problem that has vexed the information security and privacy world for years.

The goal of the initiative, known as Brandeis, is to "break the tension" between maintaining privacy and being able to tap into the huge value of data, DARPA Program Manager John Launchbury says. "Rather than having to balance these public goods, Brandeis aims to build a third option: Enabling safe and predictable sharing of data while reliably preserving privacy," he says.

DARPA envisions the project - named after Louis Brandeis, a privacy proponent as an associate justice of the Supreme Court nearly a century ago - as developing tools and techniques to enable the building of systems in which private data would be used only for its intended purpose and no other.

Democratic Need

Sharing information is a growing focus of government and industry as Congress considers legislation to encourage businesses to voluntarily share cyberthreat information with the government and each other. Those measures would require the stripping of PII before the data is exchanged. "Democracy and innovation depend on creativity and the open exchange of diverse ideas, but fear of a loss of privacy can stifle those processes," Launchbury says.

As he explains, the existing approaches to protect PII fall into two broad categories: (1) trusting the user of the data to furnish robust safeguards and (2) filtering the release of data at the source, such as by removing a person's Social Security number from a record.

Trusting the holder of the data to safeguard PII is problematic, as proven by the millions of Social Security numbers, credit card numbers and other PII exposed by a series of high-profile breaches in the past few years.

Filtering data also is tough because advanced algorithms can cross-correlate redacted data with public information to re-identify an individual. Research conducted by Harvard University's Latanya Sweeney, when she was on the Carnegie Mellon University faculty, shows that publicly available birthdates, ZIP codes and gender can identify nearly 90 percent of Americans.

Ridding Windows of Exposure

To illustrate that point, Purdue University Computer Science Professor Gene Spafford cites an insurance company that shares information about a policyholder without revealing the customer's name. An individual who conducts a query of publicly available information about all males with a certain model of a car, living in a specific ZIP code and having a discernable driving pattern - and then combines those results with the sanitized information from the insurer - might be able to identify the policyholder's name.

In addition, Spafford says, PII that was previously protected might unintentionally become exposed when recoding data. Also, he says, encrypting a database could make it difficult to analyze non-PII information linked to private data organizations seek to evaluate. "The objective really is to find a way to transform or mask the data so it's still useable but eliminate those windows of potential exposure," Spafford says.

That's what DARPA hopes to do by funding projects it'll select later this year. Four organizations will be selected, a DARPA spokesman says, who adds that Congress has budgeted $60 million for Brandeis. He declines to identify the organizations. DARPA expects the process to develop new tools under Brandeis to take at least 4½ years, split into three, 18-month phases. Each phase will result in the demonstration of experimental systems that show privacy technologies at work.

Still, new technologies themselves would provide only part of the solution, says Julia Horwitz, coordinator of the open government program at the Electronic Privacy Information Center, a public interest research group. "We need stronger laws that require companies to limit the amount data they collect in the first place and then encrypt and strip PII once they collect it," she says. "In order to help that process along, we need to make sure that technology steps up."


About the Author

Eric Chabrow

Eric Chabrow

Retired Executive Editor, GovInfoSecurity

Chabrow, who retired at the end of 2017, hosted and produced the semi-weekly podcast ISMG Security Report and oversaw ISMG's GovInfoSecurity and InfoRiskToday. He's a veteran multimedia journalist who has covered information technology, government and business.




Around the Network

Our website uses cookies. Cookies enable us to provide the best experience possible and help us understand how visitors use our website. By browsing govinfosecurity.com, you agree to our use of cookies.