Vault 7: The Concern Over Vulnerability Disclosure

On March 7, 2017, whistleblower website Wikileaks began releasing a series of classified Central Intelligence Agency (CIA) documents, now called the “Vault 7 Leaks.”[1] Some of these documents shed light on the CIA’s surveillance capabilities, including the ability to hack into electronic devices, like smartphones and televisions. For many, this brought back memories of Edward Snowden’s release of National Security Agency (NSA) documents, which exposed the NSA’s mass collection of metadata from phone conversations around the country.[2] The released CIA documents might bring up legal questions over how far the intelligence agencies spying powers reach, but unlike the NSA leak, this CIA leak does not yet contain any sign that these CIA hacked to spy on anyone domestically or to the extent where Fourth Amendment concerns might come into play.[3] Instead, the bigger legal and policy concerns this dredges up are the intelligence agencies’ responsibility to reveal software or hardware vulnerabilities that allow hackers—be they the government or private persons—in. The government deals with these issues using a vulnerability review process that determines when vulnerabilities should be disclosed. However, currently, it is not clear how effective this process is or how often intelligence agencies utilize this process. In this post, I hope to explain some of the deficiencies and analyze the benefits and determents of potential amendments to the current system.

Intelligence agencies in the United States often find and exploit vulnerabilities, in theory, to prevent crimes, like terrorism. For example, one popular discovery in the Vault 7 leaks is that Samsung televisions contain a vulnerability that would allow the CIA to use the TV camera to spy on potential criminals.[4] This is a powerful spying tool to be sure, but it could also turn into a powerful spying tool against the United States and its citizens if others find the vulnerability as well. It might be in the U.S. government’s interests to disclose the vulnerability to Samsung so that they can fix it and protect their consumers. Thus, when intelligence officers find a vulnerability, they must balance the value it adds to performing their duties with the damage it could cause to the product’s consumers.

The government’s current response to this issue is the Vulnerabilities Equities Process (“VEP”).[5] VEP allows a council to decide when a vulnerability needs to be disclosed based on criteria set by the VEP creators.[6] The idea was introduced in 2008 and implemented in 2010. Until 2014 at least, the policy was not implemented fully as it is reported that many vulnerabilities, including allegedly the famous Heartbleed bug, were not disclosed, and the White House essentially admitted the process was not being used.[7] In 2014, in response to the Heartbleed bug and the Snowden release, the White House “reinvigorated” the VEP by defaulting towards disclosure of vulnerabilities unless national security is at risk.[8] The Vault 7 leaks has shown many undisclosed vulnerabilities, which puts the VEP system back into the spotlight and may highlight a need for reform.[9]

The leak once again puts further light on some of the potential flaws with the VEP. First, there is very little public knowledge of the VEP system, and it was not until 2014 that the first significant details were divulged in a blog post by White House Cybersecurity Coordinator Michael Daniel.[10] It was not until a year or so later that further details were released only after a lengthy Freedom of Information Act trial.[11]  Second, VEP is not enforced through law or an executive order, so it is not clear how often vulnerabilities are submitted to the VEP process.[12] For example, the San Bernardino iPhone hack by an FBI-hired company used a vulnerability, but the FBI did not put it through the VEP process as they claimed they did not have the technical background of the vulnerability.[13] Sometimes intelligence officers will purchase vulnerabilities and agree not to disclose them as part of the deal. Third, it is not clear that the VEP decisions are revisited.[14] While at one point, a vulnerability might be more useful than dangerous, it is possible that might change later on, putting consumers in danger. Fourth, the VEP might take too long.[15] Many disclosed exploits are already fixed by the time they are disclosed, implying the VEP process is delayed. Lastly, the VEP system’s Executive Secretary seems to be positioned from within the NSA.[16] The NSA seems to have some bias towards non-disclosure or at least raises the possibility of bias to the public, which might lead to dangerous decisions or public distrust.[17]

The intelligence communities need to have some freedom to ensure the security of the nation, but they also have limitations in their abilities to evaluate when something should be disclosed, and so, some type of system is needed to ensure the right decisions are made. Unfortunately, from what we can gather about VEP, it needs some improvements to mitigate many of the issues detailed above.

To deal with each of these issues, Ari Schwartz and Rob Knake of Harvard’s Belfer Center have proposed the most comprehensive solution for the VEP.[18] They suggest implementing the VEP through an executive order with congressional oversight where the criteria to determine disclosure are public and with annual public reports. This would make the process more likely to be used by the intelligence community and more transparent to the public. They suggest the process should be defined clearly and reviewed periodically, so it is quicker and more equitable. Also, the VEP Executive Secretary should be moved to Homeland Security, and public agencies should be barred from creating non-disclosure agreements. The former suggestion should help fight the potential bias of the NSA, and the DHS should not have the same bias because it is a separate organization and “has developed a strong capability in vulnerability research and software assurance.”[19]  The latter would prevent the issue of contracts preventing positive disclosure.

Of course, implementing some of these ideas involves losing some of the privacy that the intelligence community desires. For example, the congressional oversight and new Executive Secretary means more people outside of the agencies knowing details of the vulnerabilities, which increases the risk of leaks. However, congressional oversight will force agencies to follow the VEP more thoroughly than if the entire process was within the intelligence communities. The non-disclosure ban could also involve security risks as some companies selling the vulnerabilities might not offer exclusivity options. Thus, while these additions would make the VEP more effective in theory, they might undermine the security concerns the VEP was meant to balance.

The privacy issues only really apply to some of the proposed solutions though, and the other proposed amendments to the VEP might be able to avoid these issues. The rules about publicizing the process make sense as long as it does not expose the vulnerabilities themselves, and explaining the process could help other countries adopt similar processes. Also, making the process clearer and including periodic review seem like easy ways to make the process more efficient without losing security. Lastly, even the non-disclosure ban could be limited to where exclusivity can be bought. Thus, if the source selling the vulnerability will be willing to sell it exclusively to the agencies for a premium, then the agencies will have to disclose it, but if the source refuses, then they will not have to disclose. It would be costlier but avoid the security concern.

These amendments will not solve all of the issues related to vulnerabilities’ disclosure. For example, non-disclosure bans will allow more vulnerabilities to go through the process, but vulnerabilities like the one used in the San Bernardino hack will still not go through the VEP because it was done using an outside company without revealing the vulnerability to the FBI.[20] However, these seem like strong steps forward to mitigate the current oversights in the VEP. Thus, while the Vault 7 leak is still shrouded in some unknowns along with the VEP, it does help highlight some of the policy concerns on vulnerability disclosures and reminds us that reform is needed before we can be confident the intelligence community is acting efficiently for the safety of America.

[1] https://www.nytimes.com/2017/03/07/world/europe/wikileaks-cia-hacking.html

[2] Id..

[3] Id.; U.S. Constitution. Amend. IV.; http://www.cnn.com/2015/05/07/politics/nsa-telephone-metadata-illegal-court/

[4] https://www.nytimes.com/2017/03/07/world/europe/wikileaks-cia-hacking.html

[5] https://www.eff.org/files/2016/01/18/37-3_vep_2016.pdf

[6] Id.

[7] http://www.belfercenter.org/sites/default/files/legacy/files/vulnerability-disclosure-web-final3.pdf

[8] Id.

[9] https://www.theguardian.com/media/2017/mar/07/wikileaks-publishes-biggest-ever-leak-of-secret-cia-documents-hacking-surveillance

[10] https://obamawhitehouse.archives.gov/blog/2014/04/28/heartbleed-understanding-when-we-disclose-cyber-vulnerabilities;

[11] https://www.eff.org/files/2016/01/18/37-3_vep_2016.pdf

[12] http://www.belfercenter.org/sites/default/files/legacy/files/vulnerability-disclosure-web-final3.pdf

[13] http://cyberlaw.stanford.edu/events/government-hacking-vulnerabilities-equities-process

[14] Id.

[15] https://epic.org/privacy/cybersecurity/vep/

[16] http://www.belfercenter.org/sites/default/files/legacy/files/vulnerability-disclosure-web-final3.pdf

[17] Id. at 15

[18] Id.

[19] Id.

[20] http://cyberlaw.stanford.edu/events/government-hacking-vulnerabilities-equities-process

Comments are closed.