So the Heartbleed bug within OpenSSL has caused a big ruckus this week. OpenSSL is one of the most widely used encryption software programs on the planet—and rightly so. This means that most of us—we billions of users of some of the most highly trafficked and trusted retail, search, and web services sites—may have unwittingly allowed our passwords and other sensitive information to be compromised within the last couple of years or so with absolutely no idea that this was happening.
So how did this vulnerability go undetected for the past year and a half by the legions of volunteer experts who have access to the code? Isn't open-source software meant to be more secure because it has such unlimited availability for review by the best of the best?
In a word, yes.
Contributors to open-source encryption software are among the most brilliant minds working today. Although they are also volunteers, these are highly motivated contributors and their rigor and commitment are exceptional.
And the great part about Heartbleed—if anything about it is great—is that the open-source process worked.
The Highly Motivated Minds behind Open-Source Encryption
What's interesting to me is the thoughtful, well-planned disclosure of the vulnerability, which should serve as a template for the open-source software community as a whole. As Chester Wisniewski said on CNN Opinion, "The bug itself is a simple, honest mistake in the computer code that was intended to reduce the computing resources encryption consumes." Researchers at Google and Codenomicon proactively found the bug and held back from disclosing it until a solution had been found and a patch was readily available, thus limiting the time that black hats would have to take advantage of what is now seen as a gaping back door in the code.
But let's think about the other half of the equation. If black hats have taken advantage of this back door—and I wouldn't be willing to bet in Vegas that they haven't—there would have been no signature to detect or blacklist, as the resulting activity would appear authorized by security detection systems. The only sign that something was amiss would have come from results of later stages of a concerted attack, when anomalous processes and activity were running on affected endpoints. The way to spot these anomalies is with analytics that work from baselines of normal behavior and give early indicators of unauthorized or unusual activity. EnCase Analytics can help with this.
What do you think about the Heartbleed bug and open-source encryption? I welcome discussion in the Comments section below. And click here for a white paper on "Six Essential Steps in Managing Cyber Breaches."