Heartbeat request caused a “heartbleed”

heartbleed_logo

What is heatbleed?

cartoon_heartbleed

The Heartbleed Bug defines this bug as “[A] serious vulnerability in the popular OpenSSL cryptographic software library. This weakness allows stealing the information protected, under normal conditions, by the SSL/TLS encryption used to secure the Internet.” A lot of big security experts have called this bug the biggest security issues of the internet to date. It basically allows anyone on the internet to read a chunk of memory that OpenSSL uses to keep your stuff protected. This means your usernames, password, content, and even worst, the key that is used to encrypt all these information can be the object to this attach. If the attacker gets that key, they will then be able to read anything that OpenSSL tries to “hide”. Furthermore, OpenSSL is one of the most widely used encryption tool on the internet.

So all this sounds like a new thing that people usually find out when some hacker hacks a big server. However, this flaw has been around since 2012 and nobody knew about it until about 2 weeks ago when this bug was independently found by Neel Mehta, a Google Security engineer and a group of security engineers at Codenomicon

What did they do?

As far as I know, what they did is to report it to NCSC-FI and the OpenSSL team and somewhat publicized it. This caused all the big server holders such as Facebook, Yahoo, Microsoft, and etc. to solve this issue because now everybody knew about it. Five day after discovery of the bug, a this list was released containing the top 1000 sites and whether they were vulnerable or not. 48 of these websites were still vulnerable at that point of time. Among these vulnerable websites, we can see some of the big server holders such as Yahoo!, stackoverflow, and Flickr!

Ethical issues:

The main question that we can ask here is who to blame here? One answer could be that the people developing the OpenSSL are the people to blame. PCMAG writes about Robin Seggelmann, a programmer who uploaded the code with the heartbeat request feature on Dec 31, 2011. Seggelmann says “I am responsible for the error, because I wrote the code and missed the necessary validation by an oversight. Unfortunately, this mistake also slipped through the review process and therefore made its way into the released version.”

Another question can be who was taking advantage of this bug since it was out there for about two years?
As Bruce Schneier mentions in his blog post “[a]t this point of time, the probability is close to one that every target has had its private keys extracted by multiple intelligence agencies.” Supporting Schneier, Electronic Frontier Foundation (EFF) mentions two stories in this article about how the evidence show the possibilities that an intelligent agency could have been taking advantage of this bug all along.

I think if such thing is true, it is completely unethical to do such thing. This is like a company finding a way to access its employee’s data and instead of fixing the issue, taking advantage of their own employers. What do you think?

3 Responses to Heartbeat request caused a “heartbleed”

  1. It seems as though there are multiple parties responsible for this bug making it out into the wild, and living for so long. Most obviously is the developer Robin Seggelmann who developed the the code containing the Heartbleed bug. I think most us can agree that one of the primary ethical responsibilities for any developer is to put out error free code that does not endanger its users. The person who did his code review is also to blame, as it was specifically his job to catch exactly this sort of thing. They both bear a pretty big chunk of responsibility, but I think we can take them both at their word that the bug was not maliciously inserted and was instead a honest mistake. While I don’t feel this lets them off the hook, I think it counts for something. Rule and act utilitarianism would both say that they were morally wrong as they have caused a pretty good chunk of harm, and developing code without fatal world altering bugs seems like a fairly safe rule for the public good.
    What I find far more concerning is the rampant speculation that the NSA knew about Heartbleed and choose to exploit it rather than report it. I think that if the NSA did know about it, then they are ultimately responsible and are also morally wrong. Act utilitarianism would say that by keeping Heartbleed a secret the NSA did a huge amount of damage to user by risking their data. I think one of the most interesting things to come out of this though is from the White House, that says when the NSA must disclose zero day bugs.

  2. youstolemycouch.

    This is a very interesting issue, I hope that the accusations against the NSA are untrue and will wait until more time has passed and facts become available before I criticize them. The point in the Bloomberg article (that Rake posted) about open source security, development and testing is very interesting. Most open source software, like OpenSSL, is developed and tested without a very large budget by a very small group of programmers, whose credentials may not even be known. I imagine it becomes harder to test as the development group gets smaller, because the number of ideas dwindles. While the blame certainly rests on the shoulders of the developers and viewers, and from what I understand the Heartbleed bug is a pretty glaring flaw that should have been found early in development, I think that the developers had the deck stacked against them from the start.
    I think it will be interesting to see how this effects the open source movement as a whole. Projects like Mozilla, GNU/Linux, and Apache have huge market share and obviously OpenSSL is very popular. Will people stop using these projects in favor of license software that has a bigger development and testing budget? Will people start to give more money to the open source projects that they use so that the open source community can spend more time on development and testing?

  3. This issue has stirred much debate over open-source software as a whole and I would like to provide my opinion on this issue. To start, I believe the developers and testers are the people responsible for this security flaw. They developed and tested the code and should have found the bug. But, as a computer science/programming student myself, I know how hard it can be find subtle bugs such as this. Even with the most rigorous test cases and plans, bugs can still slip through the cracks, regardless if the software was developed as open-source or proprietary. While this bug has definitely had more of an impact than others have in the past, the following should not be forgotten. Programmers/software engineers are human just like everyone else and make mistakes. The most important thing to take from a mistake such as this is to learn from it and move on.

    The people who exploited this bug definitely acted unethically. The only ethical action that could have been performed by someone who knew about this bug would have been to inform OpenSSL of the vulnerability so they could quickly and quietly fix it. Therefore I believe anyone who knew of this bug before it was released publicly acted unethically by not informing OpenSSL of the vulnerability.