Back in 2009 the European Commission raised a controversial proposal to grant consumer rights to software users, much like those applying to cars or other “physical” goods.With recent Heartbleed failure of OpenSSL it’s likely that these populist proposals will appear again. Why wasn’t that good idea? There’s a well-known analogy, which basically goes that if cars were like computers then they would cost $100 but crash for no reason twice a day. Seemingly annoyed by that, the commisioners proposed “more accountability for software makers” which basically meant that software licenses would be required to provide consumers with “the right to get a product that works”.
Why this intuitive approach is not a very good idea? In the first place, software is not an autonomous product like toaster. Software depends on underlying hardware, operating system and hundreds of third-party libraries, which together creates a very complex system of mutual dependencies. Some programs (e.g. games) won’t work with some hardware (e.g. GPUs), versions of operating systems (e.g. latest Windows) and libraries (e.g. .NET Framework or GPU drivers).
It’s not that we don’t know how to develop secure and reliable software. We do, and we have a number of tools for programming mission-critical systems, but the biggest problem is that delivering highly reliable software is extermely expensive and very time consuming process. Which is exactly the reason why it’s used to deliver mission-critical systems and not consumer software.
To give you an idea, here are lists of software and hardware products that obtained Common Criteria and FIPS 140-2 certification. Each list contains a few hundreds of products, out of millions present out there on the market. I don’t know about FIPS but a few years ago Common Criteria certification for the lowest EAL1 level required around 9 months examination and costed around €50k. Next level was around 18 months and €150k.
It’s actually quite likely that the version of OpenSSL with Heartbleed bug was FIPS certified, but there’s no contradiction there. Each certification is issued for a very specific version and environment in which the product was tested (look at this FIPS cert for OpenSSL to get an idea) and it usually has a very narrow, precisely defined scope (see here to see what is actually certified in OpenSSL certification) and in no way it guarantees “overall” correctness operation of the product.
So what is the current state of the software quality? Most people, even if they don’t realise that, just prefer buggy software that costs $100 over not buggy software that costs $100k per license. Tolerance of bugs is the price software market pays for being so innovative and competitive as it is now.
What could be improved?
While I’m against mandatory software security guarantess, there’s whole a lot of continuum between the “mandatory” and the current situation, especially in regard to open-source projects that suddenly found themselves as part of critical infrastructure.
OpenSSL has gone a long way since it was started in 90’s, from an amateur open-source project to industry-level security library that goes security certified. At the same time, its code has always suffered from beging rather messy and poorly documented, which is always correlates with security bugs.
But before throwing stones at OpenSSL team, we should remember that its relatively small team of volunteers, who ocassionally get commercial consulting contracts to implement specific features. At the same time, a large number of vendors have been using the project as the cryptographic base for their commercial products, allowing them to cut costs significantly, without contributing a lot back to the project.
Looks like this practice has backfired with Heartbleed. What OpenSSL needs now is significant increase both funding and engagement from commercial vendors in a way, that would allow them to deploy a decent software security development program, including static scans, code reviews, fuzzing, change management etc.
And you can help too, as I did!