Last Tuesday Chris Strohm and Jordan Robertson posted an article on Bloomberg Government (BGOV) (now available on Bloomberg.com) about open source, suggesting that recent hacks on web services have “shaken the confidence in the free software movement.”  Really?  Whose confidence, precisely?

To be sure, BGOV is a competent and comprehensive web-based information service for professionals who interact with or are impacted by the federal government.  If you aren’t a subscriber and are, you should consider.  While our work is directed locally to states and counties and their elections jurisdictions, content in the BGOV web site is often relied upon by a wide range of government officials.  And that’s why we felt compelled to speak up about it.

Notwithstanding the article’s final two paragraphs, here are two points we want to make about this:

  1. It’s just not sensible to suggest “open source software isn’t secure;” and
  2. The article appears to miss the real point altogether.

We’ll take those in order.

1. Licensing and Software Quality are Nearly Orthogonal
Open source is a way of licensing, and a model for development.  The licensing terms for software have nothing to do with its security.  Licensing terms only control what licensees can do with software, not the care with which the code is written.

When people talk about open source and security, they are talking about the development model.  If you believe that two heads are better than one, or that “With enough eyeballs all bugs are shallow,” then open source software, developed by a community, is probably more secure than software developed by a small group.

But in reality, some open source software—and some proprietary software—is developed by small groups.  That means there can be single points of failure, and that’s probably what happened in the cases of OpenSSL and Heartbleed.  The open source community responded by creating a fund to get resources to key open source projects, to improve development resources.  One thing is certain: such a community effort could not possibly happen for proprietary software, and lots of proprietary software is developed by small companies and small groups.

Some open source software is developed by hundreds of contributors.  That can also have its challenges, but the key committers on big open source projects are some of the most well respected engineers in the business.

Any popular software eventually will be a target for hackers.  But at least with open source software, you can fix it yourself.

From a different angle, the article seems to miss the point altogether, as our CTO observes:

2. All Software is Made Similarly—Adoption Sees No Color
It's true that people working outside of for-profit proprietary software created a variety of infrastructure software.  And true that a surprising large portion of this stuff has turned out to be critical.  But the point is that academic and research folks, who created this software because they needed it for their research purposes, first recognized the need for these very basic components. Then these components became so useful, as Internet technology was adopted more broadly, that commercial organizations included and incorporated the same software—and why not? It was free!  This, of course, further catalyzed the spread, adoption, adaptation, and incorporation of the software to weave the fabric of the Internet as we know it today.  It all started with a simple TCP/IP stack complemented by HTTP servers and browsers.

Of course, in a few cases, the commercial market re-engineered implementations of the same concepts. So, today we users of commercial systems depend on lots of critical software, which like all software, has bugs.

But complaining about the historical fact that non-commercial people originally developed this stuff, is just as silly as using the licensing model as part of the blame-game.

If you use a commercial product of any kind and it has flaws from a part that the vendor included, you complain to the vendor. It was the vendor's supply chain, their choice, and their lack of discovery of the flaws in the component. It doesn't matter whether the component was free or not, or what type of license to use was involved.

And it should also be remembered that this will not change: wherever commercial players in technology markets fail to deliver something that's needed—whether because "it is everybody's need" or it is a need of a niche—there is an opportunity for non-commercial technology ecosystem to deliver such, and in many cases it will do so with excellence unencumbered by a profit mandate.

We close by simply observing that open source software offers just as much risk as closed source, but no more than that.  The "rewards," if you will, to help offset the risk of vulnerability or bugs of open source include:

  1. a broader audience and user community which can help identify issues faster;
  2. a licensing scheme that promotes a perpetual harvest of enhancements and modifications (including bug repairs); and
  3. as a means for broad adoption, adaptation, and fostering of innovation, open source is a strategy employed by some of the largest of the commercial ilk.  Example? In a word: Android.

Comment