A few months back, I picked up an issue of Scientific American.  Nothing to unusual there, but I was quite surprised by one of the articles I found inside. In “Dependable Software by Design”, Daniel Jackson put the high-integrity software industry in the public eye.

The article was well-written, informative, and far from being sensationalist; but something about it was very different. I’ve seen the occasional newspaper article on the topic, but this was the first time I’d seen a large circulation national magazine with a target demographic that goes far beyond IT Professionals. According to Scientific American’s own demographic business profile, only 24% of their readership is employed in technology or engineering.  That leaves a whopping 76% who were probably learning about some of these issues for the first time.  Far more if you consider that only a very small fraction of the 24% deal with high-integrity software in any capacity.

It seems to me that this article may illustrate a growing conciousness and concern by the broader public about the need for “Dependable Software”.

Since the computers first came into use in business, we have talked and complained about them (and implicitly, their software) as the source of errors and delays.  It has been a running joke for the public.  It’s so common that we hardly even notice when the banker says “the computers are really slow today”, or the automotive tech tell us about a computer glitch that “messed up his records”. Within the few months, I spoke with a technical headhunter who was apologizing for a little computer magic within her firm… somehow my email address had become attached to more than 20 other contacts in her database.  We hear the occasional story of an erroneous $20,000 residential utility bill, and we all laugh a bit harder.  All’s well that ends well. Most people in the U.S. probably knows someone who has been affected in some memorable way by a computer error.  Personally, I still have a receipt (though the ink has faded so that is no longer legible) that showed my bank account balance at $999,999.99.  It was issued by an ATM machine following a withdrawal.  This happened circa 1988, and I was then a young NCO in the U.S. Air Force.  Trust me, it was an error.

Once upon a time, we could be fairly secure that the software that impacted our health and personal well-being was backed by trustworthy authorities.  We knew who to blame when something went wrong.

Today software is all around us.  The internet is the best example.  We use PayPal or a credit card, and our money passes through an untold number of servers on it’s way to a destination. The little padlock encourages us to trust the security, yet most people don’t know enough about internet security to question its strength.  Even if we pay in person or by telephone, the merchant will, in all likelihood re-transmit the information via the internet.  The pressure increases daily to use the internet to provide personal information for business, finance, and health, though poor internet security may be the largest single enabler for identity theft.

Today software keeps us safe.  It’s assists us in operating our vehicles.  It provides medical diagnosis and treatment.  It controls the traffic lights, elevators, and other infra-structure of our daily environments.  Can we trust in this safety?  The public is beginning to wonder.

I believe the Scientific American article is among the first of many such articles that will appear. Each will broaden the demographic until we see it brought to everyone with an episode 20/20 or Dateline.

One likely outcome of all of this media attention is will be some form of certification or licensing requirement for any software professional involved with high-integrity applications.  This is not a new idea, but to date, its been promoted primarily by professionals within the industry.  If the public should express concern, the eventual licensing of software professionals seems imminent.

Another possible outcome of all of this media attention may be a classification system, requiring all software affecting the public to be evaluated for security, safety, and reliability attributes.  Now, this is an intriguing idea… Maybe MTBF data should be included as well.  I’d love to see statistical measures as to how frequently I should expect to see the infamous Blue Screen Of Death.

About Max H:
Max is a father, a husband, and a man of many interests. He is also a consulting software architect with over 3 decades experience in the design and implementation of complex software. View his Linked-In profile at http://www.linkedin.com/pro/swarchitect