top of page
Timelines Backdrop_edited.png

The #1 Problem in Cybersecurity: The Truth You Don’t Want to Know

The Truth You Don’t Want to Know

My new favorite saying, and it seems to be rampant in the halls of success, but perhaps nowhere more so than in the IT arena when looking at IT security. As I cross a broad swath of commercial and government enterprises in the cyber defense domain, it is a continuing theme that was crystallized for me by reading a somewhat old book, Truth, Lies, and O-rings: The Story of the Challenger disaster.

Now for mature (cough older) gentlemen such as myself, this is an event I well recall. Like the generation previous to me that could say where they were when they heard Kennedy was shot, or the upcoming generation that recalls 9/11 marring their childhood, I know exactly where I was at when I heard about the Challenger disaster. I was chopping up to 5-4 in the venerable Bancroft Hall as a Plebe at the United States Naval Academy. For anyone not familiar with the event, this was the first space shuttle disaster when the external fuel tank of the shuttle exploded during launch, resulting in the eventual death of all onboard.

Of course, the call for an investigation was immediate, with NASA conducting its own failure investigation, as well as a presidential commission likewise looking into how this could have happened. In brief, the o-rings (rings used in lots of things to form a good seal. You can find examples in a waterproof flashlight for example) on the solid rocket boosters failed, resulting in a jet of fire impacting on the fuel tank until it burned through, and introduced fire to thousands of pounds of rocket fuel with predictable results. The fascinating part was that there had been evidence for years that these o-rings were susceptible to failure at low launch temperatures (that was the coldest launch in the space shuttle program history by a good 25 degrees). The engineers who built the rocket boosters were, to say the least, concerned before the launch. They called a special telecon with the NASA program manager (PM) the night before. In that telecon, the NASA manager essentially said “prove to me this will fail” instead of what he should have been saying; “prove to me this would work.” Aborting a launch was bad. The PM would look bad and his superiors had made it clear that “their section” did not hold up launches. So the O-ring problem was the truth he did not want to know and he pushed back on the engineers for “proof” until he got the answer he wanted. “It is OK to launch boss. Our contract is up for renewal and really we are sorry we bothered you.” Indeed it was the truth NASA had resisted knowing for some time. The seven astronauts including one first-time high school teacher turned space traveler, all died.

Then of course came the “here is why it is not my fault” part of the exercise, and indeed it seems NASA still was working to not know that cold temperatures and failure of the o-rings were the causes of the accident. The book does an admirable job laying all of this out from one astonishing page to the next.

What really struck me though was the thought that “This is the problem we have in Cyber.” Same issue. The truth we do not want to know. “We care a lot about security and want to make sure we are secure blah blah blah blah.” Oh but not that truth. I don’t want to really know that the things we spent a lot of money on are not wired together well. We have an expensive Intrusion Detection System (IDS). Yeah us. We must be good right? Well of course the alerts go to a log file that nobody looks at but we can tell the auditors we have one.

This afternoon I was reading an older Verizon Data Breach Digest, a publication where across the last years’ worth of breaches they have investigated, Verizon lays out common scenarios to provide lessons learned to a broader community. Buried in the center of paragraph 16 of scenario #1 I find, “Many of the core pieces of security existed – anti-virus deployments, intrusion detection sensors, and NetFlow capture were all available, but mostly unused.” Yep. Completely matches my experience. That was just scenario #1. I can hardly wait to get through the other 17.

As we look to move forward as a security industry this is really our top problem. We have to be willing to know the real truth. Only then can we take smart effective steps to improve security.


bottom of page