Jurassic Plaque: The U-Curve of SecurityBy CIOinsight
Jurassic Plaque: The U-Curve of Security
"Limits are what we fear."
... R. Buckminster Fuller
The organizations that enforce the strictest corporate security are often the ones that are the least secure. With the exception of organizations whose very mission is security (say, the Coast Guard, intelligence agencies, banks or body shops that rent uniformed personnel as their mission), the more resources an outfit throws at security, generally the less likely it is they are getting any bottom-line value for it.
Unless you have an unlimited budget, every dollar spent on securing assets is a dollar subtracted from something productiveat best, dollars spent on successful security are resources you could have spent on R&D or marketing or customer service or dividends that are now lost to you forever.
Sometimes the complexity of an initiative that requires extremely secure systems makes it nearly impossible to succeed, even for a skilled organization with unlimited resources and a skilled SI. A recent example that serves as a perfect warning is the $104+ million bloodbath the FBI suffered in its Trilogy project, a perfectly straightforward case file management and sharing system so larded with the need for absolute security that essential parts of it won't ever be deployed.
If the FBIwith a mission that everyone recognizes as vital and with supplied resources to matchcan't get to the finish line, I suggest it's not going to be any easier for anyone with a less vital mission.
Subtraction by Addition
This doesn't mean you should try to lead your group into a zero-security zone. It just means that as you add technology and processes and procedures, the ability of predators to exploit your structures is a U-curve, an upside-down bell curve. When you're adding from zero, the first steps you take will decrease vulnerability the most. Incrementally adding more stringent procedures, updating porous system software, buying new locks to add to existing locks will improve protectionto a moment where people and systems can cope with the complexity.
You can asymptotically approach total security, but at some point, you get subtraction by addition and every new attempt merely weakens the stability of the system. Complexity overcomes your efforts, and every new endeavor merely weakens your protection.
The example you already know is mandating long, complex passwords that have no internal consecutive characters that have meaning. The organizations that force upon their users 10-character nonsense strings that blend numbers and both capitalized and lowercase letters and that change every month or so are the shops that have people writing down their passwords and sticking the keys to entry in a convenient location.
But that's merely the tip of the iceberg.
Next Page: Systemic failures.
Because most IT practitioners aren't "people people" but are the kind of individuals who prefer the company of machines, they are likely to think of the problem as a human one, that if they could just get rid of the users, it could all be perfect. That naïve view, beyond its obvious unattainability, is a fallacy.
It's the systems that are most vulnerable to complexity. The more complex a single piece of software or hardware is, the more likely it is to be flawed and the less likely it is the flaws will be detected before an attack. The greater the number of pieces you try to cobble together to approach absolute security, the more likely it is that the attempt to integrate them will be imperfect and some border between them will offer a seam someone could exploit.
The result, however, of adding security on top of such Rube Goldbergiana is a plaquing up of the information channels that allow users to be productive; that plaque will always slow down and sometimes kill an organization. A chicken that recently came home to roost is an ex-client of mine, a freight forwarder.
The firm suffered a triple-whammy from 9/11. First, its management got very afraid, almost timid, about the external worldthe event triggered a general failure of nerve. They started paying more attention to what might go wrong than how to cope with the inevitable changes to their operations and business model. Second, because of the business they were in, government agencies started paying more attention to the company's transactions, which cost it extra auditing time and effort, all overhead, nothing productive.
The côup de gras, to help allay its fears, the company hired a manager-level fellow as IT's security czar. He was even more driven by anxiety than the executive team, and was hyper-energetic and very ambitious. These personality factors spun together to make for a perfect spit-storm. While he was tireless in devising procedures that slowed down network log-on and restricted access to data except when signed off on by a note from the worker's mother, he was larding up the network with new products, security patches and an attempted overseas outsourcing of his help desk to try to pay for all the new techno-binky purchases he hoped would ease his anxiety.
His actions inevitably added overhead (time, energy, cash), and that inevitably degraded productivity. The declining economy undercut the company's gross revenues at the same time, a perfectly fatal recipe.
Like a car owner who welds his sedan's doors shut and epoxies his windows in their frames, the company became a possession no one can highjack but can't be legitimately used either. The close to perfectly secured company shut its doors for good by late 2002.
But wait, there's more
Social, human factors, the way end users work within the over-secured system, present another set of overhead factors that diminish the ability to weather excessively locked-down environments. I'll explore that next time and give you a tool that can be of some use when you're confronting superfluous security silliness.
No need to weld those car doors when for $40 you could buy The Club.
Read part 2 of this article to see How Fear Impedes Security.