Modernizing Authentication — What It Takes to Transform Secure Access
Why Is It So Difficult to Educate or Enforce Compliance?
Does this sound familiar? Ian Fleming is his company's top IT executive and a passionate believer in strict security procedures. But convincing other executives at his company, an electric utility that he requested not be named, has been a struggle. On several occasions, Fleming says, he's been chastised by other executives for insisting they follow simple security procedures such as changing their passwords. And in December, Fleming found himself arguing with the CEO, who insisted on choosing a permanent password for himself. While the CEO now supports the security policy, says Fleming, "most managers in the utility industry see security as an obstacle to performing everyday business."
Judging by the results of our latest survey, Fleming has plenty of company both inside and outside his industry. Less than 10 percent of IT executives feel their companies' security is less than adequate. But ask them about enforcing security policy and educating employees about security, and the level of confidence drops considerably. Thirty-four percent feel they are doing a less than excellent job of enforcing their security policies, while 45 percent miss the mark when it comes to education.
Given the attention directed to computer viruses, hackers and terrorism, why is managing security so tough? No surprise here; according to the IT executives we spoke with, user resistance and denial are at the root of the problem. No one has a sure-fire solution. Instead, they count on practices that seem to, or they hope will, work somewhat better than others.
Users regard security policies as an inconvenience, even a nuisance. "Security makes it harder for people to use the infrastructure we have built," says Vijay Sharma, a vice president of relationship management at Sodexho USA, a food and facilities-management company. Until people begin to doubt the integrity of the data or the systems, users think security "is more or less an annoyance," he says. That leads employees, and even managers, to ignore policies and known risks.
IT executives also say they often run into a "can't happen here" or "can't happen to me" attitude. Employees and managers may feel their industry isn't likely to be a target for hackers. Others think they know enough about computers to safely disregard company policies, so they download software off the Web, install their own programs or even change the configuration of their computers in order to speed them upleaving themselves and their networks open to viruses, intruders and system crashes. Randy Kjell, VP of IT of Knowles Electronics Inc., a manufacturer in Itasca, Ill., sees this attitude among his company's engineers. "They think other people are the problem, not them. They think their stuff won't hurt the company, so even after educating them, the user community does not agree that these are truly high security issues."
How do CIOs overcome these attitudes? One way is to make the system not usersdo the work of maintaining security, so that education and enforcement become moot. Executives like Kjell are putting intrusion detection, spam managers and virus filters on the network or firewall, out of sight and reach. And while Fleming is willing to isolate the engineering department if employees there engage in risky behavior, other CIOs simply show users safe ways to do what they want to do without endangering the company's network. "I've said no, but here's a way to get the results you want that minimizes our risk," says George Brenckle, CIO of the University of Pennsylvania Health System in Philadelphia.
The CIOs we spoke to use newsletters, intranets, e-mails and meetings to educate employees, but none works, they admit, as well as talking with employees after they've damaged their computers. "They tend to learn their lesson when their machines don't work for a period of time," says Kjell. Otherwise, it isn't the technique that matters as much as how often they contact employees about security, and finding a way that means something to users. Sharma, an ex-college food service director, stresses the importance of meeting with employees and explaining the need for security on their terms. For example, at a meeting with a team working on an e-business project, Sharma discussed recent articles about security problems. "If companies like Microsoft can be compromised, what makes you think we can't be?" he asked. The team then discussed how Sodexho's business would be hurt if security problems caused customers to lose trust in the company.
The only potential breakthrough anyone cited is in the healthcare industry, where new HIPAA regulations require companies to train and test staff on privacy and security policies. In January, Penn Health's Brenckle began to use Web-based training to teach, test, record results and provide yearly refresher courses on HIPAA privacy. His staff is now putting together a similar program for security.
Still, the only way to solve the security problem is to make it a non-issue by designing systems so that they place no demands on users. Until that goal is reached, the problem of user resistance, deeply rooted in human nature, will remain a tough nut to crack.