Security is all about protecting assets—money, credit card numbers, or even resources like phone usage. When software controls access these assets, our job is to ensure those applications are reasonably secure.
As an industry, we’ve taken steps in the right direction by realizing that security isn’t something that can be solved by dropping a box (firewall, IDS, spam filter) on the network. Application code needs to also be fortified.
We’ve made efforts to banish buffer overruns, squash SQL injection vulnerabilities and weed out some of the other common issues that creep into software development. The interesting thing about software “security” though is that no matter how well we design a system, write the code, or fortify the network, we still have to deal with people that must use, operate and sometimes inadvertently completely circumvent security as a part of their job.
I’m talking about the insider, the operator, but not the mischievous or malevolent one. I’m talking about the hard working, smart, dedicated but ill-informed employee.
Oftentimes we turn software over to operations but offer little insight on how to deploy it, or more importantly, how to use it securely. In addition to securing software, we need to build a culture of security to go along with it. Without one, the effectiveness of the other is likely to be minimal.
The Unknowing Element
A recently heard story from a newly-graduated college student may help to illustrate this. First a little context: There is not a more dangerous, creative and resourceful adversary than a cash-strapped college student. This story involves a mid-sized university that has a fairly common yet complex phone system—one ruled by four-digit extensions.
The system also offers the ability to dial an office or room directly by using a regular telephone number comprised of a three-digit fixed prefix followed by the four digit internal extension number. Certain extensions have the ability to transfer a call to another extension, like forwarding someone from the admissions department to the student records office.
The school had one toll free number that connected directly to the admissions office where either a secretary or a work-study student would answer the phone. After 5 P.M. on weekdays, the extension would forward to a voicemail system.
Security vulnerabilities usually exist in additional or side-effect functionality, and in this case, that offending functionality was in the voicemail system. The system allowed a caller to key in an alternate extension, essentially allowing toll-free access to any extension (including dorm rooms) at the university.
Some students had been abusing this for years, saving thousands of dollars in long distance calls from family and friends. A design flaw to be certain, and by itself not one that’s particularly interesting, other than to illustrate that developers of the phone system weren’t thinking about how attackers would attempt to abuse it.
What was really interesting though is what happened once officials at the university, months and tens of thousands of dollars in phone bills later, realized the vulnerability existed. The technical issue was fixed. The voicemail for the admissions office was turned into a dead end for callers through the 800 number. A much more insidious vulnerability remained, however—uninformed users.
Many students then changed their phone scheme to exploit the human gatekeeper. Their new strategy was to call into the school early into the afternoon, before the admissions office closed, claiming a wrong extension and asking to be transferred to their dorm extension of choice.
There were incidents of long distance boyfriends or girlfriends calling in at 4:50 p.m. and then both parties leaving phones off the hook and connected till 10 p.m. when they began their real conversations. The university continued to hemorrhage money through the phone system, largely unnoticed by administrators.
Plugging the Holes
To me, this story illustrates one of the biggest problems we have in software security—educating our users on threats.
Developers, designers and architects all tend to assume that implementers, administrators and users have a certain level of familiarity with the products they deploy and use. Yet information to help customers deploy and use solutions securely isn’t conveyed. Secure deployment may be about ports, protocols, passwords and backups, but it also requires user education in terms of security.
It is estimated by the CSI/FBI Computer Crime and Security Survey that more than 70% of information security incidents originate from insiders, but the fact is most offending insiders aren’t malicious. They may, however, be enablers.
Take the employee that clicks on the obviously virus-laden email, or leaves the password sticky-note on his monitor. These are classic human-element security issues. Some organizations have attempted to address them with endless policies and procedures but the key failing is that telling smart people what to do without educating them about why they’re doing it is likely to provoke frustration and policy violation. So policies meant to promote security are defanged by ignorance or confusion.
The key is to build a culture of security (not paranoia!) and to let it spread to everyone in the organization. It may start by sending an email on awareness, or offering an e-learning module on threats and attackers. It might take the form of an add-on half-hour of training on IT security as a part of new employee orientation.
The reality is most people will do the right thing if they understand what the right thing is and why they’re doing it.
Until we start coupling user security awareness and deployment guidance with technical security, and start addressing the vulnerabilities in the way applications are used as well as vulnerabilities in the way software is manufactured, we will make little impact on true operational risk.
Herbert H. Thompson is chief security strategist at Security Innovation . Thompson trains software developers and testers at the world’s largest software companies on security techniques.