Among the responsibilities of the National Security Agency’s Information Assurance Directorate is to discover and mitigate technical vulnerabilities that put information systems at risk. But another class of vulnerability also poses a threat to our nation’s secrets: user experience.
When a product or process is too hard to use or simply unreliable, people naturally tend to avoid it. This tendency creates a significant problem when people avoid using their security products. Calls are made on unsecured lines, e-mails go unencrypted, files are moved across domains without being scanned for malware, and network configurations are changed to allow unauthorized applications to function. The impact of these small actions can be significant.
The greatest security measure in the world is worthless if it is not used.
The internal system of secure phones at NSA is an example of a system that works because users employ it. We have good operational discipline and do not discuss classified topics on commercial phones. Why? The internal secure phones work just like home phones. Their audio quality is great, they operate by dialing just like a normal phone — without any difficult password or login requirements — and getting a connection is quick and easy. Simply put, users make classified calls on the secure phones because the system is transparent and delivers the experience users expect.
Many military, intelligence and law-enforcement circles do not practice the same discipline in their use of secure telephones because their phone systems aren’t as transparent and functional as ours. Wouldn’t it be great if every call were secure without having to use a special phone, regardless of whether it was intended to be a classified discussion? And shouldn’t all e-mails be encrypted by default? Building in security that operates transparently in the background is a tremendous enabler.
A tough consideration in this discussion is deciding what constitutes an “acceptable risk.” We can raise the bar in a security solution to mitigate extreme exploitation scenarios, but that comes at a cost, both in development resources and operational efficiencies.
There is room to balance the risk and vulnerability in the solution, again noting that we can drive our customers away from security with too heavy a burden. Acceptable risk will vary significantly by scenario, and we need products to address the spectrum of operations, with some that accept more risk to provide an easy user experience.
This message must be carefully parsed. Poor security solutions must not be encouraged or endorsed simply for expediency. There is, however, a place for deliberate, educated risk decisions regarding security. Developers must design with the intent to deliver a quality user experience that doesn’t get in the way of operations. Foundational to that are the assumptions that security practices and policies must be followed, and that the cost to achieve solid security is necessary and must be accepted. We need to ensure the operational impact is as low as possible for the given mission.
The total or “real” security risk includes a trade-off between security and user experience. Users are clever and expedient. We know where their default reaction lies. Security competes for attention, and although it doesn’t have to be transparent, it should be nearly automatic for greatest effectiveness.
Clearly, the more developers and implementers live like our customers (in the same communications environment, using the same devices and procedures), the more crisp, realistic and usable their decision-making and advice will be. Developers using the same technology and living in the same world take these real-life trade-offs much more seriously.
As we develop security solutions, we need to examine the user experience as deeply as we inspect the technical architecture. The good news is that the user experience is within our control to mitigate just as much as the technical vulnerabilities are.