I want you to think for a second about the key you use most. Whether it’s for your house, your apartment, your car, or your office, just think about it for a moment.
Now, this key you’re thinking of is going to have a few basic properties. It consists of metal, has a blade extending out of it that has grooves along one or both sides, and either a single set of teeth cut into the bottom, or two sets of identical teeth cut into both the top and bottom.
If it is a car key, it might be slightly different; as car theft has increased, car keys have gotten more complex, so you might be thinking about a car key that is just a wireless fob that unlocks and or starts the car based on proximity, or it might be an inner-cut key as is common with many Asian and European cars today.
Aside from the description I just gave you, when was the last time you thought about that key? When did you actually last look at the ridges on it?
It’s been a while, hasn’t it? That’s because that key and the lock it works with provide the level of security you feel that you need to protect that place or car, yet it doesn’t get in your way, as long as the key and the lock are behaving properly.
Earlier this week, I was on a chat on Twitter, and we were discussing aspects of security as they relate to mobile devices. In particular, the question was asked, “Why do users elect to not put a pin/passcode/password on their mobile devices?” While I’ve mocked the idea of considering security and usability in the same sentence, let alone the same train of thought while developing technology, I was wrong. Yes, I said it. I was wrong. Truth be told, Apple’s Touch ID is what finally schooled me on it. Security and usability should be peers today.
When Apple shipped the iPhone 5s and added the Touch ID fingerprint sensor, it was derided by some as not secure enough, not well designed, not a 100% replacement for the passcode, or simply too easy to defeat. But Touch ID does what it needs to do. It works with the user’s existing passcode – which Apple wisely tries to coax users into setting up on iOS 7, regardless of whether they have a 5s or not – to make day to day use of the device easier while living with a modicum of security, and a better approach to securing the data, the device, and the credentials stored in it and iCloud in a better way than most users had prior to their 5s.
That last part is important. When we shipped Windows XP, I like to think we tried to build security into it to begin with. But the reality is, security wasn’t pervasive. It took setting aside a lot of dedicated time (two solid months of security training, threat modeling, and standing down on new feature work) for the Windows Security Push. We had to completely shift our internal mindset to think about security from end to end. Unlike the way we had lived before, security wasn’t to be a checkbox, it wasn’t a developer saying, “I used the latest cryptographic APIs”, and it wasn’t something added on at the last minute.
Security is like yeast in bread. If you add it when you’re done, you simply don’t have bread – well, at least you don’t have leavened bread. So it took us shipping Windows XP SP2 – an OS update so big and so significant many people said it should have been called a new OS release – before we ever shipped a Windows release where security was baked in from the beginning of the project, across the entirety of the project.
When it comes to design, I’ve mentioned this video before, but I think two of Jonathan Ives’ quotes in it are really important to have in your mind here. Firstly:
“A lot of what we seem to be doing in a product like that (the iPhone) is getting design out of the way.”
“It’s really important in a product to have a sense of the hierarchy of what’s important and what’s not important by removing those things that are all vying for your attention.”
I believe that this model of thought is critical to have in mind when considering usability, and in particular where security runs smack dab into usability (or more often, un-usability). I’ve said for a long time that solutions like two-factor security won’t take off until they’re approachable by, and effectively invisible to, normal people. Heck, too much of the world didn’t set ever set their VCR clocks for the better part of a decade because it was too hard, and it was a pain in the ass to do it again every time the power went out. You really don’t understand why they don’t set a good pin, let alone a good passcode, on their phone?
What I’m about to say isn’t meant to infer that usability isn’t important to many companies, including Microsoft, but I believe many companies run, and many software, hardware or technology projects are started, run, and finished, where usability is still just a checkbox. As security is today at Microsoft, usability should be embraced, taught, and rewarded across the organization.
One can imagine an alternate universe where a software project the world uses was stopped in it’s tracks for months, redesigned, and updated around the world because a user interface element was so poorly designed for mortals that they made a bad security decision. But this alternate universe is just that, an alternate universe. As you’re reading the above, it sounds wacky to you – but it shouldn’t! As technologists, it is our duty to build hardware, software, and devices where the experience, including the approach to security, works with the user, not against them. Any move that takes the status quo of “security that users self-select to opt into” and moves it forward a notch is a positive move. But any move here also has to just work. You can’t implement nerd porn like facial recognition if it doesn’t work all of the time or provide an alternative for when it fails.
Projects that build innovative solutions where usability and security intersect should be rewarded by technologists. Sure, they should be critiqued and criticized, especially if designing in a usable approach really compromises the security fundamentals of the – ideally threat-modeled – implementation. But critics should also understand where their criticism falls down in light of the practical security choices most end users make in daily life.
Touch ID, with as much poking, prodding, questioning, and hacking as it received when it was announced, is a very good thing. It’s not perfect, and I’m sure it’ll get better in future iterations of the software and hardware, and perhaps as competitors come up with alternatives or better implementations, Apple will have to make it ever more reliable. But a solution that allows that bar to be moved forward, from a place where most users don’t elect to set a pin or passcode to a place where they do? That’s a net positive, in my book.
As Internet-borne exploits continue to grow in both intensity and severity, it is so critical that we all start taking the usability of security implementations by normal people seriously. If you make bad design decisions about the intersection where security and usability collide, your end users will find their own desire path through the mayhem, likely making the easiest, and not usually the best, security decisions.