360 likes | 498 Views
Has Johnny learnt to encrypt by now? Examining the troubled relationship between a security solution and its users. M. Angela Sasse Professor of Human-Centred Technology Department of Computer Science University College London, UK a.sasse@cs.ucl.ac.uk www.ucl.cs.ac.uk/staff/A.Sasse .
E N D
Has Johnny learnt to encrypt by now?Examining the troubled relationship between a security solution and its users M. Angela Sasse Professor of Human-Centred Technology Department of Computer Science University College London, UK a.sasse@cs.ucl.ac.uk www.ucl.cs.ac.uk/staff/A.Sasse
Overview • Usable security • History – Johnny & the Enemies • A framework for thinking about security • Usable encryption: what has been tried, and how successful was it • Concluding thoughts • Who has to learn what • Possible technology pieces towards a solution
Two of three “classics” re-printed in recent book Security and Usability: Designing secure system that people can use. Edited by Lorrie Faith Cranor & Simpson Garfinkel. O’Reilly 2005.
“Why Johnny Can’t Encrypt” • Whitten & Tygar, Procs USENIX 1999 • Graphical user interface to PGP 5.0 • Even after detailed introduction, only 3 out of 12 participants could encrypt their email successfully • Need more than a pretty face: graphical ≠ usable • Problems: • User tasks not represented • Misleading labels • Lack of feedback
“Users are Not The Enemy” • Adams & Sasse, CommACM 1999 • Many users’ knowledge about security is inadequate • Users will shortcut security mechanisms that get in the way of their goals/tasks • Security policies often make impossible demands of users • Users lose respect for security, downward spiral in behaviour
How do we design a usable system • Consider users and their characteristics • Minimize physical and mental workload • Consider users’ goals and tasks • Functionality must support these, user interface must signpost in those terms • Conflicting goal structures are always bad news • Consider context of use • Physical and social environment
The path to usable security(according to Whitten, 2004) “… the usability problem for security is difficult to solve precisely because security presents qualitatively different types of usability challenges from those of other types of software […] making security usable will require the creation of user interface design methods that address those challenges.”
The path to usable security (according to Sasse et al., 2001) • Most security mechanisms are downright unusable – apply key usability principles • Identify users and relevant characteristics • Minimize their physical & mental workload • Security is an enabling task, so fit in with production tasks and context of use • Policies and mechanisms • When extra effort is needed, educate and motivate
Development of usable security systems is similar to safety-critical systems development Security is a secondary goal for most users Underlying security systems are complex Education and behaviour modification is needed Designing a user interface vs. designing a socio-technical system Security UIs should prevent errors and teach users about underlying security systems vs. simplify underlying systems Security must remain visible vs. simplify & automate wherever possible What we agreed on … and what not
A telling footnote … “… when presented with a software programme incorporating visible public key cryptography, users often complained during the first 10-15 minutes of the testing that they would expect ‘that sort of thing’ to be handled invisibly. As their exposure to the software continued and their understanding of the security mechanism grew, they generally ceased to make that complaint.” Alma Whitten’s thesis, 2004
… but we only want what’s best for them! “There are significant benefits to supporting users in developing a certain base level in generalizable security knowledge. A user who knows that, regardless of what application is in use, one kind of tool protects the privacy of transmission, a second kind protects the integrity of transmission, and a third kind protects the access to local resources, is much more empowered than one who must start afresh with each application.” Alma Whitten’s thesis, 2004
So … what would Johnny have to learn? The following lists were posted by Eric Norman (University of Wisconsin) to the Yahoo HCISec mailing group last year, and are reproduced with his kind permission
“Those of us who grew up on the north side of Indianapolis have this thing for top 10 lists. At least one of us (me) believes the following: when it comes to PKI and security, users are going to have to learn something. I'm not sure just what that something is; I know it's not the mathematics of the RSA algorithm, but I believe that no matter what, there's something that they are just going to have to learn. It's like being able to drive down the concrete highway safely.”
“You don't have to learn about spark plugs and distributors, but you do have to learn how to drive, something about what the signs mean, what lines painted on the road mean, and so forth. Nobody can do this for you; each user (driver) is going to have to learn it for themselves. In order to get a better handle on just what it is that folks are going to have to learn, I'm trying to come up with a top 10 list of things that must be learned. Here's what I have so far with some help from some other folks I know who are more technophiles than human factors people. There are two lists: one for users and the other for administrators, developers, etc.”
Things PKI users to have to learn • How to import a trust anchor. • How to import a certificate. • How to protect your privates (private keys, that is). • How to apply for a certificate in your environment. • Why you shouldn't ignore PKI warnings. • How to interpret PKI error messages. • How to turn on digital signing. • How to install someone's public key in your address book. • How to get someone's public key. • How to export a certificate.
… and • Risks of changing encryption keys. • How to interpret security icons in sundry browsers. • How to turn on encryption. • The difference between digital signatures and .signature files. • What happens if a key is revoked. • What does the little padlock really mean. • What does it mean to check the three boxes in Netscape/Mozilla? • What does "untrusted CA' mean in Netscape/Mozilla? • How to move and install certificates and private keys.
Developers, administrators, etc. • What does the little padlock really mean. • How to properly configure mod_ssl. • How to move and install certificates and private keys. • What .pem, .cer, .crt, .der, .p12, .p7s, .p7c, .p7m, etc mean. • How to reformat PKI files. • How to enable client authentication during mod_ssl configuration, • How to dump BER formatted ASN.1 stuff. • How to manually follow a certificate chain. • The risks of configuring SSL stuff such that it automatically starts during reboot. • How to extract certificates from PKCS7 files, etc.
… and • How to make PKCS12 files. • How to use the OpenSSL utilities. • What happens if a key is revoked.
Can a nice UI make security tools easy to use? • Problem lies deeper: • “key” cues the wrong mental model • Meaning of “public” and “private” is different from everyday language • Underlying model too complex • Whitten produced tutorial on public key cryptography that take 1.5 days • Solutions? • Automatic en/decryption where encryption is needed • Simplify model/language
Results from Grid security survey, 2005 • Survey of security issues in UK eScience (Grid) programme • Most frequently mentioned issue: certificates • Many users complained about effort involved in obtaining certificates, and complexity involved in using them
user gets notified they need a certificate to use a Grid application instruction sheet how to get a certificate point browser at National CA CA sends notification to local CA go to local CA with proof of identity/authorization what if user does not have one? local CA person releases to CA for specific machine, gives pw for cert release person can download from CA via browser to local machine export certificate from browser to directory where application will look for it How to get an eScience certificate
Obtaining a certificate was perceived to require too much time and effort; many projects would share certificates obtained by one project member to “make it worth it.” • Defense of security people: “People should regard it as the price of admission you have to pay for using the Grid.”
Problems in using certificates • Certificate has to be stored in right application directory • Will not work on a different machine, but • … anyone using my machine can use it (not uncommon in Grid projects). • To users, it’s just another file on their computer – nothing that marks it out as something they should look after like a bank statement • Problems understanding terminology • “doesn’t work like a key” • “there is no such thing as half a secret”
Ironic twist … • Users actually have security requirements – and one of them is availability of their data • Terrified that key and/or certificates stop working
Security metaphors • Metaphors used by security experts as shorthand for communicating with each other do not work for wider audience • “key” cues the wrong mental model – do not behave like locks and keys • Meaning of “public” and “private” is different from everyday language • Not clear why a “digitally signed” message = hasn’t been tampered with – most users think it means it is from who it says it is …
Improving Johnny’s performance • Garfinkel & Miller: overhead of obtaining certificates is barrier to adoption • Solution: • Key Continuity Management (KCM) • Colour-coding messages according to whether message was signed, and whether signer was previously known. • Remaining problems: did not realise that encrypted ≠ secret if you send message to attacker
It’s not just end-users who struggle … • Case studies with eScience (Grid) software developers identified • Many developers have difficulty understanding how to implement PKI • Tendency to avoid using PKI because it was seen to be too complex, and likely to put off potential users • Cost of implementation and operation considered too high • Zurko & Simon pointed out in 1996 that not only users, but developers & system managers struggle with complexity of security
Even cryptographers can get it wrong … • In a recent paper, Yvo Desmedt describes his failure to encrypt a wireless link … and blames it all on network/system managers … • “… system managers do not understand the consequences of their actions and may not know of, for example, man-in-the-middle attacks or understand these correctly.” • Example of colleague whose encrypted link to firewall defaulted to un-encrypted when he briefly close the lid on his Powerbook …
“Too many engineers consider cryptography to be a sort of magic security dust they can sprinkle over their hardware and software […].” “The fundamentals of cryptography are important, but far more important is how those fundamentals are implemented and used.” “Book after book presented complicated protocols for this or that, without any mention of the business and social constraints within which those protocols would have to work.” N. Ferguson & B. Schneier: Practical Cryptography, 2003
“Humans are incapable of storing high-quality cryptographic keys, and the have unacceptable speed and accuracy when performing cryptographic operations. (They are also large, expensive to maintain, difficult to manage, and they pollute the environment.) It is astonishing that these devices continue to be manufactured and deployed. But they are sufficiently pervasive that we must design our protocols around their limitations.” [C. Kaufmann, R. Perlman & M. Speciner: Network Security]
To get on the network today: • SSID: PKI2005 • WEP Key (HEX): 12E9CEA5381354FD6FE23234EA
Summary (1) – lessons from Johnny • Make it as easy as possible for Johnny to do the right thing • minimize physical and mental workload, and • consider his goals and context of use. • If you want to educate Johnny • get your terminology in order first • Motivate him by linking securing things he cares about • Less complexity, more integration would help all users (not just Johnny).
Summary (2) - strategy • Application solutions – S/MIME • Design to secure things people care about • Felten & Friedman’ value-based design • Secure delete • Better integration of encryption solutions • Better and faster administrative support • Technologies that might help • Shibboleth – probably • Token-based systems - maybe • Biometrics – maybe
References • A. Adams & M. A. Sasse (1999): Users Are Not The Enemy: Why users compromise security mechanisms and how to take remedial measures. Communications of the ACM, 42 (12), pp. 40-46 December 1999. • Y. Desmedt (in press): Why some network protocols are so user-unfriendly. Security Protocols, Springer LNCS. • I. Flechais (2005): Designing Secure and Usable Systems. PhD Thesis, • Department of Computer Science, UCL. • N. Ferguson & B. Schneier (2003): Practical Cryptography. Wiley. • S. Garfinkel & R. C. Miller (2005): Johnny 2: A user test key continuity with S/MIME and Outlook Express. Procs. SOUPS 2005. • M. A. Sasse, S. Brostoff & D. Weirich (2001): Transforming the "weakest link": a human-computer interaction approach to usable and effective security. BT Technology Journal, Vol 19 (3) July 2001, pp. 122-131. • A. Whitten (2004): Making security usable. Doctoral thesis CMU-CS-04-135. • A. Whitten & D. Tygar (1999): Why Johnny can’t encrypt. Procs. USENIX 1999. • M. E. Zurko & R. T. Simon (1996): User-centered security. In Procs of New Security Paradigms Workshop, pp. 27 -- 33, 1996.