usability social engineeringkursawe/sio2011/slides/humanasp.pdf · usability social engineering ....

69
Human aspects in Security Usability Social Engineering

Upload: others

Post on 24-May-2020

1 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Usability Social Engineeringkursawe/SiO2011/Slides/humanasp.pdf · Usability Social Engineering . Follow-on: EULA 8. Exclusion and Limitation of Liability ... bad •Usually requires

Human aspects in Security

Usability

Social Engineering

Page 2: Usability Social Engineeringkursawe/SiO2011/Slides/humanasp.pdf · Usability Social Engineering . Follow-on: EULA 8. Exclusion and Limitation of Liability ... bad •Usually requires

Follow-on: EULA

8. Exclusion and Limitation of Liability TO THE MAXIMUM EXTENT PERMITTED BY APPLICABLE LAW, IN NO EVENT SHALL THE RIGHTHOLDER OR ITS PARTNERS BE LIABLE FOR ANY SPECIAL, INCIDENTAL, PUNITIVE, INDIRECT, OR CONSEQUENTIAL DAMAGES WHATSOEVER (INCLUDING, BUT NOT LIMITED TO, DAMAGES FOR LOSS OF PROFITS OR CONFIDENTIAL OR OTHER INFORMATION, FOR BUSINESS INTERRUPTION, FOR LOSS OF PRIVACY, FOR CORRUPTION, DAMAGE AND LOSS OF DATA OR PROGRAMS, FOR FAILURE TO MEET ANY DUTY INCLUDING ANY STATUTORY DUTY, DUTY OF GOOD FAITH OR DUTY OF REASONABLE CARE, FOR NEGLIGENCE, FOR ECONOMIC LOSS, AND FOR ANY OTHER PECUNIARY OR OTHER LOSS WHATSOEVER) ARISING OUT OF OR IN ANY WAY RELATED TO THE USE OF OR INABILITY TO USE THE SOFTWARE, THE PROVISION OF OR FAILURE TO PROVIDE SUPPORT OR OTHER SERVICES, INFORMATON, SOFTWARE, AND RELATED CONTENT THROUGH THE SOFTWARE OR OTHERWISE ARISING OUT OF THE USE OF THE SOFTWARE, OR OTHERWISE UNDER OR IN CONNECTION WITH ANY PROVISION OF THIS AGREEMENT, OR ARISING OUT OF ANY BREACH OF CONTRACT OR ANY TORT (INCLUDING NEGLIGENCE, MISREPRESENTATION, ANY STRICT LIABILITY OBLIGATION OR DUTY), OR ANY BREACH OF STATUTORY DUTY, OR ANY BREACH OF WARRANTY OF THE RIGHTHOLDER OR ANY OF ITS PARTNERS, EVEN IF THE RIGHTHOLDER OR ANY PARTNER HAS BEEN ADVISED OF THE POSSIBILITY OF SUCH DAMAGES. YOU AGREE THAT IN THE EVENT THE RIGHTHOLDER AND/OR ITS PARTNERS ARE FOUND LIABILE, THE LIABILITY OF THE RIGHTHOLDER AND/OR ITS PARTNERS SHALL BE LIMITED BY THE COSTS OF THE SOFTWARE. IN NO CASE SHALL THE LIABILITY OF THE RIGHTHOLDER AND/OR ITS PARTNERS EXCEED THE FEES PAID FOR THE SOFTWARE TO THE RIGHTHOLDER OR THE PARTNER (AS MAY BE APPLICABLE). NOTHING IN THIS AGREEMENT EXCLUDES OR LIMITS ANY CLAIM FOR DEATH AND PERSONAL INJURY. FURTHER IN THE EVENT ANY DISCLAIMER, EXCLUSION OR LIMITATION IN THIS AGREEMENT CANNOT BE EXLUDED OR LIMITED ACCORDING TO APPLICABLE LAW THEN ONLY SUCH DISCLAIMER, EXCLUSION OR LIMITATION SHALL NOT APPLY TO YOU AND YOU CONTINUE TO BE BOUND BY ALL THE REMAINING DISCLAIMERS, EXCLUSIONS AND LIMITATIONS.

Page 3: Usability Social Engineeringkursawe/SiO2011/Slides/humanasp.pdf · Usability Social Engineering . Follow-on: EULA 8. Exclusion and Limitation of Liability ... bad •Usually requires

Incident of the Week

Page 4: Usability Social Engineeringkursawe/SiO2011/Slides/humanasp.pdf · Usability Social Engineering . Follow-on: EULA 8. Exclusion and Limitation of Liability ... bad •Usually requires

“You could spend a fortune purchasing

technology and services...and your network infrastructure could still remain vulnerable to old-fashioned manipulation.” -Kevin Mitnick “Amateurs hack systems. Professionals

hack people.” -Bruce Schneier

Page 5: Usability Social Engineeringkursawe/SiO2011/Slides/humanasp.pdf · Usability Social Engineering . Follow-on: EULA 8. Exclusion and Limitation of Liability ... bad •Usually requires

What is Social Engineering?

• Uses Psychological Methods

• Exploits human tendency to trust

• Goals are the Same as Hacking

• Our Definition—Manipulation of human beings to obtain information or confidence pertaining to the security of networked computer systems (with malicious intent)

Page 6: Usability Social Engineeringkursawe/SiO2011/Slides/humanasp.pdf · Usability Social Engineering . Follow-on: EULA 8. Exclusion and Limitation of Liability ... bad •Usually requires

Social Engineering

• Use a plausible story, or just bully the target

– ‘What’s your PIN so I can cancel your card?’

– ‘You have a virus, you need to install our patch’

• Frank Abagnale ‘Catch me if you can’

• Kevin Mitnick ‘Art of Deception’

• Traditional responses:

– mandatory access control

– operational security

Page 7: Usability Social Engineeringkursawe/SiO2011/Slides/humanasp.pdf · Usability Social Engineering . Follow-on: EULA 8. Exclusion and Limitation of Liability ... bad •Usually requires

Social Engineering: Why it works

• Social psychology: – Solomon Asch, 1951: two-thirds of subjects would deny

obvious facts to conform to group • Line experiment

– Stanley Milgram, 1964: a similar number will administer torture if instructed by an authority figure • “Teaching by torture” experiment

– Philip Zimbardo, 1971: you don’t need authority: the subjects’ situation / context is enough • “Prison Guard Experiment”

– Simons,Chabris Selective Attention Test, 2010

Page 8: Usability Social Engineeringkursawe/SiO2011/Slides/humanasp.pdf · Usability Social Engineering . Follow-on: EULA 8. Exclusion and Limitation of Liability ... bad •Usually requires

The Mind of a Social Engineer

• More like actors than hackers

• Learn to know how people feel by observing their actions

• can alter these feelings by changing what they say and do

• make the victim want to give them the information they need

Page 9: Usability Social Engineeringkursawe/SiO2011/Slides/humanasp.pdf · Usability Social Engineering . Follow-on: EULA 8. Exclusion and Limitation of Liability ... bad •Usually requires

Approaches

• Carelessness

• Comfort Zone

• Helpfulness

• Fear

Page 10: Usability Social Engineeringkursawe/SiO2011/Slides/humanasp.pdf · Usability Social Engineering . Follow-on: EULA 8. Exclusion and Limitation of Liability ... bad •Usually requires

Careless Approach

• Victim is Careless

– Does not implement, use, or enforce proper countermeasures

– Does not understand value of assets they reveal

• Used for Reconnaissance

• Looking for what is laying around

• Especially easy since social media

– Facebook, Linkedin, Twitter, IMDB, …

Page 11: Usability Social Engineeringkursawe/SiO2011/Slides/humanasp.pdf · Usability Social Engineering . Follow-on: EULA 8. Exclusion and Limitation of Liability ... bad •Usually requires

Careless Examples

• Dumpster Diving/Trashing

– Huge amount of information in the trash

– Most of it does not seem to be a threat

– The who, what and where of an organization

– Knowledge of internal systems

– Materials for greater authenticity

– Intelligence Agencies have done this for years

Page 12: Usability Social Engineeringkursawe/SiO2011/Slides/humanasp.pdf · Usability Social Engineering . Follow-on: EULA 8. Exclusion and Limitation of Liability ... bad •Usually requires

Careless Examples (cont.)

• Building/Password Theft

– Requires physical access

– Looking for passwords or other information left out in the open

– Little more information than dumpster diving

Page 13: Usability Social Engineeringkursawe/SiO2011/Slides/humanasp.pdf · Usability Social Engineering . Follow-on: EULA 8. Exclusion and Limitation of Liability ... bad •Usually requires

Careless Examples (cont.)

• Password Harvesting

– Internet sweepstakes, simple services

– Based on the belief that people don’t change their passwords over different accounts

– Given the number of accounts normal users have, that’s a fairly good assumption

Page 14: Usability Social Engineeringkursawe/SiO2011/Slides/humanasp.pdf · Usability Social Engineering . Follow-on: EULA 8. Exclusion and Limitation of Liability ... bad •Usually requires

Careless Examples (cont.)

• Give out small goodies to get malware on

– Free USB sticks

– Games, Corrupted websites

Page 15: Usability Social Engineeringkursawe/SiO2011/Slides/humanasp.pdf · Usability Social Engineering . Follow-on: EULA 8. Exclusion and Limitation of Liability ... bad •Usually requires

Comfort Zone Approach

• Victim organization members are in a comfortable environment

– Lower threat perception

– Implicit trust

• People already in the building belong there

• Other members of my Rabbit-Breeding club can’t be bad

• Usually requires the use of another approach

Page 16: Usability Social Engineeringkursawe/SiO2011/Slides/humanasp.pdf · Usability Social Engineering . Follow-on: EULA 8. Exclusion and Limitation of Liability ... bad •Usually requires

Comfort Zone Examples

• Impersonation – Could be anyone

• Tech Support

• Co-Worker

• Boss

• CEO

• User

• Maintenance Staff

– Generally Two Goals • Asking for a password or other authentication information

• Building access - Careless Approach

Page 17: Usability Social Engineeringkursawe/SiO2011/Slides/humanasp.pdf · Usability Social Engineering . Follow-on: EULA 8. Exclusion and Limitation of Liability ... bad •Usually requires

Comfort Examples (cont.)

• Shoulder Surfing

• Direct Theft – Outside workplace

– Wallet, id badge, or purse stolen

• Smoking Zone – Attacker will sit out in the smoking area

– Piggy back into the office when users go back to work

Page 18: Usability Social Engineeringkursawe/SiO2011/Slides/humanasp.pdf · Usability Social Engineering . Follow-on: EULA 8. Exclusion and Limitation of Liability ... bad •Usually requires

Comfort Examples (cont)

• Insider Threats

– Legitimate employee

– Could sell or use data found by “accident”

– Result of poor access control

– Asking for favors from IT staff for information

• Usually spread out over a long period of time

Page 19: Usability Social Engineeringkursawe/SiO2011/Slides/humanasp.pdf · Usability Social Engineering . Follow-on: EULA 8. Exclusion and Limitation of Liability ... bad •Usually requires

Helpful Approach

• People generally try to help even if they do not know who they are helping

• Usually involves being in a position of obvious need

• Attacker generally does not even ask for the help they receive

Page 20: Usability Social Engineeringkursawe/SiO2011/Slides/humanasp.pdf · Usability Social Engineering . Follow-on: EULA 8. Exclusion and Limitation of Liability ... bad •Usually requires

Helpful Examples

• Piggybacking – Attacker will trail an employee entering the

building

– More Effective: • Carry something large so they hold the door open for

you

• Go in when a large group of employees are going in

– Pretend to be unable to find door key

Page 21: Usability Social Engineeringkursawe/SiO2011/Slides/humanasp.pdf · Usability Social Engineering . Follow-on: EULA 8. Exclusion and Limitation of Liability ... bad •Usually requires

Helpful Examples (cont.)

• Troubled user

– Calling organization numbers asking for help

– Getting a username and asking to have a password reset

Page 22: Usability Social Engineeringkursawe/SiO2011/Slides/humanasp.pdf · Usability Social Engineering . Follow-on: EULA 8. Exclusion and Limitation of Liability ... bad •Usually requires

Fear Approach

• Usually draws from the other approaches

• Puts the user in a state of fear and anxiety

• Very aggressive

Page 23: Usability Social Engineeringkursawe/SiO2011/Slides/humanasp.pdf · Usability Social Engineering . Follow-on: EULA 8. Exclusion and Limitation of Liability ... bad •Usually requires

Fear Examples

• Conformity

– The user is the only one who has not helped out the attacker with this request in the past

– Personal responsibility is diffused

– User is violating social norms by not helping

– User gets justification for granting an attack

Page 24: Usability Social Engineeringkursawe/SiO2011/Slides/humanasp.pdf · Usability Social Engineering . Follow-on: EULA 8. Exclusion and Limitation of Liability ... bad •Usually requires

Fear Examples (cont)

• Time Frame

– Fictitious deadline

– Impersonates payroll bookkeeper, proposal coordinator

– Account will be deactivated immediately if not fixed

– Asks for password change

Page 25: Usability Social Engineeringkursawe/SiO2011/Slides/humanasp.pdf · Usability Social Engineering . Follow-on: EULA 8. Exclusion and Limitation of Liability ... bad •Usually requires

Fear Examples (cont)

• Importance

– Classic boss or director needs routine password reset

– Showing up from a utility after a natural occurrence (thunderstorm, tornado, etc)

– User has already messed up (e.g., gotten a virus) and now needs to help fix it

• e.g., a new virus that is recognized by having the file /windows/system32/igfxsrvc.dll

Page 26: Usability Social Engineeringkursawe/SiO2011/Slides/humanasp.pdf · Usability Social Engineering . Follow-on: EULA 8. Exclusion and Limitation of Liability ... bad •Usually requires

Advanced Attacks

• Attacker takes a lot of time to gain confidence

– Gather information about target victims

– Understand processes in target organization

– Combination of social engineering with real hacking

– Patient attack (e.g., wait until key people are on vacation)

Page 27: Usability Social Engineeringkursawe/SiO2011/Slides/humanasp.pdf · Usability Social Engineering . Follow-on: EULA 8. Exclusion and Limitation of Liability ... bad •Usually requires

Human “Authentication”

• Sum of many pieces – Appearance

• Well dressed people don’t lie

– Language Style • Speak the lingo of the people you interact with

– Pieces of Knowledge • Birthday, Telephone number

– Context • E.g., Internal Telephone number

• All of these individually can be faked

Page 28: Usability Social Engineeringkursawe/SiO2011/Slides/humanasp.pdf · Usability Social Engineering . Follow-on: EULA 8. Exclusion and Limitation of Liability ... bad •Usually requires

Escalation of Authentication

• Slow escalation of access & information

– Use information about victims to get more information

• E.g., impersonate as a high school acquaintance to become facebook friend

• Use building access to get to elevator phone (internal phone number)

Page 29: Usability Social Engineeringkursawe/SiO2011/Slides/humanasp.pdf · Usability Social Engineering . Follow-on: EULA 8. Exclusion and Limitation of Liability ... bad •Usually requires

There is no innocent knowledge

• Every piece of knowledge can be used to “authenticate” to get more knowledge – E.g., knowing your name and birthday I can convince you

we’ve met before

– Most people will be too embarrassed to admit they have no clue who you are

• Information you consider harmless may be used to harm other people

• Who remembers what question they put into their security questions ? – The whole point is that it’s a question you can forget about

Page 30: Usability Social Engineeringkursawe/SiO2011/Slides/humanasp.pdf · Usability Social Engineering . Follow-on: EULA 8. Exclusion and Limitation of Liability ... bad •Usually requires

Yahoo Mail Security Questions

Page 31: Usability Social Engineeringkursawe/SiO2011/Slides/humanasp.pdf · Usability Social Engineering . Follow-on: EULA 8. Exclusion and Limitation of Liability ... bad •Usually requires

• Indirection: Other people may know that too

– Your siblings know your childhood questions

– Your classmates know your school teachers

• Prominent Cases

– Sarah Palin: Where did you meet your husband ?

– Mass attack on 3000 mail accounts

Page 32: Usability Social Engineeringkursawe/SiO2011/Slides/humanasp.pdf · Usability Social Engineering . Follow-on: EULA 8. Exclusion and Limitation of Liability ... bad •Usually requires

Abusing Complexity of Processes

• Most users have no idea how their organization/ services work exactly

• An attacker with that understanding can perform innocent looking actions whose consequences no one foresees

Page 33: Usability Social Engineeringkursawe/SiO2011/Slides/humanasp.pdf · Usability Social Engineering . Follow-on: EULA 8. Exclusion and Limitation of Liability ... bad •Usually requires

Abusing Complexity Of Processes

• Frank Abagnale: Advanced check fraud

Use priority of automated reading over human readable one Cause errors in the processing chain to win time

Page 34: Usability Social Engineeringkursawe/SiO2011/Slides/humanasp.pdf · Usability Social Engineering . Follow-on: EULA 8. Exclusion and Limitation of Liability ... bad •Usually requires

Combating Social Engineers

• User Education and Training

• Identifying Areas of Risk

– Tactics correspond to Area

• Strong, Enforced, and Tested Security Policy

Page 35: Usability Social Engineeringkursawe/SiO2011/Slides/humanasp.pdf · Usability Social Engineering . Follow-on: EULA 8. Exclusion and Limitation of Liability ... bad •Usually requires

User Education and Training

• Security Orientation for new employees

• Yearly security training for all employees

• Weekly newsletters, videos, brochures, games and booklets detailing incidents and how they could have been prevented

• Signs, posters, coffee mugs, pens, pencils, mouse pads, screen savers, etc with security slogans (I.e. “Loose lips sink ships”).

• This doesn’t help much for external, highly volatile personnel (e.g., cleaning personnel)

Page 36: Usability Social Engineeringkursawe/SiO2011/Slides/humanasp.pdf · Usability Social Engineering . Follow-on: EULA 8. Exclusion and Limitation of Liability ... bad •Usually requires

K. Salah 36

Warning Signs of an Attack

• Refusal to give callback number • Out-of-ordinary request • Claim of authority • Stresses urgency • Threatens negative consequences of noncompliance • Shows discomfort when questioned • Name dropping • Compliments or flattery • Flirting

Page 37: Usability Social Engineeringkursawe/SiO2011/Slides/humanasp.pdf · Usability Social Engineering . Follow-on: EULA 8. Exclusion and Limitation of Liability ... bad •Usually requires

Train users how to combine politeness with security

• Offer to call back on fishy requests

• Strong policies to give users an excuse to say No

• Ready-available list of phone numbers of qualified personnel to deal with a request

• Prepare verification questions

– “Isn’t John normally in charge of this ?”

Page 38: Usability Social Engineeringkursawe/SiO2011/Slides/humanasp.pdf · Usability Social Engineering . Follow-on: EULA 8. Exclusion and Limitation of Liability ... bad •Usually requires

Penetration Testing

• Have regular security tests involving social engineering

• Users should expect there are consequences if they fail such tests

• Assure every action leaves audit trails to detect attacks

• Embedded testing: Automatically send test mails to keep phishing awareness high

Page 39: Usability Social Engineeringkursawe/SiO2011/Slides/humanasp.pdf · Usability Social Engineering . Follow-on: EULA 8. Exclusion and Limitation of Liability ... bad •Usually requires

Paternalistic Security

• Protect users from themselves: The user is the enemy!

• If you assume users are all malicious, they also cannot involuntarily collaborate with an attacker

• Disadvantage – No user-buy in

– Users may be tempted to fight the security measure to get work done

Page 40: Usability Social Engineeringkursawe/SiO2011/Slides/humanasp.pdf · Usability Social Engineering . Follow-on: EULA 8. Exclusion and Limitation of Liability ... bad •Usually requires

Interaction between real-and IT world

• Personal information should not be used to identify to computer systems

• Give users a chance to set their own security

– “I need to change my mothers maiden name”

Page 41: Usability Social Engineeringkursawe/SiO2011/Slides/humanasp.pdf · Usability Social Engineering . Follow-on: EULA 8. Exclusion and Limitation of Liability ... bad •Usually requires

Attacker Interaction

Page 42: Usability Social Engineeringkursawe/SiO2011/Slides/humanasp.pdf · Usability Social Engineering . Follow-on: EULA 8. Exclusion and Limitation of Liability ... bad •Usually requires

Attacker Interaction

All security mechanism will influence the attacker

• Stop it all and become a good citizen (unlikely)

• Go attack someone else

• Shift attack patterns somewhere else

Page 43: Usability Social Engineeringkursawe/SiO2011/Slides/humanasp.pdf · Usability Social Engineering . Follow-on: EULA 8. Exclusion and Limitation of Liability ... bad •Usually requires

Bad Example: South African car theft

In the late 90’s, the amount of car thefts in South Africa increased dramatically Security Response: Car immobilizer make it impossible to hotwire a car

Page 44: Usability Social Engineeringkursawe/SiO2011/Slides/humanasp.pdf · Usability Social Engineering . Follow-on: EULA 8. Exclusion and Limitation of Liability ... bad •Usually requires

Bad Example: South African car theft

Response of the Thieves: Car Jacking, Armed assaults Security Response: Remote activated theft protection

Page 45: Usability Social Engineeringkursawe/SiO2011/Slides/humanasp.pdf · Usability Social Engineering . Follow-on: EULA 8. Exclusion and Limitation of Liability ... bad •Usually requires

Bad Example: South African car theft

Response of the Thieves: Car Jacking turned to kidnapping Security Response: Armed cars

Due to “enhanced” security, people now die when their cars are stolen

Page 46: Usability Social Engineeringkursawe/SiO2011/Slides/humanasp.pdf · Usability Social Engineering . Follow-on: EULA 8. Exclusion and Limitation of Liability ... bad •Usually requires

Xbox Attacks

The main attack on the Xbox was done by people trying to run Linux on it As a side effect, the copy protection got hacked

Page 47: Usability Social Engineeringkursawe/SiO2011/Slides/humanasp.pdf · Usability Social Engineering . Follow-on: EULA 8. Exclusion and Limitation of Liability ... bad •Usually requires

Why did the PS/2 never get hacked that way ?

Sony released Linux for the PS/2 The most talented hackers stayed away, as there was no glory to get here The PS/2 was hacked weeks after it was locked down

Page 48: Usability Social Engineeringkursawe/SiO2011/Slides/humanasp.pdf · Usability Social Engineering . Follow-on: EULA 8. Exclusion and Limitation of Liability ... bad •Usually requires

Training your attackers: Set Top Box

The first set-top boxes had rather poor security. Mildly trained and equipped hackers quickly hacked it for themselves, for friends, for friends of friends. Security increased slowly enough for the hackers to learn Now, they are well trained, well equipped, and have a stable distribution network

Page 49: Usability Social Engineeringkursawe/SiO2011/Slides/humanasp.pdf · Usability Social Engineering . Follow-on: EULA 8. Exclusion and Limitation of Liability ... bad •Usually requires

Lessons in Attacker Interaction

• Consider where the attackers may go to circumvent your measures. This may be worse than the initial attack

• Don’t motivate attackers to join forces. Also, sometimes one can sacrifice low value assets to protect high value ones

• Don’t train your attacker. Bad security can be worse than none.

Page 50: Usability Social Engineeringkursawe/SiO2011/Slides/humanasp.pdf · Usability Social Engineering . Follow-on: EULA 8. Exclusion and Limitation of Liability ... bad •Usually requires

Usability

Page 51: Usability Social Engineeringkursawe/SiO2011/Slides/humanasp.pdf · Usability Social Engineering . Follow-on: EULA 8. Exclusion and Limitation of Liability ... bad •Usually requires

Why Jonny can’t encrypt

• User study on Encryption/ Authentication program PGP 5.0

– 90% of users could not get it right given 90 minutes

– Private / public, encryption / signing keys, plus trust labels was too much – people would delete or publish their private keys

Page 52: Usability Social Engineeringkursawe/SiO2011/Slides/humanasp.pdf · Usability Social Engineering . Follow-on: EULA 8. Exclusion and Limitation of Liability ... bad •Usually requires

Why Jonny can’t encrypt

Usability Evaluation (12 users)

• 3 users accidentally sent the message in clear text

• 7 users used their public key to encrypt and only 2 of the 7 figured out how to correct the problem

• Only 2 users were able to decrypt without problems

• Only 1 user figured out how to deal with RSA keys correctly.

• A total of 3 users were able to successfully complete the basic process of sending and receiving encrypted emails.

• One user was not able to encrypt at all

Page 53: Usability Social Engineeringkursawe/SiO2011/Slides/humanasp.pdf · Usability Social Engineering . Follow-on: EULA 8. Exclusion and Limitation of Liability ... bad •Usually requires

Why Jonny can’t encrypt

What’s the difference between ‘validity’ and ‘trust’ ?

Different key types for compatibility, thus confusing symbols

Page 54: Usability Social Engineeringkursawe/SiO2011/Slides/humanasp.pdf · Usability Social Engineering . Follow-on: EULA 8. Exclusion and Limitation of Liability ... bad •Usually requires

Why Johnny Can’t Encrypt

Defining Usable Security Software (Whitten , Tygar)

Security software is usable if the people who are expected to use it:

1. are reliably made aware of the security tasks they need to perform.

2. are able to figure out how to successfully perform those tasks

3. don't make dangerous errors

4. are sufficiently comfortable with the interface to continue using it.

Page 55: Usability Social Engineeringkursawe/SiO2011/Slides/humanasp.pdf · Usability Social Engineering . Follow-on: EULA 8. Exclusion and Limitation of Liability ... bad •Usually requires

Psychological Acceptability

Saltzer & Schroeder : “The Protection of Information in Computer Systems”

• Psychological acceptability: It is essential that the human interface be designed for ease of use, so that users routinely and automatically apply the protection mechanisms correctly. Also, to the extent that the user's mental image of his protection goals matches the mechanisms he must use, mistakes will be minimized. If he must translate his image of his protection needs into a radically different specification language, he will make errors.

Page 56: Usability Social Engineeringkursawe/SiO2011/Slides/humanasp.pdf · Usability Social Engineering . Follow-on: EULA 8. Exclusion and Limitation of Liability ... bad •Usually requires

Psychological Acceptability Means

• Users won't jump through hoops if they don't understand why such measures are necessary

• Users will take advantage of security that doesn't impede their work, and will undermine it otherwise

Page 57: Usability Social Engineeringkursawe/SiO2011/Slides/humanasp.pdf · Usability Social Engineering . Follow-on: EULA 8. Exclusion and Limitation of Liability ... bad •Usually requires

Example Undermining

A hospital requires every personnel to password authenticate to access a computer.

Result: The first to come in the morning logged in, and never logged out

Typical password policies require a password change every month to a different password

Result: Most user passwords then end on a number indicating the version

Page 58: Usability Social Engineeringkursawe/SiO2011/Slides/humanasp.pdf · Usability Social Engineering . Follow-on: EULA 8. Exclusion and Limitation of Liability ... bad •Usually requires

Usability Design Principles

• Designing usable security is a new science

• This is not an excuse not to try

Page 59: Usability Social Engineeringkursawe/SiO2011/Slides/humanasp.pdf · Usability Social Engineering . Follow-on: EULA 8. Exclusion and Limitation of Liability ... bad •Usually requires
Page 60: Usability Social Engineeringkursawe/SiO2011/Slides/humanasp.pdf · Usability Social Engineering . Follow-on: EULA 8. Exclusion and Limitation of Liability ... bad •Usually requires

Security by default

• Provide basic installation and maintenance for all security critical services

• Perform compatibility tests to assure they are not in the way of the user

• Ideally, users are aware what is in place, but don’t need to operate it at all

Page 61: Usability Social Engineeringkursawe/SiO2011/Slides/humanasp.pdf · Usability Social Engineering . Follow-on: EULA 8. Exclusion and Limitation of Liability ... bad •Usually requires

Decisions

• Don’t push unnecessary decisions on the user.

– Designers are usually more competent to decide

– Users need to get work done, thus will usually say yes

Page 62: Usability Social Engineeringkursawe/SiO2011/Slides/humanasp.pdf · Usability Social Engineering . Follow-on: EULA 8. Exclusion and Limitation of Liability ... bad •Usually requires

Decisions

Understandable choices instead of dilemmas

Page 63: Usability Social Engineeringkursawe/SiO2011/Slides/humanasp.pdf · Usability Social Engineering . Follow-on: EULA 8. Exclusion and Limitation of Liability ... bad •Usually requires

User self-auditing

• Provide simple mechanisms for users to assist with security

• Users can audit their own activity:

– Your last login was at 12:29 PM on Dec 12, 2010 from yourmachine.cs.ru.nl; you logged in 17 times from there last month

• Users will audit their own activity a lot more aggressively than you will

Page 64: Usability Social Engineeringkursawe/SiO2011/Slides/humanasp.pdf · Usability Social Engineering . Follow-on: EULA 8. Exclusion and Limitation of Liability ... bad •Usually requires

Abandon pre-historic technologies

• Text-passwords are designed for mainframe operators

• Random text strings are about the worst thing for a human brain to remember

– Graphic passwords

– Key items

– Biometrics

Page 65: Usability Social Engineeringkursawe/SiO2011/Slides/humanasp.pdf · Usability Social Engineering . Follow-on: EULA 8. Exclusion and Limitation of Liability ... bad •Usually requires

Don’t undermine users that got it right

Your own sites shouldn’t require poor security behavior to be usable

Page 66: Usability Social Engineeringkursawe/SiO2011/Slides/humanasp.pdf · Usability Social Engineering . Follow-on: EULA 8. Exclusion and Limitation of Liability ... bad •Usually requires

• Don’t contradict your own messages

Page 67: Usability Social Engineeringkursawe/SiO2011/Slides/humanasp.pdf · Usability Social Engineering . Follow-on: EULA 8. Exclusion and Limitation of Liability ... bad •Usually requires

Indications that this is a phishing site

• No https for login/payment • wrong domain (not .mcafee.com) • link was send in email

Page 68: Usability Social Engineeringkursawe/SiO2011/Slides/humanasp.pdf · Usability Social Engineering . Follow-on: EULA 8. Exclusion and Limitation of Liability ... bad •Usually requires

Security Compromise

Password protection in an industrial control unit:

Send_Command

Password Required Password is “Topsecret”

Topsecret

Password accepted,

awaiting command

Page 69: Usability Social Engineeringkursawe/SiO2011/Slides/humanasp.pdf · Usability Social Engineering . Follow-on: EULA 8. Exclusion and Limitation of Liability ... bad •Usually requires

Security Compromise

Users are usually concerned with availability more than security

Sometimes, this is actually correct

System designers need to have deep understanding of the workflows to design security that is not in the way