camp, presenting for camp, mcgrath, genkina security & morality: a tale of user deceit from:...
TRANSCRIPT
Camp, presenting for Camp, McGrath, Genkina
Security & Morality:A Tale of User DeceitSecurity & Morality:
A Tale of User Deceit
from: Models of Trust on the Web
Edinburgh, UK
May 2006
L. Jean Camp, C. McGrath, A. Genkina
abbreviated for the
PETS Workshop
Rump Session
29 June 2006
Camp, presenting for Camp, McGrath, Genkina
Security & Morality:A Tale of User Deceit?
• Hypotheses about human trust behavior developed from social science
• Compared with implicit assumptions in common technical mechanisms
• Test computer-human trust behaviors• FOCUS
– Exploration of security and privacy as human trust technologies
Camp, presenting for Camp, McGrath, Genkina
Experimental Definition of Trust
• Coleman’s Three Part Test– enables something not otherwise possible
• individual who trusts is worse off if the trusted party acts in an untrustworthy manner
• individuals who trust are better off if the trusted party acts in a trustworthy manner
– there is no constraint placed on the trusted party
– a time lag exists between a decision to trust and the outcome
Camp, presenting for Camp, McGrath, Genkina
Two Hypotheses
• Do humans respond differently to human or computer "betrayals" in terms of forgiveness?
• Do people interacting with a computer distinguish between computers as individuals or respond to their experience with "computers”?– Does tendency to differentiate between
remote machines increase with computer experience?
Camp, presenting for Camp, McGrath, Genkina
H1: Response to Failure
• Do humans respond differently to human or computer "betrayals" in terms of forgiveness?– Attacks which are viewed as failures as
‘ignored’ or forgiven– Technical failures as seen as accidents
rather than design decisions• May explain why people tolerate repeated
security failures
Camp, presenting for Camp, McGrath, Genkina
H2: Differentiation
• When people interact with networked computers, they discriminate among distinct computers (hosts, websites), treating them as distinct entities, particularly in their readiness to extend trust and secure themselves from possible harms.
• People become more trusting over time• People differentiate more not less with experience• Do people learn to differentiate or trust?
– “educate the user” may not work
Camp, presenting for Camp, McGrath, Genkina
The Experiment
• Developed three websites– “life management”
• Elephantmine.com
• Reminders.name
• MemoryMinder.us
Camp, presenting for Camp, McGrath, Genkina
Initial Tests
• What information would you share with each site?
• Do you trust the site?• user-defined trust, no macro definition given
• Rejected MemoryMinders.us• people dislike lime green?
• Other two designs had similar evaluations
Camp, presenting for Camp, McGrath, Genkina
Two “Betrayal” Types
• One group faced a technical betrayal– Another person’s data is displayed– “John Q. Wilson”– DoB, Credit Card Number, social network data
• One group faced a moral betrayal– Change in privacy policy announced– Collection of third party information correlated
with compiled data• very common policy• eBay, Face Book, mySpace
Camp, presenting for Camp, McGrath, Genkina
Three Step Process
• Users introduced to first site– Sites in the same order
• Users experience betrayal– Half the users have technical failure– Half had privacy change– Both sets of users experience a failure
upon departure of first site
• Then users go to second site
Camp, presenting for Camp, McGrath, Genkina
Findings: Differentiation
• Users respond to first site betrayal with significant change in behavior wrt second site– users had on average seven years experience
with Internet– computer experience not at all significant– second site not seen as “new” entity
• Cannot support the hypothesis that users differentiate– users do not enter each transaction with a new
calculation of risk
Camp, presenting for Camp, McGrath, Genkina
Findings: Betrayal Type
• Stronger reaction to privacy change– Yet technical failure indicated an inability to
protect privacy
“Malevolence” “Incompetence” Privacy
Before Privacy After
Security Before
Security After
Your IM Buddy List
22% 09% p<.001
16% 13% p<.001
Coworkers’ Names & Contact
44% 31% p<.01
42% 52%
Friends’ Names & Contact
53% 34% p<.001
65% 68%
Camp, presenting for Camp, McGrath, Genkina
Differentiation
• The tendency to differentiate between remote machines decreases with computer experience– More use results in more lumping
• Make better lumping
– Explains common logon/passwords • along with cognitive limits• “My Internet is Down”
• Need explicit DO NOT TRUST signals • If you want security to work for users -
study privacy!