point-based trust: define how much privacy is worth

22
1 Point-Based Trust: Point-Based Trust: Define How Much Privacy Define How Much Privacy is Worth is Worth Danfeng Yao Danfeng Yao Keith B. Frikken Keith B. Frikken Brown University Brown University Miami University Miami University Mikhail J. Atallah Mikhail J. Atallah Roberto Tamassia Roberto Tamassia Purdue University Purdue University Brown University Brown University ICICS December, 2006, Raleigh NC ICICS December, 2006, Raleigh NC Funded by NSF IIS-0325345, IIS-0219560, IIS-0312357, and IIS- Funded by NSF IIS-0325345, IIS-0219560, IIS-0312357, and IIS- 0242421, ONR N00014-02-1-0364, CERIAS, and Purdue Discovery Park 0242421, ONR N00014-02-1-0364, CERIAS, and Purdue Discovery Park

Upload: celine

Post on 12-Jan-2016

24 views

Category:

Documents


0 download

DESCRIPTION

Point-Based Trust: Define How Much Privacy is Worth. Danfeng YaoKeith B. Frikken Brown UniversityMiami University Mikhail J. Atallah Roberto Tamassia Purdue UniversityBrown University. - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: Point-Based Trust: Define How Much Privacy is Worth

1

Point-Based Trust: Point-Based Trust: Define How Much Define How Much Privacy is WorthPrivacy is Worth

Danfeng YaoDanfeng Yao Keith B. FrikkenKeith B. FrikkenBrown UniversityBrown University Miami UniversityMiami University

Mikhail J. AtallahMikhail J. Atallah Roberto Tamassia Roberto TamassiaPurdue UniversityPurdue University Brown UniversityBrown University

ICICS December, 2006, Raleigh NCICICS December, 2006, Raleigh NC

Funded by NSF IIS-0325345, IIS-0219560, IIS-0312357, and IIS-0242421, ONR Funded by NSF IIS-0325345, IIS-0219560, IIS-0312357, and IIS-0242421, ONR N00014-02-1-0364, CERIAS, and Purdue Discovery ParkN00014-02-1-0364, CERIAS, and Purdue Discovery Park

Page 2: Point-Based Trust: Define How Much Privacy is Worth

2

Outline of the talkOutline of the talk

1. Introduction to privacy 1. Introduction to privacy protection in authorizationprotection in authorization

2. Point-based authorization and 2. Point-based authorization and optimal credential selectionoptimal credential selection

2.1 New York State Division of Motor 2.1 New York State Division of Motor Vehicle 6-Point Authentication SystemVehicle 6-Point Authentication System

3. Secure 2-party 3. Secure 2-party protocol for knapsack protocol for knapsack problemproblem

4. Applications 4. Applications

2.2 Knapsack problem2.2 Knapsack problem

Page 3: Point-Based Trust: Define How Much Privacy is Worth

3

Protecting private informationProtecting private information

Request for discountRequest for discount

Request Request UIDUID

RequestRequest BBBBBB

BBBBBB

UIDUID

Grant the discount Grant the discount

PolicPolicyy

Releasing Releasing UIDUID requires requires BBBBBB

CredCred..

UIDUID (student (student ID)ID)

AliceAlice

PolicyPolicyDiscount Discount

requires requires UIDUID

Cred.Cred.BBBBBB (better (better

business business bureau)bureau)

Trust negotiation protocols [Trust negotiation protocols [Winsborough Seamons Jones 00, Yu Winsborough Seamons Jones 00, Yu Ma Winslett 00, Winsborough Li 02, Li Du Boneh 03Ma Winslett 00, Winsborough Li 02, Li Du Boneh 03]]

Page 4: Point-Based Trust: Define How Much Privacy is Worth

4

Our goalsOur goals

Prevent pre-mature information leaking by both partiesPrevent pre-mature information leaking by both parties Credentials should be exchanged only if services can be Credentials should be exchanged only if services can be

establishedestablished Support some kind of cumulative privacy Support some kind of cumulative privacy quantitativelyquantitatively

Disclosing more credentials should incur higher privacy loss Disclosing more credentials should incur higher privacy loss Support flexible service modelSupport flexible service model

Allow customized (or personalized) access policiesAllow customized (or personalized) access policies Adjustable services based on qualifications Adjustable services based on qualifications

Our ultimate goal is to encourage users to participate in e-commerceOur ultimate goal is to encourage users to participate in e-commerce

Page 5: Point-Based Trust: Define How Much Privacy is Worth

5

What can we learn from New York What can we learn from New York State DMV?State DMV?

6-point proof-of-identity for getting NY driver’s 6-point proof-of-identity for getting NY driver’s license license

CredentialCredential PointsPoints

PassportPassport 55

Utility billUtility bill 11

Birth certificateBirth certificate 44

Social security cardSocial security card 33

Page 6: Point-Based Trust: Define How Much Privacy is Worth

6

Another motivation – adjustable Another motivation – adjustable servicesservices

Membership, Membership, CredentialCredential Discount Discount

MastercardMastercard 2%2%

Airline frequent flier Airline frequent flier 1%1%

AAAAAA 0.5%0.5%

VeteranVeteran 0.5%0.5%

Adjustable services based on the private informationAdjustable services based on the private information revealedrevealed

Page 7: Point-Based Trust: Define How Much Privacy is Worth

7

Point-based authorization modelPoint-based authorization model

Credential type Credential type CC11,, CC22, …,, …, CCn n

The The service providerservice provider defines defines Point valuesPoint values pp11,, pp22, …,, …, ppnn of credentials ----- privateof credentials ----- private ThresholdThreshold TT for accessing a resource ----- private for accessing a resource ----- private

The The useruser defines defines sensitivitysensitivity scoresscores aa11,, aa22, …,, …, aann of credentials of credentials ----- private----- private

Credential selection problem Credential selection problem The user (or client) wants to satisfy threshold The user (or client) wants to satisfy threshold TT with the with the

minimum minimum disclosure of privacydisclosure of privacy

Minimize Minimize aaii x xii

Subject toSubject to ppi i xxi i ≥≥ TT

i=1i=1

nn

i=1i=1

nn

xxii = 0= 0 not to disclose not to disclose CCii

xxii= 1= 1 disclose disclose CCii

This can be converted to a knapsack problemThis can be converted to a knapsack problem

Page 8: Point-Based Trust: Define How Much Privacy is Worth

8

ExampleExample

CredentialCredential CollegCollege IDe ID

Driver’Driver’s s

licenselicense

CrediCredit t

cardcard

SSNSSN

Sensitivity score

10 30 50 100

CredentiCredentialal

CollegCollege IDe ID

Driver’Driver’s s

licenselicense

Credit Credit cardcard

SSNSSN

Point value

3 6 8 10

Threshold of accessing a resource: 10

Alice’s Alice’s optionoption

Point Point valuevalue

ss

SensitiviSensitivity scorety score

SSN 10 100

College ID, Credit

card

11 60

License, Credit card

14 80

Page 9: Point-Based Trust: Define How Much Privacy is Worth

9

Where do points come from? Where do points come from? Reputation systems [Reputation systems [Beth Borcherding Klein 94Beth Borcherding Klein 94,, Tran Hitchens Tran Hitchens

Varadharajan Watters 05Varadharajan Watters 05, Zouridaki Mark Hejmo ThomasZouridaki Mark Hejmo Thomas 20052005]] This is future work, but here is an ideaThis is future work, but here is an idea

Evaluate

Evaluate

Evaluate

Member of

Page 10: Point-Based Trust: Define How Much Privacy is Worth

10

Converting CSP into a knapsack Converting CSP into a knapsack problemproblem

Defines binary vector Defines binary vector yy11,, yy22, …,, …, yynn, where , where yyi i = 1 –= 1 – xxii

{{aaii}: Private to user}: Private to user

{{ppii }: Private to provider }: Private to provider

Maximize Maximize aaii y yii

Subject toSubject to ppi i yyi i << T’ T’

i=1i=1

nn

i=1i=1

nn Let Let T’T’ = = ppii - - TT

i=1i=1

nn

Bag of size T’,Bag of size T’,n = 6n = 6

What to pick and steal?What to pick and steal?

Page 11: Point-Based Trust: Define How Much Privacy is Worth

11

Dynamic programming of Dynamic programming of knapsack problemknapsack problem

Dynamic programming for 0/1 knapsack problemDynamic programming for 0/1 knapsack problem Construct a Construct a nn-by--by-T’T’ table table MM, where, where

M M i, ji, j == M M i-1i-1, , jj ifif j j << p pi i

maxmax {{M M i-1i-1,, j j,, M M i-1i-1,, j-pi j-pi ++ a ai i }} ifif j j ≥≥ p pii

T’T’ = = ppii - - TT i=1i=1

nn

.... .... .... ....

Mi-1, j-pi .... .... Mi-1, j

.... .... .... ??

{ai }: Private to user{pi }: Private to provider

Page 12: Point-Based Trust: Define How Much Privacy is Worth

12

Overview of privacy-preserving Overview of privacy-preserving knapsack computation knapsack computation

Uses 2-partyUses 2-party maximization protocol maximization protocol [ [Frikken Frikken Atallah 04Atallah 04]]

Uses Uses homomorphic encryptionhomomorphic encryption scheme scheme E(x)E(y) = E(x + y)E(x)E(y) = E(x + y) E(x)E(x)cc = E(xc) = E(xc)

Preserves privacy for bothPreserves privacy for both Two phases: table-filling and traceback Two phases: table-filling and traceback

maxmax {{ ,, - - ∞ ∞ ++ a aii }}M M i, ji, j == M M i-1i-1, , jj ifif j j << p pi i

maxmax {{M M i-1i-1,, j j ,, M M i-1i-1,, j-pi j-pi ++ a ai i }} ifif j j ≥≥ p pii

Add maximization and addition of Add maximization and addition of aaii to make the two to make the two computation procedures indistinguishable computation procedures indistinguishable

Page 13: Point-Based Trust: Define How Much Privacy is Worth

13

Preliminary: 2-party maximization Preliminary: 2-party maximization protocol in a split formatprotocol in a split format

Alice1Amazon1

Alice2Amazon2

Alice’s share

Amazon’sshare

Max

PlayerPlayer InputInput OutputOutput PrivacyPrivacy

AliceAlice Alice1, Alice2Alice1, Alice2 Alice’s share of Alice’s share of max*max*

Do not know Do not know which is the which is the

maxmaxAmazoAmazonn

Amazon1, Amazon1, Amazon2Amazon2

Amazon’s share of Amazon’s share of max*max*

* Alice’s share + Amazon’s share = * Alice’s share + Amazon’s share =

max (Alice1 + Amazon1, Alice2 + Amazon2)max (Alice1 + Amazon1, Alice2 + Amazon2)

Comparison can be done similarly [Comparison can be done similarly [Frikken Atallah 04Frikken Atallah 04]]

Page 14: Point-Based Trust: Define How Much Privacy is Worth

14

Our protocol for dynamic programming of Our protocol for dynamic programming of 0/1 knapsack problem0/1 knapsack problem

Computed entries are Computed entries are encrypted and stored by the encrypted and stored by the provider provider

The provider splits the two The provider splits the two candidates of candidates of MMi, ji, j

The client and provider The client and provider engage in a engage in a 2-party private 2-party private maximization protocolmaximization protocol to to compute the maximum compute the maximum

The client encrypts her share The client encrypts her share of the maximum and sends it of the maximum and sends it to the provider to the provider

The provider computes and The provider computes and stores the encrypted stores the encrypted MMi, ji, j

M M i, ji, j == M M i-1i-1, , jj ifif j j << p pi i

maxmax {{MMi-1i-1,, j j ,, M Mi-1i-1,, j-pi j-pi ++ a ai i }} ifif j j ≥≥ ppii

maxmax {{ ,, - - ∞ ∞ ++ a aii }}

.... .... .... ....

E(Mi-1, j-pi) .... .... E(Mi-1, j)

.... .... .... ??

Alice Amazon

Alice Amazon

E(Mi-1, j)

E(Mi-1, j-pi)Alice’s share

Amazon’sshare

aaii

Max

Page 15: Point-Based Trust: Define How Much Privacy is Worth

15

Our protocol for knapsack (Cont’d)Our protocol for knapsack (Cont’d)

At the end of the 2-party dynamic programming, the At the end of the 2-party dynamic programming, the provider has a n-by-T’ table of encrypted entriesprovider has a n-by-T’ table of encrypted entries

Number of credentials n=4Number of credentials n=4

How does the client find out the optimal selection of credentials?How does the client find out the optimal selection of credentials?

E(E(MM1, 11, 1)) E(E(MM1, 21, 2)) E(E(MM1, 31, 3)) E(E(MM1, 41, 4)) E(E(MM1, 51, 5))

E(E(MM2, 12, 1)) E(E(MM2, 22, 2)) E(E(MM2, 32, 3)) E(E(MM2, 42, 4)) E(E(MM2, 52, 5))

E(E(MM3, 13, 1)) E(E(MM3, 23, 2)) E(E(MM3, 33, 3)) E(E(MM3, 43, 4)) E(E(MM3, 53, 5))

E(E(MM4, 14, 1)) E(E(MM4, 24, 2)) E(E(MM4, 34, 3)) E(E(MM4, 44, 4)) E(E(MM4, 54, 5))

T’T’ = = ppii - - TT i=1i=1

nn

Page 16: Point-Based Trust: Define How Much Privacy is Worth

16

Traceback protocol: get the Traceback protocol: get the optimal credential selectionoptimal credential selection

Security in a semi-honest (honest-but-curious) modelSecurity in a semi-honest (honest-but-curious) model

E(E(MM1, 11, 1), ), E(E(FF1, 11, 1))

E(E(MM1, 21, 2), ), E(E(FF1, 21, 2))

E(E(MM1, 31, 3), ), E(E(FF1, 31, 3))

E(E(MM1, 41, 4), ), E(E(FF1, 41, 4))

E(E(MM1, 51, 5), ), E(E(FF1, 51, 5))

E(E(MM2, 12, 1), ), E(E(FF2, 12, 1))

E(E(MM2, 22, 2), ), E(E(FF2, 22, 2))

E(E(MM2, 32, 3), ), E(E(FF2, 32, 3))

E(E(MM2, 42, 4), ), E(E(FF2, 42, 4))

E(E(MM2, 52, 5), ), E(E(FF2, 52, 5))

E(E(MM3, 13, 1), ), E(E(FF3, 13, 1))

E(E(MM3, 23, 2), ), E(E(FF3, 23, 2))

E(E(MM3, 33, 3), ), E(E(FF3, 33, 3))

E(E(MM3, 43, 4), ), E(E(FF3, 43, 4))

E(E(MM3, 53, 5), ), E(E(FF3, 53, 5))

E(E(MM4, 14, 1), ), E(E(FF4, 14, 1))

E(E(MM4, 24, 2), ), E(E(FF4, 24, 2))

E(E(MM4, 34, 3), ), E(E(FF4, 34, 3))

E(E(MM4, 44, 4), ), E(E(FF4, 44, 4))

E(E(MM4, 54, 5), ), E(E(FF4, 54, 5))

Item 0Item 0

Item 1Item 1

Item 2Item 2

Item 3Item 3

Item 4Item 4

00

FFi, ji, j == 0 or 10 or 1

E(E(FFii, , jj))

Page 17: Point-Based Trust: Define How Much Privacy is Worth

17

Security and efficiency of our Security and efficiency of our privacy-preserving knapsack privacy-preserving knapsack

computationcomputation Informally, security means that private information is not Informally, security means that private information is not

leakedleaked Security definitions Security definitions

Semi-honest adversarial modelSemi-honest adversarial model A protocol securely implements function A protocol securely implements function ff ifif the view of the view of

participants are simulatable with an ideal implementation of participants are simulatable with an ideal implementation of the protocol the protocol

Theorem Theorem The basic protocol of the private two-party dynamic The basic protocol of the private two-party dynamic programming computation in the point-based trust programming computation in the point-based trust management model is secure in the semi-honest adversarial management model is secure in the semi-honest adversarial model.model.

TheoremTheorem The communication complexity between the provider The communication complexity between the provider and the client of our basic secure dynamic programming and the client of our basic secure dynamic programming protocol is O(nT'), where n is the total number of credentials protocol is O(nT'), where n is the total number of credentials and T' is the marginal threshold.and T' is the marginal threshold.

Page 18: Point-Based Trust: Define How Much Privacy is Worth

18

Fingerprint protocol: an improved Fingerprint protocol: an improved traceback protocol traceback protocol

We want to exclude the provider in the tracebackWe want to exclude the provider in the traceback To prevent tampering and reduce costsTo prevent tampering and reduce costs

1. Filling knapsack table1. Filling knapsack table

2. (Encrypted) last entry2. (Encrypted) last entry

3. Decrypt and identity 3. Decrypt and identity optimal credential selectionoptimal credential selection

Fingerprint protocol is a general solution for traceback in DPFingerprint protocol is a general solution for traceback in DP

Page 19: Point-Based Trust: Define How Much Privacy is Worth

19

Fingerprint protocol (cont’d)Fingerprint protocol (cont’d)

Item No.Item No. Privacy Privacy scorescore

(decimal)(decimal)

Privacy Privacy score score

(binary)(binary)

Transformed scoreTransformed score

11 22 010010 010 010 00010001

22 33 011011 011 011 00100010

33 55 101101 101 101 01000100

44 88 10001000 1000 1000 1000 1000

Knapsack result Knapsack result (decimal)(decimal)

Knapsack result Knapsack result (binary)(binary)

Item numbers in Item numbers in the knapsackthe knapsack

33 … … 00100010 22

2020 … … 11111111 1, 2, 3, 41, 2, 3, 4

Page 20: Point-Based Trust: Define How Much Privacy is Worth

20

Application of point-based Application of point-based authorization: fuzzy location query in authorization: fuzzy location query in

Presence systemsPresence systems

Alice’s bossAlice’s boss

Alice’s exAlice’s ex

Alice’s momAlice’s mom

Where is Alice?Where is Alice?

Where is Alice?Where is Alice?

Where is Alice?Where is Alice?BossBoss

MomMom

ExEx

Page 21: Point-Based Trust: Define How Much Privacy is Worth

21

Related workRelated work

Hidden credentials [Hidden credentials [Bradshaw Holt Seamons 04Bradshaw Holt Seamons 04, , Frikken Li Frikken Li Atallah 06Atallah 06]]

Private policy negotiation [Private policy negotiation [Kursawe Neven TuylsKursawe Neven Tuyls 0606], ], Optimizing trust negotiation [Optimizing trust negotiation [Chen Clarke Kurose Towsley Chen Clarke Kurose Towsley 0505], Trust negotiation protocol/framework [], Trust negotiation protocol/framework [Winsborough Winsborough Seamons Jones 00Seamons Jones 00, , Yu Ma Winslett 00Yu Ma Winslett 00,, Winsborough Li 02 Winsborough Li 02,, Li Du Boneh 03Li Du Boneh 03,, Li Li Winsborough 05 Li Li Winsborough 05]]

Anonymous credential approaches [Anonymous credential approaches [Chaum 85Chaum 85,, Camenisch Camenisch Lysyanskaya 01Lysyanskaya 01]]

Secure Multiparty Computation [Secure Multiparty Computation [Atallah Li 04Atallah Li 04,, Atallah Du Atallah Du 0101]]

OCBE [OCBE [Li Li 06Li Li 06]] Manet [Manet [Zouridaki Mark Hejmo ThomasZouridaki Mark Hejmo Thomas 0505]] Platform for Privacy PreferencesPlatform for Privacy Preferences (P3P) [(P3P) [W3CW3C]]

Page 22: Point-Based Trust: Define How Much Privacy is Worth

22

Conclusions and future workConclusions and future work

Our point-based model allows a client to choose the Our point-based model allows a client to choose the optimal selection of credentialsoptimal selection of credentials

We presented private 2-party protocol for knapsack We presented private 2-party protocol for knapsack problemproblem

Our fingerprint protocol is a general solution for Our fingerprint protocol is a general solution for tracebacktraceback

Future workFuture work Add typing to credentialsAdd typing to credentials Reputation systems and pointsReputation systems and points