6. trust negotiations and trading privacy for trust *

45
6. Trust Negotiations and Trading Privacy for Trust * Presented by: Prof. Bharat Bhargava Department of Computer Sciences and Center for Education and Research in Information Assurance and Security (CERIAS) Purdue University with contributions from Prof. Leszek Lilien Western Michigan University and CERIAS, Purdue University * Supported in part by NSF grants IIS-0209059, IIS-0242840, ANI-0219110, and Cisco URP grant.

Upload: seven

Post on 07-Jan-2016

25 views

Category:

Documents


1 download

DESCRIPTION

6. Trust Negotiations and Trading Privacy for Trust *. Presented by: Prof. Bharat Bhargava Department of Computer Sciences and Center for Education and Research in Information Assurance and Security (CERIAS) Purdue University with contributions from Prof. Leszek Lilien - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: 6. Trust  Negotiations and Trading Privacy for Trust *

6. Trust Negotiationsand

Trading Privacy for Trust*

Presented by:Prof. Bharat Bhargava

Department of Computer Sciences andCenter for Education and Research in Information Assurance and Security (CERIAS)

Purdue University

with contributions fromProf. Leszek Lilien

Western Michigan University and CERIAS, Purdue University

* Supported in part by NSF grants IIS-0209059, IIS-0242840, ANI-0219110, and Cisco URP grant.

Page 2: 6. Trust  Negotiations and Trading Privacy for Trust *

2

Trust Negotiations andTrading Privacy for Trust

Outline1) Introduction

1.1) Privacy and Trust1.2) The Paradigm of Trust

Small World Phenomenon2) Trust Negotiations

2.1) Symmetric Trust Negotiationsa) Privacy-revealingb) Privacy-preserving

2.2) Asymmetric Trust Negotiationsa) Weaker Building Trust in Strongerb) Stronger Building Trust in Weaker

2.3) Summary: Trading Information for Trust in Symm. and Asymm. Trust Negotiations

3) Trading Privacy Loss for Trust Gain 3.1) Privacy-trust Tradeoff 3.2) Proposed Approach 3.3) PRETTY Prototype for Experimental Studies

Page 3: 6. Trust  Negotiations and Trading Privacy for Trust *

3

1) Introduction (1)

1.1) Privacy and Trust

Privacy Problem Consider computer-based interactions

From a simple transaction to a complex collaboration Interactions involve dissemination of private data

It is voluntary, “pseudo-voluntary,” or required by law Threats of privacy violations result in lower trust Lower trust leads to isolation and lack of

collaboration

Trust must be established Data – provide quality an integrity End-to-end communication – sender

authentication, message integrity Network routing algorithms – deal with malicious

peers, intruders, security attacks

Page 4: 6. Trust  Negotiations and Trading Privacy for Trust *

4

1) Introduction (2)

1.2) The Paradigm of Trust Trust – a paradigm of security for open computing

environments (such as the Web) Replaces/enhances CIA (confid./integr./availab.) as one of

means for achieving security But not as one of the goals of of security (as CIA are)

Trust is a powerful paradigm Well tested in social models of interaction and

systems Trust is pervasive:

Constantly –if often unconsciously– applied in interactions between:

people / businesses / institutions / animals (e.g.: a guide dog) /

artifacts (sic! —e.g.: “Can I rely on my car for this long trip?”)

Able to simplify security solutions By reducing complexity of interactions among human and

artificial system components

Page 5: 6. Trust  Negotiations and Trading Privacy for Trust *

5

1) Introduction (3)

Small World Phenomenon

Small-world phenomenon [Milgram, 1967]

Find chains of acquaintances linking any two randomly chosen people in the United States who do not know one another (remember the Erdös number?)

Result: the average number of intermediate steps in a successful chain: between 5 and 6 => the six degrees of separation principle

Relevance to security research [Čapkun et al., 2002]

A graph exhibits the small-world phenomenon if (roughly speaking) any two vertices in the graph are likely to be connected through a short sequence of intermediate vertices

Trust is useful due to its inherently incorporating the small-world phenomenon

Page 6: 6. Trust  Negotiations and Trading Privacy for Trust *

6

2) Trust Negotiations Trust negotiations

Establish mutual trust between interacting parties

Types of trust negotiations [L. Lilien and B. Bhargava, 2006]

2.1) Symmetric trust negotiations- partners of „similar-strenght”

Overwhelmingly popular in the literature

2.2) Asymmetric trust negotiations- a “weaker” and a “stronger” partner

Identified by us (as far as we know)

Page 7: 6. Trust  Negotiations and Trading Privacy for Trust *

7

Symmetric and AsymmetricTrust Negotiations

2.1) Symmetric trust negotiations Two types:

a) Symmetric „privacy-revealing” negotiations: Disclose certificates or policies to the partner

b) Symmetric „privacy-preserving” negotiations: Preserve privacy of certificates and policies

Examples:Individual to individual / most B2B / ...

2.2) Asymmetric trust negotiations Two types:

a) Weaker Building Trust in Strongerb) Stronger Building Trust in Weaker

Examples:Individual to institution / small business to large business / ...

Page 8: 6. Trust  Negotiations and Trading Privacy for Trust *

8

2.1) Symmetric Trust Negotiations (1)

Two types of symmetric trust negotiations:a) „Privacy-revealing”b)„Privacy-preserving”

a) Privacy-revealing symmetric trust negotiations Both reveal certificates or policies to the other partner Growth of trust

An initial degree of trust necessary Must trust enough to reveal (some) certificates / policies right

away Stepwise trust growth in each other as more (possibly private)

info about each other revealed Proportional to the number of ceritficates revealed to each other

Eventually „full” mutual trust established (when negotiation succeeds)

„Full” for the task at hand

Page 9: 6. Trust  Negotiations and Trading Privacy for Trust *

9

2.1) Symmetric Trust Negotiations (2)

b) Privacy-preserving symmetric trust negotiations Both preserve privacy of certificates and policies Growth of trust

Initial distrust No one wants to reveal any info to the partner

No intermediate trust growth (no intermediate degrees of trust established)

Instead, jump „from distrust to trust” Eventually „full” mutual trust established (when

negotiation succeeds) „Full” for the task at hand

Page 10: 6. Trust  Negotiations and Trading Privacy for Trust *

10

2.2) Asymmetric Trust Negotiations

Weaker and Stronger build trust in each othera) Weaker building trust in Stronger “a priori”

E.g., a customer looking for a mortgage loan first selects a reputable bank, only then starts negotiations

b) Stronger building trust in Weaker “in real time” E.g., the bank asks the customer for a lot of private info

(incl. personal income and tax data, …) to establish trust in her

Page 11: 6. Trust  Negotiations and Trading Privacy for Trust *

11

2.2) Asymmetric Trust Negotiations (2)

a) Weaker Building Trust in Stronger

Means of building trust by Weaker in Stronger (a priori): Ask around

Family, friends, co-workers, … Check partner’s history and stated philosophy

Accomplishments, failures and associated recoveries, … Mission, goals, policies (incl. privacy policies), …

Observe partner’s behavior Trustworthy or not, stable or not, … Problem: Needs time for a fair judgment

Check reputation databases Better Business Bureau, consumer advocacy groups, …

Verify partner’s credentials Certificates and awards, memberships in trust-building

organizations (e.g., BBB), … Protect yourself against partner’s misbehavior

Trusted third-party, security deposit, prepayment, buying insurance, …

Page 12: 6. Trust  Negotiations and Trading Privacy for Trust *

12

2.2) Asymmetric Trust Negotiations (3)

b) Stronger Building Trust in Weaker

Means of building trust by Stronger in Weaker (“in real time”):

Ask partner for an anonymous payment for goods or services Cash / Digital cash / Other

Ask partner for a non-anonymous payment for goods or services Credit card / Traveler’s Checks / Other

Ask partner for specific private information Checks partner’s credit history Computer authorization subsystem observes partner’s behavior

Trustworthy or not, stable or not, … Problem: Needs time for a fair judgment

Computerized trading system checks partner’s records in reputation databases

e-Bay, PayPal, … Computer system verifies partner’s digital credentials

Passwords, magnetic and chip cards, biometrics, … Business protects itself against partner’s misbehavior

Trusted third-party, security deposit, prepayment, buying insurance, …

Note: Above blue line – anonymity preserved, below – identity revealed

Page 13: 6. Trust  Negotiations and Trading Privacy for Trust *

13

Trust Growth inAsymmetric Trust Negotiations

When/how can partners trust each other? Initially, Weaker has a „full” trust into Stronger

Weaker must trust Stronger „fully” to be ready for revealing all private information required to gain Stronger’s „full” trust

Weaker trades a (degree of) privacy loss for (a

degree of) a trust gain as perceived by Stronger A next degree of privacy „lost” when a next certificate

revealed to Stronger Exception: no privacy loss in anonymity-preserving

example in „Stronger Building Trust in Weaker”

Eventually „full” trust of Stronger into Weaker established when negotiation completed

Page 14: 6. Trust  Negotiations and Trading Privacy for Trust *

14

2.3) Summary: Trading Information for Trust in Symm. and Asymm. Negotiations

When/how can partners trust each other? Symmetric „disclosing:”

Initial degree of trust / stepwise trust growth / establishes „full” mutual trust

Trades private info for trust (degree of info privacy varies - 0 to 100%)

Symmetric „preserving:” (“from distrust to trust”) Initial distrust / no stepwise trust growth / establishes

„full” mutual trust No trading of private info for trust (degree of info privacy

varies - 0 to 100%)

Asymmetric: Initial „full” trust of Weaker into Stronger and no trust of

Stronger into Weaker / stepwise trust growth of Stronger / establishes „full” trust of Stronger into Weaker

Trades private info for trust (degree of info privacy varies - 0 to 100%)

Page 15: 6. Trust  Negotiations and Trading Privacy for Trust *

15

3) Trading Privacy Loss for Trust Gain

We’re focusing on asymmetric trust negotiations: Trading privacy for trust

Approach to trading privacy for trust:[Zhong and Bhargava, Purdue]

Formalize the privacy-trust tradeoff problem Estimate privacy loss due to disclosing a credential set Estimate trust gain due to disclosing a credential set Develop algorithms that minimize privacy loss for

required trust gain Bec. nobody likes loosing more privacy than necessary

More details available

Page 16: 6. Trust  Negotiations and Trading Privacy for Trust *

16

Related Work

Automated trust negotiation (ATN) [Yu, Winslett, and Seamons, 2003]

Tradeoff between the length of the negotiation, the amount of information disclosed, and the computation effort

Trust-based decision making [Wegella et al. 2003]

Trust lifecycle management, with considerations of both trust and risk assessments

Trading privacy for trust [Seigneur and Jensen, 2004]

Privacy as the linkability of pieces of evidence to a pseudonym; measured by using nymity

[Goldberg, thesis, 2000]

Page 17: 6. Trust  Negotiations and Trading Privacy for Trust *

17

Proposed Approach

A. Formulate the privacy-trust tradeoff problem

B. Estimate privacy loss due to disclosing a set of credentials

C. Estimate trust gain due to disclosing a set of credentials

D. Develop algorithms that minimize privacy loss for required trust gain

Page 18: 6. Trust  Negotiations and Trading Privacy for Trust *

18

A. Formulate Tradeoff Problem

Set of private attributes that user wants to conceal

Set of credentials Subset of revealed credentials R Subset of unrevealed credentials U

Choose a subset of credentials NC from U such that: NC satisfies the requirements for trust

building PrivacyLoss(NC+R) – PrivacyLoss(R) is

minimized

Page 19: 6. Trust  Negotiations and Trading Privacy for Trust *

19

Steps B – D of the Approach

B. Estimate privacy loss due to disclosing a set of credentials

Requires defining privacy metrics

C. Estimate trust gain due to disclosing a set of credentials

Requires defining trust metrics

D. Develop algorithms that minimize privacy loss for required trust gain

Includes prototyping and experimentation

-- Details in another lecture of the series --

Page 20: 6. Trust  Negotiations and Trading Privacy for Trust *

20

3.3) PRETTY Prototypefor Experimental Studies

(1)

[2a]

(3) User Role

[2b] [2d][2c1]

[2c2]

(2)

(4)

TERA = Trust-Enhanced Role Assignment(<nr>) – unconditional path

[<nr>]– conditional path

Page 21: 6. Trust  Negotiations and Trading Privacy for Trust *

21

Information Flow in PRETTY

1) User application sends query to server application.

2) Server application sends user information to TERA server for trust evaluation and role assignment.

a) If a higher trust level is required for query, TERA server sends the request for more user’s credentials to privacy negotiator.

b) Based on server’s privacy policies and the credential requirements, privacy negotiator interacts with user’s privacy negotiator to build a higher level of trust.

c) Trust gain and privacy loss evaluator selects credentials that will increase trust to the required level with the least privacy loss. Calculation considers credential requirements and credentials disclosed in previous interactions.

d) According to privacy policies and calculated privacy loss, user’s privacy negotiator decides whether or not to supply credentials to the server.

3) Once trust level meets the minimum requirements, appropriate roles are assigned to user for execution of his query.

4) Based on query results, user’s trust level and privacy polices, data disseminator determines: (i) whether to distort data and if so to what degree, and (ii) what privacy enforcement metadata should be associated with it.

Page 22: 6. Trust  Negotiations and Trading Privacy for Trust *

22

References

L. Lilien and B. Bhargava, ”A scheme for privacy-preserving data dissemination,” IEEE Transactions on Systems, Man and Cybernetics, Part A: Systems and Humans, Vol. 36(3), May 2006, pp. 503-506.

Bharat Bhargava, Leszek Lilien, Arnon Rosenthal, Marianne Winslett, “Pervasive Trust,” IEEE Intelligent Systems, Sept./Oct. 2004, pp.74-77

B. Bhargava and L. Lilien, “Private and Trusted Collaborations,” Secure Knowledge Management (SKM 2004): A Workshop, 2004.

B. Bhargava, C. Farkas, L. Lilien and F. Makedon, “Trust, Privacy, and Security. Summary of a Workshop Breakout Session at the National Science Foundation Information and Data Management (IDM) Workshop held in Seattle, Washington, September 14 - 16, 2003,” CERIAS Tech Report 2003-34, CERIAS, Purdue University, Nov. 2003.http://www2.cs.washington.edu/nsf2003 or

https://www.cerias.purdue.edu/tools_and_resources/bibtex_archive/archive/2003-34.pdf

“Internet Security Glossary,” The Internet Society, Aug. 2004; www.faqs.org/rfcs/rfc2828.html.

“Sensor Nation: Special Report,” IEEE Spectrum, vol. 41, no. 7, 2004.R. Khare and A. Rifkin, “Trust Management on the World Wide Web,” First Monday, vol. 3,

no. 6, 1998; www.firstmonday.dk/issues/issue3_6/khare.M. Richardson, R. Agrawal, and P. Domingos,“Trust Management for the Semantic Web,”

Proc. 2nd Int’l Semantic Web Conf., LNCS 2870, Springer-Verlag, 2003, pp. 351–368.P. Schiegg et al., “Supply Chain Management Systems—A Survey of the State of the Art,”

Collaborative Systems for Production Management: Proc. 8th Int’l Conf. Advances in Production Management Systems (APMS 2002), IFIP Conf. Proc. 257, Kluwer, 2002.

N.C. Romano Jr. and J. Fjermestad, “Electronic Commerce Customer Relationship Management: A Research Agenda,” Information Technology and Management, vol. 4, nos. 2–3, 2003, pp. 233–258.

Page 23: 6. Trust  Negotiations and Trading Privacy for Trust *

End

Page 24: 6. Trust  Negotiations and Trading Privacy for Trust *

Using Entropy to Trade Privacy for Trust

Yuhui ZhongBharat Bhargava

{zhong, bb}@cs.purdue.eduDepartment of Computer Sciences

Purdue UniversityThis work is supported by NSF grant IIS-0209059

Page 25: 6. Trust  Negotiations and Trading Privacy for Trust *

25

Problem motivation

Privacy and trust form an adversarial relationship Internet users worry about revealing

personal data. This fear held back $15 billion in online revenue in 2001

Users have to provide digital credentials that contain private information in order to build trust in open environments like Internet.

Research is needed to quantify the tradeoff between privacy and trust

Page 26: 6. Trust  Negotiations and Trading Privacy for Trust *

26

Subproblems

How much privacy is lost by disclosing a piece of credential?

How much does a user benefit from having a higher level of trust?

How much privacy a user is willing to sacrifice for a certain amount of trust gain?

Page 27: 6. Trust  Negotiations and Trading Privacy for Trust *

27

Formulate the privacy-trust tradeoff problem

Design metrics and algorithms to evaluate the privacy loss. We consider: Information receiver Information usage Information disclosed in the past

Estimate trust gain due to disclosing a set of credentials

Develop mechanisms empowering users to trade trust for privacy.

Design prototype and conduct experimental study

Proposed approach

Page 28: 6. Trust  Negotiations and Trading Privacy for Trust *

28

Privacy Metrics Anonymity set without accounting for probability

distribution [Reiter and Rubin, ’99] Differential entropy to measure how well an attacker

estimates an attribute value [Agrawal and Aggarwal ’01] Automated trust negotiation (ATN) [Yu, Winslett,

and Seamons, ’03] Tradeoff between the length of the negotiation, the

amount of information disclosed, and the computation effort

Trust-based decision making [Wegella et al. ’03] Trust lifecycle management, with considerations of both

trust and risk assessments Trading privacy for trust [Seigneur and Jensen, ’04]

Privacy as the linkability of pieces of evidence to a pseudonym; measured by using nymity [Goldberg, thesis, ’00]

Related work

Page 29: 6. Trust  Negotiations and Trading Privacy for Trust *

29

Set of private attributes that user wants to conceal

Set of credentials R(i): subset of credentials revealed to

receiver i U(i): credentials unrevealed to receiver i

Credential set with minimal privacy loss A subset of credentials NC from U (i) NC satisfies the requirements for trust

building PrivacyLoss(NC∪R(i)) – PrivacyLoss(R(i))) is

minimized

Formulation of tradeoff problem (1)

Page 30: 6. Trust  Negotiations and Trading Privacy for Trust *

30

Decision problem: Decide whether trade trust for privacy or

not Determine minimal privacy damage

Minimal privacy damage is a function of minimal privacy loss, information usage and trustworthiness of information receiver.

Compute trust gain Trade privacy for trust if trust gain >

minimal privacy damage Selection problem:

Choose credential set with minimal privacy loss

Formulation of tradeoff problem (2)

Page 31: 6. Trust  Negotiations and Trading Privacy for Trust *

31

Collusion among information receivers Use a global version Rg instead of R(i)

Minimal privacy loss for multiple private attributes nc1 better for attr1 but worse for attr2 than

nc2

Weight vector {w1, w2, …, wm} corresponds to the sensitivity of attributes

Salary is more sensitive than favorite TV show Privacy loss can be evaluated using:

The weighted sum of privacy loss for all attributes The privacy loss for the attribute with the highest

weight

Formulation of tradeoff problem (3)

Page 32: 6. Trust  Negotiations and Trading Privacy for Trust *

32

Query-independent privacy loss User determines her private attributes Query-independent loss characterizes

how helpful provided credentials for an adversarial to determine the probability density or probability mass function of a private attribute.

Two types of privacy loss (1)

Page 33: 6. Trust  Negotiations and Trading Privacy for Trust *

33

Query-dependent privacy loss User determines a set of potential

queries Q that she is reluctant to answer Provided credentials reveal information

of attribute set A. Q is a function of A. Query-dependent loss characterizes how

helpful provided credentials for an adversarial to determine the probability density or probability mass function of Q.

Two types of privacy loss (2)

Page 34: 6. Trust  Negotiations and Trading Privacy for Trust *

34

Observation 1

High query-independent loss does not necessarily imply high query-dependent loss An abstract example

Page 35: 6. Trust  Negotiations and Trading Privacy for Trust *

35

Privacy loss is affected by the order of disclosure

Example: Private attribute

age Potential queries:

(Q1) Is Alice an elementary school student?(Q2) Is Alice older than 50 to join a silver insurance plan?

Credentials(C1) Driver license(C2) Purdue undergraduate student ID

Observation 2

Page 36: 6. Trust  Negotiations and Trading Privacy for Trust *

36

Example (1)

Page 37: 6. Trust  Negotiations and Trading Privacy for Trust *

37

C1 C2 Disclosing C1

low query-independent loss (wide range for age) 100% loss for Query 1 (elem. school student) low loss for Query 2 (silver plan)

Disclosing C2 high query-independent loss (narrow range for age) zero loss for Query 1 (because privacy was lost by disclosing

license) high loss for Query 2 (“not sure” “no - high probability”

C2 C1 Disclosing C2

low query-independent loss (wide range for age) 100% loss for Query 1 (elem. school student) high loss for Query 2 (silver plan)

Disclosing C1 high query-independent loss (narrow range of age) zero loss for Query 1 (because privacy was lost by disclosing

ID) zero loss for Query 2

Example (2)

Page 38: 6. Trust  Negotiations and Trading Privacy for Trust *

38

Entropy-based privacy loss

Entropy measures the randomness, or uncertainty, in private data.

When an adversarial gains more information, entropy decreases

The difference shows how much information has been leaked

Conditional probability is needed for entropy evaluation Bayesian networks, kernel density estimation or

subjective estimation can be adopted

Page 39: 6. Trust  Negotiations and Trading Privacy for Trust *

39

Estimation ofquery-independent privacy loss

Single attribute Domain of attribute a : {v1, v2, …, vk}

Pi and P*i are probability mass function before and after disclosing NC given revealed credential set R.

Multiple attributes Attribute set {a1, a2 …,an} with sensitivity vector

{w1, w2, …, wn}

)ncR|va(obPrP)R|va(obPrPwhere

)P(logP)P(logP)R|nc(ivacyLossPr

i*iii

*i

ki

*ii

ki ia

and

1 21 2

)R|nc(ivacyLossPrW)R|nc(ivacyLossPria

ni iA 1

Page 40: 6. Trust  Negotiations and Trading Privacy for Trust *

40

Estimation ofquery-dependent privacy loss

Single query Q Q is the function f of attribute set A Domain of f (A) : {qv1, qv2, …, qvk}

Multiple queries Query set {q1, q2 …,qn} with sensitivity vector

{w1, w2, …, wn} Pri is the probability that qi is asked

ni iiqQ )WPr)R|nc(ivacyLoss(Pr)R|nc(ivacyLossPr

i1

)ncR|qv)A(f(obPrP)R|qv)A(f(obPrPwhere

)P(logP)P(logP)R|nc(ivacyLossPr

i*iii

*i

ki

*ii

ki iq

and

1 21 2

Page 41: 6. Trust  Negotiations and Trading Privacy for Trust *

41

Estimate privacy damage

Assume user provides one damage function dusage(PrivacyLoss) for each information usage

PrivacyDamage(PrivacyLoss, Usage, Receiver) = Dmax(PrivacyLoss)×(1-Trustreceiver) + dusage(PrivacyLoss) ×Trustreceiver

Trustreceiver is a number Є [0,1] representing the trustworthy of information receiver

Dmax(PrivacyLoss) = Max(dusage(PrivacyLoss) for all usage)

Page 42: 6. Trust  Negotiations and Trading Privacy for Trust *

42

Increasing trust level Adopt research on trust establishment

and management Benefit function TB(trust_level)

Provided by service provider or derived from user’s utility function

Trust gain TB(trust_levelnew) - TB(tust_levelprev)

Estimate trust gain

Page 43: 6. Trust  Negotiations and Trading Privacy for Trust *

43

(1)

[2a]

(3) User Role

[2b] [2d][2c1]

[2c2]

(2)

(4)

TERA = Trust-Enhanced Role Assignment(<nr>) – unconditional path

[<nr>]– conditional path

PRETTY: Prototype for Experimental Studies

Page 44: 6. Trust  Negotiations and Trading Privacy for Trust *

44

1) User application sends query to server application.

2) Server application sends user information to TERA server for trust evaluation and role assignment.

a) If a higher trust level is required for query, TERA server sends the request for more user’s credentials to privacy negotiator.

b) Based on server’s privacy policies and the credential requirements, privacy negotiator interacts with user’s privacy negotiator to build a higher level of trust.

c) Trust gain and privacy loss evaluator selects credentials that will increase trust to the required level with the least privacy loss. Calculation considers credential requirements and credentials disclosed in previous interactions.

d) According to privacy policies and calculated privacy loss, user’s privacy negotiator decides whether or not to supply credentials to the server.

3) Once trust level meets the minimum requirements, appropriate roles are assigned to user for execution of his query.

4) Based on query results, user’s trust level and privacy polices, data disseminator determines: (i) whether to distort data and if so to what degree, and (ii) what privacy enforcement metadata should be associated with it.

Information flow for PRETTY

Page 45: 6. Trust  Negotiations and Trading Privacy for Trust *

45

Conclusion

This research addresses the tradeoff issues between privacy and trust.

Tradeoff problems are formally defined.

An entropy-based approach is proposed to estimate privacy loss.

A prototype is under development for experimental study.