anonymity, strong privacy, &...

14
1 Anonymity, Strong Privacy, & Transparency Too Little or Too Much? To this point, we’ve been discussing various threats to privacy posed by ICT (government and private sector data collection, data mining, transaction tracking, employee monitoring, etc.). As Peter Singer suggests, our situation with respect to privacy might be likened to the “inspection principle” proposed by Jeremy BenthamSinger

Upload: phamthuy

Post on 03-May-2018

216 views

Category:

Documents


1 download

TRANSCRIPT

Page 1: Anonymity, Strong Privacy, & Transparencyhomepage.usask.ca/~wjb289/PHIL222/pdf/02.2_Anonymity_Strong_Pri… · privacy might be likened to the “inspection principle” ... Public

1

Anonymity, Strong Privacy, & Transparency

Too Little or Too Much?

To this point, we’ve been discussing various threats to privacy posed by ICT (government and private sector data collection, data mining, transaction tracking, employee monitoring, etc.).

As Peter Singer suggests, our situation with respect to privacy might be likened to the “inspection principle” proposed by Jeremy Bentham…

Singer

Page 2: Anonymity, Strong Privacy, & Transparencyhomepage.usask.ca/~wjb289/PHIL222/pdf/02.2_Anonymity_Strong_Pri… · privacy might be likened to the “inspection principle” ... Public

2

The Panopticon (1787)

Bentham reckoned, on utilitariangrounds, that prison conditions could and should be improved. For instance, by an improved prison design…

• All prisoners may be observed at all times by unseen guards in the central “inspection lodge.”

• Prisoners will come to behave as if they are being watched even when they are not.

• Bentham: So there’s no need always to have a guard.

The Threat View

Viewed from the perspective of valuing privacy, the contemporary technological Panopticon, may appear truly frightening.

Michel Foucault: “the perfection of power” (Singer, 33)

On the threat view, this seems to be exacerbated by the fact that we voluntarily give away so much personal information (on social media, in our browsing and purchasing habits, etc.)

In short, things look pretty grim…

Page 3: Anonymity, Strong Privacy, & Transparencyhomepage.usask.ca/~wjb289/PHIL222/pdf/02.2_Anonymity_Strong_Pri… · privacy might be likened to the “inspection principle” ... Public

3

David Friedman: Strong Privacy & Encryption

According to David D. Friedman, however: “The truth is precisely the opposite” (212).

Public Key Encryption (PKE) – used in the context of a highly networked world and in conjunction with technologies such as virtual reality (VR) communications – may promise “a level of privacy never known before.”

Governments and Privacy

One of the potential privacy villains that recurs in Friedman’s account.

Governments in general (and the U.S. government in particular) are increasingly concerned about the possibility of strong, technologically-mediated privacy.

Item: The so-called Clipper Chip (1993-7), about which Friedman asserts that “The US government is currently intervening in an attempt not protect privacy but to prevent it” (212).

Page 4: Anonymity, Strong Privacy, & Transparencyhomepage.usask.ca/~wjb289/PHIL222/pdf/02.2_Anonymity_Strong_Pri… · privacy might be likened to the “inspection principle” ... Public

4

Crude Data Privacy Technology

If want to keep information private, one way to do so is to make sure that no one can physically access it: You can put the information in a safe, rent a safe deposit box, or bury it in the ground.

If the information needs to be communicated, you can make sure that no one has physical access to the communication:

“If you are worried about eavesdroppers, check under your eaves—or hold your conversations in the middle of large open spaces” (212)

As we’ve seen, however, relying on such ‘brute force’ (de facto) data privacy protection has become increasingly proplematic and, for some purposes, impossible:

Telephones can be tapped, e-mail and text messages can be intercepted or recovered from a server back-up; data records may be created and RFIDs read…all without data subjects being aware.

Page 5: Anonymity, Strong Privacy, & Transparencyhomepage.usask.ca/~wjb289/PHIL222/pdf/02.2_Anonymity_Strong_Pri… · privacy might be likened to the “inspection principle” ... Public

5

More Sophisticated Approaches

Another (de jure) solution is legislation limiting the ways in which information can be obtained or used. But, of course, rules may not be enforced or, if they are enforced, may not be enforced equitably or fairly.

Yet another way, in use since ancient times: encryption.

Encryption’s (considerable) advantages:

1) Even if the information protected is intercepted, it cannot be read (at least not without a lot of work)

2) It need not require the cooperation or good-will of anyone besides the communicator and the communicatee.

Public Key Encryption (PKE): A Simplified Overview Public key encryption involves the use of keys (e.g.,

prime numbers).

Two keys (a key pair) are used: A secret private keyand a public key accessible by anyone (which you would publish in some way, if you want others to be able to send encrypted communications to you).

The two keys are used together such that a message encrypted with the public key can only be decrypted with the private key (and vice versa).

In the simplest terms, the public key is mathematically related to the private key such that it is very difficult (in terms of time and computing cycles) to determine a private key from its matching public key

Page 6: Anonymity, Strong Privacy, & Transparencyhomepage.usask.ca/~wjb289/PHIL222/pdf/02.2_Anonymity_Strong_Pri… · privacy might be likened to the “inspection principle” ... Public

6

PKE (con’t)

The larger the keys, the more secure the process.

(E.g., in recent years online banking requires the use of a browser that supports 256 or 128 bit SSL encryption vs. the old 56 bit standard. This is essentially a direct reflection of key size: 64 or 32 digits vs. 14)

Examples of open source / open license PKE:

Phil Zimmermann’s “Pretty Good Privacy” (PGP); Gnu Privacy Guard (GnuPG); RetroShare

Besides encrypting communications, PKE technologies can also be used to protect files and as signatures (by means of a certificate authority – a trusted third party which, in effect, guarantees the authenticity of a public key by including it as part of a certificate encrypted with its own public key).

Major political consideration: Should governments be able to access private keys for law enforcement or national security purposes.

The Clipper Chip initiative proposed, essentially, that a “back door” into PKE (in telephones), ought to be required by law.

Page 7: Anonymity, Strong Privacy, & Transparencyhomepage.usask.ca/~wjb289/PHIL222/pdf/02.2_Anonymity_Strong_Pri… · privacy might be likened to the “inspection principle” ... Public

7

Privacy Beyond Data

As Friedman notes, encryption can make the content of any information exchange effectively impenetrable, but the fact that such an exchange took place can still be monitored.

If you are, say, a criminal, or a human rights advocate working in a hostile political climate, or a corporate whistleblower, this can amount to a effective restriction on your behaviour. (Think, e.g.: Edward Snowden)

Two additional privacy technologies can help to mitigate this, however: Anonymous Digital Cash and Anonymous Remailers.

Anonymous Digital Cash

First proposed by David Chaum (DigiCash) in the late 1970s. Tried in various forms by a number of firms –NetCash, First Virtual – none of which have survived. (Bitcoin is not anonymous in its basic operation but can be used pseudonymously.)

Digital cash: A payment message bearing a digital signature which, like all money, is both a medium of exchange or store of value – like a cheque, only with bits instead of paper as a medium.

Unlike regular cash, digital cash is not (yet) legal tender. It is a liability of a bank or a private company (or, in the case of Bitcoin, a community of users), not a state central bank…

Page 8: Anonymity, Strong Privacy, & Transparencyhomepage.usask.ca/~wjb289/PHIL222/pdf/02.2_Anonymity_Strong_Pri… · privacy might be likened to the “inspection principle” ... Public

8

Non-anonymous digital cash: Easy to store or move large sums, lower transaction costs (as compared to counting, storing physical cash). But it offers no special degree of privacy protection.

Anonymous digital cash, by contrast, is untraceable(a digital cash withdrawal cannot be associated with its subsequent deposit ) and unlinkable (it is impossible to associate two different digital cash transactions made by the same person with each other).

Strong privacy perspective: Anonymous digital cash uses encryption to permit payments where neither the payer nor the payee can identify each other.

Banking Jurisdiction

As Friedman notes, a less sophisticated measure – but possibly almost as good as digital cash in practice – would be to deal only with banks in “trustworthy” jurisdictions.

For the truly wealthy this is nothing new (e.g. numbered Swiss banks accounts, which have been around for many years)…

Page 9: Anonymity, Strong Privacy, & Transparencyhomepage.usask.ca/~wjb289/PHIL222/pdf/02.2_Anonymity_Strong_Pri… · privacy might be likened to the “inspection principle” ... Public

9

Data Havens

For the rest of us, ICT has made off-shore banking much easier to achieve.

In fact, a some sovereign states (e.g., Anguilla, Bermuda) have structured their financial regulations in order to take advantage of a growing market for financial “safe-havens” and data havens, reachable through the internet.

E.g., Principality of Sealand

Anonymous Remailers

An e-mail server which receives incoming messages, strips off the identity of the sender, and then redirects the message to the recipient indicated.

The recipient sees a message from an arbitrary ID associated with the anonymous remailer (e.g., [email protected]), but cannot see the e-mail address of the original sender.

Traffic to and from the server can be monitored, of course, but without access to the server itself there is no easy way to match senders with recipients.

Page 10: Anonymity, Strong Privacy, & Transparencyhomepage.usask.ca/~wjb289/PHIL222/pdf/02.2_Anonymity_Strong_Pri… · privacy might be likened to the “inspection principle” ... Public

10

Perhaps the first famous pseudonymous remailer was anon.penet.fi run by Johan Helsingius in Finland (1993-1996).

There have been many others, however, normally run on a voluntary basis by individuals interested in data privacy. This may leave operators vulnerable to police/ government, pressure to allow access to their server (e.g. via a subpoena), as Helsingius found out.

Friedman: Some weaknesses of anonymous remailers can be overcome by PKE (encrypting the contents of messages and/or the pack order of the message) and by using multiple remailers. (Cf. Cypherpunk and Mixmasterremailers)

Social Consequences of Strong Privacy Technologies

1) One problem with anonymous activities in general is that anonymous agents have no reputation to protect.

Friedman: this could be mitigated by the use of cryptographic authentication (digital signatures).

I.e., instead of their actual identity, anonymous agents are known to the internet by their public key. As with brand identity in the marketplace, agents have a disincentive to do anything that would make their public key untrustworthy…

Page 11: Anonymity, Strong Privacy, & Transparencyhomepage.usask.ca/~wjb289/PHIL222/pdf/02.2_Anonymity_Strong_Pri… · privacy might be likened to the “inspection principle” ... Public

11

But: There is a very low “barrier to re-entry” for agents known only by a pseudonym (e.g., their public key)

Ordinarily, to lose your reputation as trustworthy member of some community is to ensure that, inter alia, you may be shunned by or excluded from that community.

Moreover, loss of reputation can be a retributive and/or deterrent punishment. You will suffer, to some degree, from the loss of trust and have to work to regain it.

Obtaining a new public key, by contrast, is trivially easy.

2) An advantage of a world of strong privacy, says Friedman, is in such a world freedom of speech (and, to be sure, economic and political freedom as well) are guaranteed by the technology, not by fallible, corruptible governments and courts.

For some purposes (reporting of human rights abuses, organization of dissident groups) there is evidence that this “freedom defending” role for such technologies has been used to good effect…

Page 12: Anonymity, Strong Privacy, & Transparencyhomepage.usask.ca/~wjb289/PHIL222/pdf/02.2_Anonymity_Strong_Pri… · privacy might be likened to the “inspection principle” ... Public

12

Privacy vs. Transparency

In fact, a somewhat broader package of ICT phenomena arguably can be said to have made the world a safer, more reasonable, more moral, place over the last few decades.

E.g.: The role of satellite television in helping to bring about the breakup of the Soviet Union, Tiananmen Square, the “Arab Spring,” video-taped police interrogations, CCTV, WikiLeaks, etc.

Note, however, that this sort of technology is politically helpful by doing the opposite of what strong privacy technologies do, namely by exposing potential or actual wrongdoing that might otherwise have gone undetected.

3) Perhaps the most obvious (and, for some, the most desirable) consequence of a world of strong privacy is a radical restriction of the powers of government.

For one thing, digital cash, especially when combined with “jurisdiction shopping” and PKE communications, severely constrains the ability of governments to collect taxes…

Page 13: Anonymity, Strong Privacy, & Transparencyhomepage.usask.ca/~wjb289/PHIL222/pdf/02.2_Anonymity_Strong_Pri… · privacy might be likened to the “inspection principle” ... Public

13

Friedman suggests, plausibly enough, that as strong privacy technologies become more common, the focus of taxation efforts will shift toward goods and services that can be physically observed: food, fuel, housing.

Note that this effectively forces taxation to become more regressive.

A Cryptographic Divide: In such conditions proportionally more taxes will be paid by people who do not have the skills, the tools or the sort of job that would allow them to receive their income as digital cash deposited in some data haven. This could easily exacerbate inequality and/or repression.

Relatedly, strong privacy technologies also “make certain sorts of legal regulation impractical.”

• Regulation pertaining to such things as censorship, trade barriers, etc. might not be missed.

• Copyright law, and restrictions on the use intellectual property generally, might be a different matter.

• (Regulation of, say, the drug trade or sex work may lie somewhere in the middle.)

Page 14: Anonymity, Strong Privacy, & Transparencyhomepage.usask.ca/~wjb289/PHIL222/pdf/02.2_Anonymity_Strong_Pri… · privacy might be likened to the “inspection principle” ... Public

14

Friedman describes some schemes for the technological protection of copyright in digital information involving anti-pirating labeling and enhanced contract protection for sellers.

The effectiveness of such schemes might turn out be extremely limited, however.

E.g.: The MPAA vs. groups like the Masters of Reverse Engineering (MoRE, Norway) concerning the Content Scrambling System (CSS) used in DVD players.