robert h. sloan · to pull out correctness issues: bugs in software misbehavior by trusted...

Post on 07-Aug-2020

2 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

TRANSCRIPT

Robert H. Sloan

1.  General overview 2.  General Principles and Security Policies 3.  Overview of common attack types ◦  Technical and other traditional ◦  Psychology-based

•  Name of single most prestigious conference in the technical area also known as computer security: •  IEEE Symposium on Privacy and Security. •  Held annually in Oakland since 1980. •  I’ve verified it’s had this name since at least

1988, and I think from the start.

•  What we expect at home. Our issues there: •  Burglars & c. MALICE •  Natural Disaster MISCHANCE •  Set house on fire cooking ERROR •  Police visit GOVERNMENT •  Unwanted Commercial Intrusion

•  Grew up w/crypto; now related but distinct

 Opaque walls  Window coverings  Locks  Child-safe stove knobs  Many special issues vis-à-vis government

(4th Amendment)   Some special issues vis-à-vis commercial

speech (1st Amendment?)

 Many security issues have economic tradeoffs:

 Almost always want medium-plus strength deadbolt lock on your doors

 Whether more expensive superb door lock depends on existence of 1st floor window locks, alarms, etc., and neighborhood

 Wherever we are critically dependant on computer systems (working correctly) ◦  Business environment: Legal compliance,

profitability, IP, cash flow, image ◦ Military environment ◦ Medical environment ◦ Households: privacy, burglar alarms, correct

billing ◦  Society at large: utilities, tax system, etc.

 Many domains have their own special security issues: ◦  Banks ◦ Military Base   Military Base with nukes

◦ Hospitals ◦  Installations w/classified materials

 Anderson’s Security Engineering ◦ Overview in Chapter 1 ◦  Intro to psychological/human factors/social

engineering in Chapter 2 (attacks) ◦  Sundry management issues including risk

management in 25.1–25.2. ◦ Missing: nuts-and-bolts stuff of entry business

technology major with a certificate.

 Key players: ◦ Organization: Entity whose security is being

protected ◦ Attackers: Entities intentionally trying to “get

past” security

 Colors: ◦ White hats: organization’s security guardians ◦  Black hats: attackers ◦ Red team: Simulated attack to test defenses.

“A computer is secure if you can depend on it and its software to behave as you expect.”

—Garfinkle and Spafford

Though a computer scientist probably wants to pull out correctness issues:

  Bugs in software   Misbehavior by trusted individuals beyond what was

contemplated. (I.e., untrustworthy trusted individuals)

 Computer Security The science of managing malicious intent and behavior that involves information and communications technology.

 Malicious behavior can include: ◦  Fraud, theft, ◦ Vandalism ◦ Terrorism, warfare, espionage, sabotage, ◦  Spam

1.  People 2.  Trust 3.  Limiting Trust   Traditional names of parties; Alice, Bob,

Charlie   People are very complicated, hard/

impossible to model: ◦  What will Alice do? ◦  How does Bob constrain Alice’s behavior?

  Confidentiality—secrecy, control of information flow. Preventing the unauthorized release of information.

  Integrity—Preventing the unauthorized alteration of information.

  Availability—Keep system available for use. Preventing denial of use to those authorized to use system.

  Inverse: Disclosure, Alteration (by hacker or head crash), Denial

  My credit card: confidentiality—but not my photos

  My files: integrity for all; confidentiality for many   Ability to append to class blog limited to those in

the class   System available 99.8% of time 6:30 a.m. eastern

to 1:00 a.m. Pacific, and at least 70% rest of week, mean response time < 1 sec; 99% of responses < 5 sec.

 Data protection/personal data privacy: fair use and collection of personal info

 Anonymity/untraceability  Copy protection, information flow control  Unlinkability: ability to use a resource

multiple time without others being able to link uses together (cookies try to prevent this)

 Rollback: ability to return to a well-defined earlier state (backup)

 Authenticity: verification of the claimed identity of a communication partner

 Non-repudiation: origin and/or receipt of message cannot be denied in front of third party

 Audit: monitoring user-initiated events   Intrusion detection

  Vulnerability: System property—weakness. ◦  E.g., Windows Machine w/no antivirus software.

  Threat: External (or malicious insider?) that might allow a vulnerability to be exploited. ◦  E.g., any given virus.

  Vulnerability + Threat = RISK   Risk = potential; Security failure = actual

violation of security policy

"   Much of literature speaks blithely of “subjects,” the active entities.

"   But 2 sets of thorny issues re exactly what subject is. 1.  Do I mean Alice the human I trust, or a

logon session of Alice, or a process of Alice?

2.  And if Alice gave Bob her UIC & password and he logged in as Alice?

"   Rich source of subtle bugs

"   Subject = Physical person "   Person = Physical or legal (i.e., company/

corp) "   Principal = Any one entity "   Group = Set of principals "   Role = Function assumed by 1 or more

persons

1.  Single Component or product a)  Stand-alone computer

  Historically important; still a (the?) basic building block requiring high trustworthiness

b)  Any other single component or product   E.g., one smart card or one cryptographic protocol

2.  Networked computers a)  Where one still knows all entities & connections

  Organization’s LAN b)  Distributed systems: ≥2 entities with widely varying

levels of trust

  IT staff, internal users (staff, management), external users (public)

 Then Broad environment: ◦ Media, regulators, politicians ◦ Competitors ◦ Malicious teens from Romania ◦ Malicious profit-driven mobsters based in

Russia ◦ Chinese government military/intelligence ◦  . . .

 Access control supports confidentiality and/or integrity by putting specific barriers in place to limit certain principle’s(but maybe goal is person??) access to specific objects (data). ◦  E.g., who can read/write what file

 More later.

  Security is never absolute, and security costs money

 Goal is enough security that ◦ Cost-benefit ratio makes sense ◦ Always achieved once info sec is no longer

the weakest link

 Principle of least privilege: subject should have minimum permissions necessary to accomplish subject’s tasks.

  Idea: Physical security of a high-value destination, e.g., military installation with nukes will have, e.g.:

•  Fences •  Alarms •  Locks •  Armed Guards

•  Big organization such as University has (we hope!) something like •  One connection to

outside world (“border router”), leading directly to a firewall

•  Split of traffic into traffic for sensitive systems versus non sensitive systems (DMZ), each with its own Intrusion Detection Systems

•  (AKA Separation of privileges) •  Examples: •  Protocol for 2 physically separated people to

turn keys to launch nukes •  Creation of account and assigning administrative

privileges to new account must be done by two different users

•  Finalizing transaction above threshold dollar amount requires two to authorize it

•  Access to physical location of sensitive servers requires 2 trusted subjects present

 Kerckhoffs’ law: Crypto systems should be designed in such a way that they are not compromised if the opponents learn the technique being used Kerckhooffs, “La Cryptographie Militaire,” J. Sciences Militaire, Jan. 1883

  Empirically, security through obscurity never worked well

  Security is game where bad guys need find only 1 weakness, so having 100 good-guy academics searching for weaknesses helps discover if there are any

  Especially true today, with much open-source code, many highly trained individuals around the world, etc.

  In common use, something like “Written documents that outline an

organization’s security requirements and expectations of users, administrators, security professionals, and managers”

(Solomon and Chappelle, Information Security Illuminated)  Anderson argues this too often gives us

useless (except for CYA?) mush

 A formal, brief, and high-level statement or plan that embraces an organization’s general beliefs, goals, objectives, and acceptable procedures for a specified subject area.

  Policy attributes include the following: ◦  Require compliance (mandatory) ◦  Focus on desired results, not on means of

implementation —SANS Policy Primer

  Security policies are the goals.   Independent of mechanism/

implementation   Intended to ensure sufficient

confidentiality, integrity, and availability of organization’s information assets.

 Those assets = information (data) + hardware (computers, networks)

 Protection mechanism/implementation and attacks vary by system

 Acceptable Use Policy ◦ What types of activities are acceptable? ◦ What types of activities are clearly

unacceptable (“including but not limited to”)? ◦ Where should users go for AUP clarifications? ◦ What procedure should be followed if

violation suspected? ◦ What are consequences for violations?

  For integrity and availability   Such things as: ◦ What should be backed up and how ◦ Where should backup media be stored? ◦ Who should have access to backups? ◦ How long should backups be retained?

 Probably want to put data into several broad categories

 Minimum retention time for each category, sometimes specified or suggested by law (e.g., certain tax records 7 years)

 Maximum retention time to promote personal privacy and/or avoid discovery

 Types of equipment that may be purchased by organizational entities

 Types of personal equipment (if any) that can be brought on site

 Types of wireless services that organizational entities can buy

 Approval method/authorities for exceptions

 Lots of frequent staff education   Security checklists and matrices both

useful albeit simple minded tools for admins

 Matrix: Choose low, moderate, high importance for each of C,I,A, guides resource allocation

1.  Physical security—Limiting (or preventing) physical access to computer hardware as well as fire control, electronic emanations issues

2.  Personnel Security—Vetting at hiring, role change. Also training, user monitoring. Key issue: How do you decide whom to trust, and how much? Sufficiently serious position: oaths, investigations, lie detectors, references, credit reports. . . .

"   3. Logical or Technical or Procedural controls. Typically implemented via software/OS, and main thing that most computer scientists think about as security.

"   E.g., User ID and authentication, crypto. "   Based on organization’s rules and/or

laws, professional standards.

  If technical security is too onerous, it will be bypassed.

  If technical security is too lax, it will be ineffective.

•  Prevent certain types of attacks •  Detect security breaches •  Stop further occurrences •  Identify the bad guy(s) •  Punish the bad guy(s)

1.  Hackers/crackers/malicious outsiders ◦  Corporate spies searching for trade secrets ◦  Intelligence agencies seeking military secrets

or info on domestic opponents ◦  Organized crime recruiting botnet elements

or attacking enemies/setting up blackmail ◦  Teens, 20-somethings seeking thrills

2.  Malware/malicious code objects 3.  Malicious insiders

•  Stand-alone: Buffer overflow, authentication attacks

•  Networked: Injecting bogus packets, reading packets not intended for the node

•  Distributed System: Distributed Denial of Service (DDoS)

 Malicious code, e.g., viruses, worms, Trojans ◦  Buffer Overflow: technical error by developer,

exploited by malicious code  Back doors: special path to higher

authorization left by developer  Brute force: e.g., try all possible

passwords  Denial of Service: attack availability  Man-in-the-Middle

  “Hello, I’m Bob in Tech Support and we’ve gotten a call from the Provost that he can’t get this XXX.doc file on his account to print right, and he wants us to print off 20 copies for the Board of Trustees meeting tomorrow. Can you give me the Provost’s password to YYY?”

  Famous use by investigators hired by HP Board Chair Patricia Dunn re leaks

Dear customer, Our records show that your online session has been locked due to the following reason.

1. Log on attempts with invalid information. 2. Inadequate update on your cc online account.

We urge you to restore your NetSpend online account immediately to avoid final shut down of your account. Click the link below to restore your NetSpend online account: NetSpend Online Account

© 2010 NetSpend. All rights reserved.

 Typically an email claiming to come from legit company requesting verification of some information

 Quality of the phishers’ work is improving  Targets company’s customers, not staff;

hard to defend against  Loss estimates vary wildly: US per year

$60 million to $3 billion

--Hackers Try to Steal Carbon Credits (February 3 & 4, 2010)

Companies in Europe, Japan, and New Zealand received phishing emails that appeared to come from the German Emissions Trading Authority.  The messages told the recipients that they needed to re-register their accounts and were directed to a phony web page where the login credentials were stolen.  The information was used to access the companies' accounts. The thieves stole the credits and resold them to unsuspecting buyers.  In all, they stole an estimated 250,000 carbon credit permits from six companies; the credits were worth a total of more than US $4 million.  The attack caused emissions trading registries in several countries to be shut down temporarily.  The carbon trading program allows companies to sell permission to produce greenhouse gases.

1.  Attack on applications (browsers, especially IE, Flash, Adobe Reader, Office) rather than OS ◦  Often via targeted emails: spear phishing

2.  Compromising trusted web sites

Source: SANS semi-annual Top Cyber-security risks report

top related