Introduction to Security ======================== > *All satisfied with their seats? O.K. No talking, no smoking, no > knitting, no newspaper reading, no sleeping, and for God's sake take > notes.* —Vladimir Nabokov **November 2, 1988:** Robert Tappan Morris, Jr. released the "great worm." (His dad, Bob Morris, was chief scientist at NSA's National Computer Security Center.) This was the first worm; the first malware to get media attention; and the first conviction under the Computer Fraud and Abuse Act. Morris was 23 years old, and a first year grad student at Cornell. (He released the worm from MIT, though.) He later claimed the worm's purpose was to measure size of Internet, but the immediate effect was denial of service (DoS). The Internet "came apart" as hosts were overloaded by invisible processes. System admins had to disconnect from the network to isolate their systems from infection. The US GAO later estimated the cost of recovery was somewhere in $100k to $10m. Morris was tried in US District Court. His sentence: 3 years probation, 400 hours community service, and a $13k fine. In 1999, he received a PhD from Harvard. And now he's a professor at MIT. **June 1, 2012:** The *New York Times* reports that the US and Israel created Stuxnet, the first (publicly known) cyberweapon. Its provenance was initially unknown. The weapon first infects Windows systems, then subsequently infects an industrial control device, causing it to vary the frequency of its motor and do physical harm. But, the weapon hides that frequency change from the device's monitoring system, so that the harm won't be noticed until it's too late. The purpose of the weapon seemed to be destruction of centrifuges in Iranian uranium enrichment facilities. **Today,** security is - *hard:* we don't (fully) know how to accomplish it, - *interesting:* involves lots of cool ideas, and - *important:* society depends on computers, hence their security. That's what makes this such a fun field of study. ### Defining security A computer system is *secure* when it - does what it should - and nothing more. A *security policy* stipulates what should and should not be done. Policies can be long English documents, mathematical axioms, etc. But almost everyone agrees that security policies are formulated in terms of three basic kinds of *security properties*: confidentiality, integrity, and availability. **Confidentiality:** Protection of assets from unauthorized disclosure. Assets could be information, or resources. Disclosure must be to someone; that might be a person, a program, another computer system, etc. To generalize those entities, define a *principal* to be any entity that can take actions. So, confidentiality is about which principals are allowed to learn what. *Secrecy* is synonymous with confidentiality. - keep contents of file from being read. (Access control.) - keep value of variable secret. That's harder, needs compiler/runtime support. (Information flow.) - keep behavior of system secret. Even harder: suppose system load goes up when system is processing secret information. Anyone on system can observe load. (Covert channel.) - keep information about an individual secret. Hard these days with lots of databases. Can be linked against one another, new information extracted. Gender, DOB, ZIP uniquely identify about 99% of people in Cambridge, MA. (Database privacy.) *Privacy* is the confidentiality of identifying information about individuals, which could be people, organizations, etc. Sometimes privacy is construed as legal right. Don't say "keep information private" unless you really mean that the information is about an individual and is identifying. (**All your vocabulary are belong to us.**) **Integrity:** Protection of assets from unauthorized modification. I.e., what changes are allowed to system and its environment. Changes can include initial sources, hence provenance. The environment can include outputs. - output is correct according to a mathematical specification. - no exceptions are thrown. - resource consumption is bounded. - only certain principals are allowed to write a file. (Access control.) - data are not corrupted or tainted by downloaded programs. (Information flow.) **Availability:** Protections of assets from loss of use. Denial of service (DoS) attacks typically cause violations of availability properties. - system must accept inputs periodically (OS doesn't terminate) - produce output by specified time (program terminates) - process requests fairly (order received, priority, etc.)