Important principles may, and must, be inflexible. —Abraham Lincoln
We'll talk a lot about vulnerabilities and countermeasures, about policies and mechanisms, about securing software systems throughout the semester. Here are underlying principles for building secure systems. We'll continue to see many examples of these throughout semester, so don't worry if they seem a bit abstract now.
This is a principle behind real-world security, and it holds for software security, too. Consider a bank vault. It has a lock, key(s), and a video camera:
In the real world, we don't make perfect locks or keys or cameras. Instead, we do risk management. We buy insurance. (It's cheaper than building perfect locks, etc.)
Mechanisms for accountability are separated into three classes:
Together these are known as the Gold Standard, because they all begin with Au, the atomic symbol for gold. Use these terms carefully! People frequently confuse authorization and authentication.
For example, your bank might use both a password and a hardware token to authenticate customers. Your apartment building might have multiple door locks and a security system. And your university IT department might use firewalls and virus scanners to prevent spread of malware.
Assume the enemy knows the system. For example, assume the enemy knows the encryption algorithm, but not the key. Or, assume the enemy knows the model of a lock, but not the cuts made in the key. We saw this principle appear in cryptography, where it's called "Kerckhoffs's Principle."
Non-secrecy is frequently violated by neophytes. Note that there's nothing wrong with keeping a design secret if security can be established by other means. That's just defense in depth.
The opposite of this principle is "security by obscurity".
They're easier to understand and easier to get right. It's easier to construct evidence of trustworthiness for small, simple things.
In any system, there's some set of mechanisms that implement the core, critical security functionality hence must be trusted. That set is called the trusted computing base (TCB). Economy of Mechanism says keep the TCB small.
It's safer to forget to grant privilege (in which case a principal complains) than to accidentally grant privileges (in which case principal has an opportunity to exploit them). For example,
Experienced system administrators know not to login as root for routine operations. Otherwise, they might accidentally misuse their root privileges and wreak havoc. Likewise, a web browser doesn't need full access to all files on the local filesystem. And a web front-end doesn't need full write access to a database back-end for most of its operation.
The component that does the mediation is called a reference monitor. Reference monitors should be tamperproof and transparently correct.
Time-of-check to time-of-use (TOCTOU) attacks exploit vulnerabilities arising from failure to adhere to this principle.
Financial institutions frequently employ this principle. Two tellers might be required in order to open a vault or disperse a large amount of cash.
In practice, this principle is difficult to implement. Do you really want to manage rights for every object and operation and principal in a software system? There's millions of them—you'll get something wrong. So we naturally do some bundling of privileges.