"Computer scientists, as all scientists, seek a common framework in which to link and organize many levels of explanation; moreover, this common framework must be semantic, since our explanations (including programs) are typically in formal language." --Robin Milner
From the beginning, Cornell has been known for its research in programming languages. Our strength in programming languages spans a wide range. Cornell does foundational work on type theory, automated theorem proving, and language semantics. A more recent theme has been language-based solutions to important problems such as computer security and distributed programming. Cornell researchers have also contributed to language implementation, program analysis and optimization, domain-specific languages, and software engineering. We are proud of both our breadth and depth in this core discipline. Programming Languages is a lively area at Cornell with five faculty and over a dozen Ph.D. students.
Faculty
Robert Constable does research on programming languages and formal methods in the context of type theory. The Nuprl proof assistant, developed by Constable and his group, is a dependently-typed language that can be used to describe distributed computing, as a formal specification language for computing tasks, and as a theory for formalizing topics in constructive and intuitionistic mathematics (of which classical mathematics can usually be seen as a special case). Constable is also interested in synthesizing programs and concurrent processes from proofs, developing systems that can be shown to be secure by construction, and exploring the deep connections between programming and logic.
Nate Foster works on language design, semantics, and implementation. In the past, he worked on languages and type systems for data processing, including bidirectional languages and data provenance. More recently, he has been developing the Frenetic language, which provides high-level constructs for specifying the behavior of networks. Frenetic makes it possible for programmers to specify the behavior of an entire network using a single program that a compiler translates to low-level code for the underlying devices. This provides opportunities for enforcing security, reliability, and performance guarantees using language-based techniques.
Dexter Kozen has interests that span a variety of topics at the boundary of computer science and mathematics including the design and analysis of algorithms, computational complexity, decision problems in logic and algebra, and logics and semantics of programming languages. Kozen obtained a number of foundational results for Kleene algebras with tests, and developed applications to efficient code certification and compiler verification. Recently he has been investigating capsules, which provide a clean algebraic representation of state in higher-order functional and imperative languages with mutable bindings, and coalgebraic techniques in verification.
Andrew Myers works on languages for secure programming, such as Jif, which integrates information flow into Java; Fabric, which extends Jif for building secure distributed systems, and Swift, which automatically partitions web applications securely between the client and the web server. The challenges posed by Fabric and Swift have also led to work on language-based methods for controlling timing channels, and on language support for extension and evolution of large software systems, such as the Polyglot extensible Java compiler.
Fred Schneider has gotten great leverage on his research on cyber-security by applying programming logics, semantics, and language design. Recently, he has been working on a "logic of belief" for characterizing authorization policies; this approach is now implemented in the Nexus operating system recently developed here at Cornell. Other examples of his recent work include his hyper-properties characterization of kinds of security policies and his proof that program obfuscation (e.g., address space re-ordering) can be as effective as type checking for defending against cyber-attacks.
Ross Tate works on problems related to language design and formalization including type systems, optimizations, and domain-specific extensions. His work draws on fields such as category theory and constructive type theory to develop powerful and flexible solutions. His research is being put into practice through industry collaborations with Red Hat and JetBrains on the design of the Ceylon and Kotlin languages.




