A reading list

Resources spanning AI, cognitive science, and program synthesis that might be useful for anyone wanting to get started on program induction, neural-symbolic models, or related areas. This reading list is slanted toward my own interests, and if you think it’s missing something, shoot me an e-mail: kellis@cornell.edu

Textbooks

Information Theory, Inference, and Learning Algorithms

The Deep Learning Textbook

Types and Programming Languages (Text is not freely available but there are free source codes to study)

The Program Synthesis Textbook

Probabilistic Models of Cognition

Program synthesis

Automating String Processing in Spreadsheets Using Input-Output Examples

FlashMeta: A Framework for Inductive Program Synthesis

Scaling Enumerative Program Synthesis via Divide and Conquer

Program Synthesis by Sketching

Synthesizing Data Structure Transformations from Input-Output Examples

Program induction

Applications

Distilling Free-Form Natural Laws from Experimental Data

AI Feynman 2.0: Pareto-optimal symbolic regression exploiting graph modularity

Learning to Infer Graphics Programs from Hand-Drawn Images

Making sense of sensory input

Programmatically Interpretable Reinforcement Learning

Symbolic methods

Bootstrap Learning via Modular Concept Discovery

Meta-interpretive learning of higher-order dyadic datalog: predicate invention revisited. Code. Web interface.

Neurally-guided program induction

DeepCoder: Learning to Write Programs

RobustFill: Neural Program Learning under Noisy I/O

Execution-guided Neural Program Synthesis

Write, Execute, Assess: Program Synthesis with a REPL

Accelerating search-based program synthesis using learned probabilistic models

Learning Differentiable Programs with Admissible Neural Heuristics (This paper is hard to categorize. Neural networks are used as a relaxation of the space of programs, and this relaxation serves to construct a search heuristic. But definitely worth reading.)

Neural-symbolic programs

Making Sense of Raw Input

Learning to learn generative programs with Memoised Wake-Sleep

Imitation-Projected Programmatic Reinforcement Learning

HOUDINI: Lifelong Learning as Program Synthesis

Modular meta-learning

Programs and gradient descent

Neural Turing Machines

Learning Explanatory Rules from Noisy Data

TerpreT: A Probabilistic Programming Language for Program Induction

Computational Cognitive Science

On the Measure of Intelligence

Human-level concept learning through probabilistic program induction

Bootstrapping in a language of thought: A formal model of numerical concept learning

The Computational Origin of Representation

Theory Learning as Stochastic Search in a Language of Thought

The Copycat project: A model of mental fluidity and analogy-making

Natural language

Learning dependency-based compositional semantics

Deep Compositional Question Answering with Neural Module Networks

NL2Bash: A Corpus and Semantic Parser for Natural Language Interface to the Linux Operating System