Books

Programming Language Theory and Design

Practical Foundations for Programming Languages (2016, Robert Harper) Essentials of Programming Languages (2012, Friedman) Design Concepts in Programming Languages (2008, Turbak) Concepts, Techniques and Models of Computer Programming (2004, Van Roy) Concepts of Programming Languages (2016, Sebesta) Concepts in Programming Languages (2010, Mitchell) Programming Language Concepts (1997, Ghezzi) Programming Language Pragmatics (2015, Scott) Programming Languages and Operational Semantics: An Introduction (2004, Fernandez) The Formal Semantics of Programming Languages: An Introduction (1993, Winskel) Types and Programming Languages (2020, Benjamin Pierce) Advanced Topics in Types and Programming Languages (2004, Benjamin Pierce) An Introduction to Functional Programming Through Lambda Calculus (2011, Greg Michaelson) Lambda Calculus with Types (2014, Henk Barendregt) Lambda-calculus, Combinators and Functional Programming (2009, G. E. Revesz) Type Theory and Formal Proof: An Introduction (2016, Rob Nederpelt)

Compilers

Engineering A Compiler (2021, Cooper, Torczon) Crafting Interpreters (2020, Nystrom) Introduction to Compilers and Language Design (2020, Douglas Thain) Compiling to Assembly from Scratch (2020, Vladimir Keleshev) Writing an Interpreter in Go (2017, Thorsten Ball) Writing a Compiler in Go (2018, Thorsten Ball) Modern Compiler Design (2012, Grune) Implementing Programming Languages. An Introduction to Compilers and Interpreters (2012, Ranta) Language Implementation Patterns (2010, Parr) Writing Compilers and Interpreters: A Software Engineering Approach (2009, Mak) Compiler Design. Theory, Tools, Examples (2010, Bergmann) Basics of Compiler Design (2010, Torben Mogensen) Compilers. Principles, Techniques and Tools (2006, 2nd Ed, Aho, Lam, Sethi, Ullman) Optimizing Compilers for Modern Architectures (2005, Allen, Kennedy) Compiler Design Handbook. Optimizations and Machine Code Generation (2003, Srikant, Shankar) The Optimal Implementation of Functional Programming Languages (1999, Asperti, Guerrini) Modern Compiler Implementation in C (1998, Appel) Modern Compiler Implementation in ML (1998, Appel) Modern Compiler Implementation in Java (1998, Appel) Building an Optimizing Compiler (1997, Morgan) Compiler Construction (2005, Wirth) Compiler Construction (1995, Waite, Goos) The Functional Treatment of Parsing (1993, Leermakers) Parsing Techniques. A Practical Guide (1990, Grune, Jacobs) Compiler Design in C (1990, Holub) The Implementation of Functional Programming Languages (1987, Peyton Jones) The Design and Construction of Compilers (1982, Hunter) Writing Interactive Compilers and Interpreters (1979, Brown) Assemblers, Compilers, and Program Translation (1979, Calingaert) Compiler Construction: An Advanced Course (1974, Bauer, Eickel) The Theory of Parsing, Translation, and Compiling Volume 1 Parsing (1972, Aho, Ullman) The Theory of Parsing, Translation, and Compiling Volume 2 Compiling (1972, Aho, Ullman)


Practical Foundations for Programming Languages

2nd Ed, 2016, Robert Harper

Types are the central organizing principle of the theory of programming languages. In this innovative book, Professor Robert Harper offers a fresh perspective on the fundamentals of these languages through the use of type theory. Whereas most textbooks on the subject emphasize taxonomy, Harper instead emphasizes genetics, examining the building blocks from which all programming languages are constructed. Language features are manifestations of type structure. The syntax of a language is governed by the constructs that define its types, and its semantics is determined by the interactions among those constructs. The soundness of a language design – the absence of ill-defined programs – follows naturally. Professor Harper's presentation is simultaneously rigorous and intuitive, relying on only elementary mathematics. The framework he outlines scales easily to a rich variety of language concepts and is directly applicable to their implementation. The result is a lucid introduction to programming theory that is both accessible and practical.


Essentials of Programming Languages

2012, Friedman

A new edition of a textbook that provides students with a deep, working understanding of the essential concepts of programming languages, completely revised, with significant new material. This book provides students with a deep, working understanding of the essential concepts of programming languages. Most of these essentials relate to the semantics, or meaning, of program elements, and the text uses interpreters (short programs that directly analyze an abstract representation of the program text) to express the semantics of many essential language elements in a way that is both clear and executable. The approach is both analytical and hands-on. The book provides views of programming languages using widely varying levels of abstraction, maintaining a clear connection between the high-level and low-level views. Exercises are a vital part of the text and are scattered throughout; the text explains the key concepts, and the exercises explore alternative designs and other issues. The complete Scheme code for all the interpreters and analyzers in the book can be found online through The MIT Press web site. For this new edition, each chapter has been revised and many new exercises have been added. Significant additions have been made to the text, including completely new chapters on modules and continuation-passing style. Essentials of Programming Languages can be used for both graduate and undergraduate courses, and for continuing education courses for programmers.


Design Concepts in Programming Languages

2008, Turbak

Key ideas in programming language design and implementation explained using a simple and concise framework; a comprehensive introduction suitable for use as a textbook or a reference for researchers.

Hundreds of programming languages are in use today--scripting languages for Internet commerce, user interface programming tools, spreadsheet macros, page format specification languages, and many others. Designing a programming language is a metaprogramming activity that bears certain similarities to programming in a regular language, with clarity and simplicity even more important than in ordinary programming. This comprehensive text uses a simple and concise framework to teach key ideas in programming language design and implementation.

The book's unique approach is based on a family of syntactically simple pedagogical languages that allow students to explore programming language concepts systematically. It takes as premise and starting point the idea that when language behaviors become incredibly complex, the description of the behaviors must be incredibly simple. The book presents a set of tools (a mathematical metalanguage, abstract syntax, operational and denotational semantics) and uses it to explore a comprehensive set of programming language design dimensions, including dynamic semantics (naming, state, control, data), static semantics (types, type reconstruction, polymporphism, effects), and pragmatics (compilation, garbage collection).

The many examples and exercises offer students opportunities to apply the foundational ideas explained in the text. Specialized topics and code that implements many of the algorithms and compilation methods in the book can be found on the book's Web site, along with such additional material as a section on concurrency and proofs of the theorems in the text. The book is suitable as a text for an introductory graduate or advanced undergraduate programming languages course; it can also serve as a reference for researchers and practitioners.


Concepts, Techniques and Models of Computer Programming

2004, Van Roy

  • Table of Contents
    • 1 Introduction to Programming Concepts
    • Part I General Computation Models
      • 2 Declarative Computation Model
      • 3 Declarative Programming Techniques
      • 4 Declarative Concurrency
      • 5 Message-Passing Concurrency
      • 6 Explicit State
      • 7 Object-Oriented Programming
      • 8 Shared-State Concurrency
      • 9 Relational Programming
    • Part II Specialized Computation Models
      • 10 Graphical User Interface Programming
      • 11 Distributed Programming
      • 12 Constraint Programming
    • Part III Semantics
      • 13 Language Semantics
    • Part IV Appendixes
      • A Mozart System Development Environment
      • B Basic Data Types
      • C Language Syntax
      • D General Computation Model

Teaching the science and the technology of programming as a unified discipline that shows the deep relationships between programming paradigms.

This innovative text presents computer programming as a unified discipline in a way that is both practical and scientifically sound. The book focuses on techniques of lasting value and explains them precisely in terms of a simple abstract machine. The book presents all major programming paradigms in a uniform framework that shows their deep relationships and how and where to use them together. After an introduction to programming concepts, the book presents both well-known and lesser-known computation models (programming paradigms). Each model has its own set of techniques and each is included on the basis of its usefulness in practice. The general models include declarative programming, declarative concurrency, message-passing concurrency, explicit state, object-oriented programming, shared-state concurrency, and relational programming. Specialized models include graphical user interface programming, distributed programming, and constraint programming. Each model is based on its kernel language--a simple core language that consists of a small number of programmer-significant elements. The kernel languages are introduced progressively, adding concepts one by one, thus showing the deep relationships between different models. The kernel languages are defined precisely in terms of a simple abstract machine. Because a wide variety of languages and programming paradigms can be modeled by a small set of closely related kernel languages, this approach allows programmer and student to grasp the underlying unity of programming. The book has many program fragments and exercises, all of which can be run on the Mozart Programming System, an Open Source software package that features an interactive incremental development environment.


Concepts of Programming Languages

2016, Sebesta

Concepts of Computer Programming Languages introduces students to the fundamental concepts of computer programming languages and provides them with the tools necessary to evaluate contemporary and future languages. An in-depth discussion of programming language structures, such as syntax and lexical and syntactic analysis, also prepares readers to study compiler design. The Eleventh Edition maintains an up-to-date discussion on the topic with the removal of outdated languages such as Ada and Fortran. The addition of relevant new topics and examples such as reflection and exception handling in Python and Ruby add to the currency of the text. Through a critical analysis of design issues of various program languages, Concepts of Computer Programming Languages teaches programmers the essential differences between computing with specific languages.


Concepts in Programming Languages

2010, Mitchell

For undergraduate and beginning graduate students, this textbook explains and examines the central concepts used in modern programming languages, such as functions, types, memory management, and control. The book is unique in its comprehensive presentation and comparison of major object-oriented programming languages. Separate chapters examine the history of objects, Simula and Smalltalk, and the prominent languages C++ and Java. The author presents foundational topics, such as lambda calculus and denotational semantics, in an easy-to-read, informal style, focusing on the main insights provided by these theories. Advanced topics include concurrency, concurrent object-oriented programming, program components, and inter-language interoperability. A chapter on logic programming illustrates the importance of specialized programming methods for certain kinds of problems. This book will give the reader a better understanding of the issues and tradeoffs that arise in programming language design, and a better appreciation of the advantages and pitfalls of the programming languages they use.


Programming Language Concepts

1997, Ghezzi


Programming Language Pragmatics

2015, Scott

Programming Language Pragmatics, Fourth Edition, is the most comprehensive programming language textbook available today. It is distinguished and acclaimed for its integrated treatment of language design and implementation, with an emphasis on the fundamental tradeoffs that continue to drive software development.

The book provides readers with a solid foundation in the syntax, semantics, and pragmatics of the full range of programming languages, from traditional languages like C to the latest in functional, scripting, and object-oriented programming. This fourth edition has been heavily revised throughout, with expanded coverage of type systems and functional programming, a unified treatment of polymorphism, highlights of the newest language standards, and examples featuring the ARM and x86 64-bit architectures.

--

Programming Languages and Operational Semantics: An Introduction

2004, Fernandez

This book provides a concise introduction to the essential concepts in programming languages, using techniques from operational semantics. It is addressed to undergraduate students, as a complement to programming languages or operational semantics courses. There are three parts in the book, highlighting three major programming paradigms: - imperative languages: the main features of these languages are illustrated using Java, C, Pascal - functional languages: modern languages such as ML and Haskell are used to describe the functional style of programming - logic languages: the last part of the book gives an overview of logic programming using Prolog. After a general description of each family of languages, their semantics are studied using abstract machines and structural operational semantics. The book gives an in-depth analysis of the basic concepts in programming languages instead of a mere survey of languages, privileging the understanding of the basic techniques underlying the semantics of languages over simply describing their properties.


The Formal Semantics of Programming Languages: An Introduction

1993, Winskel

The Formal Semantics of Programming Languages provides the basic mathematical techniques necessary for those who are beginning a study of the semantics and logics of programming languages. These techniques will allow students to invent, formalize, and justify rules with which to reason about a variety of programming languages. Although the treatment is elementary, several of the topics covered are drawn from recent research, including the vital area of concurency. The book contains many exercises ranging from simple to miniprojects.Starting with basic set theory, structural operational semantics is introduced as a way to define the meaning of programming languages along with associated proof techniques. Denotational and axiomatic semantics are illustrated on a simple language of while-programs, and fall proofs are given of the equivalence of the operational and denotational semantics and soundness and relative completeness of the axiomatic semantics. A proof of Godel's incompleteness theorem, which emphasizes the impossibility of achieving a fully complete axiomatic semantics, is included. It is supported by an appendix providing an introduction to the theory of computability based on while-programs. Following a presentation of domain theory, the semantics and methods of proof for several functional languages are treated. The simplest language is that of recursion equations with both call-by-value and call-by-name evaluation. This work is extended to lan guages with higher and recursive types, including a treatment of the eager and lazy lambda-calculi. Throughout, the relationship between denotational and operational semantics is stressed, and the proofs of the correspondence between the operation and denotational semantics are provided. The treatment of recursive types - one of the more advanced parts of the book - relies on the use of information systems to represent domains. The book concludes with a chapter on parallel programming languages, accompanied by a discussion of methods for specifying and verifying nondeterministic and parallel programs.


Types and Programming Languages

2020, Benjamin Pierce

A comprehensive introduction to type systems and programming languages. A type system is a syntactic method for automatically checking the absence of certain erroneous behaviors by classifying program phrases according to the kinds of values they compute. The study of type systems--and of programming languages from a type-theoretic perspective--has important applications in software engineering, language design, high-performance compilers, and security.

This text provides a comprehensive introduction both to type systems in computer science and to the basic theory of programming languages. The approach is pragmatic and operational; each new concept is motivated by programming examples and the more theoretical sections are driven by the needs of implementations. Each chapter is accompanied by numerous exercises and solutions, as well as a running implementation, available via the Web. Dependencies between chapters are explicitly identified, allowing readers to choose a variety of paths through the material.

The core topics include the untyped lambda-calculus, simple type systems, type reconstruction, universal and existential polymorphism, subtyping, bounded quantification, recursive types, kinds, and type operators. Extended case studies develop a variety of approaches to modeling the features of object-oriented languages.


Advanced Topics in Types and Programming Languages

2004, Benjamin Pierce

A thorough and accessible introduction to a range of key ideas in type systems for programming language.

The study of type systems for programming languages now touches many areas of computer science, from language design and implementation to software engineering, network security, databases, and analysis of concurrent and distributed systems. This book offers accessible introductions to key ideas in the field, with contributions by experts on each topic.

The topics covered include precise type analyses, which extend simple type systems to give them a better grip on the run time behavior of systems; type systems for low-level languages; applications of types to reasoning about computer programs; type theory as a framework for the design of sophisticated module systems; and advanced techniques in ML-style type inference.

Advanced Topics in Types and Programming Languages builds on Benjamin Pierce's Types and Programming Languages (MIT Press, 2002); most of the chapters should be accessible to readers familiar with basic notations and techniques of operational semantics and type systems--the material covered in the first half of the earlier book.

Advanced Topics in Types and Programming Languages can be used in the classroom and as a resource for professionals. Most chapters include exercises, ranging in difficulty from quick comprehension checks to challenging extensions, many with solutions.


An Introduction to Functional Programming Through Lambda Calculus

2011, Greg Michaelson

  • Table of Contents
    • Preface
      1. Introduction
      1. Lambda Calculus
      1. Conditions, booleans, and numbers
      1. Recursion and arithmetic
      1. Types
      1. Lists and Strings
      1. Composite values and trees
      1. Evaluation
      1. Functional programming in Standard ML
      1. Functional programming and LISP

Functional programming is rooted in lambda calculus, which constitutes the world's smallest programming language. This well-respected text offers an accessible introduction to functional programming concepts and techniques for students of mathematics and computer science. The treatment is as nontechnical as possible, and it assumes no prior knowledge of mathematics or functional programming. Cogent examples illuminate the central ideas, and numerous exercises appear throughout the text, offering reinforcement of key concepts. All problems feature complete solutions.


Lambda Calculus with Types

2014, Henk Barendregt

This handbook with exercises reveals in formalisms, hitherto mainly used for hardware and software design and verification, unexpected mathematical beauty. The lambda calculus forms a prototype universal programming language, which in its untyped version is related to Lisp, and was treated in the first author's classic The Lambda Calculus (1984). The formalism has since been extended with types and used in functional programming (Haskell, Clean) and proof assistants (Coq, Isabelle, HOL), used in designing and verifying IT products and mathematical proofs. In this book, the authors focus on three classes of typing for lambda terms: simple types, recursive types and intersection types. It is in these three formalisms of terms and types that the unexpected mathematical beauty is revealed. The treatment is authoritative and comprehensive, complemented by an exhaustive bibliography, and numerous exercises are provided to deepen the readers' understanding and increase their confidence using types.


Lambda-calculus, Combinators and Functional Programming

2009, G. E. Revesz

Originally published in 1988, this book presents an introduction to lambda-calculus and combinators without getting lost in the details of mathematical aspects of their theory. Lambda-calculus is treated here as a functional language and its relevance to computer science is clearly demonstrated. The main purpose of the book is to provide computer science students and researchers with a firm background in lambda-calculus and combinators and show the applicabillity of these theories to functional programming. The presentation of the material is self-contained. It can be used as a primary text for a course on functional programming. It can also be used as a supplementary text for courses on the structure and implementation of programming languages, theory of computing, or semantics of programming languages.


Type Theory and Formal Proof: An Introduction

2016, Rob Nederpelt

Type theory is a fast-evolving field at the crossroads of logic, computer science and mathematics. This gentle step-by-step introduction is ideal for graduate students and researchers who need to understand the ins and outs of the mathematical machinery, the role of logical rules therein, the essential contribution of definitions and the decisive nature of well-structured proofs. The authors begin with untyped lambda calculus and proceed to several fundamental type systems, including the well-known and powerful Calculus of Constructions. The book also covers the essence of proof checking and proof development, and the use of dependent type theory to formalise mathematics. The only prerequisite is a basic knowledge of undergraduate mathematics. Carefully chosen examples illustrate the theory throughout. Each chapter ends with a summary of the content, some historical context, suggestions for further reading and a selection of exercises to help readers familiarise themselves with the material.


Engineering A Compiler

Cooper, Torczon

3rd Ed, 2022

Engineering a Compiler, Third Edition covers the latest developments in compiler technology, with new chapters focusing on semantic elaboration (the problems that arise in generating code from the ad-hoc syntax-directed translation schemes in a generated parser), on runtime support for naming and addressability, and on code shape for expressions, assignments and control-structures. Leading educators and researchers, Keith Cooper and Linda Torczon, have revised this popular text with a fresh approach to learning important techniques for constructing a modern compiler, combining basic principles with pragmatic insights from their own experience building state-of-the-art compilers.

2nd Ed, 2011

  • Table of Contents
  • 1 Overview of Compilation
    • 1.1 Introduction
    • 1.2 Compiler Structure
    • 1.3 Overview of Translation
  • 2 Scanners
    • 2.1 Introduction
    • 2.2 Recognizing words
    • 2.3 Regular Expressions
    • 2.4 From Regular Expression to Scanner
    • 2.5 Implementing Scanners
    • 2.6 Advanced Topics
  • 3 Parsers
    • 3.1 Introduction
    • 3.2 Expressing Syntax
    • 3.3 Top-Down Parsing
    • 3.4 Bottom-Up Parsing
    • 3.5 Practical Issues
    • 3.6 Advanced Topics
  • 4 Context-sensitive Analysis
    • 4.1 Introduction
    • 4.2 An Introduction to Type Systems
    • 4.3 An Attribute-Grammar Framework
    • 4.4 Ad Hoc Syntax-Directed Translation
    • 4.5 Advanced Topics
  • 5 Intermediate Representations
    • 5.1 Introduction
    • 5.2 Graphical IRs
    • 5.3 Linear IRs
    • 5.4 Mapping Values to Names
    • 5.5 Symbol Tables
  • 6 Procedure Abstraction
    • 6.1 Introduction
    • 6.2 Procedure Calls
    • 6.3 Name Spaces
    • 6.4 Communicating Values Between Procedures
    • 6.5 Advanced Topics
  • 7 Code Shape
    • 7.1 Introduction
    • 7.2 Assigning Storage Locations
    • 7.3 Arithmetic Operators
    • 7.4 Boolean and Relational Operators
    • 7.5 Storing and Accessing Arrays
    • 7.6 Character Strings
    • 7.7 Structure References
    • 7.8 Control Flow Constructs
    • 7.9 Procedure Calls
  • 8 Introduction to Optimization
    • 8.1 Introduction
    • 8.2 Background
    • 8.3 Scope of Optimizations
    • 8.4 Local Optimization
    • 8.5 Regional Optimization
    • 8.6 Global Optimization
    • 8.7 Interprocedural Optimization
  • 9 Data-Flow Analysis
    • 9.1 Introduction
    • 9.2 Iterative Data-Flow Analysis
    • 9.3 Static Single-Assigment Form
    • 9.4 Interprocedural Analysis
    • 9.5 Advanced Topics
  • 10 Scalar Optimization
    • 10.1 Introduction
    • 10.2 Eliminating Useless and Unreachable Code
    • 10.3 Code Motion
    • 10.4 Specialization
    • 10.5 Redundancy Elimination
    • 10.6 Finishing Other Transformations
    • 10.7 Advanced Topics
  • 11 Instruction Selection
    • 11.1 Introduction
    • 11.2 Code Generation
    • 11.3 Extending the Simple Treewalk Scheme
    • 11.4 Instruction Selection via Tree Pattern Matching
    • 11.5 Instruction Selection via Peephole Optimization
    • 11.6 Advanced Topics
  • 12 Instruction Scheduling
    • 12.1 Introduction
    • 12.2 The Instruction-Scheduling Problem
    • 12.3 Local List Scheduling
    • 12.4 Regional Scheduling
    • 12.5 Advanced Topics
  • 13 Register Allocation
    • 13.1 Introduction
    • 13.2 Background Issues
    • 13.3 Local Register Allocation and Assignment
    • 13.4 Global Register Allocation and Assignment
    • 13.5 Advanced Topics

This entirely revised second edition of Engineering a Compiler is full of technical updates and new material covering the latest developments in compiler technology. In this comprehensive text you will learn important techniques for constructing a modern compiler. Leading educators and researchers Keith Cooper and Linda Torczon combine basic principles with pragmatic insights from their experience building state-of-the-art compilers. They will help you fully understand important techniques such as compilation of imperative and object-oriented languages, construction of static single assignment forms, instruction scheduling, and graph-coloring register allocation


Crafting Interpreters

2020, Bob Nystrom


Introduction to Compilers and Language Design

2020, Douglas Thain


Compiling to Assembly from Scratch

2020, Vladimir Keleshev


Writing {An Interpreter, A Compiler} In Go

2018, Thorsten Ball


Modern Compiler Design

2012, 2nd Ed, Grune

  • Table of Contents
    • 1 Introduction
    • Part I
      • 2 Program Text to Tokens - Lexical Analysis
      • 3 Tokens to Syntax Tree - Syntax Analysis
    • Part II
      • 4 Annotating the Abstract Syntax Tree
      • 5 Manual Context Handling
    • Part III
      • 6 Interpretation
      • 7 Code Generation
      • 8 Assemblers, Disassemblers, Linkers and Loaders
      • 9 Optimization Techniques
    • Part IV Memory Management
      • 10 Explicit and Implicit Memory Management
    • Part V From Abstract Syntax Tree to Intermediate Code
      • 11 Imperative and Object-Oriented Programs
      • 12 Functional Programs
      • 13 Logic Programs
      • 14 Parallel and Distributed Programs

"Modern Compiler Design" makes the topic of compiler design more accessible by focusing on principles and techniques of wide application. By carefully distinguishing between the essential (material that has a high chance of being useful) and the incidental (material that will be of benefit only in exceptional cases) much useful information was packed in this comprehensive volume. The student who has finished this book can expect to understand the workings of and add to a language processor for each of the modern paradigms, and be able to read the literature on how to proceed. The first provides a firm basis, the second potential for growth.


Implementing Programming Languages. An Introduction to Compilers and Interpreters

2012, Ranta

  • Table of Contents
    • 1 Compilation Phases
    • 2 Grammars
    • 3 Lexing and Parsing
    • 4 Type Checking
    • 5 Interpreters
    • 6 Code Generation
    • 7 Functional Programming Languages
    • 8 The Language Design Space
    • A BNFC Quick Reference
    • B Some JVM instructions
    • C Summary of the Assignments
    • D Further Reading

Implementing a programming language means bridging the gap from the programmer's high-level thinking to the machine's zeros and ones. If this is done in an efficient and reliable way, programmers can concentrate on the actual problems they have to solve, rather than on the details of machines. But understanding the whole chain from languages to machines is still an essential part of the training of any serious programmer. It will result in a more competent programmer, who will moreover be able to develop new languages. A new language is often the best way to solve a problem, and less difficult than it may sound. This book follows a theory-based practical approach, where theoretical models serve as blueprint for actual coding. The reader is guided to build compilers and interpreters in a well-understood and scalable way. The solutions are moreover portable to different implementation languages. Much of the actual code is automatically generated from a grammar of the language, by using the BNF Converter tool. The rest can be written in Haskell or Java, for which the book gives detailed guidance, but with some adaptation also in C, C++, C#, or OCaml, which are supported by the BNF Converter. The main focus of the book is on standard imperative and functional languages: a subset of C++ and a subset of Haskell are the source languages, and Java Virtual Machine is the main target. Simple Intel x86 native code compilation is shown to complete the chain from language to machine. The last chapter leaves the standard paths and explores the space of language design ranging from minimal Turing-complete languages to human-computer interaction in natural language.


Language Implementation Patterns

2010, Parr

  • Table of Contents
    • Part I Getting Started with Parsing
      • 1 Language Applications Cracked Open
      • 2 Basic Parsing Patterns
      • 3 Enhanced Parsing Patterns
    • Part II Analyzing Languages
      • 4 Building Intermetiad Form Trees
      • 5 Walking and Rewriting Trees
      • 6 Tracking and Identifying Program Symbols
      • 7 Managing Symbol Tables for Data Aggregates
      • 8 Enforcing Static Typing Rules
    • Part III Building Interpreters
      • 9 Building high-Level Interpreters
      • 10 Building Bytecode Interpreters
    • Part IV Translating and Generating Languages
      • 11 Translating Computer Languages
      • 12 Generating DSLs with Templates
      • 13 Putting It All Together

Knowing how to create domain-specific languages (DSLs) can give you a huge productivity boost. Instead of writing code in a general-purpose programming language, you can first build a custom language tailored to make you efficient in a particular domain. The key is understanding the common patterns found across language implementations. "Language Design Patterns" identifies and condenses the most common design patterns, providing sample implementations of each. The pattern implementations use Java, but the patterns themselves are completely general. Some of the implementations use the well-known ANTLR parser generator, so readers will find this book an excellent source of ANTLR examples as well. But this book will benefit anyone interested in implementing languages, regardless of their tool of choice. Other language implementation books focus on compilers, which you rarely need in your daily life. Instead, "Language Design Patterns" shows you patterns you can use for all kinds of language applications. You'll learn to create configuration file readers, data readers, model-driven code generators, source-to-source translators, source analyzers, and interpreters. Each chapter groups related design patterns and, in each pattern, you'll get hands-on experience by building a complete sample implementation. By the time you finish the book, you'll know how to solve most common language implementation problems.


Writing Compilers and Interpreters: A Software Engineering Approach

2009, Mak, 3rd Ed

  • Github
  • Table of Contents
    • Chapter 1: Introduction
    • Chapter 2: Framework I: Compiler and Interpreter
    • Chapter 3: Scanning
    • Chapter 4: The Symbol Table
    • Chapter 5: Parsing Expressions and Assignment Statements
    • Chapter 6: Interpreting Expressions and Assignment Statements
    • Chapter 7: Parsing Control Statements
    • Chapter 8: Interpreting Control Statements
    • Chapter 9: Parsing Declarations
    • Chapter 10: Type Checking
    • Chapter 11: Parsing Programs, Procedures, and Functions
    • Chapter 12: Interpreting Pascal Programs
    • Chapter 13: An Interactive Source-Level Debugger
    • Chapter 14: Framework II: An Integrated Development Environment (IDE)
    • Chapter 15: Jasmin Assembly Language and Code Generation for the Java Virtual Machine
    • Chapter 16: Compiling Programs, Assignment Statements, and Expressions
    • Chapter 17: Compiling Procedure and Function Calls and String Operations
    • Chapter 18: Compiling Control Statements, Arrays, and Records
    • Chapter 19: Additional Topics

Long-awaited revision to a unique guide that covers both compilers and interpreters Revised, updated, and now focusing on Java instead of C++, this long-awaited, latest edition of this popular book teaches programmers and software engineering students how to write compilers and interpreters using Java. You?ll write compilers and interpreters as case studies, generating general assembly code for a Java Virtual Machine that takes advantage of the Java Collections Framework to shorten and simplify the code. In addition, coverage includes Java Collections Framework, UML modeling, object-oriented programming with design patterns, working with XML intermediate code, and more. From the Back Cover Master the skills you need to build your own compilers and interpreters Compilers and interpreters are very difficult programs to write, but modern software engineering tackles the complexity. Design patterns and other object-oriented programming techniques guide you to develop well-structured code in incremental, understandable steps. Apply what you learn in this book to succeed with any complex software project. You'll learn to:

  • Use Java to develop scanners and parsers for programming languages
  • Employ UML to model software components
  • Manage symbol tables with the Java Collections Framework
  • Use XML to represent the generated intermediate code
  • Develop an interpreter to execute programs, including a powerful interactive source-level debugger
  • Implement an integrated development environment (IDE) that animates the execution of programs
  • Use the IDE's graphical user interface to set breakpoints and single-step programs statement by statement with mouse clicks
  • Develop a code generator that emits object code for the Java Virtual Machine (JVM), and run the compiled code on multiple platforms

Compiler Design. Theory, Tools, Examples

2010, Bergmann Homepage

  • Table of Contents
    • 1 Introduction
    • 2 Lexical Analysis
    • 3 Syntax Analysis
    • 4 Top-Down Parsing
    • 5 Bottom-Up Parsing
    • 6 Code Generation
    • 7 Optimization

Basics of Compiler Design

2010, Torben Mogensen Homepage

  • Table of Contents
    • 1 Introduction
    • 2 Lexical Analysis
    • 3 Syntax Analysis
    • 4 Scopes and Symbol Tables
    • 5 Interpretation
    • 6 Type Checking
    • 7 Intermediate-Code Generation
    • 8 Machine Code Generation
    • 9 Register Allocation
    • 10 Function Calls
    • 11 Analysis and Optimization
    • 12 Memory Management
    • 13 Bootstrapping a Compiler

Compilers. Principles, Techniques and Tools

2006, 2nd Ed, Aho, Lam, Sethi, Ullman

  • Table of Contents
    • 1 Introduction
      • 1.1 Language Processors
      • 1.2 The structure of a compiler
        • 1.2.1 Lexical analysis
        • 1.2.2 Syntax analysis
        • 1.2.3 Semantic analysis
        • 1.2.4 Intermediate code generation
        • 1.2.4 Code optimization
        • 1.2.6 Code generation
        • 1.2.7 Symbol-table management
        • 1.2.8 The grouping of phases into passes
        • 1.2.9 Compiler-construction tools
      • 1.3 The evolution of programming languages
    • 2 A simple syntax-directed translator
    • 3 Lexical analysis
    • 4 Syntax analysis
    • 5 Syntax-directed translation
    • 6 Intermediate code generation
    • 7 Run-time environments
      • Storage organization
        • Static versus dynamic storage allocation
      • Stack allocation of space
        • Activation trees
        • Activation records
        • Calling sequences
        • Variable-length data on the stack
      • Access to nonlocal data on the stack
        • Data access without nested procedures
        • Issues with nested procedures
        • A language with nested procedure declarations
        • Nesting depth
        • Access links
        • Manipulating access links
        • Displays
      • Heap management
        • The memory manager
        • The memory hierarchy of a computer
        • Locality in programs
          • Optimization using the memory hierarchy
        • Reducing fragmentation
          • Best-fit and next-fit object placement
          • Managing and coalescing free space
        • Manual deallocation requests
          • Problems with manual deallocation
          • Programming conventions and tools
      • Introduction to garbage collection
        • Design goals for garbage collection
          • Basic requirement: type safety
          • Performance metrics
        • Reachability
        • Reference counting garbage collection
      • Introduction to trace-based collection
        • A basic mark-and-sweep collector
        • Basic abstraction
        • Optimizing mark-and-sweep
        • Mark-and-compact garbage collection
        • Copying collectors
        • Comparing costs
      • Short-pause garbage collection
        • Incremental garbage collection
          • Precision of incremental collection
        • Incremental reachability analysis
          • Implementing write barriers
          • Combining incremental and copying techniques
        • Partial-collection basics
        • Generational garbage collection
        • The train algorithm
          • Remembered sets
          • Managing trains
          • Garbage collecting a car
          • Panic mode
      • Advance topics in garbage collection
        • Parallel and concurrent garbage collection
        • Partial object relocation
        • Conservative collection for unsafe languages
        • Weak references
      • Summary
    • 8 Code generation
      • Issues in the design of a code generator
        • Input to the code generator
        • The target program
        • Instruction selection
        • Register allocation
      • The target language
        • A simple target machine model
        • Program and instruction costs
      • Addresses in the target code
        • Static allocation
        • Stack allocation
        • Run-time addresses for names
      • Basic blocks and flow graphs
        • Basic blocks
        • Next-use information
        • Flow graphs
        • Representation of flow graphs
        • Loops
      • Optimization of basic blocks
        • The DAG representation of basic blocks
        • Finding local common subexpressions
        • Dead code elimination
        • The use of algebraic identities
        • Representation of array references
        • Pointer assignments and procedure calls
        • Reassembling basic blocks from DAGs
      • A simple code generator
        • Register and address descriptors
        • The code-generation algorithm
          • Machine instructions for copy statements
          • Ending the basic block
          • Managing register and address descriptors Design of the function getReg
      • Peephole optimization
        • Eliminating redundant loads and stores
        • Eliminating unreachable code
        • Flow-of-control optimizations
        • Algebraic simplification and reduction in strength
        • Use of machine idioms
      • Register allocation and assignment
        • Global register allocation
        • Usage counts
        • Register assignment for outer loops
        • Register allocation by graph coloring
      • Instruction selection by tree-rewriting
        • Tree-translation scheme
        • Code generation by tiling an input tree
        • Pattern matching by parsing
        • Routines for semantic checking
        • General tree matching
      • Code generation for expressions
        • Ershov numbers
        • Generating code from labeled expression trees
        • Evaluating expression with an insufficient supply of registers
      • Dynamic programming code-generation
        • Contiguous evaluation
        • The dynamic programming algorithm
      • Summary
    • 9 Machine-independent optimizations
    • 10 Instruction-level parallelism
    • 11 Optimizing for parallelism and locality
    • Appendix A A Complete front end
    • Appendix B Finding linearly independent solutions

Optimizing Compilers for Modern Architectures

2005, Allen, Kennedy

  • Table of Contents
    • 1 Compiler Challenges for High-Performance Architectures
    • 2 Dependence: Theory and Practice
    • 3 Dependence Testing
    • 4 Preliminary Transformations
    • 5 Enhancing Fine-Grained Parallelism
    • 6 Creating Coarse-Graned Parallelism
    • 7 Control Dependence
    • 8 Compiler Improvement of Register Usage
    • 9 Cache Management
    • 10 Scheduling
    • 11 Interprocedural Analysis and Optimization
    • 12 Other Applications of Dependence
    • 13 Compiling Array Assignments
    • 14 Compiling High-Performance Fortran

Compiler Design Handbook. Optimizations and Machine Code Generation

2003, Srikant, Shankars

  • Table of Contents
    • 1 Data Flow Analysis
    • 2 Automatic Generation of Code Optimizers from Formal Specifications
    • 3 Scalar Compiler Optimizations on the Single Static Assignment
    • 4 Profile-Guided Compiler Optimizations
    • 5 Shape Analysis and Applications
    • 6 Optimizations for Object-Oriented Languages
    • 7 Data Flow Testing
    • 8 Program Slicing
    • 9 Debuggers for Programming Languages
    • 10 Dependence Analysis and Parallelization Transformations
    • 11 Compilation for Distributed Memory Architectures
    • 12 Automatic Data Distribution
    • 13 Register Allocation
    • 14 Architecture Description Language for Retargetable Compilation
    • 15 Instruction Selection Using Tree Parsing
    • 16 Retargetable Very Long Instruction Word Compiler Framework
    • 17 Instruction Scheduling
    • 18 Software Pipelining
    • 19 Dynamic Compilation
    • 20 Compiling Safe Mobile Code
    • 21 Type Systems in Programming Languages
    • 22 Introduction to Operational Semantics

The Optimal Implementation of Functional Programming Languages

1999, Asperti, Guerrini

  • Table of Contents
    • 1 Introduction
    • 2 Optimal Reduction
    • 3 The Full Algorithm
    • 4 Optimal Reductions and Linear Logic
    • 5 Redex Families and Optimality
    • 6 Paths
    • 7 Read-Back
    • 8 Other Translations in Sharing Graphs
    • 9 Safe Nodes
    • 10 Complexity
    • 11 Functional Programming
    • 12 The Bologna Optimal Higher-Order Machine

Modern Compiler Implementation in C

1998, Appel

  • Table of Contents
    • Part I Fundamentals of Compilation
      • 1 Introduction
      • 2 Lexical Analysis
      • 3 Parsing
      • 4 Abstract Syntax
      • 5 Semantic Analysis
      • 6 Activation Records
      • 7 Translation to Intermediate Code
      • 8 Basic Blocks and Traces
      • 9 Instruction Selection
      • 10 Liveness Analysis
      • 11 Register Allocation
      • 12 Putting it All Together
    • Part II Advanced Topics
      • 13 Garbage Collection
      • 14 Object-Oriented Languages
      • 15 Functional Programming Languages
      • 16 Polymorphic Types
      • 17 Dataflow Analysis
      • 18 Loop Optimization
      • 19 Static Single Assignment Form
      • 20 Pipelining and Scheduling
      • 21 The Memory Hierarchy

Modern Compiler Implementation in ML

1998, Appel

  • Table of Contents
    • Part I Fundamentals of Compilation
      • 1 Introduction
      • 2 Lexical Analysis
      • 3 Parsing
      • 4 Abstract Syntax
      • 5 Semantic Analysis
      • 6 Activation Records
      • 7 Translation to Intermediate Code
      • 8 Basic Blocks and Traces
      • 9 Instruction Selection
      • 10 Liveness Analysis
      • 11 Register Allocation
      • 12 Putting it All Together
    • Part II Advanced Topics
      • 13 Garbage Collection
      • 14 Object-Oriented Languages
      • 15 Functional Programming Languages
      • 16 Polymorphic Types
      • 17 Dataflow Analysis
      • 18 Loop Optimization
      • 19 Static Single Assignment Form
      • 20 Pipelining and Scheduling
      • 21 The Memory Hierarchy

Modern Compiler Implementation in Java

1998, Appel


Building an Optimizing Compiler

1997, Morgan

  • Table of Contents
    • 1 Overview
    • 2 Compiler Structure
    • 3 Graphs
    • 4 Flow Graph
    • 5 Local Optimization
    • 6 Alias Analysis
    • 7 Static Single Assignment
    • 8 Dominator-Based Optimization
    • 9 Advanced Techniques
    • 10 Global Optimization
    • 11 Limiting Resources
    • 12 Scheduling and Rescheduling
    • 13 Register Allocation
    • 14 The Object Module
    • 15 Completion and Futures

Compiler Construction (Wirth)

1996, Wirth Homepage

  • Table of Contents
    • 1 Introduction
    • 2 Language and Syntax
    • 3 Regular Languages
    • 4 Analysis of Context-Free Language
    • 5 Attributed Grammars and Semantics
    • 6 The Programming Language Oberon-0
    • 7 A Parser for Oberon-0
    • 8 Consideation of Context Specified by Declarations
    • 9 A RISC Architecture as a Target
    • 10 Expressions and Assignments
    • 11 Conditional and Repeated Statements and Boolean Expressions
    • 12 Procedures and the Concept of Locality
    • 13 Elementary Data Types
    • 14 Open Arrays, Pointers, and Procedure Types
    • 15 Modules and Separate Compilation
    • 16 Code Optimizations and the Frontend/Backend Structure

Compiler Construction (Waite, Goos)

1995, Waite, Goos

  • Table of Contents
    • 1 Introduction and Overview
    • 2 Properties of Programming Languages
    • 3 Properties of Real and Abstract Machines
    • 4 Abstract Program Representation
    • 5 Elements of Formal Systems
    • 6 Lexical Analysis
    • 7 Parsing
    • 8 Attribute Grammars
    • 9 Semantic Analysis
    • 10 Code Generation
    • 11 Assembly
    • 12 Error Handling
    • 13 Optimization
    • 14 Implementation

Let's Build a Compiler

1995, Jack Crenshaw


The Functional Treatment of Parsing

1993, Leermakers

  • Table of Contents
    • 1 Context-Free Grammars
    • 2 Bunch Notation
    • 3 Grammar Interpretations
    • 4 Recursive Descent
    • 5 Grammar Transformations
    • 6 Recursive Ascent
    • 7 Parse Forest
    • 8 Attribute Grammars
    • 9 LR Parsers
    • 10 Some Notes

Parsing Techniques. A Practical Guide

1990, Grune, Jacobs

  • Table of Contents
    • 1 Introduction
      • Parsing as a craft
      • The approach used
      • Outline of the contents
      • The annotated bibliography
    • 2 Grammars as a generating device
      • Languages as infinite sets
      • Formal grammars
      • The Chomsky hierarchy of grammars and languages
      • VW grammars
      • Actually generating sentences from a grammar
      • To shrink or not to shrink
      • A characterization of the limits of CF and FS grammars
      • Hygiene in grammars
      • The semantic connection
      • A metaphorical comparison of grammar types
    • 3 Introduction to parsing
      • Various kinds of ambiguity
      • Linearization of the parse tree
      • Two ways to parse a sentence
      • Non-deterministic automata
      • Recognition and parsing gort type 0 to type 4 grammars
      • An overview of parsing methods
    • 4 General one-directional methods
      • Unger's parsing method
      • The CYK parsing method
    • 5 Regular grammars and finite state machines
      • Applications of regular grammars
      • Producing from a regular grammar
      • Parsing with a regular grammar
    • 6 General directional top-down methods
      • Imitating left'most productions
      • The pushdown automaton
      • Breadth-first top-down parsing
      • Eliminating left-recursion
      • Depth-first (backtracking) parsers
      • Recursive descent
      • Definite clause grammars
    • 7 General bottom-up parsing
      • Parsing by searching
      • Top-down restricted breadth-first bottom-up parsing
    • 8 Deterministic top-down methods
      • Replacing search by table look-up
      • LL(1) grammars
      • LL(k) grammars
      • Extended LL(1) grammars
    • 9 Deterministic bottom-up parsing
      • Simple handle-isolating techniques
      • Precedence parsing
      • Bounded-context parsing
      • LR methods
      • LR(1)
      • LALR(1) parsing
      • Further developments of LR methods
      • Tomita's parser
      • Non-canonical parsers
      • LR(k) as an ambiguity test
    • 10 Error handling
      • Detection versus recovery versus correction
      • Parsing techniques and error detection
      • Recovering from errors
      • Global error handling
      • Ad hoc methods
      • Regional error handling
      • Local error handling
      • Suffix parsing
    • 11 Comparative survey
      • Considerations
      • General parsers
      • Linear-time parsers
    • 12 A simple general context-free parser
      • Principles of the parser
      • The program
      • Parsing in polynomial time
    • 13 Annotated bibliography

Compiler Design in C

1990, Holub

  • Table of Contents
    • 1 Basic Concepts
      • 1.1 The Parts of the Compiler
      • 1.2 Representing Computer Languages
      • 1.3 A Recursive-Descent Expression Compiler
    • 2 Input and Lexical Analysis
      • 2.1 The Lexical Analyzer as Part of a Compiler
      • 2.2 Error Recovery in Lexical Analysis
      • 2.3 Input Systems
      • 2.4 Lexical Analysis
      • 2.5 Lex - a Lexical-Analyzer Generator
    • 3 Context-Free Grammars
      • 3.1 Sentences, Phrases, and Context-Free Grammars
      • 3.2 Derivations and Sentential Forms
      • 3.3 Parse Trees and Semantic Difficulties
      • 3.4 Epsilon Productions
      • 3.5 The End-of-Input Marker
      • 3.6 Right-Linear Grammars
      • 3.7 Lists, Recursion, and Associativity
      • 3.8 Expressions
      • 3.9 Ambiguous Grammars
      • 3.10 Syntax-Directed Translation
      • 3.11 Representing Generic Grammars
    • 4 Top-Down Parsing
      • 4.1 Push-Down Automata
      • 4.2 Using a PDA for a Top-Down Parse
      • 4.3 Error Recovery in a Top-Down Parser
      • 4.4 Augmented Grammars and Table-Driven Parsers
      • 4.5 Automating the Top-Down Parse Process
      • 4.6 LL(1) Grammars and Their Limitations
      • 4.7 Making the Parse Tables
      • 4.8 Modifying Grammars
      • 4.9 Implementing LL(1) Grammars
      • 4.10 LLama - Implementing an LL(1) Parser-Generator
    • 5 Bottom-Up Parsing
      • 5.1 How Bottom-Up Parsing Works
      • 5.2 Recursion in Bottom-Up Parsing
      • 5.3 Implementing the Parser as a State Machine
      • 5.4 Error Recovery in an LR Parser
      • 5.5 The Value Stack and Attribute Processing
      • 5.6 Creating LR Parse Tables - Theory
      • 5.7 Representing LR State Tables
      • 5.8 Eliminating Single-Reduction States
      • 5.9 Using Ambiguous Grammars
      • 5.10 Implementing an LALR(1) Parser - The Occs Output File
      • 5.11 Implementing an LALR(1) Parser Generator - Occs Internals
      • 5.12 Parser-File Generation
      • 5.13 Generating LALR(1) Parse Tables
    • 6 Code Generation
      • 6.1 Intermediate Languages
      • 6.2 C-Code An Intermediate Language and Virtual Machine
      • 6.3 The Symbol Table
      • 6.4 The Parser: Configuration
      • 6.5 The Lexical Analyzer
      • 6.6 Declarations
      • 6.7 The gen() Subroutine
      • 6.8 Expressions
      • 6.8 Statements and Control Flow
    • 7 Optimization Strategies
      • 7.1 Parser Optimizations
      • 7.2 Linear (Peephole) Optimizations
      • 7.3 Structural Optimizations
      • 7.4 Aliasing Problems

The Implementation of Functional Programming Languages

1987, Peyton Jones

  • Homepage
  • Table of Contents
    • 1 Introduction
    • Part I Compiling High-Level Functional Languages
      • 2 The Lambda Calculus
      • 3 Translating a High-Level Functional Language into the Lambda Calculus
      • 4 Structured Types and the Semantics of Pattern-Matching
      • 5 Efficient Compilation of Pattern-Matching
      • 6 Transforming The Enriched Lambda Calculus
      • 7 List Comprehensions
      • 8 Polymorphic Type-Checking
      • 9 A Type-Checker
    • Part II Graph Reduction
      • 10 Program Representation
      • 11 Selecting the Redex
      • 12 Graph reduction of Lambda Expressions
      • 13 Supercombinators and Lambda-lifting
      • 14 Recursive Supercombinators
      • 15 Fully-Lazy Lambda-Lifting
      • 16 SK Combinators
      • 17 Storage Management and Garbage Collections
    • Part III Advanced Graph Reduction
      • 18 The G-Machine
      • 19 G-Code Definition and Implementation
      • 20 Optimizations to the G-Machine
      • 21 Optimizing Generalized Tail Calls
      • 22 Strictness Analysis

The Design and Construction of Compilers

1982, Hunter

  • Table of Contents
    • 1 The compilation process
    • 2 Language definition
    • 3 Lexical analysis
    • 4 Context-free grammars and tope-down syntax analysis
    • 5 Bottom-up syntax analysis
    • 6 Embedding actions in syntax
    • 7 Compiler design
    • 8 Symbol and mode tables
    • 9 Storage allocation
    • 10 Code generation
    • 11 Generation of machine code
    • 12 Error recovery and diagnostics
    • 13 Writing reliable compilers

Writing Interactive Compilers and Interpreters

1979, Brown

  • Table of Contents
    • 1 Planning
      • 1.1 Why interactive?
      • 1.2 Planning use of resources
      • 1.3 Documentation
      • 1.4 Designing the source language and the user interface
      • 1.5 Encoding the compiler
    • 2 The Structure of the Compiler
      • 2.1 Filling the gaps
      • 2.2 Description of terminology and environment
      • 2.3 Source and internal languages
      • 2.4 Incremental compiling
      • 2.5 Re-creating the source program
      • 2.6 Levels of internal language
      • 2.7 True compilers
      • 2.8 Error checking
      • 2.9 Error messages
      • 2.10 Names, scope, and data type
      • 2.11 Dictionaries and tables
      • 2.12 Storage management
      • 2.13 The editor
      • 2.14 Input and output
      • 2.15 Break-ins
      • 2.16 Summary of design
    • 3 The Design of an Internal Language
      • 3.1 Reverse polish notation
      • 3.2 Operators
      • 3.3 Encoding reverse polish
      • 3.4 A brief summary
    • 4 The Translator
      • 4.1 Overall translator organization
      • 4.2 Lexical analysis
      • 4.3 Grammars
      • 4.4 Using grammars for parsing
      • 4.5 Checking and resolving data types
      • 4.6 Semantic actions
    • 5 The Run-Time System
      • 5.1 Error detection and diagnosis
      • 5.2 Executing reverse polish
      • 5.3 Allocating and referencing user variables
      • 5.4 Execution of statements
      • 5.5 String temporaries
    • 6 Other Modules
      • 6.1 The pre-run module
      • 6.2 The re-creator module
      • 6.3 The command module
    • 7 Testing and Issuing
      • 7.1 Testing the compiler
      • 7.2 Issuing
    • 8 Some Advanced and Specialized Topics
      • 8.1 Some special copmilers
      • 8.2 Dynamic compiling

Assemblers, Compilers, and Program Translation

1979, Calingaert

  • Table of Contents
    • 1 Overview
    • 2 Assembly
    • 3 Program Modules
    • 4 Macro Procesing
    • 5 Program Translation
    • 6 Initial Source-Language Processing
    • 7 Compilation
    • 8 Preparation for Execution

Compiler Construction: An Advanced Course

1974, Bauer, Eickel

  • Table of Contents
    • 1 Introduction
    • 2 Analysis
      • Review of formalisms and notation
      • LL(1) grammars and analysers
      • LR grammars and analysers
      • Lexical analysis
      • Transformational grammars
      • Two-level grammars
      • Semantic analysis
    • 3 Synthesis
      • Relationship of languages to machines
      • Run-time storage management
      • Special run-time organization techniques for Algol 68
      • Symbol table access
      • Code generation
      • Assembly and linkage
    • 4 Compiler-Compiler
      • Introduction to compiler-compilers
      • Using the CDL compiler-compiler
    • 5 Engineering a compiler
      • Portable and adaptable compilers
      • Structuring compiler development
      • Programming language design
      • What the compiler should tell the user
      • Optimization
    • 6 Appendix
      • Historical remarks on compiler construction

The Theory of Parsing, Translation, and Compiling

1972, Aho, Ullman

Volume 1 Parsing

  • Table of Contents
    • 0 Mathematical Preliminaries
      • Concepts from Set Theory
      • Sets of Strings
      • Concepts from Logic
      • Procedures and Algorithms
      • Concepts from Graph Theory
    • 1 An Introduction to Compiling
      • Programming Languages
      • An Overview of Compiling
      • Other Applications of Parsing and Translating Algorithms
    • 2 Elements of Language Theory
      • Representations for Languages
      • Regular Sets, Their Generators, and Their Recognizers
      • Properties of Regular Sets
      • Context-Free Languages
      • Pushdown Automata
      • Properties of Context-Free Languages
    • 3 Theory of Translation
      • Formalisms for Translations
      • Properties of Syntax-Directed Translations
      • Lexical Analysis
      • Parsing
    • 4 General Parsing Methods
      • Backtrack Parsing
      • Tabular Parsing Methods
    • 5 One-pass No Backtrack Parsing
      • LL(k) Grammars
      • Deterministic Bottom-Up Parsing
      • Precedence Grammars
      • Other classes of Shift-Reduce Parsable Grammars
    • 6 Limited Backtrack Parsing Algorithms
      • Limited Backtrack Top-Down Parsing
      • Limited Backtrack Bottom-Up Parsing

Volume 2 Compiling

  • Table of Contents
    • 7 Techniques for Parser Optimization
      • Linear Precedence Functions
      • Optimization of Floyd-Evans Parsers
      • Transformations on Sets of LR(k) Tables
      • Techniques for Constructing LR(k) Parsers
      • Parsing Automata
    • 8 Theory of Deterministic Parsing
      • Theory of LL Languages
      • Classes of Grammars Generating the Deterministic Languages
      • Theory of Simple Precedence Languages
    • 9 Translation and Code Generation
      • The Role of Translation in Compiling
      • Syntax-Directed Translation
      • Genealized Translation Schemes
    • 10 Bookkeeping
      • Symbol Tables
      • Property Grammars
    • 11 Code Optimization
      • Optimization of Straight-Line Code
      • Arithmetic Expressions
      • Programs With Loops
      • Data Flow Analysis