master 0787493713b4 cached
97 files
427.6 KB
111.7k tokens
1012 symbols
1 requests
Download .txt
Showing preview only (455K chars total). Download the full file or copy to clipboard to get everything.
Repository: lazear/types-and-programming-languages
Branch: master
Commit: 0787493713b4
Files: 97
Total size: 427.6 KB

Directory structure:
gitextract_94czkswh/

├── .gitattributes
├── .github/
│   ├── ISSUE_TEMPLATE/
│   │   ├── bug_report.md
│   │   └── feature_request.md
│   └── workflows/
│       └── rust.yml
├── .gitignore
├── .rustfmt.toml
├── .travis.yml
├── 01_arith/
│   ├── Cargo.toml
│   └── src/
│       ├── lexer.rs
│       ├── main.rs
│       └── parser.rs
├── 02_lambda/
│   ├── Cargo.toml
│   └── src/
│       ├── context.rs
│       ├── lexer.rs
│       ├── main.rs
│       └── parser.rs
├── 03_typedarith/
│   ├── Cargo.toml
│   └── src/
│       ├── ast.rs
│       ├── lexer.rs
│       ├── main.rs
│       └── parser.rs
├── 04_stlc/
│   ├── .gitignore
│   ├── Cargo.toml
│   └── src/
│       ├── eval.rs
│       ├── lexer.rs
│       ├── main.rs
│       ├── parser.rs
│       ├── term.rs
│       ├── typing.rs
│       └── visitor.rs
├── 05_recon/
│   ├── Cargo.toml
│   └── src/
│       ├── disjoint.rs
│       ├── main.rs
│       ├── mutation/
│       │   ├── mod.rs
│       │   └── write_once.rs
│       ├── naive.rs
│       ├── parser.rs
│       └── types.rs
├── 06_system_f/
│   ├── Cargo.toml
│   ├── README.md
│   ├── src/
│   │   ├── diagnostics.rs
│   │   ├── eval.rs
│   │   ├── macros.rs
│   │   ├── main.rs
│   │   ├── patterns/
│   │   │   └── mod.rs
│   │   ├── syntax/
│   │   │   ├── lexer.rs
│   │   │   ├── mod.rs
│   │   │   └── parser.rs
│   │   ├── terms/
│   │   │   ├── mod.rs
│   │   │   └── visit.rs
│   │   ├── types/
│   │   │   ├── mod.rs
│   │   │   ├── patterns.rs
│   │   │   └── visit.rs
│   │   └── visit.rs
│   └── test.sf
├── 07_system_fw/
│   ├── Cargo.toml
│   ├── README.md
│   ├── src/
│   │   ├── diagnostics.rs
│   │   ├── elaborate.rs
│   │   ├── functor.rs
│   │   ├── hir/
│   │   │   ├── bidir.rs
│   │   │   └── mod.rs
│   │   ├── macros.rs
│   │   ├── main.rs
│   │   ├── stack.rs
│   │   ├── syntax/
│   │   │   ├── ast.rs
│   │   │   ├── lexer.rs
│   │   │   ├── mod.rs
│   │   │   ├── parser/
│   │   │   │   ├── README.md
│   │   │   │   ├── decls.rs
│   │   │   │   ├── exprs.rs
│   │   │   │   ├── infix.rs
│   │   │   │   ├── mod.rs
│   │   │   │   ├── patterns.rs
│   │   │   │   └── types.rs
│   │   │   ├── tokens.rs
│   │   │   └── visit/
│   │   │       ├── mod.rs
│   │   │       └── types.rs
│   │   ├── terms.rs
│   │   ├── typecheck.rs
│   │   └── types.rs
│   └── test.fw
├── Cargo.toml
├── LICENSE
├── README.md
├── util/
│   ├── .gitignore
│   ├── Cargo.toml
│   └── src/
│       ├── arena.rs
│       ├── diagnostic.rs
│       ├── lib.rs
│       ├── span.rs
│       └── unsafe_arena.rs
├── x1_bidir/
│   ├── Cargo.toml
│   └── src/
│       ├── helpers.rs
│       └── main.rs
└── x2_dependent/
    ├── Cargo.toml
    └── src/
        └── main.rs

================================================
FILE CONTENTS
================================================

================================================
FILE: .gitattributes
================================================
*	text=auto

================================================
FILE: .github/ISSUE_TEMPLATE/bug_report.md
================================================
---
name: Bug report
about: Create a report to help us improve
title: ''
labels: bug
assignees: ''

---

**Describe the bug**
A clear and concise description of what the bug is.

**To Reproduce**
Steps to reproduce the behavior:
1. Go to '...'
2. Click on '....'
3. Scroll down to '....'
4. See error

**Expected behavior**
A clear and concise description of what you expected to happen.

**Screenshots**
If applicable, add screenshots to help explain your problem.

**Additional context**
Add any other context about the problem here.


================================================
FILE: .github/ISSUE_TEMPLATE/feature_request.md
================================================
---
name: Feature request
about: Suggest an idea for this project
title: ''
labels: enhancement
assignees: ''

---

**Is your feature request related to a problem? Please describe.**
A clear and concise description of what the problem is. Ex. I'm always frustrated when [...]

**Describe the solution you'd like**
A clear and concise description of what you want to happen.

**Describe alternatives you've considered**
A clear and concise description of any alternative solutions or features you've considered.

**Additional context**
Add any other context or screenshots about the feature request here.


================================================
FILE: .github/workflows/rust.yml
================================================
name: Rust

on: [push, pull_request]

jobs:
  build:

    runs-on: ubuntu-latest

    steps:
    - uses: actions/checkout@v1
    - name: Build
      run: cargo build --verbose
    - name: Run tests
      run: |
        cargo test --verbose
        cargo run --bin system_f ./06_system_f/test.sf


================================================
FILE: .gitignore
================================================
/target
**/*.rs.bk
.vscode/

================================================
FILE: .rustfmt.toml
================================================
wrap_comments = true
max_width = 120

================================================
FILE: .travis.yml
================================================
language: rust
rust:
  - stable
  - nightly
matrix:
  allow_failures:
    - rust: nightly
script:
  - cargo build --verbose --all
  - cargo test --lib
notifications:
  email: false


================================================
FILE: 01_arith/Cargo.toml
================================================
[package]
name = "arith"
version = "0.1.0"
authors = ["Michael Lazear <lazear@scripps.edu>"]
edition = "2018"

[dependencies]
util = { path = "../util" }

================================================
FILE: 01_arith/src/lexer.rs
================================================
use util::span::{Location, Span, Spanned};

use std::char;
use std::iter::Peekable;
use std::str::Chars;

#[derive(Copy, Clone, Debug, PartialEq, PartialOrd)]
pub enum Token {
    Int(u32),
    Succ,
    Pred,
    If,
    Then,
    Else,
    True,
    False,
    IsZero,

    Semicolon,
    LParen,
    RParen,

    Invalid,
}

#[derive(Copy, Clone, Debug, PartialEq, PartialOrd)]
pub struct TokenSpan {
    pub kind: Token,
    pub span: Span,
}

impl std::ops::Deref for TokenSpan {
    type Target = Token;
    fn deref(&self) -> &Self::Target {
        &self.kind
    }
}

#[derive(Clone)]
pub struct Lexer<'s> {
    input: Peekable<Chars<'s>>,
    current: Location,
}

impl<'s> Lexer<'s> {
    pub fn new(input: Chars<'s>) -> Lexer<'s> {
        Lexer {
            input: input.peekable(),
            current: Location {
                line: 0,
                col: 0,
                abs: 0,
            },
        }
    }

    fn peek(&mut self) -> Option<char> {
        self.input.peek().cloned()
    }

    /// Consume the next [`char`] and advance internal source position
    fn consume(&mut self) -> Option<char> {
        match self.input.next() {
            Some('\n') => {
                self.current.line += 1;
                self.current.col = 0;
                self.current.abs += 1;
                Some('\n')
            }
            Some(ch) => {
                self.current.col += 1;
                self.current.abs += 1;
                Some(ch)
            }
            None => None,
        }
    }

    fn consume_while<F: Fn(char) -> bool>(&mut self, pred: F) -> Spanned<String> {
        let mut s = String::new();
        let start = self.current;
        while let Some(n) = self.peek() {
            if pred(n) {
                match self.consume() {
                    Some(ch) => s.push(ch),
                    None => break,
                }
            } else {
                break;
            }
        }
        Spanned::new(Span::new(start, self.current), s)
    }

    /// Eat whitespace
    fn consume_delimiter(&mut self) {
        let _ = self.consume_while(char::is_whitespace);
    }

    fn number(&mut self) -> Option<TokenSpan> {
        let Spanned { data, span } = self.consume_while(char::is_numeric);
        let kind = Token::Int(data.parse::<u32>().expect("only numeric chars"));
        Some(TokenSpan { kind, span })
    }

    fn keyword(&mut self) -> Option<TokenSpan> {
        let Spanned { data, span } = self.consume_while(|ch| ch.is_ascii_alphanumeric());
        let kind = match data.as_ref() {
            "if" => Token::If,
            "then" => Token::Then,
            "else" => Token::Else,
            "true" => Token::True,
            "false" => Token::False,
            "succ" => Token::Succ,
            "pred" => Token::Pred,
            "iszero" => Token::IsZero,
            "zero" => Token::Int(0),
            _ => Token::Invalid,
        };
        Some(TokenSpan { kind, span })
    }

    fn eat(&mut self, ch: char, token: Token) -> Option<TokenSpan> {
        let loc = self.current;
        let n = self.consume()?;
        let kind = if n == ch { token } else { Token::Invalid };
        Some(TokenSpan {
            span: Span::new(loc, self.current),
            kind,
        })
    }

    fn lex(&mut self) -> Option<TokenSpan> {
        self.consume_delimiter();
        match self.peek()? {
            x if x.is_ascii_alphabetic() => self.keyword(),
            x if x.is_numeric() => self.number(),
            '(' => self.eat('(', Token::LParen),
            ')' => self.eat(')', Token::RParen),
            ';' => self.eat(';', Token::Semicolon),
            _ => self.eat(' ', Token::Invalid),
        }
    }
}

impl<'s> Iterator for Lexer<'s> {
    type Item = TokenSpan;
    fn next(&mut self) -> Option<Self::Item> {
        self.lex()
    }
}

#[cfg(test)]
mod test {
    use super::*;
    use Token::*;
    #[test]
    fn valid() {
        let input = "succ(succ(succ(0)))";
        let expected = vec![Succ, LParen, Succ, LParen, Succ, LParen, Int(0), RParen, RParen, RParen];
        let output = Lexer::new(input.chars())
            .into_iter()
            .map(|t| t.kind)
            .collect::<Vec<Token>>();
        assert_eq!(expected, output);
    }

    #[test]
    fn invalid() {
        let input = "succ(succ(succ(xyz)))";
        let expected = vec![
            Succ, LParen, Succ, LParen, Succ, LParen, Invalid, RParen, RParen, RParen,
        ];
        let output = Lexer::new(input.chars())
            .into_iter()
            .map(|t| t.kind)
            .collect::<Vec<Token>>();
        assert_eq!(expected, output);
    }
}


================================================
FILE: 01_arith/src/main.rs
================================================
mod lexer;
mod parser;
use parser::{Parser, Term};

#[derive(Copy, Clone, Debug, PartialEq, PartialOrd)]
pub enum RuntimeError {
    NoRuleApplies,
}

impl Term {
    pub fn is_numeric(&self) -> bool {
        match self {
            Term::TmZero => true,
            Term::TmSucc(t) => t.is_numeric(),
            _ => false,
        }
    }

    pub fn is_normal(&self) -> bool {
        match self {
            Term::TmZero | Term::TmTrue | Term::TmFalse => true,
            _ => false,
        }
    }
}

pub fn eval1(t: Term) -> Result<Term, RuntimeError> {
    use Term::*;
    let res = match t {
        TmIf(cond, csq, alt) => match *cond {
            TmFalse => *alt,
            TmTrue => *csq,
            _ => TmIf(Box::new(eval1(*cond)?), csq, alt),
        },
        TmSucc(term) => TmSucc(Box::new(eval1(*term)?)),
        TmPred(term) => match *term {
            TmZero => TmZero,
            TmSucc(nv) => {
                if nv.is_numeric() {
                    *nv
                } else {
                    return Err(RuntimeError::NoRuleApplies);
                }
            }
            _ => TmPred(Box::new(eval1(*term)?)),
        },
        TmIsZero(term) => match *term {
            TmZero => TmTrue,
            TmSucc(nv) => {
                if nv.is_numeric() {
                    TmFalse
                } else {
                    return Err(RuntimeError::NoRuleApplies);
                }
            }
            _ => TmIsZero(Box::new(eval1(*term)?)),
        },
        _ => return Err(RuntimeError::NoRuleApplies),
    };
    Ok(res)
}

pub fn eval(t: Term) -> Term {
    let mut r = t;
    while let Ok(tprime) = eval1(r.clone()) {
        r = tprime;
        if r.is_normal() {
            break;
        }
    }
    r
}

fn main() {
    println!("λ");
    let input = "if iszero(succ(zero)) then false else succ(4)";
    let mut p = Parser::new(input);
    while let Some(tm) = p.parse_term() {
        print!("{:?} ==> ", tm);
        println!("{:?}", eval(tm));
    }

    let diag = p.diagnostic();
    if diag.error_count() > 0 {
        println!("\n{} error(s) detected while parsing!", diag.error_count());
        println!("{}", diag.emit());
    }
}


================================================
FILE: 01_arith/src/parser.rs
================================================
use crate::lexer::{Lexer, Token};
use std::iter::Peekable;
use util::diagnostic::Diagnostic;
use util::span::Span;

#[derive(Clone, Debug, PartialEq, PartialOrd)]
pub enum Term {
    TmTrue,
    TmFalse,
    TmIf(Box<Term>, Box<Term>, Box<Term>),
    TmZero,
    TmSucc(Box<Term>),
    TmPred(Box<Term>),
    TmIsZero(Box<Term>),
}

pub struct Parser<'s> {
    diagnostic: Diagnostic<'s>,
    /// [`Lexer`] impls [`Iterator`] over [`TokenSpan`],
    /// so we can just directly wrap it in a [`Peekable`]
    lexer: Peekable<Lexer<'s>>,
    span: Span,
}

impl<'s> Parser<'s> {
    /// Create a new [`Parser`] for the input `&str`
    pub fn new(input: &'s str) -> Parser<'s> {
        Parser {
            diagnostic: Diagnostic::new(input),
            lexer: Lexer::new(input.chars()).peekable(),
            span: Span::default(),
        }
    }

    fn consume(&mut self) -> Option<Token> {
        let ts = self.lexer.next()?;
        self.span = ts.span;
        Some(ts.kind)
    }

    fn expect(&mut self, token: Token) -> Option<Token> {
        match self.consume()? {
            t if t == token => Some(t),
            _ => None,
        }
    }

    fn parse_paren(&mut self) -> Option<Term> {
        let e = self.parse_term();
        self.expect(Token::RParen);
        e
    }

    fn parse_if(&mut self) -> Option<Term> {
        let cond = self.parse_term()?;
        let _ = self.expect(Token::Then)?;
        let csq = self.parse_term()?;
        let _ = self.expect(Token::Else)?;
        let alt = self.parse_term()?;
        Some(Term::TmIf(Box::new(cond), Box::new(csq), Box::new(alt)))
    }

    pub fn parse_term(&mut self) -> Option<Term> {
        let kind = match self.consume()? {
            Token::False => Term::TmFalse,
            Token::True => Term::TmTrue,
            Token::Succ => Term::TmSucc(Box::new(self.parse_term()?)),
            Token::Pred => Term::TmPred(Box::new(self.parse_term()?)),
            Token::IsZero => Term::TmIsZero(Box::new(self.parse_term()?)),
            Token::If => return self.parse_if(),
            Token::LParen => return self.parse_paren(),
            Token::Semicolon => return self.parse_term(),
            Token::Int(x) => baptize(x),
            Token::Then | Token::Else | Token::RParen => {
                self.diagnostic.push("Out of place token", self.span);
                return self.parse_term();
            }
            Token::Invalid => {
                self.diagnostic.push("Invalid token", self.span);
                return self.parse_term();
            }
        };
        Some(kind)
    }

    pub fn diagnostic(self) -> Diagnostic<'s> {
        self.diagnostic
    }
}

/// Convert from natural number to church encoding
fn baptize(int: u32) -> Term {
    let mut num = Term::TmZero;
    for _ in 0..int {
        num = Term::TmSucc(Box::new(num));
    }
    num
}


================================================
FILE: 02_lambda/Cargo.toml
================================================
[package]
name = "lambda"
version = "0.1.0"
authors = ["Michael Lazear <lazear@scripps.edu>"]
edition = "2018"

[dependencies]
util = { path = "../util" }

================================================
FILE: 02_lambda/src/context.rs
================================================
use std::collections::VecDeque;

#[derive(Clone, Debug, Default)]
pub struct Context {
    inner: VecDeque<String>,
}

impl Context {
    pub fn bind(&mut self, hint: String) -> (Context, usize) {
        if self.inner.contains(&hint) {
            self.bind(format!("{}'", hint))
        } else {
            let mut ctx = self.clone();
            let idx = ctx.size();
            ctx.inner.push_front(hint);
            (ctx, idx)
        }
    }

    pub fn lookup(&self, key: String) -> Option<usize> {
        for (idx, s) in self.inner.iter().enumerate() {
            if key == *s {
                return Some(idx);
            }
        }
        None
    }

    pub fn size(&self) -> usize {
        self.inner.len()
    }
}


================================================
FILE: 02_lambda/src/lexer.rs
================================================
use util::span::{Location, Span, Spanned};

use std::char;
use std::iter::Peekable;
use std::str::Chars;

#[derive(Copy, Clone, Debug, PartialEq, PartialOrd)]
pub enum Token {
    Var(char),
    LParen,
    RParen,
    Lambda,
    Dot,
    Invalid,
}

#[derive(Clone)]
pub struct Lexer<'s> {
    input: Peekable<Chars<'s>>,
    current: Location,
}

impl<'s> Lexer<'s> {
    pub fn new(input: Chars<'s>) -> Lexer<'s> {
        Lexer {
            input: input.peekable(),
            current: Location {
                line: 0,
                col: 0,
                abs: 0,
            },
        }
    }

    fn peek(&mut self) -> Option<char> {
        self.input.peek().copied()
    }

    /// Consume the next [`char`] and advance internal source position
    fn consume(&mut self) -> Option<char> {
        match self.input.next() {
            Some('\n') => {
                self.current.line += 1;
                self.current.col = 0;
                self.current.abs += 1;
                Some('\n')
            }
            Some(ch) => {
                self.current.col += 1;
                self.current.abs += 1;
                Some(ch)
            }
            None => None,
        }
    }

    fn consume_while<F: Fn(char) -> bool>(&mut self, pred: F) -> Spanned<String> {
        let mut s = String::new();
        let start = self.current;
        while let Some(n) = self.peek() {
            if pred(n) {
                match self.consume() {
                    Some(ch) => s.push(ch),
                    None => break,
                }
            } else {
                break;
            }
        }
        Spanned::new(Span::new(start, self.current), s)
    }

    /// Eat whitespace
    fn consume_delimiter(&mut self) {
        let _ = self.consume_while(char::is_whitespace);
    }

    fn eat(&mut self, ch: char, token: Token) -> Option<Spanned<Token>> {
        let loc = self.current;
        let n = self.consume()?;
        let kind = if n == ch { token } else { Token::Invalid };
        Some(Spanned::new(Span::new(loc, self.current), kind))
    }

    fn lex(&mut self) -> Option<Spanned<Token>> {
        self.consume_delimiter();
        match self.peek()? {
            '(' => self.eat('(', Token::LParen),
            ')' => self.eat(')', Token::RParen),
            'λ' => self.eat('λ', Token::Lambda),
            '.' => self.eat('.', Token::Dot),
            ch => self.eat(ch, Token::Var(ch)),
        }
    }
}

impl<'s> Iterator for Lexer<'s> {
    type Item = Spanned<Token>;
    fn next(&mut self) -> Option<Self::Item> {
        self.lex()
    }
}


================================================
FILE: 02_lambda/src/main.rs
================================================
mod context;
mod lexer;
mod parser;
use parser::Parser;

use context::Context;
use parser::{RcTerm, Term};

fn shift1(d: isize, c: isize, tm: RcTerm) -> RcTerm {
    match &tm as &Term {
        Term::TmVar(sp, x) => {
            if *x as isize >= c {
                Term::TmVar(*sp, *x + d as usize).into()
            } else {
                Term::TmVar(*sp, *x).into()
            }
        }
        Term::TmAbs(sp, x) => Term::TmAbs(*sp, shift1(d, c + 1, x.clone())).into(),
        Term::TmApp(sp, a, b) => Term::TmApp(*sp, shift1(d, c, a.clone()), shift1(d, c, b.clone())).into(),
    }
}

fn shift(d: isize, tm: RcTerm) -> RcTerm {
    shift1(d, 0, tm)
}

fn subst_walk(j: isize, s: RcTerm, c: isize, t: RcTerm) -> RcTerm {
    match &t as &Term {
        Term::TmVar(_, x) => {
            if *x as isize == j + c {
                shift(c, s)
            } else {
                t
            }
        }
        Term::TmAbs(sp, tm) => Term::TmAbs(*sp, subst_walk(j, s, c + 1, tm.clone())).into(),
        Term::TmApp(sp, lhs, rhs) => Term::TmApp(
            *sp,
            subst_walk(j, s.clone(), c, lhs.clone()),
            subst_walk(j, s, c, rhs.clone()),
        )
        .into(),
    }
}

fn subst(j: isize, s: RcTerm, tm: RcTerm) -> RcTerm {
    subst_walk(j, s, 0, tm)
}

fn term_subst_top(s: RcTerm, tm: RcTerm) -> RcTerm {
    shift(-1, subst(0, shift(1, s), tm))
}

fn isval(_ctx: &Context, tm: RcTerm) -> bool {
    match &tm as &Term {
        Term::TmAbs(_, _) => true,
        _ => false,
    }
}

fn eval1(ctx: &Context, tm: RcTerm) -> RcTerm {
    match &tm as &Term {
        Term::TmApp(_, t, v) if isval(ctx, v.clone()) => {
            if let Term::TmAbs(_, t2) = &t as &Term {
                term_subst_top(v.clone(), t2.clone())
            } else {
                panic!("No rule applies!")
            }
        }
        Term::TmApp(sp, v, t) if isval(ctx, v.clone()) => {
            let t_prime = eval1(ctx, t.clone());
            Term::TmApp(*sp, v.clone(), t_prime).into()
        }
        Term::TmApp(sp, t1, t2) => {
            let t_prime = eval1(ctx, t1.clone());
            Term::TmApp(*sp, t_prime, t2.clone()).into()
        }
        _ => panic!("No rule applies!"),
    }
}

fn main() {
    // let input = "(λ x. x x) (λ x. x x) λ x. λ y. y λ x. λ x. x";
    //
    let input = "(λ x. (λ y. y) x) (λ x. x)";
    let mut p = Parser::new(input);
    while let Some(tm) = p.parse_term() {
        println!("{:?}", tm);
        dbg!(eval1(p.ctx(), tm));
        // dbg!(term_subst_top(Term::TmVar(Span::default(), 0).into(), tm));
    }

    dbg!(p.ctx());

    let diag = p.diagnostic();
    if diag.error_count() > 0 {
        println!("\n{} error(s) detected while parsing!", diag.error_count());
        println!("{}", diag.emit());
    }
}


================================================
FILE: 02_lambda/src/parser.rs
================================================
use crate::context::Context;
use crate::lexer::{Lexer, Token};
use std::iter::Peekable;
use std::ops::Deref;
use std::rc::Rc;
use util::diagnostic::Diagnostic;
use util::span::*;

#[derive(Clone, PartialEq, PartialOrd)]
pub struct RcTerm(pub Rc<Term>);

impl From<Term> for RcTerm {
    fn from(term: Term) -> RcTerm {
        RcTerm(Rc::new(term))
    }
}

impl std::fmt::Debug for RcTerm {
    fn fmt(&self, f: &mut std::fmt::Formatter) -> std::fmt::Result {
        write!(f, "{:?}", self.0)
    }
}

impl std::fmt::Debug for Term {
    fn fmt(&self, f: &mut std::fmt::Formatter) -> std::fmt::Result {
        match self {
            Term::TmVar(_, v) => write!(f, "{}", v),
            Term::TmAbs(_, tm) => write!(f, "λ.{:?}", tm),
            Term::TmApp(_, t, b) => write!(f, "{:?} {:?}", t, b),
        }
    }
}

impl Deref for RcTerm {
    type Target = Term;
    fn deref(&self) -> &Self::Target {
        &self.0
    }
}

#[derive(Clone, PartialEq, PartialOrd)]
pub enum Term {
    TmVar(Span, usize),
    TmAbs(Span, RcTerm),
    TmApp(Span, RcTerm, RcTerm),
}

pub struct Parser<'s> {
    ctx: Context,
    diagnostic: Diagnostic<'s>,
    /// [`Lexer`] impls [`Iterator`] over [`TokenSpan`],
    /// so we can just directly wrap it in a [`Peekable`]
    lexer: Peekable<Lexer<'s>>,
    span: Span,
}

impl<'s> Parser<'s> {
    /// Create a new [`Parser`] for the input `&str`
    pub fn new(input: &'s str) -> Parser<'s> {
        Parser {
            ctx: Context::default(),
            diagnostic: Diagnostic::new(input),
            lexer: Lexer::new(input.chars()).peekable(),
            span: Span::default(),
        }
    }

    fn consume(&mut self) -> Option<Spanned<Token>> {
        let ts = self.lexer.next()?;
        self.span = ts.span;
        Some(ts)
    }

    fn expect(&mut self, token: Token) -> Option<Spanned<Token>> {
        let spanned = self.consume()?;
        match spanned.data {
            t if t == token => Some(spanned),
            t => {
                self.diagnostic
                    .push(format!("Expected token {:?}, found {:?}", token, t), spanned.span);
                None
            }
        }
    }

    fn peek(&mut self) -> Option<Token> {
        self.lexer.peek().map(|s| s.data)
    }

    fn lambda(&mut self) -> Option<RcTerm> {
        let start = self.expect(Token::Lambda)?.span;

        let var = self.consume()?;

        // Bind variable into a new context before parsing the body
        // of the lambda abstraction
        let prev_ctx = self.ctx.clone();
        let (ctx, _) = match var.data {
            Token::Var(ch) => {
                let (ctx, idx) = self.ctx.bind(format!("{}", ch));
                (ctx, Term::TmVar(var.span, idx))
            }
            x => {
                self.diagnostic
                    .push(format!("Expected variable, found {:?}", x), var.span);
                return None;
            }
        };

        self.ctx = ctx;

        let _ = self.expect(Token::Dot)?;
        let body = self.term()?;
        let end = self.span;

        // Return to previous context
        self.ctx = prev_ctx;
        Some(Term::TmAbs(start + end, body).into())
    }

    fn term(&mut self) -> Option<RcTerm> {
        match self.peek()? {
            Token::Lambda => self.lambda(),
            _ => self.application(),
        }
    }

    /// Parse an application of form:
    /// application = atom application' | atom
    /// application' = atom application' | empty
    fn application(&mut self) -> Option<RcTerm> {
        let mut lhs = self.atom()?;
        let span = self.span;
        while let Some(rhs) = self.atom() {
            lhs = Term::TmApp(span + self.span, lhs, rhs).into();
        }
        Some(lhs)
    }

    /// Parse an atomic term
    /// LPAREN term RPAREN | var
    fn atom(&mut self) -> Option<RcTerm> {
        match self.peek()? {
            Token::LParen => {
                self.expect(Token::LParen)?;
                let term = self.term()?;
                self.expect(Token::RParen)?;
                Some(term)
            }
            Token::Var(ch) => {
                let sp = self.consume()?.span;
                match self.ctx.lookup(format!("{}", ch)) {
                    Some(idx) => Some(Term::TmVar(sp, idx).into()),
                    None => {
                        self.diagnostic.push(format!("Unbound variable {}", ch), sp);
                        None
                    }
                }
            }

            _ => None,
        }
    }

    pub fn parse_term(&mut self) -> Option<RcTerm> {
        self.term()
    }

    pub fn ctx(&self) -> &Context {
        &self.ctx
    }

    pub fn diagnostic(self) -> Diagnostic<'s> {
        self.diagnostic
    }
}


================================================
FILE: 03_typedarith/Cargo.toml
================================================
[package]
name = "typedarith"
version = "0.1.0"
authors = ["Michael Lazear <lazear@scripps.edu>"]
edition = "2018"

# See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html

[dependencies]
util = { path = "../util" }

================================================
FILE: 03_typedarith/src/ast.rs
================================================
use std::ops::Deref;
use std::rc::Rc;

#[derive(Copy, Clone, Debug, PartialEq, PartialOrd)]
pub enum Type {
    Nat,
    Bool,
}

#[derive(Clone, Debug, PartialEq, PartialOrd)]
pub enum Term {
    TmTrue,
    TmFalse,
    TmIf(RcTerm, RcTerm, RcTerm),
    TmZero,
    TmSucc(RcTerm),
    TmPred(RcTerm),
    TmIsZero(RcTerm),
}

#[derive(Copy, Clone, Debug, PartialEq, PartialOrd)]
pub enum TyError {
    TypingError,
}

pub fn typing(tm: RcTerm) -> Result<Type, TyError> {
    match &tm as &Term {
        Term::TmTrue => Ok(Type::Bool),
        Term::TmFalse => Ok(Type::Bool),
        Term::TmZero => Ok(Type::Nat),
        Term::TmSucc(t) => match typing(t.clone()) {
            Ok(Type::Nat) => Ok(Type::Nat),
            _ => Err(TyError::TypingError),
        },
        Term::TmPred(t) => match typing(t.clone()) {
            Ok(Type::Nat) => Ok(Type::Nat),
            _ => Err(TyError::TypingError),
        },
        Term::TmIsZero(t) => match typing(t.clone()) {
            Ok(Type::Nat) => Ok(Type::Bool),
            _ => Err(TyError::TypingError),
        },
        Term::TmIf(a, b, c) => match typing(a.clone()) {
            Ok(Type::Bool) => {
                let ty_b = typing(b.clone())?;
                let ty_c = typing(c.clone())?;
                if ty_b == ty_c {
                    Ok(ty_b)
                } else {
                    Err(TyError::TypingError)
                }
            }
            _ => Err(TyError::TypingError),
        },
    }
}

#[derive(Clone, PartialEq, PartialOrd)]
pub struct RcTerm(pub Rc<Term>);

impl std::fmt::Debug for RcTerm {
    fn fmt(&self, f: &mut std::fmt::Formatter) -> std::fmt::Result {
        write!(f, "{:?}", self.0)
    }
}

impl From<Term> for RcTerm {
    fn from(term: Term) -> RcTerm {
        RcTerm(Rc::new(term))
    }
}

impl Deref for RcTerm {
    type Target = Term;
    fn deref(&self) -> &Self::Target {
        &self.0
    }
}


================================================
FILE: 03_typedarith/src/lexer.rs
================================================
use util::span::{Location, Span, Spanned};

use std::char;
use std::iter::Peekable;
use std::str::Chars;

#[derive(Copy, Clone, Debug, PartialEq, PartialOrd)]
pub enum Token {
    Int(u32),
    Succ,
    Pred,
    If,
    Then,
    Else,
    True,
    False,
    IsZero,
    Semicolon,
    LParen,
    RParen,
    Invalid,
}

#[derive(Copy, Clone, Debug, PartialEq, PartialOrd)]
pub struct TokenSpan {
    pub kind: Token,
    pub span: Span,
}

impl std::ops::Deref for TokenSpan {
    type Target = Token;
    fn deref(&self) -> &Self::Target {
        &self.kind
    }
}

#[derive(Clone)]
pub struct Lexer<'s> {
    input: Peekable<Chars<'s>>,
    current: Location,
}

impl<'s> Lexer<'s> {
    pub fn new(input: Chars<'s>) -> Lexer<'s> {
        Lexer {
            input: input.peekable(),
            current: Location {
                line: 0,
                col: 0,
                abs: 0,
            },
        }
    }

    fn peek(&mut self) -> Option<char> {
        self.input.peek().cloned()
    }

    /// Consume the next [`char`] and advance internal source position
    fn consume(&mut self) -> Option<char> {
        match self.input.next() {
            Some('\n') => {
                self.current.line += 1;
                self.current.col = 0;
                self.current.abs += 1;
                Some('\n')
            }
            Some(ch) => {
                self.current.col += 1;
                self.current.abs += 1;
                Some(ch)
            }
            None => None,
        }
    }

    fn consume_while<F: Fn(char) -> bool>(&mut self, pred: F) -> Spanned<String> {
        let mut s = String::new();
        let start = self.current;
        while let Some(n) = self.peek() {
            if pred(n) {
                match self.consume() {
                    Some(ch) => s.push(ch),
                    None => break,
                }
            } else {
                break;
            }
        }
        Spanned::new(Span::new(start, self.current), s)
    }

    /// Eat whitespace
    fn consume_delimiter(&mut self) {
        let _ = self.consume_while(char::is_whitespace);
    }

    fn number(&mut self) -> Option<TokenSpan> {
        let Spanned { data, span } = self.consume_while(char::is_numeric);
        let kind = Token::Int(data.parse::<u32>().expect("only numeric chars"));
        Some(TokenSpan { kind, span })
    }

    fn keyword(&mut self) -> Option<TokenSpan> {
        let Spanned { data, span } = self.consume_while(|ch| ch.is_ascii_alphanumeric());
        let kind = match data.as_ref() {
            "if" => Token::If,
            "then" => Token::Then,
            "else" => Token::Else,
            "true" => Token::True,
            "false" => Token::False,
            "succ" => Token::Succ,
            "pred" => Token::Pred,
            "iszero" => Token::IsZero,
            "zero" => Token::Int(0),
            _ => Token::Invalid,
        };
        Some(TokenSpan { kind, span })
    }

    fn eat(&mut self, ch: char, token: Token) -> Option<TokenSpan> {
        let loc = self.current;
        let n = self.consume()?;
        let kind = if n == ch { token } else { Token::Invalid };
        Some(TokenSpan {
            span: Span::new(loc, self.current),
            kind,
        })
    }

    fn lex(&mut self) -> Option<TokenSpan> {
        self.consume_delimiter();
        match self.peek()? {
            x if x.is_ascii_alphabetic() => self.keyword(),
            x if x.is_numeric() => self.number(),
            '(' => self.eat('(', Token::LParen),
            ')' => self.eat(')', Token::RParen),
            ';' => self.eat(';', Token::Semicolon),
            _ => self.eat(' ', Token::Invalid),
        }
    }
}

impl<'s> Iterator for Lexer<'s> {
    type Item = TokenSpan;
    fn next(&mut self) -> Option<Self::Item> {
        self.lex()
    }
}

#[cfg(test)]
mod test {
    use super::*;
    use Token::*;
    #[test]
    fn valid() {
        let input = "succ(succ(succ(0)))";
        let expected = vec![Succ, LParen, Succ, LParen, Succ, LParen, Int(0), RParen, RParen, RParen];
        let output = Lexer::new(input.chars())
            .into_iter()
            .map(|t| t.kind)
            .collect::<Vec<Token>>();
        assert_eq!(expected, output);
    }

    #[test]
    fn invalid() {
        let input = "succ(succ(succ(xyz)))";
        let expected = vec![
            Succ, LParen, Succ, LParen, Succ, LParen, Invalid, RParen, RParen, RParen,
        ];
        let output = Lexer::new(input.chars())
            .into_iter()
            .map(|t| t.kind)
            .collect::<Vec<Token>>();
        assert_eq!(expected, output);
    }
}


================================================
FILE: 03_typedarith/src/main.rs
================================================
mod ast;
mod lexer;
mod parser;
use ast::*;
use parser::Parser;

fn main() {
    let input = "if iszero(succ(zero)) then pred(0) else succ(4)";
    let mut p = Parser::new(input);
    while let Some(tm) = p.parse_term() {
        print!("{:?} ==> ", tm);
        println!("{:?}", typing(tm));
    }

    let diag = p.diagnostic();
    if diag.error_count() > 0 {
        println!("\n{} error(s) detected while parsing!", diag.error_count());
        println!("{}", diag.emit());
    }
}


================================================
FILE: 03_typedarith/src/parser.rs
================================================
use crate::ast::{RcTerm, Term};
use crate::lexer::{Lexer, Token};
use std::iter::Peekable;
use util::diagnostic::Diagnostic;
use util::span::Span;

pub struct Parser<'s> {
    diagnostic: Diagnostic<'s>,
    /// [`Lexer`] impls [`Iterator`] over [`TokenSpan`],
    /// so we can just directly wrap it in a [`Peekable`]
    lexer: Peekable<Lexer<'s>>,
    span: Span,
}

impl<'s> Parser<'s> {
    /// Create a new [`Parser`] for the input `&str`
    pub fn new(input: &'s str) -> Parser<'s> {
        Parser {
            diagnostic: Diagnostic::new(input),
            lexer: Lexer::new(input.chars()).peekable(),
            span: Span::default(),
        }
    }

    fn consume(&mut self) -> Option<Token> {
        let ts = self.lexer.next()?;
        self.span = ts.span;
        Some(ts.kind)
    }

    fn expect(&mut self, token: Token) -> Option<Token> {
        match self.consume()? {
            t if t == token => Some(t),
            _ => None,
        }
    }

    fn parse_paren(&mut self) -> Option<RcTerm> {
        let e = self.parse_term();
        self.expect(Token::RParen);
        e
    }

    fn parse_if(&mut self) -> Option<RcTerm> {
        let cond = self.parse_term()?;
        let _ = self.expect(Token::Then)?;
        let csq = self.parse_term()?;
        let _ = self.expect(Token::Else)?;
        let alt = self.parse_term()?;
        Some(Term::TmIf(cond, csq, alt).into())
    }

    pub fn parse_term(&mut self) -> Option<RcTerm> {
        let kind = match self.consume()? {
            Token::False => Term::TmFalse,
            Token::True => Term::TmTrue,
            Token::Succ => Term::TmSucc(self.parse_term()?),
            Token::Pred => Term::TmPred(self.parse_term()?),
            Token::IsZero => Term::TmIsZero(self.parse_term()?),
            Token::If => return self.parse_if(),
            Token::LParen => return self.parse_paren(),
            Token::Semicolon => return self.parse_term(),
            Token::Int(x) => baptize(x),
            Token::Then | Token::Else | Token::RParen => {
                self.diagnostic.push("Out of place token", self.span);
                return self.parse_term();
            }
            Token::Invalid => {
                self.diagnostic.push("Invalid token", self.span);
                return self.parse_term();
            }
        };
        Some(kind.into())
    }

    pub fn diagnostic(self) -> Diagnostic<'s> {
        self.diagnostic
    }
}

/// Convert from natural number to church encoding
fn baptize(int: u32) -> Term {
    let mut num = Term::TmZero;
    for _ in 0..int {
        num = Term::TmSucc(num.into());
    }
    num
}


================================================
FILE: 04_stlc/.gitignore
================================================
/target
**/*.rs.bk
.vscode/

================================================
FILE: 04_stlc/Cargo.toml
================================================
[package]
name = "stlc"
version = "0.1.0"
authors = ["Michael Lazear <lazear@scripps.edu>"]
edition = "2018"

# See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html

[dependencies]
util = { path = "../util" }

================================================
FILE: 04_stlc/src/eval.rs
================================================
use super::term::*;
use super::typing::Context;
use super::visitor::{Direction, MutVisitor, Shifting, Substitution};

#[derive(Debug)]
pub enum Error {
    NoRuleApplies,
}

#[inline]
fn subst(mut val: Term, body: &mut Term) {
    Shifting::new(Direction::Up).visit_term(&mut val);
    Substitution::new(val).visit_term(body);
    Shifting::new(Direction::Down).visit_term(body);
}

fn value(ctx: &Context, term: &Term) -> bool {
    match term {
        Term::Unit | Term::True | Term::False | Term::Abs(_, _) | Term::Zero => true,
        Term::Succ(t) | Term::Pred(t) | Term::IsZero(t) => value(ctx, t),
        Term::Record(fields) => {
            for field in fields {
                if !value(ctx, &field.term) {
                    return false;
                }
            }
            true
        }
        _ => false,
    }
}

fn eval1(ctx: &Context, term: Term) -> Result<Box<Term>, Error> {
    match term {
        Term::App(t1, t2) => {
            if value(ctx, &t2) {
                match *t1 {
                    Term::Abs(_, mut abs) => {
                        subst(*t2, abs.as_mut());
                        Ok(abs)
                    }
                    _ => {
                        let t_prime = eval1(ctx, *t1)?;
                        Ok(Term::App(t_prime, t2).into())
                    }
                }
            } else if value(ctx, &t1) {
                let t_prime = eval1(ctx, *t2)?;
                Ok(Term::App(t1.clone(), t_prime).into())
            } else {
                let t_prime = eval1(ctx, *t1)?;
                Ok(Term::App(t_prime, t2.clone()).into())
            }
        }
        Term::If(guard, csq, alt) => match &*guard {
            Term::True => Ok(csq),
            Term::False => Ok(alt),
            _ => {
                let t_prime = eval1(ctx, *guard)?;
                Ok(Term::If(t_prime, csq, alt).into())
            }
        },
        Term::Let(bind, mut body) => {
            if value(ctx, &bind) {
                subst(*bind, body.as_mut());
                Ok(body)
            } else {
                let t = eval1(ctx, *bind)?;
                Ok(Term::Let(t, body).into())
            }
        }
        Term::Succ(t) => {
            let t_prime = eval1(ctx, *t)?;
            Ok(Term::Succ(t_prime).into())
        }

        Term::Pred(t) => match t.as_ref() {
            Term::Zero => Ok(t.clone()),
            Term::Succ(n) => Ok(n.clone()),
            _ => Ok(Term::Pred(eval1(ctx, *t)?).into()),
        },

        Term::IsZero(t) => match t.as_ref() {
            Term::Zero => Ok(Term::True.into()),
            Term::Succ(_) => Ok(Term::False.into()),
            _ => Ok(Term::IsZero(eval1(ctx, *t)?).into()),
        },

        Term::Projection(rec, proj) => {
            if value(ctx, &rec) {
                match rec.as_ref() {
                    Term::Record(rec) => crate::term::record_access(rec, &proj).ok_or(Error::NoRuleApplies),
                    _ => Ok(Term::Projection(eval1(ctx, *rec)?, proj).into()),
                }
            } else {
                Ok(Term::Projection(eval1(ctx, *rec)?, proj).into())
            }
        }

        _ => Err(Error::NoRuleApplies),
    }
}

pub fn eval(ctx: &Context, term: Term) -> Result<Term, Error> {
    let mut tp = term;
    loop {
        println!("  -> {}", &tp);
        match eval1(ctx, tp.clone()) {
            Ok(r) => tp = *r,
            Err(e) => {
                return Ok(tp);
            }
        }
    }
}


================================================
FILE: 04_stlc/src/lexer.rs
================================================
use util::span::{Location, Span};

use std::char;
use std::iter::Peekable;
use std::str::Chars;

#[derive(Clone, Debug, PartialEq, PartialOrd)]
pub enum TokenKind {
    Ident(String),
    Nat(u32),
    TyNat,
    TyBool,
    TyArrow,
    TyUnit,
    TypeDecl,
    Unit,
    True,
    False,
    Lambda,
    Succ,
    Pred,
    If,
    Then,
    Else,
    Let,
    In,
    IsZero,
    Semicolon,
    Colon,
    Comma,
    Proj,
    LParen,
    RParen,
    LBrace,
    RBrace,
    Equals,
    Bar,
    Invalid(char),
    Eof,
}

#[derive(Clone, Debug, PartialEq, PartialOrd)]
pub struct Token {
    pub kind: TokenKind,
    pub span: Span,
}

impl Token {
    pub const fn new(kind: TokenKind, span: Span) -> Token {
        Token { kind, span }
    }
}

#[derive(Clone)]
pub struct Lexer<'s> {
    input: Peekable<Chars<'s>>,
    current: Location,
}

impl<'s> Lexer<'s> {
    pub fn new(input: Chars<'s>) -> Lexer<'s> {
        Lexer {
            input: input.peekable(),
            current: Location {
                line: 0,
                col: 0,
                abs: 0,
            },
        }
    }

    /// Peek at the next [`char`] in the input stream
    fn peek(&mut self) -> Option<char> {
        self.input.peek().cloned()
    }

    /// Consume the next [`char`] and advance internal source position
    fn consume(&mut self) -> Option<char> {
        match self.input.next() {
            Some('\n') => {
                self.current.line += 1;
                self.current.col = 0;
                self.current.abs += 1;
                Some('\n')
            }
            Some(ch) => {
                self.current.col += 1;
                self.current.abs += 1;
                Some(ch)
            }
            None => None,
        }
    }

    /// Consume characters from the input stream while pred(peek()) is true,
    /// collecting the characters into a string.
    fn consume_while<F: Fn(char) -> bool>(&mut self, pred: F) -> (String, Span) {
        let mut s = String::new();
        let start = self.current;
        while let Some(n) = self.peek() {
            if pred(n) {
                match self.consume() {
                    Some(ch) => s.push(ch),
                    None => break,
                }
            } else {
                break;
            }
        }
        (s, Span::new(start, self.current))
    }

    /// Eat whitespace
    fn consume_delimiter(&mut self) {
        let _ = self.consume_while(char::is_whitespace);
    }

    /// Lex a natural number
    fn number(&mut self) -> Token {
        // Since we peeked at least one numeric char, we should always
        // have a string containing at least 1 single digit, as such
        // it is safe to call unwrap() on str::parse<u32>
        let (data, span) = self.consume_while(char::is_numeric);
        let n = data.parse::<u32>().unwrap();
        Token::new(TokenKind::Nat(n), span)
    }

    /// Lex a reserved keyword or an identifier
    fn keyword(&mut self) -> Token {
        let (data, span) = self.consume_while(|ch| ch.is_ascii_alphanumeric());
        let kind = match data.as_ref() {
            "if" => TokenKind::If,
            "then" => TokenKind::Then,
            "else" => TokenKind::Else,
            "true" => TokenKind::True,
            "false" => TokenKind::False,
            "succ" => TokenKind::Succ,
            "pred" => TokenKind::Pred,
            "iszero" => TokenKind::IsZero,
            "zero" => TokenKind::Nat(0),
            "Bool" => TokenKind::TyBool,
            "Nat" => TokenKind::TyNat,
            "Unit" => TokenKind::TyUnit,
            "unit" => TokenKind::Unit,
            "let" => TokenKind::Let,
            "in" => TokenKind::In,
            "type" => TokenKind::TypeDecl,
            _ => TokenKind::Ident(data),
        };
        Token::new(kind, span)
    }

    /// Consume the next input character, expecting to match `ch`.
    /// Return a [`TokenKind::Invalid`] if the next character does not match,
    /// or the argument `kind` if it does
    fn eat(&mut self, ch: char, kind: TokenKind) -> Token {
        let loc = self.current;
        // Lexer::eat() should only be called internally after calling peek()
        // so we know that it's safe to unwrap the result of Lexer::consume()
        let n = self.consume().unwrap();
        let kind = if n == ch { kind } else { TokenKind::Invalid(n) };
        Token::new(kind, Span::new(loc, self.current))
    }

    /// Return the next lexeme in the input as a [`Token`]
    pub fn lex(&mut self) -> Token {
        self.consume_delimiter();
        let next = match self.peek() {
            Some(ch) => ch,
            None => return Token::new(TokenKind::Eof, Span::dummy()),
        };
        match next {
            x if x.is_ascii_alphabetic() => self.keyword(),
            x if x.is_numeric() => self.number(),
            '(' => self.eat('(', TokenKind::LParen),
            ')' => self.eat(')', TokenKind::RParen),
            ';' => self.eat(';', TokenKind::Semicolon),
            ':' => self.eat(':', TokenKind::Colon),
            ',' => self.eat(',', TokenKind::Comma),
            '{' => self.eat('{', TokenKind::LBrace),
            '}' => self.eat('}', TokenKind::RBrace),
            '\\' => self.eat('\\', TokenKind::Lambda),
            'λ' => self.eat('λ', TokenKind::Lambda),
            '.' => self.eat('.', TokenKind::Proj),
            '=' => self.eat('=', TokenKind::Equals),
            '|' => self.eat('|', TokenKind::Bar),
            '-' => {
                self.consume();
                self.eat('>', TokenKind::TyArrow)
            }
            ch => self.eat(' ', TokenKind::Invalid(ch)),
        }
    }
}

impl<'s> Iterator for Lexer<'s> {
    type Item = Token;
    fn next(&mut self) -> Option<Self::Item> {
        match self.lex() {
            Token {
                kind: TokenKind::Eof, ..
            } => None,
            tok => Some(tok),
        }
    }
}

#[cfg(test)]
mod test {
    use super::*;
    use TokenKind::*;
    #[test]
    fn valid() {
        let input = "succ(succ(succ(0)))";
        let expected = vec![Succ, LParen, Succ, LParen, Succ, LParen, Nat(0), RParen, RParen, RParen];
        let output = Lexer::new(input.chars())
            .into_iter()
            .map(|t| t.kind)
            .collect::<Vec<TokenKind>>();
        assert_eq!(expected, output);
    }

    #[test]
    fn invalid() {
        let input = "succ(succ(succ(xyz)))";
        let expected = vec![
            Succ,
            LParen,
            Succ,
            LParen,
            Succ,
            LParen,
            Ident("xyz".into()),
            RParen,
            RParen,
            RParen,
        ];
        let output = Lexer::new(input.chars())
            .into_iter()
            .map(|t| t.kind)
            .collect::<Vec<TokenKind>>();
        assert_eq!(expected, output);
    }
}


================================================
FILE: 04_stlc/src/main.rs
================================================
#![allow(unused_variables)]
mod eval;
mod lexer;
mod parser;
mod term;
mod typing;
mod visitor;

use term::Term;
use typing::{Context, Type};

fn ev(ctx: &mut Context, term: Term) -> Result<Term, eval::Error> {
    let ty = match ctx.type_of(&term) {
        Ok(ty) => ty,
        Err(err) => {
            println!("Mistyped term {} => {:?}", term, err);
            return Err(eval::Error::NoRuleApplies);
        }
    };
    let r = eval::eval(&ctx, term)?;

    // This is safe by our typing inference/induction rules
    // any well typed term t (checked previously) that evaluates to
    // a term t' [ t -> t' ] is also well typed
    //
    // Furthermore,  Γ t:T, t ->* t' => t':T
    let ty_ = ctx.type_of(&r);
    // assert_eq!(ty_, ty);
    println!("===> {} -- {:?}\n", r, ty_);

    Ok(r)
}

fn parse(ctx: &mut Context, input: &str) {
    let mut p = parser::Parser::new(input);
    while let Some(tok) = p.parse_term() {
        let _ = ev(ctx, *tok);
    }

    let diag = p.diagnostic();
    if diag.error_count() > 0 {
        println!("\n{} error(s) detected while parsing!", diag.error_count());
        println!("{}", diag.emit());
    }
}

fn main() {
    let mut root: Context = Context::default();
    // parse(
    //     &mut root,
    //     "let not = (\\x: Bool. if x then false else true) in
    //      let x = not false in
    //      let y = not x in
    //      if y then succ 0 else succ succ 0",
    // );

    parse(&mut root, "let x = (\\y: Nat. y) in x");

    parse(&mut root, "(\\x: Nat. (\\y: Nat. iszero x)) (succ 0) 0");

    parse(
        &mut root,
        "(\\x: {a: Bool, b: Bool, c: Nat}. x.b) {a: true, b: false, c: 0}",
    );

    // parse(&mut root, "let not = \\x: Bool. if x then false else true in {a:
    // 0, b: \\x: Bool. not x, c: unit}.b "); parse(&mut root, "type Struct
    // = {valid: Bool, number: Nat}"); parse(&mut root, "(\\x: Struct.
    // x.number) {valid: true, number: succ 0}"); parse(
    //     &mut root,
    //     "(\\x: Struct. x.number) {valid: false, number: succ 0}",
    // )
    // dbg!(root);
}


================================================
FILE: 04_stlc/src/parser.rs
================================================
use crate::lexer::{Lexer, Token, TokenKind};
use crate::term::{Field, Term};
use crate::typing::{Record, RecordField, Type};
use std::collections::VecDeque;
use std::iter::Peekable;
use util::diagnostic::Diagnostic;
use util::span::*;

#[derive(Clone, Debug, Default)]
pub struct DeBruijnIndexer {
    inner: VecDeque<String>,
}

impl DeBruijnIndexer {
    pub fn push(&mut self, hint: String) -> usize {
        if self.inner.contains(&hint) {
            self.push(format!("{}'", hint))
        } else {
            let idx = self.inner.len();
            self.inner.push_front(hint);
            idx
        }
    }

    pub fn pop(&mut self) {
        self.inner.pop_front();
    }

    pub fn lookup(&self, key: &str) -> Option<usize> {
        for (idx, s) in self.inner.iter().enumerate() {
            if key == s {
                return Some(idx);
            }
        }
        None
    }
}

pub struct Parser<'s> {
    ctx: DeBruijnIndexer,
    diagnostic: Diagnostic<'s>,
    /// [`Lexer`] impls [`Iterator`] over [`TokenSpan`],
    /// so we can just directly wrap it in a [`Peekable`]
    lexer: Peekable<Lexer<'s>>,
    span: Span,
}

impl<'s> Parser<'s> {
    /// Create a new [`Parser`] for the input `&str`
    pub fn new(input: &'s str) -> Parser<'s> {
        Parser {
            ctx: DeBruijnIndexer::default(),
            diagnostic: Diagnostic::new(input),
            lexer: Lexer::new(input.chars()).peekable(),
            span: Span::dummy(),
        }
    }

    fn consume(&mut self) -> Option<Token> {
        let ts = self.lexer.next()?;
        self.span = ts.span;
        Some(ts)
    }

    fn expect(&mut self, kind: TokenKind) -> Option<Token> {
        let tk = self.consume()?;
        match &tk.kind {
            t if t == &kind => Some(tk),
            _ => {
                self.diagnostic
                    .push(format!("Expected token {:?}, found {:?}", kind, tk.kind), tk.span);
                None
            }
        }
    }

    fn expect_term(&mut self) -> Option<Box<Term>> {
        match self.term() {
            Some(term) => Some(term),
            None => {
                let sp = self.peek_span();
                self.diagnostic.push("Expected term".to_string(), sp);
                None
            }
        }
    }

    fn peek(&mut self) -> Option<TokenKind> {
        self.lexer.peek().map(|tk| tk.kind.clone())
    }

    fn peek_span(&mut self) -> Span {
        self.lexer.peek().map(|s| s.span).unwrap_or(self.span)
    }

    fn lambda(&mut self) -> Option<Box<Term>> {
        let start = self.expect(TokenKind::Lambda)?;

        // Bind variable into a new context before parsing the body
        let var = self.ident()?;
        self.ctx.push(var);

        let _ = self.expect(TokenKind::Colon)?;
        let ty = self.ty()?;
        let _ = self.expect(TokenKind::Proj)?;
        let body = self.term()?;

        // Return to previous context
        self.ctx.pop();
        Some(Term::Abs(ty, body).into())
    }

    fn let_expr(&mut self) -> Option<Box<Term>> {
        let start = self.expect(TokenKind::Let)?;
        let var = self.ident()?;
        self.ctx.push(var);
        let _ = self.expect(TokenKind::Equals)?;
        let bind = self.expect_term()?;
        let _ = self.expect(TokenKind::In)?;
        let body = self.expect_term()?;
        self.ctx.pop();
        Some(Term::Let(bind, body).into())
    }

    fn ty_record_field(&mut self) -> Option<RecordField> {
        let ident = self.ident()?;
        self.expect(TokenKind::Colon)?;
        let ty = self.ty()?;
        Some(RecordField {
            ident,
            ty: Box::new(ty),
        })
    }

    fn ty_atom(&mut self) -> Option<Type> {
        match &self.peek()? {
            TokenKind::TyBool => {
                self.consume()?;
                Some(Type::Bool)
            }
            TokenKind::TyNat => {
                self.consume()?;
                Some(Type::Nat)
            }
            TokenKind::TyUnit => {
                self.consume()?;
                Some(Type::Unit)
            }
            TokenKind::LBrace => {
                self.consume()?;
                let mut fields = vec![self.ty_record_field()?];
                while let Some(TokenKind::Comma) = self.peek() {
                    self.expect(TokenKind::Comma)?;
                    fields.push(self.ty_record_field()?);
                }
                self.expect(TokenKind::RBrace)?;
                Some(Type::Record(Record {
                    // span,
                    ident: String::new(),
                    fields,
                }))
            }
            TokenKind::LParen => {
                self.consume()?;
                let r = self.ty()?;
                self.expect(TokenKind::RParen)?;
                Some(r)
            }
            _ => None,
        }
    }

    fn ty(&mut self) -> Option<Type> {
        let span = self.span;
        let mut lhs = match self.ty_atom() {
            Some(ty) => ty,
            None => {
                let sp = self.peek_span();
                self.diagnostic.push("Expected type".to_string(), sp);
                return None;
            }
        };

        if let Some(TokenKind::TyArrow) = self.peek() {
            self.consume()?;
        }
        while let Some(rhs) = self.ty_atom() {
            lhs = Type::Arrow(Box::new(lhs), Box::new(rhs));
            if let Some(TokenKind::TyArrow) = self.peek() {
                self.consume()?;
            } else {
                break;
            }
        }
        Some(lhs)
    }

    /// Parse an application of form:
    /// application = atom application' | atom
    /// application' = atom application' | empty
    fn application(&mut self) -> Option<Box<Term>> {
        let mut lhs = self.atom()?;
        let span = self.span;
        while let Some(rhs) = self.atom() {
            lhs = Term::App(lhs, rhs).into();
        }

        if let Some(TokenKind::Proj) = self.peek() {
            self.expect(TokenKind::Proj)?;
            let accessor = self.ident()?;
            lhs = Term::Projection(lhs, accessor.into()).into();
        }
        Some(lhs)
    }

    fn ident(&mut self) -> Option<String> {
        let Token { kind, span } = self.consume()?;
        match kind {
            TokenKind::Ident(s) => Some(s),
            _ => {
                self.diagnostic
                    .push(format!("Expected identifier, found {:?}", kind), span);
                None
            }
        }
    }

    fn record_field(&mut self) -> Option<Field> {
        let span = self.span;
        let ident = self.ident()?;
        self.expect(TokenKind::Colon)?;
        let term = self.expect_term()?;

        Some(Field {
            span: span + self.span,
            ident,
            term,
        })
    }

    fn record(&mut self) -> Option<Box<Term>> {
        let mut fields = vec![self.record_field()?];
        let span = self.span;
        while let Some(TokenKind::Comma) = self.peek() {
            self.expect(TokenKind::Comma)?;
            fields.push(self.record_field()?);
        }
        Some(Term::Record(fields).into())
    }

    fn if_expr(&mut self) -> Option<Box<Term>> {
        let _ = self.expect(TokenKind::If)?;
        let guard = self.expect_term()?;
        let _ = self.expect(TokenKind::Then)?;
        let csq = self.expect_term()?;
        let _ = self.expect(TokenKind::Else)?;
        let alt = self.expect_term()?;
        Some(Term::If(guard, csq, alt).into())
    }

    /// Parse an atomic term
    /// LPAREN term RPAREN | var
    fn atom(&mut self) -> Option<Box<Term>> {
        match self.peek()? {
            TokenKind::True => {
                self.expect(TokenKind::True)?;
                Some(Term::True.into())
            }
            TokenKind::False => {
                self.expect(TokenKind::False)?;
                Some(Term::False.into())
            }
            TokenKind::If => self.if_expr(),
            TokenKind::Let => self.let_expr(),
            TokenKind::Nat(i) => {
                self.consume()?;
                Some(Term::Zero.into())
            }
            TokenKind::Succ => {
                self.expect(TokenKind::Succ)?;
                Some(Term::Succ(self.term()?).into())
            }
            TokenKind::Pred => {
                self.expect(TokenKind::Pred)?;
                Some(Term::Pred(self.term()?).into())
            }
            TokenKind::IsZero => {
                self.expect(TokenKind::IsZero)?;
                Some(Term::IsZero(self.term()?).into())
            }
            TokenKind::LParen => {
                self.expect(TokenKind::LParen)?;
                let term = self.term()?;
                self.expect(TokenKind::RParen)?;
                Some(term)
            }
            TokenKind::LBrace => {
                self.expect(TokenKind::LBrace)?;
                let term = self.record()?;
                self.expect(TokenKind::RBrace)?;
                Some(term)
            }
            TokenKind::Unit => {
                self.expect(TokenKind::Unit)?;
                Some(Term::Unit.into())
            }
            TokenKind::Lambda => self.lambda(),
            TokenKind::Ident(s) => {
                let sp = self.consume()?.span;
                match self.ctx.lookup(&s) {
                    Some(idx) => Some(Term::Var(idx).into()),
                    None => {
                        self.diagnostic.push(format!("Unbound variable {}", s), sp);
                        None
                    }
                }
            }
            _ => None,
        }
    }

    fn term(&mut self) -> Option<Box<Term>> {
        match self.peek()? {
            // TokenKind::Lambda => self.lambda(),
            _ => self.application(),
        }
    }

    pub fn parse_term(&mut self) -> Option<Box<Term>> {
        self.term()
    }

    pub fn diagnostic(self) -> Diagnostic<'s> {
        self.diagnostic
    }
}


================================================
FILE: 04_stlc/src/term.rs
================================================
use crate::typing::Type;
use std::fmt;
use util::span::Span;

#[derive(Clone, Debug, PartialEq, PartialOrd)]
pub struct Field {
    pub span: Span,
    pub ident: String,
    pub term: Box<Term>,
}

// pub enum Item {
//     Variant(VariantDecl),
//     Record(RecordDecl)
// }

#[derive(Clone, Debug, PartialEq, PartialOrd)]
pub enum Term {
    Unit,
    True,
    False,
    Zero,
    Succ(Box<Term>),
    Pred(Box<Term>),
    IsZero(Box<Term>),
    // DeBrujin index
    Var(usize),
    // Type of bound variable, and body of abstraction
    Abs(Type, Box<Term>),
    // Application (t1 t2)
    App(Box<Term>, Box<Term>),
    If(Box<Term>, Box<Term>, Box<Term>),
    Let(Box<Term>, Box<Term>),
    Record(Vec<Field>),
    Projection(Box<Term>, Box<String>),
}

pub fn record_access(fields: &[Field], projection: &str) -> Option<Box<Term>> {
    for f in fields {
        if f.ident == projection {
            return Some(f.term.clone());
        }
    }
    None
}

impl fmt::Display for Term {
    fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
        match self {
            Term::Unit => write!(f, "unit"),
            Term::True => write!(f, "true"),
            Term::False => write!(f, "false"),
            Term::Zero => write!(f, "Z"),
            Term::Succ(t) => write!(f, "S({})", t),
            Term::Pred(t) => write!(f, "P({})", t),
            Term::IsZero(t) => write!(f, "IsZero({})", t),
            Term::Var(idx) => write!(f, "#{}", idx),
            Term::Abs(ty, body) => write!(f, "λ_:{:?}. {}", ty, body),
            Term::App(t1, t2) => write!(f, "({}) {}", t1, t2),
            Term::If(a, b, c) => write!(f, "if {} then {} else {}", a, b, c),
            Term::Let(bind, body) => write!(f, "let x={} in {}", bind, body),
            Term::Record(rec) => write!(
                f,
                "{{{}}}",
                rec.iter()
                    .map(|x| format!("{}:{}", x.ident, x.term))
                    .collect::<Vec<String>>()
                    .join(",")
            ),
            Term::Projection(rec, idx) => write!(f, "{}.{}", rec, idx),
        }
    }
}


================================================
FILE: 04_stlc/src/typing.rs
================================================
use crate::term::Term;
use std::fmt;

#[derive(Clone, PartialEq, PartialOrd)]
pub enum Type {
    Unit,
    Bool,
    Nat,
    Arrow(Box<Type>, Box<Type>),
    Record(Record),
}

#[derive(Clone, PartialEq, PartialOrd)]
pub struct Record {
    // pub span: Span,
    pub ident: String,
    pub fields: Vec<RecordField>,
}

#[derive(Clone, PartialEq, PartialOrd)]
pub struct RecordField {
    // pub span: Span,
    pub ident: String,
    pub ty: Box<Type>,
}

impl fmt::Debug for Type {
    fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
        match self {
            Type::Unit => write!(f, "Unit"),
            Type::Bool => write!(f, "Bool"),
            Type::Nat => write!(f, "Nat"),
            Type::Arrow(a, b) => write!(f, "({:?}->{:?})", a, b),
            Type::Record(r) => write!(
                f,
                "{} {{{}}}",
                r.ident,
                r.fields
                    .iter()
                    .map(|x| format!("{}:{:?}", x.ident, x.ty))
                    .collect::<Vec<String>>()
                    .join(",")
            ),
        }
    }
}

#[derive(Clone, Debug, PartialEq, PartialOrd)]
pub enum TypeError {
    Guard,
    ArmMismatch,
    ParameterMismatch,
    UnknownVariable(usize),
    ExpectedArrow,
    InvalidProjection,
    NotRecordType,
}

#[derive(Clone, Debug, Default)]
/// A typing context, Γ
///
/// Much simpler than the binding list suggested in the book, and used
/// in the other directories, but this should be more efficient, and
/// a vec is really overkill here
pub struct Context<'a> {
    parent: Option<&'a Context<'a>>,
    ty: Option<Type>,
}

impl<'a> Context<'a> {
    pub fn add(&self, ty: Type) -> Context {
        if self.ty.is_none() {
            Context {
                parent: None,
                ty: Some(ty),
            }
        } else {
            Context {
                parent: Some(self),
                ty: Some(ty),
            }
        }
    }

    pub fn get(&self, idx: usize) -> Option<&Type> {
        if idx == 0 {
            self.ty.as_ref()
        } else if let Some(ctx) = self.parent {
            ctx.get(idx - 1)
        } else {
            None
        }
    }

    pub fn type_of(&self, term: &Term) -> Result<Type, TypeError> {
        use Term::*;
        match term {
            Unit => Ok(Type::Unit),
            True => Ok(Type::Bool),
            False => Ok(Type::Bool),
            Zero => Ok(Type::Nat),
            Record(fields) => {
                let fields: Vec<RecordField> = fields
                    .iter()
                    .map(|f| {
                        self.type_of(&f.term).map(|ty| {
                            RecordField {
                                // span: f.span,
                                ident: f.ident.clone(),
                                ty: Box::new(ty),
                            }
                        })
                    })
                    .collect::<Result<Vec<RecordField>, TypeError>>()?;

                Ok(Type::Record(crate::typing::Record {
                    // span: Span::dummy(),
                    ident: String::new(),
                    fields,
                }))
            }
            Projection(r, proj) => match self.type_of(r)? {
                Type::Record(self::Record { fields, .. }) => {
                    for f in &fields {
                        if &f.ident == proj.as_ref() {
                            return Ok(*f.ty.clone());
                        }
                    }
                    Err(TypeError::InvalidProjection)
                }
                _ => Err(TypeError::NotRecordType),
            },
            IsZero(t) => {
                if let Ok(Type::Nat) = self.type_of(t) {
                    Ok(Type::Bool)
                } else {
                    Err(TypeError::ParameterMismatch)
                }
            }
            Succ(t) | Pred(t) => {
                if let Ok(Type::Nat) = self.type_of(t) {
                    Ok(Type::Nat)
                } else {
                    Err(TypeError::ParameterMismatch)
                }
            }
            If(guard, csq, alt) => {
                if let Ok(Type::Bool) = self.type_of(guard) {
                    let ty1 = self.type_of(csq)?;
                    let ty2 = self.type_of(alt)?;
                    if ty1 == ty2 {
                        Ok(ty2)
                    } else {
                        Err(TypeError::ArmMismatch)
                    }
                } else {
                    Err(TypeError::Guard)
                }
            }
            Let(bind, body) => {
                let ty = self.type_of(bind)?;
                let ctx = self.add(ty);
                ctx.type_of(body)
            }
            Var(s) => match self.get(*s) {
                Some(ty) => Ok(ty.clone()),
                _ => Err(TypeError::UnknownVariable(*s)),
            },
            Abs(ty, body) => {
                let ctx = self.add(ty.clone());
                let ty_body = ctx.type_of(body)?;
                Ok(Type::Arrow(Box::new(ty.clone()), Box::new(ty_body)))
            }
            App(t1, t2) => {
                let ty1 = self.type_of(t1)?;
                let ty2 = self.type_of(t2)?;
                match ty1 {
                    Type::Arrow(ty11, ty12) => {
                        if *ty11 == ty2 {
                            Ok(*ty12)
                        } else {
                            Err(TypeError::ParameterMismatch)
                        }
                    }
                    _ => Err(TypeError::ExpectedArrow),
                }
            }
        }
    }
}

// impl<'a> Visitor for Context<'a> {
//     fn visit_var(&mut self, var: usize) {
//         self.get(var)
//             .cloned()
//             .ok_or(TypeError::UnknownVariable(var))
//     }

//     fn visit_abs(&mut self, ty: Type, body: &Term) {
//         let ty = match ty {
//             Type::Var(name) => self
//                 .types
//                 .borrow()
//                 .get(&name)
//                 .cloned()
//                 .ok_or(TypeError::Undefined(name))?,
//             x => x,
//         };
//         let mut ctx = self.add(ty.clone());
//         let ty_body: Result<Type, TypeError> = body.accept(&mut ctx);
//         Ok(Type::Arrow(Box::new(ty), Box::new(ty_body?)))
//     }

//     fn visit_app(&mut self, t1: &Term, t2: &Term) {
//         let ty1 = t1.accept(self)?;
//         let ty2 = t2.accept(self)?;
//         match ty1 {
//             Type::Arrow(ty11, ty12) => {
//                 if *ty11 == ty2 {
//                     Ok(*ty12)
//                 } else {
//                     Err(TypeError::ParameterMismatch)
//                 }
//             }
//             _ => Err(TypeError::ExpectedArrow),
//         }
//     }

//     fn visit_if(
//         &mut self,
//         guard: &Term,
//         csq: &Term,
//         alt: &Term,
//     ) {
//         if let Ok(Type::Bool) = guard.accept(self) {
//             let ty1 = csq.accept(self)?;
//             let ty2 = alt.accept(self)?;
//             if ty1 == ty2 {
//                 Ok(ty2)
//             } else {
//                 Err(TypeError::ArmMismatch)
//             }
//         } else {
//             Err(TypeError::Guard)
//         }
//     }

//     fn visit_let(&mut self, bind: &Term, body: &Term) {
//         // Dirty hack or correct behavior?
//         //
//         // We definitely need to correct var indices or how the context is
//         // working so that let binders can access names defined in an
//         // enclosing let-bound scope
//         let ty = bind
//             // .accept(&mut Shifting::new(Direction::Down))
//             .accept(self)?;
//         let mut ctx = self.add(ty);
//         body.accept(&mut ctx)
//     }

//     fn visit_succ(&mut self, t: &Term) {
//         Ok(Type::Nat)
//     }

//     fn visit_pred(&mut self, t: &Term) {
//         Ok(Type::Nat)
//     }

//     fn visit_iszero(&mut self, t: &Term) {
//         Ok(Type::Bool)
//     }

//     fn visit_const(&mut self, c: &Term) {
//         match c.as_ref() {
//             Term::Unit => Ok(Type::Unit),
//             Term::Zero => Ok(Type::Nat),
//             Term::True | Term::False => Ok(Type::Bool),
//             _ => unreachable!(),
//         }
//     }

//     fn visit_record(&mut self, rec: &[RecordField]) {
//         let tys = rec
//             .iter()
//             .map(|f| f.data.accept(self).map(|ty| (f.label.clone(), ty)))
//             .collect::<Result<Vec<(Rc<String>, Type)>, TypeError>>()?;
//         Ok(Type::Record(tys))
//     }

//     fn visit_proj(&mut self, c: &Term, proj: Rc<String>) {
//         match c.accept(self)? {
//             Type::Record(fields) => {
//                 for f in &fields {
//                     if f.0 == proj {
//                         return Ok(f.1.clone());
//                     }
//                 }
//                 Err(TypeError::InvalidProjection)
//             }
//             _ => Err(TypeError::NotRecordType),
//         }
//     }

//     fn visit_typedecl(&mut self, name: Rc<String>, ty: &Type) {
//         self.bind(name.to_string(), ty.clone());
//         Ok(Type::Unit)
//     }
// }


================================================
FILE: 04_stlc/src/visitor.rs
================================================
use super::*;
use crate::term::{Field, Term};
use std::default::Default;

pub trait Visitor: Sized {
    fn visit_var(&mut self, var: usize);
    fn visit_abs(&mut self, ty: Type, body: &Term);
    fn visit_app(&mut self, t1: &Term, t2: &Term);
    fn visit_if(&mut self, guard: &Term, csq: &Term, alt: &Term);
    fn visit_let(&mut self, bind: &Term, body: &Term);
    fn visit_succ(&mut self, t: &Term);
    fn visit_pred(&mut self, t: &Term);
    fn visit_iszero(&mut self, t: &Term);
    fn visit_const(&mut self, c: &Term);
    fn visit_record(&mut self, c: &[Field]);
    fn visit_proj(&mut self, c: &Term, proj: &str);
    fn visit_typedecl(&mut self, name: &str, ty: &Type);
}

pub trait MutVisitor: Sized {
    fn visit_var(&mut self, var: &mut Term) {}

    fn visit_abs(&mut self, ty: &mut Type, body: &mut Term) {
        self.visit_term(body);
    }
    fn visit_app(&mut self, t1: &mut Term, t2: &mut Term) {
        self.visit_term(t1);
        self.visit_term(t2);
    }
    fn visit_if(&mut self, guard: &mut Term, csq: &mut Term, alt: &mut Term) {
        self.visit_term(guard);
        self.visit_term(csq);
        self.visit_term(alt);
    }
    fn visit_let(&mut self, bind: &mut Term, body: &mut Term) {
        self.visit_term(bind);
        self.visit_term(body);
    }
    fn visit_succ(&mut self, t: &mut Term) {
        self.visit_term(t);
    }
    fn visit_pred(&mut self, t: &mut Term) {
        self.visit_term(t);
    }
    fn visit_iszero(&mut self, t: &mut Term) {
        self.visit_term(t);
    }
    fn visit_const(&mut self, t: &mut Term) {}
    fn visit_record(&mut self, c: &mut [Field]) {
        for t in c {
            self.visit_term(t.term.as_mut());
        }
    }
    fn visit_proj(&mut self, t: &mut Term, proj: &mut String) {
        self.visit_term(t);
    }
    fn visit_typedecl(&mut self, name: &mut String, ty: &mut Type) {}

    fn visit_term(&mut self, term: &mut Term) {
        walk_mut_term(self, term);
    }
}

fn walk_mut_term<V: MutVisitor>(visitor: &mut V, var: &mut Term) {
    match var {
        Term::Unit | Term::True | Term::False | Term::Zero => visitor.visit_const(var),
        Term::Succ(t) => visitor.visit_succ(t),
        Term::Pred(t) => visitor.visit_pred(t),
        Term::IsZero(t) => visitor.visit_iszero(t),
        Term::Var(_) => visitor.visit_var(var),
        Term::Abs(ty, body) => visitor.visit_abs(ty, body),
        Term::App(t1, t2) => visitor.visit_app(t1, t2),
        Term::If(a, b, c) => visitor.visit_if(a, b, c),
        Term::Let(bind, body) => visitor.visit_let(bind, body),
        Term::Record(rec) => visitor.visit_record(rec),
        Term::Projection(rec, idx) => visitor.visit_proj(rec, idx),
    }
}

#[derive(Copy, Clone, Debug)]
pub enum Direction {
    Up,
    Down,
}

#[derive(Copy, Clone, Debug)]
pub struct Shifting {
    pub cutoff: usize,
    pub direction: Direction,
}

impl Default for Shifting {
    fn default() -> Self {
        Shifting {
            cutoff: 0,
            direction: Direction::Up,
        }
    }
}

impl Shifting {
    pub fn new(direction: Direction) -> Self {
        Shifting { cutoff: 0, direction }
    }
}

impl MutVisitor for Shifting {
    fn visit_var(&mut self, var: &mut Term) {
        let n = match var {
            Term::Var(n) => n,
            _ => unreachable!(),
        };

        if *n >= self.cutoff {
            // NB: Substracting 1 from the usize here is safe, as long as
            // a shift Down is only called *after* a shift/substitute cycle
            match self.direction {
                Direction::Up => *n += 1,
                Direction::Down => *n -= 1,
            }
        }
    }

    fn visit_abs(&mut self, ty_: &mut Type, body: &mut Term) {
        self.cutoff += 1;
        self.visit_term(body);
        self.cutoff -= 1;
    }

    fn visit_let(&mut self, bind: &mut Term, body: &mut Term) {
        self.cutoff += 1;
        self.visit_term(bind);
        self.visit_term(body);
        self.cutoff -= 1;
    }
}

#[derive(Debug)]
pub struct Substitution {
    pub cutoff: usize,
    pub term: Term,
}

impl Substitution {
    pub fn new(term: Term) -> Substitution {
        Substitution { cutoff: 0, term }
    }
}

impl MutVisitor for Substitution {
    fn visit_var(&mut self, var: &mut Term) {
        match var {
            Term::Var(n) if *n >= self.cutoff => {
                *var = self.term.clone();
            }
            _ => unreachable!(),
        }
    }

    fn visit_abs(&mut self, ty_: &mut Type, body: &mut Term) {
        self.cutoff += 1;
        walk_mut_term(self, body);
        self.cutoff -= 1;
    }

    fn visit_let(&mut self, bind: &mut Term, body: &mut Term) {
        self.cutoff += 1;
        walk_mut_term(self, bind);
        walk_mut_term(self, body);
        self.cutoff -= 1;
    }
}


================================================
FILE: 05_recon/Cargo.toml
================================================
[package]
name = "recon"
version = "0.1.0"
authors = ["Michael Lazear <lazear@scripps.edu>"]
edition = "2018"

# See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html

[dependencies]
util = { path = "../util" }

================================================
FILE: 05_recon/src/disjoint.rs
================================================
//! A disjoint set using the union-find algorithm with path-compression

use std::cell::Cell;
use std::cmp::Ordering;
use std::collections::HashMap;

struct SetElement<T> {
    data: Option<T>,
    rank: Cell<u32>,
    parent: Cell<usize>,
}

pub struct DisjointSet<T> {
    elements: Vec<SetElement<T>>,
    components: Cell<usize>,
}

impl<T> Default for DisjointSet<T> {
    fn default() -> Self {
        DisjointSet {
            elements: Vec::new(),
            components: Cell::new(0),
        }
    }
}

#[derive(Copy, Clone, Debug, PartialEq, PartialOrd, Eq, Hash)]
pub struct Element(usize);

pub enum Choice {
    Left,
    Right,
}

impl<T> DisjointSet<T> {
    pub fn new() -> DisjointSet<T> {
        DisjointSet {
            elements: Vec::new(),
            components: Cell::new(0),
        }
    }

    pub fn singleton(&mut self, data: T) -> Element {
        let n = self.elements.len();
        let elem = SetElement {
            data: Some(data),
            rank: Cell::new(0),
            parent: Cell::new(n),
        };
        self.elements.push(elem);
        self.components.replace(self.components.get() + 1);
        Element(n)
    }

    fn find_set(&self, id: usize) -> usize {
        // locate parent set
        let mut ptr = id;
        while ptr != self.elements[ptr].parent.get() {
            ptr = self.elements[ptr].parent.get();
        }

        // id is the representative element, return
        if ptr == id {
            return id;
        }

        // perform path compression
        let parent = ptr;
        ptr = id;
        while ptr != self.elements[ptr].parent.get() {
            ptr = self.elements[ptr].parent.replace(parent);
        }
        parent
    }

    pub fn find_repr(&self, element: Element) -> Element {
        Element(self.find_set(element.0))
    }

    pub fn data(&self, element: Element) -> Option<&T> {
        self.elements[element.0].data.as_ref()
    }

    pub fn find(&self, element: Element) -> &T {
        // Invariant that the representative element is always "Some"
        self.elements[self.find_set(element.0)]
            .data
            .as_ref()
            .expect("Invariant violated")
    }

    pub fn union<F: Fn(T, T) -> T>(&mut self, f: F, a: Element, b: Element) {
        let pa = self.find_set(a.0);
        let pb = self.find_set(b.0);

        if pa == pb {
            return;
        }

        // Move data out first to appease borrowck
        let a_data = self.elements[pa].data.take().expect("Invariant violated");
        let b_data = self.elements[pb].data.take().expect("Invariant violated");

        self.components.replace(self.components.get() - 1);
        match self.elements[pa].rank.cmp(&self.elements[pb].rank) {
            Ordering::Equal => {
                self.elements[pa].data = Some(f(a_data, b_data));
                self.elements[pb].parent.replace(pa);
                self.elements[pa].rank.replace(self.elements[pa].rank.get() + 1);
            }
            Ordering::Less => {
                self.elements[pb].data = Some(f(a_data, b_data));
                self.elements[pa].parent.replace(pb);
                self.elements[pb].rank.replace(self.elements[pb].rank.get() + 1);
            }
            Ordering::Greater => {
                self.elements[pa].data = Some(f(a_data, b_data));
                self.elements[pb].parent.replace(pa);
                self.elements[pa].rank.replace(self.elements[pa].rank.get() + 1);
            }
        }
    }

    pub fn partition(&self) -> Vec<&T> {
        let mut v = HashSet::new();

        for idx in 0..self.elements.len() {
            v.insert(self.find_set(idx));
        }
        v.into_iter()
            .map(|idx| self.elements[idx].data.as_ref().unwrap())
            .collect()
    }
}

use super::*;
type Variable = Element;

#[derive(Debug, Clone)]
pub enum Unification {
    Unknown(TypeVar),
    Constr(Tycon, Vec<Variable>),
}

impl Unification {
    fn is_var(&self) -> bool {
        match self {
            Self::Unknown(_) => true,
            _ => false,
        }
    }
}

#[derive(Debug, Default)]
pub struct Unifier {
    set: disjoint::DisjointSet<Unification>,
    map: HashMap<Type, Variable>,
}

impl Unifier {
    pub fn new() -> Unifier {
        Unifier {
            set: DisjointSet::new(),
            map: HashMap::default(),
        }
    }

    pub fn occurs_check(&self, v: TypeVar, u: &Unification) -> bool {
        match u {
            Unification::Unknown(x) => *x == v,
            Unification::Constr(_, vars) => vars.iter().any(|x| self.occurs_check(v, self.set.find(*x))),
        }
    }

    pub fn decode(&self, uni: &Unification) -> Type {
        match uni {
            Unification::Unknown(x) => Type::Var(*x),
            Unification::Constr(tc, vars) => {
                Type::Con(*tc, vars.into_iter().map(|v| self.decode(self.set.find(*v))).collect())
            }
        }
    }

    pub fn intern(&mut self, ty: Type) -> Variable {
        if let Some(v) = self.map.get(&ty) {
            return *v;
        }

        let v = match &ty {
            Type::Var(x) => self.set.singleton(Unification::Unknown(*x)),
            Type::Con(tc, vars) => {
                let vars = vars.into_iter().cloned().map(|v| self.intern(v)).collect();
                self.set.singleton(Unification::Constr(*tc, vars))
            }
        };
        self.map.insert(ty, v);

        v
    }

    fn var_bind(&mut self, v: TypeVar, v_: Variable, u: &Unification, u_: Variable) -> Result<(), String> {
        if self.occurs_check(v, u) {
            return Err(format!("Failed occurs check {:?} {:?}", v, u));
        }
        self.set.union(
            |a, b| match (a, b) {
                (a @ Unification::Constr(_, _), _) => a,
                (_, b) => b,
            },
            u_,
            v_,
        );
        Ok(())
    }

    pub fn subst(&self) -> HashMap<TypeVar, Type> {
        let mut map = HashMap::new();
        for (ty, var) in &self.map {
            match ty {
                Type::Var(x) => {
                    map.insert(*x, self.decode(self.set.find(*var)));
                }
                _ => {}
            }
        }

        map
    }

    pub fn unify(&mut self, a_: Variable, b_: Variable) -> Result<(), String> {
        if a_ == b_ {
            return Ok(());
        }
        if a_ == self.set.find_repr(b_) || b_ == self.set.find_repr(a_) {
            return Ok(());
        }
        let a = self.set.find(a_).clone();
        let b = self.set.find(b_).clone();
        use Unification::*;
        match (a, b) {
            (Unknown(a), b) => self.var_bind(a, a_, &b, b_),
            (a, Unknown(b)) => self.var_bind(b, b_, &a, a_),
            (Constr(a, a_vars), Constr(b, b_vars)) => {
                if a != b {
                    return Err(format!("Can't unify constructors {:?} and {:?}", a, b));
                }
                if a_vars.len() != b_vars.len() {
                    return Err(format!("Can't unify argument lists {:?} and {:?}", a_vars, b_vars));
                }
                for (c, d) in a_vars.into_iter().zip(b_vars) {
                    self.set.union(
                        |a, b| match (a, b) {
                            (a @ Unification::Constr(_, _), _) => a,
                            (_, b) => b,
                        },
                        c,
                        d,
                    );
                }
                Ok(())
            }
        }
    }
}

pub fn solve<I: Iterator<Item = (Type, Type)>>(iter: I) -> Result<HashMap<TypeVar, Type>, String> {
    let mut un = Unifier::new();

    for (a, b) in iter {
        let a = un.intern(a);
        let b = un.intern(b);
        un.unify(a, b)?;
    }
    let mut map = HashMap::new();
    for (ty, var) in &un.map {
        match ty {
            Type::Var(x) => {
                map.insert(*x, un.decode(un.set.find(*var)));
            }
            _ => {}
        }
    }

    Ok(map)
}

impl<T: std::fmt::Debug> std::fmt::Debug for DisjointSet<T> {
    fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
        let part = self.partition();
        writeln!(f, "{{")?;
        for values in part {
            write!(f, "\t{:?}\n", values)?;
        }
        writeln!(f, "}}")
    }
}


================================================
FILE: 05_recon/src/main.rs
================================================
use std::collections::{HashMap, HashSet};
pub mod disjoint;
pub mod mutation;
pub mod naive;
pub mod parser;
pub mod types;

use types::*;

#[derive(Debug)]
pub enum Term {
    Unit,
    Bool(bool),
    Int(usize),
    Var(usize, String),
    Abs(Box<Term>),
    App(Box<Term>, Box<Term>),
    Let(Box<Term>, Box<Term>),
    If(Box<Term>, Box<Term>, Box<Term>),
}

#[derive(Debug)]
pub enum TypedTerm {
    Unit,
    Bool(bool),
    Int(usize),
    Var(usize, String),
    Abs(Box<SystemF>),
    App(Box<SystemF>, Box<SystemF>),
    Let(Box<SystemF>, Box<SystemF>),
    If(Box<SystemF>, Box<SystemF>, Box<SystemF>),
}

#[derive(Debug)]
pub struct SystemF<T = Type> {
    expr: TypedTerm,
    ty: T,
}

pub enum Constraint {
    Eq(Type, Type),
    Inst(Type, Scheme),
    Gen(Type, Vec<TypeVar>, Type),
}

#[derive(Default, Debug)]
struct Elaborator {
    exist: TypeVar,
    context: Vec<Scheme>,
    constraints: Vec<(Type, Type)>,

    uni: disjoint::Unifier,
}

impl SystemF {
    fn new(expr: TypedTerm, ty: Type) -> SystemF {
        SystemF { expr, ty }
    }

    fn de(self) -> (TypedTerm, Type) {
        (self.expr, self.ty)
    }
}

impl Elaborator {
    fn fresh(&mut self) -> TypeVar {
        let ex = self.exist;
        self.exist.0 += 1;
        ex
    }

    fn ftv(&self) -> HashSet<TypeVar> {
        let mut set = HashSet::new();
        for s in &self.context {
            set.extend(s.ftv());
        }
        set
    }

    fn get_scheme(&self, index: usize) -> Option<&Scheme> {
        for (idx, scheme) in self.context.iter().rev().enumerate() {
            if idx == index {
                return Some(scheme);
            }
        }
        None
    }

    fn generalize(&mut self, ty: Type) -> Scheme {
        let set: HashSet<TypeVar> = ty.ftv().difference(&self.ftv()).copied().collect();

        if set.is_empty() {
            Scheme::Mono(ty)
        } else {
            Scheme::Poly(set.into_iter().collect(), ty)
        }
    }

    fn instantiate(&mut self, scheme: Scheme) -> Type {
        match scheme {
            Scheme::Mono(ty) => ty,
            Scheme::Poly(vars, ty) => {
                let freshv: Vec<TypeVar> = (0..vars.len()).map(|_| self.fresh()).collect();
                let map = vars
                    .into_iter()
                    .zip(freshv.iter())
                    .map(|(v, f)| (v, Type::Var(*f)))
                    .collect::<HashMap<TypeVar, Type>>();
                ty.apply(&map)
            }
        }
    }

    fn push(&mut self, ty: (Type, Type)) {
        let a = self.uni.intern(ty.0);
        let b = self.uni.intern(ty.1);
        self.uni.unify(a, b).unwrap();
    }

    fn elaborate(&mut self, term: &Term) -> SystemF {
        // dbg!(term);
        match term {
            Term::Unit => SystemF::new(TypedTerm::Unit, Type::Con(T_UNIT, vec![])),
            Term::Bool(b) => SystemF::new(TypedTerm::Bool(*b), Type::Con(T_BOOL, vec![])),
            Term::Int(i) => SystemF::new(TypedTerm::Int(*i), Type::Con(T_INT, vec![])),
            // x has type T iff T is an instance of the type scheme associated with x
            Term::Var(x, s) => {
                let scheme = self.get_scheme(*x).cloned().expect("Unbound variable!");
                let ty = self.instantiate(scheme.clone());
                SystemF::new(TypedTerm::Var(*x, s.clone()), ty)
            }

            Term::Abs(body) => {
                let arg = self.fresh();

                self.context.push(Scheme::Mono(Type::Var(arg)));
                let (body, ty) = self.elaborate(body).de();
                self.context.pop();
                let arrow = Type::arrow(Type::Var(arg), ty.clone());
                SystemF::new(TypedTerm::Abs(Box::new(SystemF::new(body, ty))), arrow)
            }
            // t1 t2 has type T iff for some X2, t1 has type X2 -> T and t2 has type X2
            Term::App(t1, t2) => {
                let (t1, ty1) = self.elaborate(t1).de();
                let (t2, ty2) = self.elaborate(t2).de();

                let v = self.fresh();
                self.push((ty1.clone(), Type::arrow(ty2.clone(), Type::Var(v))));

                SystemF::new(
                    TypedTerm::App(Box::new(SystemF::new(t1, ty1)), Box::new(SystemF::new(t2, ty2))),
                    Type::Var(v),
                )
            }
            Term::Let(t1, t2) => {
                let (t1, ty1) = self.elaborate(t1).de();

                // let sub = disjoint::solve(self.constraints.drain(..)).unwrap();
                // for (a, b) in self.constraints.drain(..) {
                //     let a = self.uni.intern(a);
                //     let b = self.uni.intern(b);
                //     self.uni.unify(a, b).unwrap();
                // }

                let sub = self.uni.subst();
                self.context = self.context.drain(..).map(|sch| sch.apply(&sub)).collect();
                let scheme = self.generalize(ty1.clone().apply(&sub));

                self.context.push(scheme);
                let (t2, ty2) = self.elaborate(t2).de();
                self.context.pop();
                SystemF::new(
                    TypedTerm::Let(Box::new(SystemF::new(t1, ty1)), Box::new(SystemF::new(t2, ty2.clone()))),
                    ty2,
                )
            }
            Term::If(t1, t2, t3) => {
                let (t1, ty1) = self.elaborate(t1).de();
                let (t2, ty2) = self.elaborate(t2).de();
                let (t3, ty3) = self.elaborate(t3).de();

                let fresh = self.fresh();
                self.push((ty1.clone(), Type::bool()));
                self.push((ty2.clone(), Type::Var(fresh)));
                self.push((ty3.clone(), Type::Var(fresh)));

                SystemF::new(
                    TypedTerm::If(
                        Box::new(SystemF::new(t1, ty1)),
                        Box::new(SystemF::new(t2, ty2)),
                        Box::new(SystemF::new(t3, ty3)),
                    ),
                    Type::Var(fresh),
                )
            }
        }
    }
}

impl TypedTerm {
    fn subst(self, s: &HashMap<TypeVar, Type>) -> TypedTerm {
        use TypedTerm::*;
        match self {
            Abs(a) => Abs(Box::new(a.subst(s))),
            App(a, b) => App(Box::new(a.subst(s)), Box::new(b.subst(s))),
            Let(a, b) => Let(Box::new(a.subst(s)), Box::new(b.subst(s))),
            If(a, b, c) => If(Box::new(a.subst(s)), Box::new(b.subst(s)), Box::new(c.subst(s))),
            x => x,
        }
    }
}

impl SystemF {
    fn subst(self, s: &HashMap<TypeVar, Type>) -> SystemF {
        SystemF {
            expr: self.expr.subst(s),
            ty: self.ty.apply(s),
        }
    }
}

fn main() {
    use std::io::prelude::*;
    use std::time::{Duration, Instant};

    let input = "fn m. let y = m in let x = y true in x";
    let input = "
    let id = fn x. x in 
        let g = id id in 
        let f = id true in 
        let h = (id id) 1 in 
        let j = id 10 in 
        g f";
    let tm = parser::Parser::new(input).parse_term().unwrap();

    let start = Instant::now();
    let mut gen = mutation::Elaborator::default();
    let tm = gen.elaborate(&tm);
    // let sub = gen.uni.subst();
    // let sub =  disjoint.solve(gen.constraints);
    // let sub = disjoint::solve(gen.constraints.into_iter());
    let end1 = start.elapsed().as_micros();
    println!("{:?} {:?}", end1, tm);

    loop {
        let mut buffer = String::new();
        print!("repl: ");
        std::io::stdout().flush().unwrap();
        std::io::stdin().read_to_string(&mut buffer).unwrap();
        // let mut gen = Elaborator::default();
        match parser::Parser::new(&buffer).parse_term() {
            Some(tm) => {
                // let (tm, ty) = gen.elaborate(&tm).de();

                let mut e = mutation::Elaborator::default();
                dbg!(e.elaborate(&tm));

                // let mut sub = HashMap::new();
                // println!("{:?}", gen.constraints);
                // for (a, b) in &gen.constraints {
                //     let tmp = unify(a.clone().apply(&sub), b.clone().apply(&sub)).unwrap();
                //     sub = compose(tmp, sub);
                // }
                // let sub =  disjoint::solve(gen.constraints.clone());
                // println!("{:?}", sub);
                // println!("tm {:#?} :{:?}", tm, ty);

                // println!("tm {:#?} :{:?}", tm.subst(&sub), ty.apply(&sub));

                // dbg!(sub);
            }
            None => println!("parse error!"),
        }
    }
}


================================================
FILE: 05_recon/src/mutation/mod.rs
================================================
use super::{Term, T_ARROW, T_BOOL, T_INT, T_UNIT};
use std::collections::{HashMap, HashSet, VecDeque};
use std::rc::Rc;

mod write_once;
use write_once::WriteOnce;

#[derive(Debug, Clone, PartialEq)]
pub struct TypeVar {
    exist: usize,
    data: Rc<WriteOnce<Type>>,
}

#[derive(Debug, Clone, PartialEq)]
pub enum Type {
    Var(TypeVar),
    Con(super::Tycon, Vec<Type>),
}

#[derive(Debug, Clone)]
pub enum Scheme {
    Mono(Type),
    Poly(Vec<usize>, Type),
}

#[derive(Debug)]
pub enum TypedTerm {
    Unit,
    Bool(bool),
    Int(usize),
    Var(usize, String),
    Abs(Box<SystemF>),
    App(Box<SystemF>, Box<SystemF>),
    Let(Box<SystemF>, Box<SystemF>),
    If(Box<SystemF>, Box<SystemF>, Box<SystemF>),
}

#[derive(Debug)]
pub struct SystemF {
    expr: TypedTerm,
    ty: Type,
}

impl SystemF {
    fn new(expr: TypedTerm, ty: Type) -> SystemF {
        SystemF { expr, ty }
    }
}

impl Type {
    fn ftv(&self, rank: usize) -> HashSet<usize> {
        let mut set = HashSet::new();
        let mut queue = VecDeque::new();
        queue.push_back(self);

        while let Some(ty) = queue.pop_front() {
            match ty {
                Type::Var(x) => match x.data.get() {
                    None => {
                        if x.data.get_rank() > rank {
                            set.insert(x.exist);
                        }
                    }
                    Some(link) => {
                        queue.push_back(link);
                    }
                },
                Type::Con(_, tys) => {
                    for ty in tys {
                        queue.push_back(ty);
                    }
                }
            }
        }
        set
    }

    fn apply(self, map: &HashMap<usize, Type>) -> Type {
        match self {
            Type::Var(x) => match x.data.get() {
                Some(ty) => ty.clone().apply(map),
                None => map.get(&x.exist).cloned().unwrap_or(Type::Var(x)),
            },
            Type::Con(tc, vars) => Type::Con(tc, vars.into_iter().map(|ty| ty.apply(map)).collect()),
        }
    }
}

impl Type {
    pub fn arrow(a: Type, b: Type) -> Type {
        Type::Con(T_ARROW, vec![a, b])
    }

    pub fn bool() -> Type {
        Type::Con(T_BOOL, vec![])
    }

    pub fn de_arrow(&self) -> (&Type, &Type) {
        match self {
            Type::Con(T_ARROW, v) => (&v[0], &v[1]),
            _ => panic!("Not arrow type! {:?}", self),
        }
    }
}

pub fn occurs_check(v: &TypeVar, ty: &Type) -> bool {
    match ty {
        Type::Var(x) => {
            if let Some(info) = x.data.get() {
                occurs_check(v, &info)
            } else {
                let min_rank = x.data.get_rank().min(v.data.get_rank());
                if min_rank != x.data.get_rank() {
                    println!("promoting type var {:?} {}->{}", x, x.data.get_rank(), min_rank);
                    x.data.set_rank(min_rank);
                }

                x.exist == v.exist
            }
        }
        Type::Con(_, vars) => vars.iter().any(|x| occurs_check(v, x)),
    }
}

fn var_bind(v: &TypeVar, ty: &Type) -> Result<(), String> {
    if occurs_check(&v, ty) {
        return Err(format!("Failed occurs check {:?} {:?}", v, ty));
    }

    v.data.set(ty.clone()).unwrap();
    Ok(())
}

fn unify_type(a: &Type, b: &Type) -> Result<(), String> {
    match (a, b) {
        (Type::Var(a), b) => match a.data.get() {
            Some(ty) => unify_type(ty, b),
            None => var_bind(a, b),
        },
        (a, Type::Var(b)) => match b.data.get() {
            Some(ty) => unify_type(a, ty),
            None => var_bind(b, a),
        },
        (Type::Con(a, a_args), Type::Con(b, b_args)) => {
            if a != b {
                return Err(format!("Can't unify constructors {:?} and {:?}", a, b));
            }
            if a_args.len() != b_args.len() {
                return Err(format!("Can't unify argument lists {:?} and {:?}", a_args, b_args));
            }
            for (c, d) in a_args.into_iter().zip(b_args) {
                unify_type(c, d)?;
            }
            Ok(())
        }
    }
}

#[derive(Default, Debug)]
pub struct Elaborator {
    exist: usize,
    rank: usize,
    context: Vec<Scheme>,
}

impl Elaborator {
    fn fresh(&mut self) -> TypeVar {
        let ex = self.exist;
        self.exist += 1;
        TypeVar {
            exist: ex,
            data: Rc::new(WriteOnce::with_rank(self.rank)),
        }
    }

    fn get_scheme(&self, index: usize) -> Option<&Scheme> {
        for (idx, scheme) in self.context.iter().rev().enumerate() {
            if idx == index {
                return Some(scheme);
            }
        }
        None
    }

    fn generalize(&mut self, ty: Type) -> Scheme {
        let set: HashSet<usize> = ty.ftv(self.rank);

        if set.is_empty() {
            Scheme::Mono(ty)
        } else {
            Scheme::Poly(set.into_iter().collect(), ty)
        }
    }

    fn instantiate(&mut self, scheme: Scheme) -> Type {
        match scheme {
            Scheme::Mono(ty) => ty,
            Scheme::Poly(vars, ty) => {
                let map = vars
                    .into_iter()
                    .map(|v| (v, Type::Var(self.fresh())))
                    .collect::<HashMap<usize, Type>>();
                ty.apply(&map)
            }
        }
    }

    pub fn elaborate(&mut self, term: &Term) -> SystemF {
        match term {
            Term::Unit => SystemF::new(TypedTerm::Unit, Type::Con(T_UNIT, vec![])),
            Term::Bool(b) => SystemF::new(TypedTerm::Bool(*b), Type::Con(T_BOOL, vec![])),
            Term::Int(i) => SystemF::new(TypedTerm::Int(*i), Type::Con(T_INT, vec![])),

            Term::Var(x, s) => {
                let scheme = self.get_scheme(*x).cloned().expect("Unbound variable!");
                let ty = self.instantiate(scheme.clone());
                SystemF::new(TypedTerm::Var(*x, s.clone()), ty)
            }
            Term::Abs(body) => {
                let arg = self.fresh();

                self.context.push(Scheme::Mono(Type::Var(arg.clone())));
                let body = self.elaborate(body);
                self.context.pop();
                let arrow = Type::arrow(Type::Var(arg), body.ty.clone());
                SystemF::new(TypedTerm::Abs(Box::new(body)), arrow)
            }
            Term::App(t1, t2) => {
                let t1 = self.elaborate(t1);
                let t2 = self.elaborate(t2);

                let v = self.fresh();

                unify_type(&t1.ty, &Type::arrow(t2.ty.clone(), Type::Var(v.clone()))).unwrap();

                SystemF::new(TypedTerm::App(Box::new(t1), Box::new(t2)), Type::Var(v))
            }
            Term::Let(t1, t2) => {
                self.rank += 1;
                let t1 = self.elaborate(t1);
                self.rank -= 1;

                let scheme = self.generalize(t1.ty.clone());

                self.context.push(scheme);
                let t2 = self.elaborate(t2);
                self.context.pop();
                let ty = t2.ty.clone();
                SystemF::new(TypedTerm::Let(Box::new(t1), Box::new(t2)), ty)
            }
            Term::If(t1, t2, t3) => {
                let t1 = self.elaborate(t1);
                let t2 = self.elaborate(t2);
                let t3 = self.elaborate(t3);

                unify_type(&t1.ty, &Type::bool()).unwrap();
                unify_type(&t2.ty, &t3.ty).unwrap();

                let ty = t2.ty.clone();
                SystemF::new(TypedTerm::If(Box::new(t1), Box::new(t2), Box::new(t3)), ty)
            }
        }
    }
}


================================================
FILE: 05_recon/src/mutation/write_once.rs
================================================
use std::cell::{Cell, UnsafeCell};
use std::rc::Rc;
use std::sync::atomic::{AtomicBool, Ordering};

pub struct WriteOnce<T> {
    inner: UnsafeCell<Option<T>>,
    rank: Cell<usize>,
    init: AtomicBool,
}

pub type WriteOnceCell<T> = Rc<WriteOnce<T>>;

impl<T> Default for WriteOnce<T> {
    fn default() -> Self {
        WriteOnce {
            inner: UnsafeCell::new(None),
            rank: Cell::new(0),
            init: false.into(),
        }
    }
}

impl<T: PartialEq> PartialEq for WriteOnce<T> {
    fn eq(&self, other: &Self) -> bool {
        self.get() == other.get()
    }
}

impl<T: std::fmt::Debug> std::fmt::Debug for WriteOnce<T> {
    fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
        write!(f, "{:?}#{}", self.get(), self.get_rank())
    }
}

impl<T> WriteOnce<T> {
    pub fn with_rank(rank: usize) -> Self {
        WriteOnce {
            inner: UnsafeCell::new(None),
            rank: Cell::new(rank),
            init: false.into(),
        }
    }

    pub fn set(&self, data: T) -> Result<(), T> {
        if !self.init.compare_and_swap(false, true, Ordering::Acquire) {
            unsafe {
                let ptr = &mut *self.inner.get();
                *ptr = Some(data);
            }
            Ok(())
        } else {
            Err(data)
        }
    }

    pub fn get(&self) -> Option<&T> {
        if !self.init.compare_and_swap(false, false, Ordering::Release) {
            None
        } else {
            unsafe { &*self.inner.get() }.as_ref()
        }
    }

    pub fn set_rank(&self, rank: usize) {
        self.rank.set(rank)
    }

    pub fn get_rank(&self) -> usize {
        self.rank.get()
    }
}

#[cfg(test)]
mod tests {
    use super::*;
    #[test]
    fn smoke() {
        let cell = WriteOnce::default();
        assert_eq!(cell.get(), None);
        assert_eq!(cell.set(10), Ok(()));
        assert_eq!(cell.set(12), Err(12));
        assert_eq!(cell.get(), Some(&10));
    }

    #[test]
    fn smoke_shared() {
        let cell = Rc::new(WriteOnce::default());
        let rc1 = cell.clone();
        let rc2 = cell.clone();

        assert_eq!(rc2.get(), None);
        rc1.set(12).unwrap();
        assert_eq!(rc2.get(), Some(&12));
        assert_eq!(rc2.set(10), Err(10));
    }
}


================================================
FILE: 05_recon/src/naive.rs
================================================
use super::*;

fn var_bind(var: TypeVar, ty: Type) -> Result<HashMap<TypeVar, Type>, String> {
    if ty.occurs(var) {
        return Err(format!("Fails occurs check! {:?} {:?}", var, ty));
    }
    let mut sub = HashMap::new();
    match ty {
        Type::Var(x) if x == var => {}
        _ => {
            sub.insert(var, ty);
        }
    }
    Ok(sub)
}

pub fn unify(a: Type, b: Type) -> Result<HashMap<TypeVar, Type>, String> {
    // println!("{:?} {:?}", a, b);
    match (a, b) {
        (Type::Con(a, a_args), Type::Con(b, b_args)) => {
            if a_args.len() == b_args.len() && a == b {
                solve(a_args.into_iter().zip(b_args.into_iter()))
            } else {
                Err(format!(
                    "Can't unify types: {:?} {:?}",
                    Type::Con(a, a_args),
                    Type::Con(b, b_args)
                ))
            }
        }
        (Type::Var(tv), b) => var_bind(tv, b),
        (a, Type::Var(tv)) => var_bind(tv, a),
    }
}

pub fn solve<I: Iterator<Item = (Type, Type)>>(iter: I) -> Result<HashMap<TypeVar, Type>, String> {
    let mut sub = HashMap::new();
    for (a, b) in iter {
        let tmp = unify(a.clone().apply(&sub), b.clone().apply(&sub))?;
        sub = compose(tmp, sub);
    }
    Ok(sub)
}


================================================
FILE: 05_recon/src/parser.rs
================================================
use super::Term;
use std::char;
use std::collections::VecDeque;
use std::iter::Peekable;
use std::str::Chars;
use util::span::{Location, Span};

#[derive(Clone, Debug, PartialEq, PartialOrd)]
pub enum TokenKind {
    Ident(String),
    Int(u32),
    Unit,
    Lambda,
    Let,
    Equals,
    In,
    Dot,
    If,
    Then,
    Else,
    True,
    False,
    LParen,
    RParen,
    Invalid(char),
    Eof,
}

#[derive(Clone, Debug, PartialEq, PartialOrd)]
pub struct Token {
    pub kind: TokenKind,
    pub span: Span,
}

impl Token {
    pub const fn new(kind: TokenKind, span: Span) -> Token {
        Token { kind, span }
    }
}

#[derive(Clone)]
pub struct Lexer<'s> {
    input: Peekable<Chars<'s>>,
    current: Location,
}

impl<'s> Lexer<'s> {
    pub fn new(input: Chars<'s>) -> Lexer<'s> {
        Lexer {
            input: input.peekable(),
            current: Location {
                line: 0,
                col: 0,
                abs: 0,
            },
        }
    }

    /// Peek at the next [`char`] in the input stream
    fn peek(&mut self) -> Option<char> {
        self.input.peek().cloned()
    }

    /// Consume the next [`char`] and advance internal source position
    fn consume(&mut self) -> Option<char> {
        match self.input.next() {
            Some('\n') => {
                self.current.line += 1;
                self.current.col = 0;
                self.current.abs += 1;
                Some('\n')
            }
            Some(ch) => {
                self.current.col += 1;
                self.current.abs += 1;
                Some(ch)
            }
            None => None,
        }
    }

    /// Consume characters from the input stream while pred(peek()) is true,
    /// collecting the characters into a string.
    fn consume_while<F: Fn(char) -> bool>(&mut self, pred: F) -> (String, Span) {
        let mut s = String::new();
        let start = self.current;
        while let Some(n) = self.peek() {
            if pred(n) {
                match self.consume() {
                    Some(ch) => s.push(ch),
                    None => break,
                }
            } else {
                break;
            }
        }
        (s, Span::new(start, self.current))
    }

    /// Eat whitespace
    fn consume_delimiter(&mut self) {
        let _ = self.consume_while(char::is_whitespace);
    }

    /// Lex a natural number
    fn number(&mut self) -> Token {
        // Since we peeked at least one numeric char, we should always
        // have a string containing at least 1 single digit, as such
        // it is safe to call unwrap() on str::parse<u32>
        let (data, span) = self.consume_while(char::is_numeric);
        let n = data.parse::<u32>().unwrap();
        Token::new(TokenKind::Int(n), span)
    }

    /// Lex a reserved keyword or an identifier
    fn keyword(&mut self) -> Token {
        let (data, span) = self.consume_while(|ch: char| ch.is_ascii_alphanumeric());
        let kind = match data.as_ref() {
            "unit" => TokenKind::Unit,
            "let" => TokenKind::Let,
            "in" => TokenKind::In,
            "fn" => TokenKind::Lambda,
            "if" => TokenKind::If,
            "then" => TokenKind::Then,
            "else" => TokenKind::Else,
            "true" => TokenKind::True,
            "false" => TokenKind::False,
            _ => TokenKind::Ident(data),
        };
        Token::new(kind, span)
    }

    /// Consume the next input character, expecting to match `ch`.
    /// Return a [`TokenKind::Invalid`] if the next character does not match,
    /// or the argument `kind` if it does
    fn eat(&mut self, ch: char, kind: TokenKind) -> Token {
        let loc = self.current;
        // Lexer::eat() should only be called internally after calling peek()
        // so we know that it's safe to unwrap the result of Lexer::consume()
        let n = self.consume().unwrap();
        let kind = if n == ch { kind } else { TokenKind::Invalid(n) };
        Token::new(kind, Span::new(loc, self.current))
    }

    /// Return the next lexeme in the input as a [`Token`]
    pub fn lex(&mut self) -> Token {
        self.consume_delimiter();
        let next = match self.peek() {
            Some(ch) => ch,
            None => return Token::new(TokenKind::Eof, Span::dummy()),
        };
        match next {
            x if x.is_ascii_alphabetic() => self.keyword(),
            x if x.is_numeric() => self.number(),
            '(' => self.eat('(', TokenKind::LParen),
            ')' => self.eat(')', TokenKind::RParen),
            '\\' => self.eat('\\', TokenKind::Lambda),
            'λ' => self.eat('λ', TokenKind::Lambda),
            '.' => self.eat('.', TokenKind::Dot),
            '=' => self.eat('=', TokenKind::Equals),
            ch => self.eat(' ', TokenKind::Invalid(ch)),
        }
    }
}

impl<'s> Iterator for Lexer<'s> {
    type Item = Token;
    fn next(&mut self) -> Option<Self::Item> {
        match self.lex() {
            Token {
                kind: TokenKind::Eof, ..
            } => None,
            tok => Some(tok),
        }
    }
}

#[derive(Clone, Debug, Default)]
pub struct DeBruijnIndexer {
    inner: VecDeque<String>,
}

impl DeBruijnIndexer {
    pub fn push(&mut self, hint: String) -> usize {
        if self.inner.contains(&hint) {
            self.push(format!("{}'", hint))
        } else {
            let idx = self.inner.len();
            self.inner.push_front(hint);
            idx
        }
    }

    pub fn pop(&mut self) {
        self.inner.pop_front();
    }

    pub fn lookup(&self, key: &str) -> Option<usize> {
        for (idx, s) in self.inner.iter().enumerate() {
            if key == s {
                return Some(idx);
            }
        }
        None
    }
}

pub struct Parser<'s> {
    ctx: DeBruijnIndexer,
    /// [`Lexer`] impls [`Iterator`] over [`TokenSpan`],
    /// so we can just directly wrap it in a [`Peekable`]
    lexer: Peekable<Lexer<'s>>,
    span: Span,
}

impl<'s> Parser<'s> {
    /// Create a new [`Parser`] for the input `&str`
    pub fn new(input: &'s str) -> Parser<'s> {
        Parser {
            ctx: DeBruijnIndexer::default(),
            lexer: Lexer::new(input.chars()).peekable(),
            span: Span::dummy(),
        }
    }

    fn consume(&mut self) -> Option<Token> {
        let ts = self.lexer.next()?;
        self.span = ts.span;
        Some(ts)
    }

    fn expect(&mut self, kind: TokenKind) -> Option<Token> {
        let tk = self.consume()?;
        match &tk.kind {
            t if t == &kind => Some(tk),
            _ => {
                eprintln!("Expected token {:?}, found {:?}", kind, tk.kind);
                None
            }
        }
    }

    fn expect_term(&mut self) -> Option<Box<Term>> {
        match self.term() {
            Some(term) => Some(term),
            None => {
                let sp = self.peek_span();
                eprintln!("Expected term  at {:?}", sp);
                None
            }
        }
    }

    fn peek(&mut self) -> Option<TokenKind> {
        self.lexer.peek().map(|tk| tk.kind.clone())
    }

    fn peek_span(&mut self) -> Span {
        self.lexer.peek().map(|s| s.span).unwrap_or(self.span)
    }

    fn lambda(&mut self) -> Option<Box<Term>> {
        let start = self.expect(TokenKind::Lambda)?;

        // Bind variable into a new context before parsing the body
        let var = self.ident()?;
        self.ctx.push(var);
        let _ = self.expect(TokenKind::Dot)?;
        let body = self.term()?;

        // Return to previous context
        self.ctx.pop();
        Some(Term::Abs(body).into())
    }

    fn let_expr(&mut self) -> Option<Box<Term>> {
        let start = self.expect(TokenKind::Let)?;
        let var = self.ident()?;

        let _ = self.expect(TokenKind::Equals)?;
        let bind = self.expect_term()?;
        self.ctx.push(var);
        let _ = self.expect(TokenKind::In)?;
        let body = self.expect_term()?;
        self.ctx.pop();
        Some(Term::Let(bind, body).into())
    }

    /// Parse an application of form:
    /// application = atom application' | atom
    /// application' = atom application' | empty
    fn application(&mut self) -> Option<Box<Term>> {
        let mut lhs = self.atom()?;
        let span = self.span;
        while let Some(rhs) = self.atom() {
            lhs = Term::App(lhs, rhs).into();
        }
        Some(lhs)
    }

    fn ident(&mut self) -> Option<String> {
        let Token { kind, span } = self.consume()?;
        match kind {
            TokenKind::Ident(s) => Some(s),
            _ => {
                eprintln!("Expected identifier, found {:?}", kind);
                None
            }
        }
    }

    fn if_expr(&mut self) -> Option<Box<Term>> {
        let _ = self.expect(TokenKind::If)?;
        let guard = self.expect_term()?;
        let _ = self.expect(TokenKind::Then)?;
        let csq = self.expect_term()?;
        let _ = self.expect(TokenKind::Else)?;
        let alt = self.expect_term()?;
        Some(Term::If(guard, csq, alt).into())
    }

    /// Parse an atomic term
    /// LPAREN term RPAREN | var
    fn atom(&mut self) -> Option<Box<Term>> {
        match self.peek()? {
            TokenKind::Let => self.let_expr(),
            TokenKind::Int(i) => {
                self.consume()?;
                Some(Term::Int(i as usize).into())
            }
            TokenKind::True => {
                self.consume();
                Some(Term::Bool(true).into())
            }
            TokenKind::False => {
                self.consume();
                Some(Term::Bool(false).into())
            }
            TokenKind::LParen => {
                self.expect(TokenKind::LParen)?;
                let term = self.term()?;
                self.expect(TokenKind::RParen)?;
                Some(term)
            }
            TokenKind::Unit => {
                self.expect(TokenKind::Unit)?;
                Some(Term::Unit.into())
            }
            TokenKind::If => self.if_expr(),
            TokenKind::Lambda => self.lambda(),
            TokenKind::Ident(s) => {
                let sp = self.consume()?.span;
                match self.ctx.lookup(&s) {
                    Some(idx) => Some(Term::Var(idx, s).into()),
                    None => {
                        eprintln!("Unbound variable {}", s);
                        None
                    }
                }
            }
            _ => None,
        }
    }

    fn term(&mut self) -> Option<Box<Term>> {
        match self.peek()? {
            // TokenKind::Lambda => self.lambda(),
            _ => self.application(),
        }
    }

    pub fn parse_term(&mut self) -> Option<Box<Term>> {
        self.term()
    }
}


================================================
FILE: 05_recon/src/types.rs
================================================
use std::collections::{HashMap, HashSet, VecDeque};

#[derive(Copy, Clone, Default, PartialEq, PartialOrd, Eq, Hash)]
pub struct TypeVar(pub u32, pub u32);

#[derive(Copy, Clone, PartialEq, PartialOrd, Eq, Hash)]
pub struct Tycon {
    id: usize,
    arity: usize,
}

#[derive(Clone, PartialEq, PartialOrd, Eq, Hash)]
pub enum Type {
    Var(TypeVar),
    Con(Tycon, Vec<Type>),
}

#[derive(Debug, Clone)]
pub enum Scheme {
    Mono(Type),
    Poly(Vec<TypeVar>, Type),
}

pub trait Substitution {
    fn ftv(&self) -> HashSet<TypeVar>;
    fn apply(self, s: &HashMap<TypeVar, Type>) -> Self;
}

impl Substitution for Type {
    fn ftv(&self) -> HashSet<TypeVar> {
        let mut set = HashSet::new();
        let mut queue = VecDeque::new();
        queue.push_back(self);

        while let Some(ty) = queue.pop_front() {
            match ty {
                Type::Var(x) => {
                    set.insert(*x);
                }
                Type::Con(_, tys) => {
                    for ty in tys {
                        queue.push_back(ty);
                    }
                }
            }
        }
        set
    }

    fn apply(self, map: &HashMap<TypeVar, Type>) -> Type {
        match self {
            Type::Var(x) => map.get(&x).cloned().unwrap_or(Type::Var(x)),
            Type::Con(tc, vars) => Type::Con(tc, vars.into_iter().map(|ty| ty.apply(map)).collect()),
        }
    }
}

impl Type {
    pub fn arrow(a: Type, b: Type) -> Type {
        Type::Con(T_ARROW, vec![a, b])
    }

    pub fn bool() -> Type {
        Type::Con(T_BOOL, vec![])
    }

    pub fn occurs(&self, exist: TypeVar) -> bool {
        match self {
            Type::Var(x) => *x == exist,
            Type::Con(_, tys) => tys.iter().any(|ty| ty.occurs(exist)),
        }
    }

    pub fn de_arrow(&self) -> (&Type, &Type) {
        match self {
            Type::Con(T_ARROW, v) => (&v[0], &v[1]),
            _ => panic!("Not arrow type! {:?}", self),
        }
    }
}

pub fn compose(s1: HashMap<TypeVar, Type>, s2: HashMap<TypeVar, Type>) -> HashMap<TypeVar, Type> {
    let mut s2 = s2
        .into_iter()
        .map(|(k, v)| (k, v.apply(&s1)))
        .collect::<HashMap<TypeVar, Type>>();
    for (k, v) in s1 {
        if !s2.contains_key(&k) {
            s2.insert(k, v);
        }
    }
    s2
}

impl Substitution for Scheme {
    fn ftv(&self) -> HashSet<TypeVar> {
        match self {
            Scheme::Mono(ty) => ty.ftv(),
            Scheme::Poly(vars, ty) => ty.ftv(),
        }
    }

    fn apply(self, map: &HashMap<TypeVar, Type>) -> Scheme {
        match self {
            Scheme::Mono(ty) => Scheme::Mono(ty.apply(map)),
            Scheme::Poly(vars, ty) => {
                let mut map: HashMap<TypeVar, Type> = map.clone();
                for v in &vars {
                    map.remove(v);
                }
                Scheme::Poly(vars, ty.apply(&map))
            }
        }
    }
}

pub const T_ARROW: Tycon = Tycon { id: 0, arity: 2 };
pub const T_INT: Tycon = Tycon { id: 1, arity: 0 };
pub const T_UNIT: Tycon = Tycon { id: 2, arity: 0 };
pub const T_BOOL: Tycon = Tycon { id: 3, arity: 0 };

impl std::fmt::Debug for Tycon {
    fn fmt(&self, f: &mut std::fmt::Formatter) -> std::fmt::Result {
        match self.id {
            0 => write!(f, "->"),
            1 => write!(f, "int"),
            2 => write!(f, "unit"),
            3 => write!(f, "bool"),
            _ => write!(f, "??"),
        }
    }
}

fn fresh_name(x: u32) -> String {
    let last = ((x % 26) as u8 + 'a' as u8) as char;
    (0..x / 26)
        .map(|_| 'z')
        .chain(std::iter::once(last))
        .collect::<String>()
}

impl std::fmt::Debug for TypeVar {
    fn fmt(&self, f: &mut std::fmt::Formatter) -> std::fmt::Result {
        f.write_str(&fresh_name(self.0))
    }
}

impl std::fmt::Debug for Type {
    fn fmt(&self, f: &mut std::fmt::Formatter) -> std::fmt::Result {
        match self {
            Type::Var(x) => write!(f, "{:?}", x),
            Type::Con(T_ARROW, tys) => write!(f, "({:?} -> {:?})", tys[0], tys[1]),
            Type::Con(tc, _) => write!(f, "{:?}", tc,),
        }
    }
}


================================================
FILE: 06_system_f/Cargo.toml
================================================
[package]
name = "system_f"
version = "0.1.0"
authors = ["Michael Lazear <lazear@scripps.edu>"]
edition = "2018"

# See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html

[dependencies]
util = { path = "../util" }

================================================
FILE: 06_system_f/README.md
================================================
# System F

An extension of the simply typed lambda calculus with parametric polymorphism

================================================
FILE: 06_system_f/src/diagnostics.rs
================================================
use util::span::Span;
#[derive(Debug, Copy, Clone)]
pub enum Level {
    Warn,
    Error,
}

#[derive(Debug, Clone)]
pub struct Annotation {
    pub span: Span,
    pub info: String,
}

#[derive(Debug, Clone)]
pub struct Diagnostic {
    pub level: Level,
    pub primary: Annotation,
    pub info: Vec<String>,
    pub other: Vec<Annotation>,
}

impl Annotation {
    pub fn new<S: Into<String>>(span: Span, message: S) -> Annotation {
        Annotation {
            span,
            info: message.into(),
        }
    }
}

impl Diagnostic {
    pub fn error<S: Into<String>>(span: Span, message: S) -> Diagnostic {
        Diagnostic {
            level: Level::Error,
            primary: Annotation::new(span, message),
            other: Vec::new(),
            info: Vec::new(),
        }
    }

    pub fn warn<S: Into<String>>(span: Span, message: S) -> Diagnostic {
        Diagnostic {
            level: Level::Warn,
            primary: Annotation::new(span, message),
            other: Vec::new(),
            info: Vec::new(),
        }
    }

    pub fn message<S: Into<String>>(mut self, span: Span, message: S) -> Diagnostic {
        self.other.push(Annotation::new(span, message));
        self
    }

    pub fn info<S: Into<String>>(mut self, info: S) -> Diagnostic {
        self.info.push(info.into());
        self
    }

    pub fn lines(&self) -> std::ops::Range<u32> {
        let mut range = std::ops::Range {
            start: self.primary.span.start.line,
            end: self.primary.span.end.line + 1,
        };

        for addl in &self.other {
            if addl.span.start.line < range.start {
                range.start = addl.span.start.line;
            }
            if addl.span.end.line + 1 > range.end {
                range.end = addl.span.end.line + 1;
            }
        }
        range
    }
}


================================================
FILE: 06_system_f/src/eval.rs
================================================
use crate::patterns::Pattern;
use crate::terms::visit::{Shift, Subst, TyTermSubst};
use crate::terms::{Kind, Literal, Primitive, Term};
use crate::types::{Context, Type};
use crate::visit::MutTermVisitor;

pub struct Eval<'ctx> {
    _context: &'ctx Context,
}

impl<'ctx> Eval<'ctx> {
    pub fn with_context(_context: &Context) -> Eval<'_> {
        Eval { _context }
    }

    fn normal_form(&self, term: &Term) -> bool {
        match &term.kind {
            Kind::Lit(_) => true,
            Kind::Abs(_, _) => true,
            Kind::TyAbs(_) => true,
            Kind::Primitive(_) => true,
            Kind::Injection(_, tm, _) => self.normal_form(tm),
            Kind::Product(fields) => fields.iter().all(|f| self.normal_form(f)),
            Kind::Fold(_, tm) => self.normal_form(tm),
            Kind::Pack(_, tm, _) => self.normal_form(tm),
            // Kind::Unpack(pack, tm) => self.normal_form(tm),
            _ => false,
        }
    }

    fn eval_primitive(&self, p: Primitive, term: Term) -> Option<Term> {
        fn map<F: Fn(u32) -> u32>(f: F, mut term: Term) -> Option<Term> {
            match &term.kind {
                Kind::Lit(Literal::Nat(n)) => {
                    term.kind = Kind::Lit(Literal::Nat(f(*n)));
                    Some(term)
                }
                _ => None,
            }
        }

        match p {
            Primitive::Succ => map(|l| l + 1, term),
            Primitive::Pred => map(|l| l.saturating_sub(1), term),
            Primitive::IsZero => match &term.kind {
                Kind::Lit(Literal::Nat(0)) => Some(Term::new(Kind::Lit(Literal::Bool(true)), term.span)),
                _ => Some(Term::new(Kind::Lit(Literal::Bool(false)), term.span)),
            },
        }
    }

    pub fn small_step(&self, term: Term) -> Option<Term> {
        if self.normal_form(&term) {
            return None;
        }
        match term.kind {
            Kind::App(t1, t2) => {
                if self.normal_form(&t2) {
                    match t1.kind {
                        Kind::Abs(_, mut abs) => {
                            term_subst(*t2, abs.as_mut());
                            Some(*abs)
                        }
                        Kind::Primitive(p) => self.eval_primitive(p, *t2),
                        _ => {
                            let t = self.small_step(*t1)?;
                            Some(Term::new(Kind::App(Box::new(t), t2), term.span))
                        }
                    }
                } else if self.normal_form(&t1) {
                    // t1 is in normal form, but t2 is not, so we will
                    // carry out the reducton t2 -> t2', and return
                    // App(t1, t2')
                    let t = self.small_step(*t2)?;
                    Some(Term::new(Kind::App(t1, Box::new(t)), term.span))
                } else {
                    // Neither t1 nor t2 are in normal form, we reduce t1 first
                    let t = self.small_step(*t1)?;
                    Some(Term::new(Kind::App(Box::new(t), t2), term.span))
                }
            }
            Kind::Let(pat, bind, mut body) => {
                if self.normal_form(&bind) {
                    // term_subst(*bind, &mut body);
                    self.case_subst(&pat, &bind, body.as_mut());
                    Some(*body)
                } else {
                    let t = self.small_step(*bind)?;
                    Some(Term::new(Kind::Let(pat, Box::new(t), body), term.span))
                }
            }
            Kind::TyApp(tm, ty) => match tm.kind {
                Kind::TyAbs(mut tm2) => {
                    type_subst(*ty, &mut tm2);
                    Some(*tm2)
                }
                _ => {
                    let t_prime = self.small_step(*tm)?;
                    Some(Term::new(Kind::TyApp(Box::new(t_prime), ty), term.span))
                }
            },
            Kind::Injection(label, tm, ty) => {
                let t_prime = self.small_step(*tm)?;
                Some(Term::new(Kind::Injection(label, Box::new(t_prime), ty), term.span))
            }
            Kind::Projection(tm, idx) => {
                if self.normal_form(&tm) {
                    match tm.kind {
                        // Typechecker ensures that idx is in bounds
                        Kind::Product(terms) => terms.get(idx).cloned(),
                        _ => None,
                    }
                } else {
                    let t_prime = self.small_step(*tm)?;
                    Some(Term::new(Kind::Projection(Box::new(t_prime), idx), term.span))
                }
            }
            Kind::Product(terms) => {
                let mut v = Vec::with_capacity(terms.len());
                for term in terms {
                    if self.normal_form(&term) {
                        v.push(term);
                    } else {
                        v.push(self.small_step(term)?);
                    }
                }
                Some(Term::new(Kind::Product(v), term.span))
            }
            Kind::Fix(tm) => {
                if !self.normal_form(&tm) {
                    let t_prime = self.small_step(*tm)?;
                    return Some(Term::new(Kind::Fix(Box::new(t_prime)), term.span));
                }

                let x = Term::new(Kind::Fix(tm.clone()), term.span);
                match tm.kind {
                    Kind::Abs(_, mut body) => {
                        term_subst(x, &mut body);
                        Some(*body)
                    }
                    _ => None,
                }
            }
            Kind::Case(expr, arms) => {
                if !self.normal_form(&expr) {
                    let t_prime = self.small_step(*expr)?;
                    return Some(Term::new(Kind::Case(Box::new(t_prime), arms), term.span));
                }

                for mut arm in arms {
                    if arm.pat.matches(&expr) {
                        self.case_subst(&arm.pat, &expr, arm.term.as_mut());
                        return Some(*arm.term);
                    }
                }

                None
            }
            Kind::Fold(ty, tm) => {
                if !self.normal_form(&tm) {
                    let t_prime = self.small_step(*tm)?;
                    Some(Term::new(Kind::Fold(ty, Box::new(t_prime)), term.span))
                } else {
                    None
                }
            }

            Kind::Unfold(ty, tm) => {
                if !self.normal_form(&tm) {
                    let t_prime = self.small_step(*tm)?;
                    return Some(Term::new(Kind::Unfold(ty, Box::new(t_prime)), term.span));
                }

                match tm.kind {
                    Kind::Fold(ty2, inner) => Some(*inner),
                    _ => None,
                }
            }
            Kind::Pack(wit, evidence, sig) => {
                if !self.normal_form(&evidence) {
                    let t_prime = self.small_step(*evidence)?;
                    return Some(Term::new(Kind::Pack(wit, Box::new(t_prime), sig), term.span));
                }
                None
            }
            Kind::Unpack(package, mut body) => match package.kind {
                Kind::Pack(wit, evidence, sig) => {
                    term_subst(*evidence, &mut body);
                    type_subst(*wit, &mut body);
                    Some(*body)
                }
                _ => {
                    if !self.normal_form(&package) {
                        let t_prime = self.small_step(*package)?;
                        return Some(Term::new(Kind::Unpack(Box::new(t_prime), body), term.span));
                    }
                    None
                }
            },

            _ => None,
        }
    }

    fn case_subst(&self, pat: &Pattern, expr: &Term, term: &mut Term) {
        use Pattern::*;
        match pat {
            Any => {}
            Literal(_) => {}
            Variable(_) => {
                term_subst(expr.clone(), term);
            }
            Product(v) => {
                if let Kind::Product(terms) = &expr.kind {
                    let mut idx = 0;
                    for tm in terms.iter() {
                        self.case_subst(&v[idx], tm, term);
                        idx += 1;
                    }
                } else {
                    panic!("wrong type!")
                }
            }
            Constructor(label, v) => {
                if let Kind::Injection(label_, tm, _) = &expr.kind {
                    if label == label_ {
                        self.case_subst(&v, &tm, term);
                    }
                } else {
                    panic!("wrong type!")
                }
            }
        }
    }
}

fn term_subst(mut s: Term, t: &mut Term) {
    Shift::new(1).visit(&mut s);
    Subst::new(s).visit(t);
    Shift::new(-1).visit(t);
}

fn type_subst(s: Type, t: &mut Term) {
    TyTermSubst::new(s).visit(t);
    Shift::new(-1).visit(t);
}

#[cfg(test)]
mod test {
    use super::*;
    use util::span::Span;

    #[test]
    fn literal() {
        let ctx = crate::types::Context::default();
        let eval = Eval::with_context(&ctx);
        assert_eq!(eval.small_step(lit!(false)), None);
    }

    #[test]
    fn application() {
        let ctx = crate::types::Context::default();
        let eval = Eval::with_context(&ctx);
        let tm = app!(abs!(Type::Nat, app!(prim!(Primitive::Succ), var!(0))), nat!(1));

        let t1 = eval.small_step(tm);
        assert_eq!(t1, Some(app!(prim!(Primitive::Succ), nat!(1))));
        let t2 = eval.small_step(t1.unwrap());
        assert_eq!(t2, Some(nat!(2)));
        let t3 = eval.small_step(t2.unwrap());
        assert_eq!(t3, None);
    }

    #[test]
    fn type_application() {
        let ctx = crate::types::Context::default();
        let eval = Eval::with_context(&ctx);
        let tm = tyapp!(
            tyabs!(abs!(Type::Var(0), app!(prim!(Primitive::Succ), var!(0)))),
            Type::Nat
        );

        let t1 = eval.small_step(tm);
        assert_eq!(t1, Some(abs!(Type::Nat, app!(prim!(Primitive::Succ), var!(0)))));
        let t2 = eval.small_step(t1.unwrap());
        assert_eq!(t2, None);
    }

    #[test]
    fn projection() {
        let ctx = crate::types::Context::default();
        let eval = Eval::with_context(&ctx);
        let product = Term::new(Kind::Product(vec![nat!(5), nat!(6), nat!(29)]), Span::zero());
        let projection = Term::new(Kind::Projection(Box::new(product), 2), Span::zero());
        let term = app!(prim!(Primitive::Succ), projection);

        let t1 = eval.small_step(term);
        assert_eq!(t1, Some(app!(prim!(Primitive::Succ), nat!(29))));
        let t2 = eval.small_step(t1.unwrap());
        assert_eq!(t2, Some(nat!(30)));
        let t3 = eval.small_step(t2.unwrap());
        assert_eq!(t3, None);
    }
}


================================================
FILE: 06_system_f/src/macros.rs
================================================
//! Macros to make writing tests easier

/// Boolean term
macro_rules! lit {
    ($x:expr) => {
        crate::terms::Term::new(
            crate::terms::Kind::Lit(crate::terms::Literal::Bool($x)),
            util::span::Span::dummy(),
        )
    };
}

/// Integer term
macro_rules! nat {
    ($x:expr) => {
        crate::terms::Term::new(
            crate::terms::Kind::Lit(crate::terms::Literal::Nat($x)),
            util::span::Span::dummy(),
        )
    };
}

/// TmVar term
macro_rules! var {
    ($x:expr) => {
        crate::terms::Term::new(crate::terms::Kind::Var($x), util::span::Span::dummy())
    };
}

/// Application term
macro_rules! app {
    ($t1:expr, $t2:expr) => {
        crate::terms::Term::new(
            crate::terms::Kind::App(Box::new($t1), Box::new($t2)),
            util::span::Span::dummy(),
        )
    };
}

/// Lambda abstraction term
macro_rules! abs {
    ($ty:expr, $t:expr) => {
        crate::terms::Term::new(
            crate::terms::Kind::Abs(Box::new($ty), Box::new($t)),
            util::span::Span::dummy(),
        )
    };
}

/// Type application term
macro_rules! tyapp {
    ($t1:expr, $t2:expr) => {
        crate::terms::Term::new(
            crate::terms::Kind::TyApp(Box::new($t1), Box::new($t2)),
            util::span::Span::dummy(),
        )
    };
}

/// Type abstraction term
macro_rules! tyabs {
    ( $t:expr) => {
        crate::terms::Term::new(crate::terms::Kind::TyAbs(Box::new($t)), util::span::Span::dummy())
    };
}

/// Primitive term
macro_rules! prim {
    ($t:expr) => {
        crate::terms::Term::new(crate::terms::Kind::Primitive($t), util::span::Span::dummy())
    };
}

macro_rules! inj {
    ($label:expr, $t:expr, $ty:expr) => {
        crate::terms::Term::new(
            crate::terms::Kind::Injection($label.to_string(), Box::new($t), Box::new($ty)),
            util::span::Span::dummy(),
        )
    };
}

/// Product term
macro_rules! tuple {
    ($($ex:expr),+) => { crate::terms::Term::new(crate::terms::Kind::Product(vec![$($ex),+]),
    util::span::Span::dummy()) }
}

/// Type arrow
macro_rules! arrow {
    ($ty1:expr, $ty2:expr) => {
        crate::types::Type::Arrow(Box::new($ty1), Box::new($ty2))
    };
}

/// Boolean pattern
macro_rules! boolean {
    ($ex:expr) => {
        crate::patterns::Pattern::Literal(crate::terms::Literal::Bool($ex))
    };
}

/// Numeric pattern
macro_rules! num {
    ($ex:expr) => {
        crate::patterns::Pattern::Literal(crate::terms::Literal::Nat($ex))
    };
}

/// Product pattern
macro_rules! prod {
    ($($ex:expr),+) => { crate::patterns::Pattern::Product(vec![$($ex),+]) }
}

/// Constructor pattern
macro_rules! con {
    ($label:expr, $ex:expr) => {
        crate::patterns::Pattern::Constructor($label.to_string(), Box::new($ex))
    };
}

/// Variant type
macro_rules! variant {
    ($label:expr, $ty:expr) => {
        crate::types::Variant {
            label: $label.to_string(),
            ty: $ty,
        }
    };
}


================================================
FILE: 06_system_f/src/main.rs
================================================
#![allow(unused_variables, unused_macros)]
#[macro_use]
pub mod macros;
pub mod diagnostics;
pub mod eval;
pub mod patterns;
pub mod syntax;
pub mod terms;
pub mod types;
pub mod visit;

use diagnostics::*;
use std::env;
use std::io::{Read, Write};
use syntax::parser::{self, Parser};
use terms::{visit::InjRewriter, Term};
use types::{Type, Variant};
use visit::MutTermVisitor;

fn test_variant() -> Type {
    Type::Variant(vec![
        Variant {
            label: "A".into(),
            ty: Type::Unit,
        },
        Variant {
            label: "B".into(),
            ty: Type::Nat,
        },
        Variant {
            label: "C".into(),
            ty: Type::Nat,
        },
    ])
}

pub fn code_format(src: &str, diag: Diagnostic) {
    // let lines = diag.ot
    //     .iter()
    //     .map(|(_, sp)| sp.start.line)
    //     .collect::<std::collections::HashSet<_>>();
    let srcl = src.lines().collect::<Vec<&str>>();

    let mut msgs = diag.other.clone();
    msgs.insert(0, diag.primary.clone());

    for line in diag.lines() {
        println!("| {} {}", line + 1, &srcl[line as usize]);
        for anno in &msgs {
            if anno.span.start.line != line {
                continue;
            }
            let empty = (0..anno.span.start.col + 3).map(|_| ' ').collect::<String>();
            let tilde = (1..anno.span.end.col.saturating_sub(anno.span.start.col))
                .map(|_| '~')
                .collect::<String>();
            println!("{}^{}^ --- {}", empty, tilde, anno.info);
        }
    }
}

fn eval(ctx: &mut types::Context, mut term: Term, verbose: bool) -> Result<Term, Diagnostic> {
    ctx.de_alias(&mut term);
    InjRewriter.visit(&mut term);
    let ty = ctx.type_check(&term)?;
    println!("  -: {:?}", ty);

    let ev = eval::Eval::with_context(ctx);
    let mut t = term;
    let fin = loop {
        if let Some(res) = ev.small_step(t.clone()) {
            t = res;
        } else {
            break t;
        }
        if verbose {
            println!("---> {}", t);
        }
    };
    println!("===> {}", fin);
    let fty = ctx.type_check(&fin)?;
    if fty != ty {
        panic!(
            "Type of term after evaluation is different than before!\n1 {:?}\n2 {:?}",
            ty, fty
        );
    }
    Ok(fin)
}

fn parse_and_eval(ctx: &mut types::Context, input: &str, verbose: bool) -> bool {
    let mut p = Parser::new(input);
    loop {
        let term = match p.parse() {
            Ok(term) => term,
            Err(parser::Error {
                kind: parser::ErrorKind::Eof,
                ..
            }) => break,
            Err(e) => {
                dbg!(e);
                break;
            }
        };
        if let Err(diag) = eval(ctx, term, verbose) {
            code_format(input, diag);
            return false;
        }
    }
    let diag = p.diagnostic();
    if diag.error_count() > 0 {
        println!("Parsing {}", diag.emit());
        false
    } else {
        true
    }
}

fn nat_list() -> Type {
    Type::Rec(Box::new(Type::Variant(vec![
        variant!("Nil", Type::Unit),
        variant!("Cons", Type::Product(vec![Type::Nat, Type::Var(0)])),
    ])))
}

fn nat_list2() -> Type {
    Type::Variant(vec![
        variant!("Nil", Type::Unit),
        variant!("Cons", Type::Product(vec![Type::Nat, Type::Var(0)])),
    ])
}

fn main() {
    let mut ctx = types::Context::default();

    ctx.alias("Var".into(), test_variant());
    ctx.alias("NatList".into(), nat_list());
    ctx.alias("NB".into(), nat_list2());

    let args = env::args();
    if args.len() > 1 {
        for f in args.skip(1) {
            println!("reading {}", f);
            let file = std::fs::read_to_string(&f).unwrap();
            if !parse_and_eval(&mut ctx, &file, false) {
                panic!("test failed! {}", f);
            }
        }
        return;
    }

    loop {
        let mut buffer = String::new();
        print!("repl: ");
        std::io::stdout().flush().unwrap();
        std::io::stdin().read_to_string(&mut buffer).unwrap();

        parse_and_eval(&mut ctx, &buffer, true);
    }
}


================================================
FILE: 06_system_f/src/patterns/mod.rs
================================================
use crate::terms::{Kind, Literal, Term};
use crate::types::{variant_field, Type};
use crate::visit::PatternVisitor;
use util::span::Span;

/// Patterns for case and let expressions
#[derive(Clone, Debug, PartialEq, PartialOrd, Eq, Hash)]
pub enum Pattern {
    /// Wildcard pattern, this always matches
    Any,
    /// Constant pattern
    Literal(Literal),
    /// Variable binding pattern, this always matches
    Variable(String),
    /// Tuple of pattern bindings
    Product(Vec<Pattern>),
    /// Algebraic datatype constructor, along with binding pattern
    Constructor(String, Box<Pattern>),
}

#[derive(Clone, Debug, Default)]
pub struct PatVarStack {
    pub inner: Vec<String>,
}

impl PatVarStack {
    pub fn collect(pat: &mut Pattern) -> Vec<String> {
        let mut p = Self::default();
        p.visit_pattern(pat);
        p.inner
    }
}

impl PatternVisitor for PatVarStack {
    fn visit_variable(&mut self, var: &String) {
        self.inner.push(var.clone());
    }
}

/// Visitor that simply counts the number of binders (variables) within a
/// pattern
pub struct PatternCount(usize);

impl PatternCount {
    pub fn collect(pat: &mut Pattern) -> usize {
        let mut p = PatternCount(0);
        p.visit_pattern(pat);
        p.0
    }
}

impl PatternVisitor for PatternCount {
    fn visit_variable(&mut self, var: &String) {
        self.0 += 1;
    }
}

impl Pattern {
    /// Does this pattern match the given [`Term`]?
    pub fn matches(&self, term: &Term) -> bool {
        match self {
            Pattern::Any => return true,
            Pattern::Variable(_) => return true,
            Pattern::Literal(l) => {
                if let Kind::Lit(l2) = &term.kind {
                    return l == l2;
                }
            }
            Pattern::Product(vec) => {
                if let Kind::Product(terms) = &term.kind {
                    return vec.iter().zip(terms).all(|(p, t)| p.matches(t));
                }
            }
            Pattern::Constructor(label, inner) => {
                if let Kind::Injection(label_, tm, _) = &term.kind {
                    if label == label_ {
                        return inner.matches(&tm);
                    }
                }
            }
        }
        false
    }
}

/// Helper struct to traverse a [`Pattern`] and bind variables
/// to the typing context as needed.
///
/// It is the caller's responsibiliy to track stack growth and pop off
/// types after calling this function
pub struct PatTyStack<'ty> {
    pub ty: &'ty Type,
    pub inner: Vec<&'ty Type>,
}

impl<'ty> PatTyStack<'ty> {
    pub fn collect(ty: &'ty Type, pat: &Pattern) -> Vec<&'ty Type> {
        let mut p = PatTyStack {
            ty,
            inner: Vec::with_capacity(16),
        };
        p.visit_pattern(pat);
        p.inner
    }
}

impl<'ty> PatternVisitor for PatTyStack<'_> {
    fn visit_product(&mut self, pats: &Vec<Pattern>) {
        if let Type::Product(tys) = self.ty {
            let ty = self.ty;
            for (ty, pat) in tys.iter().zip(pats.iter()) {
                self.ty = ty;
                self.visit_pattern(pat);
            }
            self.ty = ty;
        }
    }

    fn visit_constructor(&mut self, label: &String, pat: &Pattern) {
        if let Type::Variant(vs) = self.ty {
            let ty = self.ty;
            self.ty = variant_field(&vs, label, Span::zero()).unwrap();
            self.visit_pattern(pat);
            self.ty = ty;
        }
    }

    fn visit_pattern(&mut self, pattern: &Pattern) {
        match pattern {
            Pattern::Any | Pattern::Literal(_) => {}
            Pattern::Variable(_) => self.inner.push(self.ty),
            Pattern::Constructor(label, pat) => self.visit_constructor(label, pat),
            Pattern::Product(pats) => self.visit_product(pats),
        }
    }
}

#[cfg(test)]
mod test {
    use super::*;
    #[test]
    fn pattern_count() {
        let mut pat = Pattern::Variable(String::new());
        assert_eq!(PatternCount::collect(&mut pat), 1);
    }

    #[test]
    fn pattern_ty_stack() {
        let mut pat = Pattern::Variable(String::new());
        let ty = Type::Nat;
        assert_eq!(PatTyStack::collect(&ty, &mut pat), vec![&ty]);
    }

    #[test]
    fn pattern_var_stack() {
        let mut pat = Pattern::Variable("x".into());
        assert_eq!(PatVarStack::collect(&mut pat), vec![String::from("x")]);
    }
}


================================================
FILE: 06_system_f/src/syntax/lexer.rs
================================================
use super::{Token, TokenKind};
use std::char;
use std::iter::Peekable;
use std::str::Chars;
use util::span::{Location, Span};

#[derive(Clone)]
pub struct Lexer<'s> {
    input: Peekable<Chars<'s>>,
    current: Location,
}

impl<'s> Lexer<'s> {
    pub fn new(input: Chars<'s>) -> Lexer<'s> {
        Lexer {
            input: input.peekable(),
            current: Location {
                line: 0,
                col: 0,
                abs: 0,
            },
        }
    }

    /// Peek at the next [`char`] in the input stream
    fn peek(&mut self) -> Option<char> {
        self.input.peek().cloned()
    }

    /// Consume the next [`char`] and advance internal source position
    fn consume(&mut self) -> Option<char> {
        match self.input.next() {
            Some('\n') => {
                self.current.line += 1;
                self.current.col = 0;
                self.current.abs += 1;
                Some('\n')
            }
            Some(ch) => {
                self.current.col += 1;
                self.current.abs += 1;
                Some(ch)
            }
            None => None,
        }
    }

    /// Consume characters from the input stream while pred(peek()) is true,
    /// collecting the characters into a string.
    fn consume_while<F: Fn(char) -> bool>(&mut self, pred: F) -> (String, Span) {
        let mut s = String::new();
        let start = self.current;
        while let Some(n) = self.peek() {
            if pred(n) {
                match self.consume() {
                    Some(ch) => s.push(ch),
                    None => break,
                }
            } else {
                break;
            }
        }
        (s, Span::new(start, self.current))
    }

    /// Eat whitespace
    fn consume_delimiter(&mut self) {
        let _ = self.consume_while(char::is_whitespace);
    }

    /// Lex a natural number
    fn number(&mut self) -> Token {
        // Since we peeked at least one numeric char, we should always
        // have a string containing at least 1 single digit, as such
        // it is safe to call unwrap() on str::parse<u32>
        let (data, span) = self.consume_while(char::is_numeric);
        let n = data.parse::<u32>().unwrap();
        Token::new(TokenKind::Nat(n), span)
    }

    /// Lex a reserved keyword or an identifier
    fn keyword(&mut self) -> Token {
        let (data, span) = self.consume_while(|ch| ch.is_ascii_alphanumeric());
        let kind = match data.as_ref() {
            "if" => TokenKind::If,
            "then" => TokenKind::Then,
            "else" => TokenKind::Else,
            "true" => TokenKind::True,
            "false" => TokenKind::False,
            "succ" => TokenKind::Succ,
            "pred" => TokenKind::Pred,
            "iszero" => TokenKind::IsZero,
            "zero" => TokenKind::Nat(0),
            "Bool" => TokenKind::TyBool,
            "Nat" => TokenKind::TyNat,
            "Unit" => TokenKind::TyUnit,
            "unit" => TokenKind::Unit,
            "let" => TokenKind::Let,
            "in" => TokenKind::In,
            "fix" => TokenKind::Fix,
            "case" => TokenKind::Case,
            "of" => TokenKind::Of,
            "fold" => TokenKind::Fold,
            "unfold" => TokenKind::Unfold,
            "rec" => TokenKind::Rec,
            "lambda" => TokenKind::Lambda,
            "forall" => TokenKind::Forall,
            "exists" => TokenKind::Exists,
            "pack" => TokenKind::Pack,
            "unpack" => TokenKind::Unpack,
            "as" => TokenKind::As,

            _ => {
                if data.starts_with(|ch: char| ch.is_ascii_uppercase()) {
                    TokenKind::Uppercase(data)
                } else {
                    TokenKind::Lowercase(data)
                }
            }
        };
        Token::new(kind, span)
    }

    /// Consume the next input character, expecting to match `ch`.
    /// Return a [`TokenKind::Invalid`] if the next character does not match,
    /// or the argument `kind` if it does
    fn eat(&mut self, ch: char, kind: TokenKind) -> Token {
        let loc = self.current;
        // Lexer::eat() should only be called internally after calling peek()
        // so we know that it's safe to unwrap the result of Lexer::consume()
        let n = self.consume().unwrap();
        let kind = if n == ch { kind } else { TokenKind::Invalid(n) };
        Token::new(kind, Span::new(loc, self.current))
    }

    /// Return the next lexeme in the input as a [`Token`]
    pub fn lex(&mut self) -> Token {
        self.consume_delimiter();
        let next = match self.peek() {
            Some(ch) => ch,
            None => return Token::new(TokenKind::Eof, Span::new(self.current, self.current)),
        };
        match next {
            x if x.is_ascii_alphabetic() => self.keyword(),
            x if x.is_numeric() => self.number(),
            '(' => self.eat('(', TokenKind::LParen),
            ')' => self.eat(')', TokenKind::RParen),
            ';' => self.eat(';', TokenKind::Semicolon),
            ':' => self.eat(':', TokenKind::Colon),
            ',' => self.eat(',', TokenKind::Comma),
            '{' => self.eat('{', TokenKind::LBrace),
            '}' => self.eat('}', TokenKind::RBrace),
            '[' => self.eat('[', TokenKind::LSquare),
            ']' => self.eat(']', TokenKind::RSquare),
            '\\' => self.eat('\\', TokenKind::Lambda),
            'λ' => self.eat('λ', TokenKind::Lambda),
            '∀' => self.eat('∀', TokenKind::Forall),
            '∃' => self.eat('∃', TokenKind::Exists),
            '.' => self.eat('.', TokenKind::Proj),
            '=' => self.eat('=', TokenKind::Equals),
            '|' => self.eat('|', TokenKind::Bar),
            '_' => self.eat('_', TokenKind::Wildcard),
            '>' => self.eat('>', TokenKind::Gt),
            '-' => {
                self.consume();
                self.eat('>', TokenKind::TyArrow)
            }
            ch => self.eat(' ', TokenKind::Invalid(ch)),
        }
    }
}

impl<'s> Iterator for Lexer<'s> {
    type Item = Token;
    fn next(&mut self) -> Option<Self::Item> {
        match self.lex() {
            Token {
                kind: TokenKind::Eof, ..
            } => None,
            tok => Some(tok),
        }
    }
}

#[cfg(test)]
mod test {
    use super::*;
    use TokenKind::*;
    #[test]
    fn nested() {
        let input = "succ(succ(succ(0)))";
        let expected = vec![Succ, LParen, Succ, LParen, Succ, LParen, Nat(0), RParen, RParen, RParen];
        let output = Lexer::new(input.chars())
            .into_iter()
            .map(|t| t.kind)
            .collect::<Vec<TokenKind>>();
        assert_eq!(expected, output);
    }

    #[test]
    fn case() {
        let input = "case x of | A _ => true | B x => (\\y: Nat. x)";
        let expected = vec![
            Case,
            Lowercase("x".into()),
            Of,
            Bar,
            Uppercase("A".into()),
            Wildcard,
            Equals,
            Gt,
            True,
            Bar,
            Uppercase("B".into()),
            Lowercase("x".into()),
            Equals,
            Gt,
            LParen,
            Lambda,
            Lowercase("y".into()),
            Colon,
            TyNat,
            Proj,
            Lowercase("x".into()),
            RParen,
        ];
        let output = Lexer::new(input.chars())
            .into_iter()
            .map(|t| t.kind)
            .collect::<Vec<TokenKind>>();
        assert_eq!(expected, output);
    }
}


================================================
FILE: 06_system_f/src/syntax/mod.rs
================================================
//! Lexical analysis and recursive descent parser for System F
pub mod lexer;
pub mod parser;
use util::span::Span;

#[derive(Clone, Debug, PartialEq, PartialOrd)]
pub enum TokenKind {
    Uppercase(String),
    Lowercase(String),
    Nat(u32),
    TyNat,
    TyBool,
    TyArrow,
    TyUnit,
    Unit,
    True,
    False,
    Lambda,
    Forall,
    Exists,
    As,
    Pack,
    Unpack,
    Succ,
    Pred,
    If,
    Then,
    Else,
    Let,
    In,
    IsZero,
    Semicolon,
    Colon,
    Comma,
    Proj,
    LParen,
    RParen,
    LBrace,
    RBrace,
    LSquare,
    RSquare,
    Equals,
    Bar,
    Wildcard,
    Gt,
    Case,
    Of,
    Fix,
    Fold,
    Unfold,
    Rec,
    Invalid(char),
    Dummy,
    Eof,
}

#[derive(Clone, Debug, PartialEq, PartialOrd)]
pub struct Token {
    pub kind: TokenKind,
    pub span: Span,
}

impl Token {
    pub const fn dummy() -> Token {
        Token {
            kind: TokenKind::Dummy,
            span: Span::zero(),
        }
    }

    pub const fn new(kind: TokenKind, span: Span) -> Token {
        Token { kind, span }
    }
}


================================================
FILE: 06_system_f/src/syntax/parser.rs
================================================
use super::lexer::Lexer;
use super::{Token, TokenKind};

use std::collections::VecDeque;
use util::diagnostic::Diagnostic;
use util::span::*;

use crate::patterns::{PatVarStack, Pattern};
use crate::terms::*;
use crate::types::*;

#[derive(Clone, Debug, Default)]
pub struct DeBruijnIndexer {
    inner: VecDeque<String>,
}

impl DeBruijnIndexer {
    pub fn push(&mut self, hint: String) -> usize {
        let idx = self.inner.len();
        self.inner.push_front(hint);
        idx
    }

    pub fn pop(&mut self) {
        self.inner.pop_front();
    }

    pub fn lookup(&self, key: &str) -> Option<usize> {
        for (idx, s) in self.inner.iter().enumerate() {
            if key == s {
                return Some(idx);
            }
        }
        None
    }

    pub fn len(&self) -> usize {
        self.inner.len()
    }
}

pub struct Parser<'s> {
    tmvar: DeBruijnIndexer,
    tyvar: DeBruijnIndexer,
    diagnostic: Diagnostic<'s>,
    lexer: Lexer<'s>,
    span: Span,
    token: Token,
}

#[derive(Clone, Debug)]
pub struct Error {
    pub span: Span,
    pub tok: Token,
    pub kind: ErrorKind,
}

#[derive(Clone, Debug)]
pub enum ErrorKind {
    ExpectedAtom,
    ExpectedIdent,
    ExpectedType,
    ExpectedPattern,
    ExpectedToken(TokenKind),
    UnboundTypeVar,
    Unknown,
    Eof,
}
impl<'s> Parser<'s> {
    /// Create a new [`Parser`] for the input `&str`
    pub fn new(input: &'s str) -> Parser<'s> {
        let mut p = Parser {
            tmvar: DeBruijnIndexer::default(),
            tyvar: DeBruijnIndexer::default(),
            diagnostic: Diagnostic::new(input),
            lexer: Lexer::new(input.chars()),
            span: Span::default(),
            token: Token::dummy(),
        };
        p.bump();
        p
    }

    pub fn diagnostic(self) -> Diagnostic<'s> {
        self.diagnostic
    }
}

impl<'s> Parser<'s> {
    /// Kleene Plus combinator
    fn once_or_more<T, F>(&mut self, func: F, delimiter: TokenKind) -> Result<Vec<T>, Error>
    where
        F: Fn(&mut Parser) -> Result<T, Error>,
    {
        let mut v = vec![func(self)?];
        while self.bump_if(&delimiter) {
            v.push(func(self)?);
        }
        Ok(v)
    }

    /// Expect combinator
    /// Combinator that must return Ok or a message will be pushed to
    /// diagnostic. This method should only be called after a token has
    /// already been bumped.
    fn once<T, F>(&mut self, func: F, message: &str) -> Result<T, Error>
    where
        F: Fn(&mut Parser) -> Result<T, Error>,
    {
        match func(self) {
            Ok(t) => Ok(t),
            Err(e) => {
                self.diagnostic.push(message, self.span);
                Err(e)
            }
        }
    }
}

impl<'s> Parser<'s> {
    fn error<T>(&self, kind: ErrorKind) -> Result<T, Error> {
        Err(Error {
            span: self.token.span,
            tok: self.token.clone(),
            kind,
        })
    }

    fn bump(&mut self) -> TokenKind {
        let prev = std::mem::replace(&mut self.token, self.lexer.lex());
        self.span = prev.span;
        prev.kind
    }

    fn bump_if(&mut self, kind: &TokenKind) -> bool {
        if &self.token.kind == kind {
            self.bump();
            true
        } else {
            false
        }
    }

    fn expect(&mut self, kind: TokenKind) -> Result<(), Error> {
        if self.token.kind == kind {
            self.bump();
            Ok(())
        } else {
            self.diagnostic.push(
                format!("expected token {:?}, found {:?}", kind, self.token.kind),
                self.span,
            );
            self.error(ErrorKind::ExpectedToken(kind))
        }
    }

    fn kind(&self) -> &TokenKind {
        &self.token.kind
    }

    fn ty_variant(&mut self) -> Result<Variant, Error> {
        let label = self.uppercase_id()?;
        let ty = match self.ty() {
            Ok(ty) => ty,
            _ => Type::Unit,
        };

        Ok(Variant { label, ty })
    }

    fn ty_app(&mut self) -> Result<Type, Error> {
        if !self.bump_if(&TokenKind::LSquare) {
            return self.error(ErrorKind::ExpectedToken(TokenKind::LSquare));
        }
        let ty = self.ty()?;
        self.expect(TokenKind::RSquare)?;
        Ok(ty)
    }

    fn ty_atom(&mut self) -> Result<Type, Error> {
        match self.kind() {
            TokenKind::TyBool => {
                self.bump();
                Ok(Type::Bool)
            }
            TokenKind::TyNat => {
                self.bump();
                Ok(Type::Nat)
            }
            TokenKind::TyUnit => {
                self.bump();
                Ok(Type::Unit)
            }
            TokenKind::LParen => {
                self.bump();
                let r = self.ty()?;
                self.expect(TokenKind::RParen)?;
                Ok(r)
            }
            TokenKind::Forall => {
                self.bump();
                Ok(Type::Universal(Box::new(self.ty()?)))
            }
            TokenKind::Exists => {
                self.bump();
                let tvar = self.uppercase_id()?;
                self.expect(TokenKind::Proj)?;
                self.tyvar.push(tvar);
                let xs = Type::Existential(Box::new(self.ty()?));
                self.tyvar.pop();
                Ok(xs)
            }
            TokenKind::Uppercase(_) => {
                let ty = self.uppercase_id()?;
                match self.tyvar.lookup(&ty) {
                    Some(idx) => Ok(Type::Var(idx)),
                    None => Ok(Type::Alias(ty)),
                }
            }
            TokenKind::LBrace => {
                self.bump();
                let fields = self.once_or_more(|p| p.ty_variant(), TokenKind::Bar)?;
                self.expect(TokenKind::RBrace)?;
                Ok(Type::Variant(fields))
            }
            _ => self.error(ErrorKind::ExpectedType),
        }
    }

    fn ty_tuple(&mut self) -> Result<Type, Error> {
        if self.bump_if(&TokenKind::LParen) {
            let mut v = self.once_or_more(|p| p.ty(), TokenKind::Comma)?;
            self.expect(TokenKind::RParen)?;

            if v.len() > 1 {
                Ok(Type::Product(v))
            } else {
                Ok(v.remove(0))
            }
        } else {
            self.ty_atom()
        }
    }

    pub fn ty(&mut self) -> Result<Type, Error> {
        if self.bump_if(&TokenKind::Rec) {
            let name = self.uppercase_id()?;
            self.expect(TokenKind::Equals)?;
            self.tyvar.push(name);
            let ty = self.ty()?;
            self.tyvar.pop();
            return Ok(Type::Rec(Box::new(ty)));
        }

        let mut lhs = self.ty_tuple()?;
        if let TokenKind::TyArrow = self.kind() {
            self.bump();
            while let Ok(rhs) = self.ty() {
                lhs = Type::Arrow(Box::new(lhs), Box::new(rhs));
                if let TokenKind::TyArrow = self.kind() {
                    self.bump();
                } else {
                    break;
                }
            }
        }
        Ok(lhs)
    }

    fn tyabs(&mut self) -> Result<Term, Error> {
        let tyvar = self.uppercase_id()?;
        let sp = self.span;
        let ty = Box::new(Type::Var(self.tyvar.push(tyvar)));
        let body = self.once(|p| p.parse(), "abstraction body required")?;
        Ok(Term::new(Kind::TyAbs(Box::new(body)), sp + self.span))
    }

    fn tmabs(&mut self) -> Result<Term, Error> {
        let tmvar = self.lowercase_id()?;
        let sp = self.span;
        self.tmvar.push(tmvar);

        self.expect(TokenKind::Colon)?;
        let ty = self.once(|p| p.ty(), "type annotation required in abstraction")?;
        self.expect(TokenKind::Proj)?;
        let body = self.once(|p| p.parse(), "abstraction body required")?;
        self.tmvar.pop();
        Ok(Term::new(Kind::Abs(Box::new(ty), Box::new(body)), sp + self.span))
    }

    fn fold(&mut self) -> Result<Term, Error> {
        self.expect(TokenKind::Fold)?;
        let sp = self.span;
        let ty = self.once(|p| p.ty(), "type annotation required after `fold`")?;
        let tm = self.once(|p| p.parse(), "term required after `fold`")?;
        Ok(Term::new(Kind::Fold(Box::new(ty), Box::new(tm)), sp + self.span))
    }

    fn unfold(&mut self) -> Result<Term, Error> {
        self.expect(TokenKind::Unfold)?;
        let sp = self.span;
        let ty = self.once(|p| p.ty(), "type annotation required after `unfold`")?;
        let tm = self.once(|p| p.parse(), "term required after `unfold`")?;
        Ok(Term::new(Kind::Unfold(Box::new(ty), Box::new(tm)), sp + self.span))
    }

    fn fix(&mut self) -> Result<Term, Error> {
        let sp = self.span;
        self.expect(TokenKind::Fix)?;
        let t = self.parse()?;
        Ok(Term::new(Kind::Fix(Box::new(t)), sp + self.span))
    }

    fn letexpr(&mut self) -> Result<Term, Error> {
        let sp = self.span;
        self.expect(TokenKind::Let)?;
        let mut pat = self.once(|p| p.pattern(), "missing pattern")?;

        self.expect(TokenKind::Equals)?;

        let t1 = self.once(|p| p.parse(), "let binder required")?;
        let len = self.tmvar.len();
        for var in PatVarStack::collect(&mut pat).into_iter().rev() {
            self.tmvar.push(var);
        }
        self.expect(TokenKind::In)?;
        let t2 = self.once(|p| p.parse(), "let body required")?;
        while self.tmvar.len() > len {
            self.tmvar.pop();
        }
        Ok(Term::new(
            Kind::Let(Box::new(pat), Box::new(t1), Box::new(t2)),
            sp + self.span,
        ))
    }

    fn lambda(&mut self) -> Result<Term, Error> {
        self.expect(TokenKind::Lambda)?;
        match self.kind() {
            TokenKind::Uppercase(_) => self.tyabs(),
            TokenKind::Lowercase(_) => self.tmabs(),
            _ => {
                self.diagnostic
                    .push("expected identifier after lambda, found".to_string(), self.span);
                self.error(ErrorKind::ExpectedIdent)
            }
        }
    }

    fn paren(&mut self) -> Result<Term, Error> {
        self.expect(TokenKind::LParen)?;
        let span = self.span;
        let mut n = self.once_or_more(|p| p.parse(), TokenKind::Comma)?;
        self.expect(TokenKind::RParen)?;
        if n.len() > 1 {
            Ok(Term::new(Kind::Product(n), span + self.span))
        } else {
            // invariant, n.len() >= 1
            Ok(n.remove(0))
        }
    }

    fn uppercase_id(&mut self) -> Result<String, Error> {
        match self.bump() {
            TokenKind::Uppercase(s) => Ok(s),
            tk => {
                self.diagnostic
                    .push(format!("expected uppercase identifier, found {:?}", tk), self.span);
                self.error(ErrorKind::ExpectedIdent)
            }
        }
    }

    fn lowercase_id(&mut self) -> Result<String, Error> {
        match self.bump() {
            TokenKind::Lowercase(s) => Ok(s),
            tk => {
                self.diagnostic
                    .push(format!("expected lowercase identifier, found {:?}", tk), self.span);
                self.error(ErrorKind::ExpectedIdent)
            }
        }
    }

    fn literal(&mut self) -> Result<Term, Error> {
        let lit = match self.bump() {
            TokenKind::Nat(x) => Literal::Nat(x),
            TokenKind::True => Literal::Bool(true),
            TokenKind::False => Literal::Bool(false),
            TokenKind::Unit => Literal::Unit,
            _ => return self.error(ErrorKind::Unknown),
        };
        Ok(Term::new(Kind::Lit(lit), self.span))
    }

    fn primitive(&mut self) -> Result<Term, Error> {
        let p = match self.bump() {
            TokenKind::IsZero => Primitive::IsZero,
            TokenKind::Succ => Primitive::Succ,
            TokenKind::Pred => Primitive::Pred,
            _ => return self.error(ErrorKind::Unknown),
        };
        Ok(Term::new(Kind::Primitive(p), self.span))
    }

    /// Important to note that this function can push variable names to the
    /// de Bruijn naming context. Callers of this function are responsible for
    /// making sure that the stack is balanced afterwards
    fn pat_atom(&mut self) -> Result<Pattern, Error> {
        match self.kind() {
            TokenKind::LParen => self.pattern(),
            TokenKind::Wildcard => {
                self.bump();
                Ok(Pattern::Any)
            }
            TokenKind::Uppercase(_) => {
                let tycon = self.uppercase_id()?;
                let inner = match self.pattern() {
                    Ok(pat) => pat,
                    _ => Pattern::Any,
                };
                Ok(Pattern::Constructor(tycon, Box::new(inner)))
            }
            TokenKind::Lowercase(_) => {
                let var = self.lowercase_id()?;
                // self.tmvar.push(var.clone());
                Ok(Pattern::Variable(var))
            }
            TokenKind::True => {
                self.bump();
                Ok(Pattern::Literal(Literal::Bool(true)))
            }
            TokenKind::False => {
                self.bump();
                Ok(Pattern::Literal(Literal::Bool(false)))
            }
            TokenKind::Unit => {
                self.bump();
                Ok(Pattern::Literal(Literal::Unit))
            }
            TokenKind::Nat(n) => {
                // O great borrowck, may this humble offering appease thee
                let n = *n;
                self.bump();
                Ok(Pattern::Literal(Literal::Nat(n)))
            }
            _ => self.error(ErrorKind::ExpectedPattern),
        }
    }

    fn pattern(&mut self) -> Result<Pattern, Error> {
        match self.kind() {
            TokenKind::LParen => {
                self.bump();
                let mut v = self.once_or_more(|p| p.pat_atom(), TokenKind::Comma)?;
                self.expect(TokenKind::RParen)?;
                if v.len() > 1 {
                    Ok(Pattern::Product(v))
                } else {
                    // v must have length == 1, else we would have early returned
                    assert_eq!(v.len(), 1);
                    Ok(v.remove(0))
                }
            }
            _ => self.pat_atom(),
        }
    }

    fn case_arm(&mut self) -> Result<Arm, Error> {
        // match self.kind() {
        //     TokenKind::Bar => self.bump(),
        //     _ => return self.error(ErrorKind::ExpectedToken(TokenKind::Bar)),
        // };

        // We don't track the length of the debruijn index in other methods,
        // but we have a couple branches where variables might be bound,
        // and this is pretty much the easiest way of doing it

        let len = self.tmvar.len();
        let mut span = self.span;

        let mut pat = self.once(|p| p.pattern(), "missing pattern")?;

        for var in PatVarStack::collect(&mut pat).into_iter().rev() {
            self.tmvar.push(var);
        }

        self.expect(TokenKind::Equals)?;
        self.expect(TokenKind::Gt)?;

        let term = Box::new(self.once(|p| p.application(), "missing case term")?);

        self.bump_if(&TokenKind::Comma);

        // Unbind any variables from the parsing context
        while self.tmvar.len() > len {
            self.tmvar.pop();
        }

        span = span + self.span;

        Ok(Arm { span, pat, term })
    }

    fn case(&mut self) -> Result<Term, Error> {
        self.expect(TokenKind::Case)?;
        let span = self.span;
        let expr = self.once(|p| p.parse(), "missing case expression")?;
        self.expect(TokenKind::Of)?;

        self.bump_if(&TokenKind::Bar);
        let arms = self.once_or_more(|p| p.case_arm(), TokenKind::Bar)?;

        Ok(Term::new(Kind::Case(Box::new(expr), arms), span + self.span))
    }

    fn injection(&mut self) -> Result<Term, Error> {
        let label = self.uppercase_id()?;
        let sp = self.span;
        let term = match self.parse() {
            Ok(t) => t,
            _ => Term::new(Kind::Lit(Literal::Unit), self.span),
        };

        self.expect(TokenKind::Of)?;
        let ty = self.ty()?;
        Ok(Term::new(
            Kind::Injection(label, Box::new(term), Box::new(ty)),
            sp + self.span,
        ))
    }

    fn pack(&mut self) -> Result<Term, Error> {
        self.expect(TokenKind::Pack)?;
        let sp = self.span;
        let witness = self.ty()?;
        self.expect(TokenKind::Comma)?;
        let evidence = self.parse()?;
        self.expect(TokenKind::As)?;
        let signature = self.ty()?;

        Ok(Term::new(
            Kind::Pack(Box::new(witness), Box::new(evidence), Box::new(signature)),
            sp + self.span,
        ))
    }

    fn unpack(&mut self) -> Result<Term, Error> {
        self.expect(TokenKind::Unpack)?;
        let sp = self.span;
        let package = self.parse()?;
        self.expect(TokenKind::As)?;

        let tyvar = self.uppercase_id()?;
        self.expect(TokenKind::Comma)?;
        let name = self.lowercase_id()?;
        self.tyvar.push(tyvar);
        self.tmvar.push(name);
        self.expect(TokenKind::In)?;
        let expr = self.parse()?;
        self.tmvar.pop();
        self.tyvar.pop();
        Ok(Term::new(
            Kind::Unpack(Box::new(package), Box::new(expr)),
            sp + self.span,
        ))
    }

    fn atom(&mut self) -> Result<Term, Error> {
        match self.kind() {
            TokenKind::LParen => self.paren(),
            TokenKind::Fix => self.fix(),
            TokenKind::Fold => self.fold(),
            TokenKind::Unfold => self.unfold(),
            TokenKind::Pack => self.pack(),
            TokenKind::Unpack => self.unpack(),
            TokenKind::IsZero | TokenKind::Succ | TokenKind::Pred => self.primitive(),
            TokenKind::Uppercase(_) => self.injection(),
            TokenKind::Lowercase(s) => {
                let var = self.lowercase_id()?;
                match self.tmvar.lookup(&var) {
                    Some(idx) => Ok(Term::new(Kind::Var(idx), self.span)),
                    None => {
                        self.diagnostic.push(format!("unbound variable {}", var), self.span);
                        self.error(ErrorKind::UnboundTypeVar)
                    }
                }
            }
            TokenKind::Nat(_) | TokenKind::True | TokenKind::False | TokenKind::Unit => self.literal(),
            TokenKind::Eof => self.error(ErrorKind::Eof),
            TokenKind::Semicolon => {
                self.bump();
                self.error(ErrorKind::ExpectedAtom)
            }
            _ => self.error(ErrorKind::ExpectedAtom),
        }
    }

    /// Parse a term of form:
    /// projection = atom `.` projection
    /// projection = atom
    fn projection(&mut self) -> Result<Term, Error> {
        let atom = self.atom()?;
        if self.bump_if(&TokenKind::Proj) {
            let idx = match self.bump() {
                TokenKind::Nat(idx) => idx,
                _ => {
                    self.diagnostic
                        .push(format!("expected integer index after {}", atom), self.span);
                    return self.error(ErrorKind::ExpectedToken(TokenKind::Proj));
                }
            };
            let sp = atom.span + self.span;
            Ok(Term::new(Kind::Projection(Box::new(atom), idx as usize), sp))
        } else {
            Ok(atom)
        }
    }

    /// Parse an application of form:
    /// application = atom application' | atom
    /// application' = atom application' | empty
    fn application(&mut self) -> Result<Term, Error> {
        let mut app = self.projection()?;

        loop {
            let sp = app.span;
            if let Ok(ty) = self.ty_app() {
                // Full type inference for System F is undecidable
                // Additionally, even partial type reconstruction,
                // where only type application types are erased is also
                // undecidable, see TaPL 23.6.2, Boehm 1985, 1989
                //
                // Partial erasure rules:
                // erasep(x) = x
                // erasep(λx:T. t) = λx:T. erasep(t)
                // erasep(t1 t2) = erasep(t1) erasep(t2)
                // erasep(λX. t) = λX. erasep(t)
                // erasep(t T) = erasep(t) []      <--- erasure of TyApp
                app = Term::new(Kind::TyApp(Box::new(app), Box::new(ty)), sp + self.span);
            } else if let Ok(term) = self.projection() {
                app = Term::new(Kind::App(Box::new(app), Box::new(term)), sp + self.span);
            } else {
                break;
            }
        }
        Ok(app)
    }

    pub fn parse(&mut self) -> Result<Term, Error> {
        match self.kind() {
            TokenKind::Case => self.case(),
            TokenKind::Lambda => self.lambda(),
            TokenKind::Let => self.letexpr(),
            _ => self.application(),
        }
    }
}


================================================
FILE: 06_system_f/src/terms/mod.rs
================================================
//! Representation lambda calculus terms
use crate::patterns::Pattern;
use crate::types::Type;
use std::fmt;
use util::span::Span;
pub mod visit;

#[derive(Clone, PartialEq, PartialOrd)]
pub struct Term {
    pub span: Span,
    pub kind: Kind,
}

/// Primitive functions supported by this implementation
#[derive(Copy, Clone, Debug, PartialEq, PartialOrd)]
pub enum Primitive {
    Succ,
    Pred,
    IsZero,
}

/// Abstract syntax of the parametric polymorphic lambda calculus
#[derive(Clone, Debug, PartialEq, PartialOrd)]
pub enum Kind {
    /// A literal value
    Lit(Literal),
    /// A bound variable, represented by it's de Bruijn index
    Var(usize),
    /// Fixpoint operator/Y combinator
    Fix(Box<Term>),

    Primitive(Primitive),

    /// Injection into a sum type
    /// fields: type constructor tag, term, and sum type
    Injection(String, Box<Term>, Box<Type>),

    /// Product type (tuple)
    Product(Vec<Term>),
    /// Projection into a term
    Projection(Box<Term>, usize),

    /// A case expr, with case arms
    Case(Box<Term>, Vec<Arm>),

    Let(Box<Pattern>, Box<Term>, Box<Term>),
    /// A lambda abstraction
    Abs(Box<Type>, Box<Term>),
    /// Application of a term to another term
    App(Box<Term>, Box<Term>),
    /// Type abstraction
    TyAbs(Box<Term>),
    /// Type application
    TyApp(Box<Term>, Box<Type>),

    Fold(Box<Type>, Box<Term>),
    Unfold(Box<Type>, Box<Term>),

    /// Introduce an existential type
    /// { *Ty1, Term } as {∃X.Ty}
    /// essentially, concrete representation as interface
    Pack(Box<Type>, Box<Term>, Box<Type>),
    /// Unpack an existential type
    /// open {∃X, bind} in body -- X is bound as a TyVar, and bind as Var(0)
    /// Eliminate an existential type
    Unpack(Box<Term>, Box<Term>),
}

/// Arm of a case expression
#[derive(Clone, Debug, PartialEq, PartialOrd)]
pub struct Arm {
    pub span: Span,
    pub pat: Pattern,
    pub term: Box<Term>,
}

/// Constant literal expression or pattern
#[derive(Copy, Clone, Debug, PartialEq, PartialOrd, Eq, Hash)]
pub enum Literal {
    Unit,
    Bool(bool),
    Nat(u32),
}

impl Term {
    pub fn new(kind: Kind, span: Span) -> Term {
        Term { span, kind }
    }

    #[allow(dead_code)]
    pub const fn unit() -> Term {
        Term {
            span: Span::dummy(),
            kind: Kind::Lit(Literal::Unit),
        }
    }

    #[allow(dead_code)]
    #[inline]
    pub fn span(&self) -> Span {
        self.span
    }

    #[inline]
    pub fn kind(&self) -> &Kind {
        &self.kind
    }
}

impl fmt::Display for Literal {
    fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
        match self {
            Literal::Nat(n) => write!(f, "{}", n),
            Literal::Bool(b) => write!(f, "{}", b),
            Literal::Unit => write!(f, "unit"),
        }
    }
}

impl fmt::Display for Term {
    fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
        match &self.kind {
            Kind::Lit(lit) => write!(f, "{}", lit),
            Kind::Var(v) => write!(f, "#{}", v),
            Kind::Abs(ty, term) => write!(f, "(λ_:{:?}. {})", ty, term),
            Kind::Fix(term) => write!(f, "Fix {:?}", term),
            Kind::Primitive(p) => write!(f, "{:?}", p),
            Kind::Injection(label, tm, ty) => write!(f, "{}({})", label, tm),
            Kind::Projection(term, idx) => write!(f, "{}.{}", term, idx),
            Kind::Product(terms) => write!(
                f,
                "({})",
                terms
                    .iter()
                    .map(|t| format!("{}", t))
                    .collect::<Vec<String>>()
                    .join(",")
            ),
            Kind::Case(term, arms) => {
                writeln!(f, "case {} of", term)?;
                for arm in arms {
                    writeln!(f, "\t| {:?} => {},", arm.pat, arm.term)?;
                }
                write!(f, "")
            }
            Kind::Let(pat, t1, t2) => write!(f, "let {:?} = {} in {}", pat, t1, t2),
            Kind::App(t1, t2) => write!(f, "({} {})", t1, t2),
            Kind::TyAbs(term) => write!(f, "(λTy {})", term),
            Kind::TyApp(term, ty) => write!(f, "({} [{:?}])", term, ty),
            Kind::Fold(ty, term) => write!(f, "fold [{:?}] {}", ty, term),
            Kind::Unfold(ty, term) => write!(f, "unfold [{:?}] {}", ty, term),
            Kind::Pack(witness, body, sig) => write!(f, "[|pack {{*{:?}, {}}} as {:?} |]", witness, body, sig),
            Kind::Unpack(m, n) => write!(f, "unpack {} as {}", m, n),
        }
    }
}

impl fmt::Debug for Term {
    fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
        write!(f, "{:?}", self.kind)
    }
}

#[cfg(test)]
mod test {
    use super::*;

    #[test]
    fn pattern_matches() {
        let ty = Type::Variant(vec![
            variant!("A", Type::Nat),
            variant!("B", Type::Product(vec![Type::Nat, Type::Bool])),
        ]);

        let a_pats = vec![con!("A", Pattern::Any), con!("A", num!(9)), con!("A", num!(10))];

        let b_pats = vec![
            con!("B", Pattern::Any),
            con!("B", prod!(num!(1), boolean!(true))),
            con!("B", prod!(Pattern::Any, boolean!(false))),
        ];

        let res = [true, false, true];

        let a = inj!("A", nat!(10), ty.clone());
        let b = inj!("B", tuple!(nat!(1), lit!(false)), ty.clone());

        for (pat, result) in a_pats.iter().zip(&res) {
            assert_eq!(pat.matches(&a), *result);
        }

        for (pat, result) in b_pats.iter().zip(&res) {
            assert_eq!(pat.matches(&b), *result, "{:?}", pat);
        }
    }
}


================================================
FILE: 06_system_f/src/terms/visit.rs
================================================
use crate::patterns::{Pattern, PatternCount};
use crate::terms::{Arm, Kind, Primitive, Term};
use crate::types::Type;
use crate::visit::{MutTermVisitor, MutTypeVisitor};
use util::span::Span;

pub struct Shift {
    cutoff: usize,
    shift: isize,
}

impl Shift {
    pub const fn new(shift: isize) -> Shift {
        Shift { cutoff: 0, shift }
    }
}

impl MutTermVisitor for Shift {
    fn visit_var(&mut self, sp: &mut Span, var: &mut usize) {
        if *var >= self.cutoff {
            *var = (*var as isize + self.shift) as usize;
        }
    }

    fn visit_abs(&mut self, sp: &mut Span, ty: &mut Type, term: &mut Term) {
        self.cutoff += 1;
        self.visit(term);
        self.cutoff -= 1;
    }

    fn visit_let(&mut self, sp: &mut Span, pat: &mut Pattern, t1: &mut Term, t2: &mut Term) {
        self.visit(t1);
        let c = PatternCount::collect(pat);
        self.cutoff += c;
        self.visit(t2);
        self.cutoff -= c;
    }

    fn visit_case(&mut self, sp: &mut Span, term: &mut Term, arms: &mut Vec<Arm>) {
        self.visit(term);
        for arm in arms {
            let c = PatternCount::collect(&mut arm.pat);
            self.cutoff += c;
            self.visit(&mut arm.term);
            self.cutoff -= c;
        }
    }

    fn visit_unpack(&mut self, _: &mut Span, package: &mut Term, term: &mut Term) {
        self.visit(package);
        self.cutoff += 1;
        self.visit(term);
        self.cutoff -= 1;
    }
}

pub struct Subst {
    cutoff: usize,
    term: Term,
}

impl Subst {
    pub fn new(term: Term) -> Subst {
        Subst { cutoff: 0, term }
    }
}

impl MutTermVisitor for Subst {
    fn visit_abs(&mut self, sp: &mut Span, ty: &mut Type, term: &mut Term) {
        self.cutoff += 1;
        self.visit(term);
        self.cutoff -= 1;
    }

    fn visit_let(&mut self, sp: &mut Span, pat: &mut Pattern, t1: &mut Term, t2: &mut Term) {
        self.visit(t1);
        let c = PatternCount::collect(pat);
        self.cutoff += c;
        self.visit(t2);
        self.cutoff -= c;
    }

    fn visit_case(&mut self, sp: &mut Span, term: &mut Term, arms: &mut Vec<Arm>) {
        self.visit(term);
        for arm in arms {
            let c = PatternCount::collect(&mut arm.pat);
            self.cutoff += c;
            self.visit(&mut arm.term);
            self.cutoff -= c;
        }
    }

    fn visit_unpack(&mut self, _: &mut Span, package: &mut Term, term: &mut Term) {
        self.visit(package);
        self.cutoff += 1;
        self.visit(term);
        self.cutoff -= 1;
    }

    fn visit(&mut self, term: &mut Term) {
        let sp = &mut term.span;
        match &mut term.kind {
            Kind::Var(v) if *v == self.cutoff => {
                Shift::new(self.cutoff as isize).visit(&mut self.term);
                *term = self.term.clone();
            }
            _ => self.walk(term),
        }
    }
}

pub struct TyTermSubst {
    cutoff: usize,
    ty: Type,
}

impl TyTermSubst {
    pub fn new(ty: Type) -> TyTermSubst {
        use crate::types::visit::*;
        let mut ty = ty;
        Shift::new(1).visit(&mut ty);
        TyTermSubst { cutoff: 0, ty }
    }

    fn visit_ty(&mut self, ty: &mut Type) {
        let mut s = crate::types::visit::Subst {
            cutoff: self.cutoff,
            ty: self.ty.clone(),
        };
        s.visit(ty);
    }
}

impl MutTermVisitor for TyTermSubst {
    fn visit_abs(&mut self, sp: &mut Span, ty: &mut Type, term: &mut Term) {
        // self.cutoff += 1;
        self.visit_ty(ty);
        self.visit(term);
        // self.cutoff -= 1;
    }

    fn visit_tyapp(&mut self, sp: &mut Span, term: &mut Term, ty: &mut Type) {
        self.visit_ty(ty);
        self.visit(term);
    }

    fn visit_tyabs(&mut self, sp: &mut Span, term: &mut Term) {
        self.cutoff += 1;
        self.visit(term);
        self.cutoff -= 1;
    }

    fn visit_fold(&mut self, sp: &mut Span, ty: &mut Type, term: &mut Term) {
        self.visit_ty(ty);
        self.visit(term);
    }

    fn visit_unfold(&mut self, sp: &mut Span, ty: &mut Type, term: &mut Term) {
        self.visit_ty(ty);
        self.visit(term);
    }

    fn visit_unpack(&mut self, _: &mut Span, package: &mut Term, term: &mut Term) {
        self.visit(package);
        self.cutoff += 1;
        self.visit(term);
        self.cutoff -= 1;
    }

    fn visit_pack(&mut self, _: &mut Span, wit: &mut Type, body: &mut Term, sig: &mut Type) {
        self.visit_ty(wit);
        self.visit(body);
        self.visit_ty(sig);
    }

    fn visit_injection(&mut self, sp: &mut Span, label: &mut String, term: &mut Term, ty: &mut Type) {
        self.visit_ty(ty);
        self.visit(term);
    }
}

/// Visitor for handling recursive variants automatically, by inserting a
/// fold term
///
/// Transform an [`Injection`] term of form: `Label tm of Rec(u.T)` into
/// `fold [u.T] Label tm of [X->u.T] T`
pub struct InjRewriter;

impl MutTermVisitor for InjRewriter {
    fn visit(&mut self, term: &mut Term) {
        match &mut term.kind {
            Kind::Injection(label, val, ty) => {
                match *ty.clone() {
                    Type::Rec(inner) => {
                        let ty_prime = crate::types::subst(*ty.clone(), *inner.clone());
                        let rewrite_ty = Term::new(
                            Kind::Injection(label.clone(), val.clone(), Box::new(ty_prime)),
                            term.span,
                        );

                        *term = Term::new(Kind::Fold(ty.clone(), Box::new(rewrite_ty)), term.span);
                    }
                    _ => {}
                }
                self.walk(term);
            }
            _ => self.walk(term),
        }
    }
}


================================================
FILE: 06_system_f/src/types/mod.rs
================================================
//! Typechecking of the simply typed lambda calculus with parametric
//! polymorphism
pub mod patterns;
pub mod visit;
use crate::diagnostics::*;
use crate::terms::{Kind, Literal, Primitive, Term};
use crate::visit::{MutTermVisitor, MutTypeVisitor};
use std::collections::{HashMap, VecDeque};
use std::fmt;
use util::span::Span;
use visit::{Shift, Subst};

#[derive(Clone, PartialEq, PartialOrd, Eq, Hash)]
pub enum Type {
    Unit,
    Nat,
    Bool,
    Alias(String),
    Var(usize),
    Variant(Vec<Variant>),
    Product(Vec<Type>),
    Arrow(Box<Type>, Box<Type>),
    Universal(Box<Type>),
    Existential(Box<Type>),
    Rec(Box<Type>),
}

#[derive(Clone, PartialEq, PartialOrd, Eq, Hash)]
pub struct Variant {
    pub label: String,
    pub ty: Type,
}

#[derive(Clone, Debug, PartialEq, PartialOrd)]
pub struct TypeError {
    pub span: Span,
    pub kind: TypeErrorKind,
}

#[derive(Clone, Debug, PartialEq, PartialOrd)]
pub enum TypeErrorKind {
    ParameterMismatch(Box<Type>, Box<Type>, Span),

    InvalidProjection,
    NotArrow,
    NotUniversal,
    NotVariant,
    NotProduct,
    NotRec,
    IncompatibleArms,
    InvalidPattern,
    NotExhaustive,
    UnreachablePattern,
    UnboundVariable(usize),
}

#[derive(Clone, Debug, Default, PartialEq)]
pub struct Context {
    stack: VecDeque<Type>,
    map: HashMap<String, Type>,
}

impl Context {
    fn push(&mut self, ty: Type) {
        self.stack.push_front(ty);
    }

    fn pop(&mut self) {
        self.stack.pop_front().expect("Context::pop() with empty type stack");
    }

    fn find(&self, idx: usize) -> Option<&Type> {
        self.stack.get(idx)
    }

    pub fn alias(&mut self, alias: String, ty: Type) {
        self.map.insert(alias, ty);
    }

    fn aliaser(&self) -> Aliaser<'_> {
        Aliaser { map: &self.map }
    }

    pub fn de_alias(&mut self, term: &mut Term) {
        crate::visit::MutTermVisitor::visit(self, term)
    }
}

/// Helper function for extracting type from a variant
pub fn variant_field<'vs>(var: &'vs [Variant], label: &str, span: Span) -> Result<&'vs Type, Diagnostic> {
    for f in var {
        if label == f.label {
            return Ok(&f.ty);
        }
    }
    Err(Diagnostic::error(
        span,
        format!("constructor {} doesn't appear in variant fields", label),
    ))

    // Err(TypeError {
    //     span,
    //     kind: TypeErrorKind::NotVariant,
    // })
}

impl Context {
    pub fn type_check(&mut self, term: &Term) -> Result<Type, Diagnostic> {
        // dbg!(&self.stack);

        // println!("{}", term);
        match term.kind() {
            Kind::Lit(Literal::Unit) => Ok(Type::Unit),
            Kind::Lit(Literal::Bool(_)) => Ok(Type::Bool),
            Kind::Lit(Literal::Nat(_)) => Ok(Type::Nat),
            Kind::Var(idx) => self
                .find(*idx)
                .cloned()
                .ok_or_else(|| Diagnostic::error(term.span, format!("unbound variable {}", idx))),

            Kind::Abs(ty, t2) => {
                self.push(*ty.clone());
                let ty2 = self.type_check(t2)?;
                // Shift::new(-1).visit(&mut ty2);
                self.pop();
                Ok(Type::Arrow(ty.clone(), Box::new(ty2)))
            }
            Kind::App(t1, t2) => {
                let ty1 = self.type_check(t1)?;
                let ty2 = self.type_check(t2)?;
                match ty1 {
                    Type::Arrow(ty11, ty12) => {
                        if *ty11 == ty2 {
                            Ok(*ty12)
                        } else {
                            let d = Diagnostic::error(term.span, "Type mismatch in application")
                                .message(t1.span, format!("Abstraction requires type {:?}", ty11))
                                .message(t2.span, format!("Value has a type of {:?}", ty2));
                            Err(d)
                        }
                    }
                    _ => Err(Diagnostic::error(term.span, "Expected arrow type!")
                        .message(t1.span, format!("operator has type {:?}", ty1))),
                }
            }
            Kind::Fix(inner) => {
                let ty = self.type_check(inner)?;
                match ty {
                    Type::Arrow(ty1, ty2) => {
                        if ty1 == ty2 {
                            Ok(*ty1)
                        } else {
                            let d = Diagnostic::error(term.span, "Type mismatch in fix term")
                                .message(inner.span, format!("Abstraction requires type {:?}->{:?}", ty1, ty1));
                            Err(d)
                        }
                    }
                    _ => Err(Diagnostic::error(term.span, "Expected arrow type!")
                        .message(inner.span, format!("operator has type {:?}", ty))),
                }
            }
            Kind::Primitive(prim) => match prim {
                Primitive::IsZero => Ok(Type::Arrow(Box::new(Type::Nat), Box::new(Type::Bool))),
                _ => Ok(Type::Arrow(Box::new(Type::Nat), Box::new(Type::Nat))),
            },
            Kind::Injection(label, tm, ty) => match ty.as_ref() {
                Type::Variant(fields) => {
                    for f in fields {
                        if label == &f.label {
                            let ty_ = self.type_check(tm)?;
                            if ty_ == f.ty {
                                return Ok(*ty.clone());
                            } else {
                                let d = Diagnostic::error(term.span, "Invalid associated type in variant").message(
                                    tm.span,
                                    format!("variant {} requires type {:?}, but this is {:?}", label, f.ty, ty_),
                                );
                                return Err(d);
                            }
                        }
                    }
                    Err(Diagnostic::error(
                        term.span,
                        format!(
                            "constructor {} does not belong to the variant {:?}",
                            label,
                            fields
                                .iter()
                                .map(|f| f.label.clone())
                                .collect::<Vec<String>>()
                                .join(" | ")
                        ),
                    ))
                }
                _ => Err(Diagnostic::error(
                    term.span,
                    format!("Cannot injection {} into non-variant type {:?}", label, ty),
                )),
            },
            Kind::Projection(term, idx) => match self.type_check(term)? {
                Type::Product(types) => match types.get(*idx) {
                    Some(ty) => Ok(ty.clone()),
                    None => Err(Diagnostic::error(
                        term.span,
                        format!("{} is out of range for product of length {}", idx, types.len()),
                    )),
                },
                ty => Err(Diagnostic::error(
                    term.span,
                    format!("Cannot project on non-product type {:?}", ty),
                )),
            },
            Kind::Product(terms) => Ok(Type::Product(
                terms.iter().map(|t| self.type_check(t)).collect::<Result<_, _>>()?,
            )),
            Kind::Let(pat, t1, t2) => {
                let ty = self.type_check(t1)?;
                if !self.pattern_type_eq(&pat, &ty) {
                    return Err(Diagnostic::error(
                        t1.span,
                        format!("pattern does not match type of binder"),
                    ));
                }

                let height = self.stack.len();

                let binds = crate::patterns::PatTyStack::collect(&ty, &pat);
                for b in binds.into_iter().rev() {
                    self.push(b.clone());
                }

                let y = self.type_check(t2);

                while self.stack.len() > height {
                    self.pop();
                }

                y
            }
            Kind::TyAbs(term) => {
                self.stack.iter_mut().for_each(|ty| match ty {
                    Type::Var(v) => *v += 1,
                    _ => {}
                });
                let ty2 = self.type_check(term)?;
                self.stack.iter_mut().for_each(|ty| match ty {
                    Type::Var(v) => *v -= 1,
                    _ => {}
                });
                Ok(Type::Universal(Box::new(ty2)))
            }
            Kind::TyApp(term, ty) => {
                let mut ty = ty.clone();
                let ty1 = self.type_check(term)?;
                match ty1 {
                    Type::Universal(mut ty12) => {
                        Shift::new(1).visit(&mut ty);
                        Subst::new(*ty).visit(&mut ty12);
                        Shift::new(-1).visit(&mut ty12);
                        Ok(*ty12)
                    }
                    _ => Err(Diagnostic::error(
                        term.span,
                        format!("Expected a universal type, not {:?}", ty1),
                    )),
                }
            }
            // See src/types/patterns.rs for exhaustiveness and typechecking
            // of case expressions
            Kind::Case(expr, arms) => self.type_check_case(expr, arms),

            Kind::Unfold(rec, tm) => match rec.as_ref() {
                Type::Rec(inner) => {
                    let ty_ = self.type_check(&tm)?;
                    if ty_ == *rec.clone() {
                        let s = subst(*rec.clone(), *inner.clone());
                        Ok(s)
                    } else {
                        let d = Diagnostic::error(term.span, "Type mismatch in unfold")
                            .message(term.span, format!("unfold requires type {:?}", rec))
                            .message(tm.span, format!("term has a type of {:?}", ty_));
                        Err(d)
                    }
                }
                _ => Err(Diagnostic::error(
                    term.span,
                    format!("Expected a recursive type, not {:?}", rec),
                )),
            },

            Kind::Fold(rec, tm) => match rec.as_ref() {
                Type::Rec(inner) => {
                    let ty_ = self.type_check(&tm)?;
                    let s = subst(*rec.clone(), *inner.clone());
                    if ty_ == s {
                        Ok(*rec.clone())
                    } else {
                        let d = Diagnostic::error(term.span, "Type mismatch in fold")
                            .message(term.span, format!("unfold requires type {:?}", s))
                            .message(tm.span, format!("term has a type of {:?}", ty_));
                        Err(d)
                    }
                }
                _ => Err(Diagnostic::error(
                    term.span,
                    format!("Expected a recursive type, not {:?}", rec),
                )),
            },
            Kind::Pack(witness, evidence, signature) => {
                if let Type::Existential(exists) = signature.as_ref() {
                    let sig_prime = subst(*witness.clone(), *exists.clone());
                    let evidence_ty = self.type_check(evidence)?;
                    if evidence_ty == sig_prime {
                        Ok(*signature.clone())
                    } else {
                        let d = Diagnostic::error(term.span, "Type mismatch in pack")
                            .message(term.span, format!("signature has type {:?}", sig_prime))
                            .message(evidence.span, format!("but term has a type {:?}", evidence_ty));
                        Err(d)
                    }
                } else {
                    Err(Diagnostic::error(
                        term.span,
                        format!("Expected an existential type signature, not {:?}", signature),
                    ))
                }
            }
            Kind::Unpack(package, body) => {
                let p_ty = self.type_check(package)?;
                if let Type::Existential(xst) = p_ty {
                    self.push(*xst);
                    let body_ty = self.type_check(body)?;
                    self.pop();
                    Ok(body_ty)
                } else {
                    Err(Diagnostic::error(
                   
Download .txt
gitextract_94czkswh/

├── .gitattributes
├── .github/
│   ├── ISSUE_TEMPLATE/
│   │   ├── bug_report.md
│   │   └── feature_request.md
│   └── workflows/
│       └── rust.yml
├── .gitignore
├── .rustfmt.toml
├── .travis.yml
├── 01_arith/
│   ├── Cargo.toml
│   └── src/
│       ├── lexer.rs
│       ├── main.rs
│       └── parser.rs
├── 02_lambda/
│   ├── Cargo.toml
│   └── src/
│       ├── context.rs
│       ├── lexer.rs
│       ├── main.rs
│       └── parser.rs
├── 03_typedarith/
│   ├── Cargo.toml
│   └── src/
│       ├── ast.rs
│       ├── lexer.rs
│       ├── main.rs
│       └── parser.rs
├── 04_stlc/
│   ├── .gitignore
│   ├── Cargo.toml
│   └── src/
│       ├── eval.rs
│       ├── lexer.rs
│       ├── main.rs
│       ├── parser.rs
│       ├── term.rs
│       ├── typing.rs
│       └── visitor.rs
├── 05_recon/
│   ├── Cargo.toml
│   └── src/
│       ├── disjoint.rs
│       ├── main.rs
│       ├── mutation/
│       │   ├── mod.rs
│       │   └── write_once.rs
│       ├── naive.rs
│       ├── parser.rs
│       └── types.rs
├── 06_system_f/
│   ├── Cargo.toml
│   ├── README.md
│   ├── src/
│   │   ├── diagnostics.rs
│   │   ├── eval.rs
│   │   ├── macros.rs
│   │   ├── main.rs
│   │   ├── patterns/
│   │   │   └── mod.rs
│   │   ├── syntax/
│   │   │   ├── lexer.rs
│   │   │   ├── mod.rs
│   │   │   └── parser.rs
│   │   ├── terms/
│   │   │   ├── mod.rs
│   │   │   └── visit.rs
│   │   ├── types/
│   │   │   ├── mod.rs
│   │   │   ├── patterns.rs
│   │   │   └── visit.rs
│   │   └── visit.rs
│   └── test.sf
├── 07_system_fw/
│   ├── Cargo.toml
│   ├── README.md
│   ├── src/
│   │   ├── diagnostics.rs
│   │   ├── elaborate.rs
│   │   ├── functor.rs
│   │   ├── hir/
│   │   │   ├── bidir.rs
│   │   │   └── mod.rs
│   │   ├── macros.rs
│   │   ├── main.rs
│   │   ├── stack.rs
│   │   ├── syntax/
│   │   │   ├── ast.rs
│   │   │   ├── lexer.rs
│   │   │   ├── mod.rs
│   │   │   ├── parser/
│   │   │   │   ├── README.md
│   │   │   │   ├── decls.rs
│   │   │   │   ├── exprs.rs
│   │   │   │   ├── infix.rs
│   │   │   │   ├── mod.rs
│   │   │   │   ├── patterns.rs
│   │   │   │   └── types.rs
│   │   │   ├── tokens.rs
│   │   │   └── visit/
│   │   │       ├── mod.rs
│   │   │       └── types.rs
│   │   ├── terms.rs
│   │   ├── typecheck.rs
│   │   └── types.rs
│   └── test.fw
├── Cargo.toml
├── LICENSE
├── README.md
├── util/
│   ├── .gitignore
│   ├── Cargo.toml
│   └── src/
│       ├── arena.rs
│       ├── diagnostic.rs
│       ├── lib.rs
│       ├── span.rs
│       └── unsafe_arena.rs
├── x1_bidir/
│   ├── Cargo.toml
│   └── src/
│       ├── helpers.rs
│       └── main.rs
└── x2_dependent/
    ├── Cargo.toml
    └── src/
        └── main.rs
Download .txt
SYMBOL INDEX (1012 symbols across 65 files)

FILE: 01_arith/src/lexer.rs
  type Token (line 8) | pub enum Token {
  type TokenSpan (line 27) | pub struct TokenSpan {
    type Target (line 33) | type Target = Token;
    method deref (line 34) | fn deref(&self) -> &Self::Target {
  type Lexer (line 40) | pub struct Lexer<'s> {
  function new (line 46) | pub fn new(input: Chars<'s>) -> Lexer<'s> {
  function peek (line 57) | fn peek(&mut self) -> Option<char> {
  function consume (line 62) | fn consume(&mut self) -> Option<char> {
  function consume_while (line 79) | fn consume_while<F: Fn(char) -> bool>(&mut self, pred: F) -> Spanned<Str...
  function consume_delimiter (line 96) | fn consume_delimiter(&mut self) {
  function number (line 100) | fn number(&mut self) -> Option<TokenSpan> {
  function keyword (line 106) | fn keyword(&mut self) -> Option<TokenSpan> {
  function eat (line 123) | fn eat(&mut self, ch: char, token: Token) -> Option<TokenSpan> {
  function lex (line 133) | fn lex(&mut self) -> Option<TokenSpan> {
  type Item (line 147) | type Item = TokenSpan;
  method next (line 148) | fn next(&mut self) -> Option<Self::Item> {
  function valid (line 158) | fn valid() {
  function invalid (line 169) | fn invalid() {

FILE: 01_arith/src/main.rs
  type RuntimeError (line 6) | pub enum RuntimeError {
  method is_numeric (line 11) | pub fn is_numeric(&self) -> bool {
  method is_normal (line 19) | pub fn is_normal(&self) -> bool {
  function eval1 (line 27) | pub fn eval1(t: Term) -> Result<Term, RuntimeError> {
  function eval (line 63) | pub fn eval(t: Term) -> Term {
  function main (line 74) | fn main() {

FILE: 01_arith/src/parser.rs
  type Term (line 7) | pub enum Term {
  type Parser (line 17) | pub struct Parser<'s> {
  function new (line 27) | pub fn new(input: &'s str) -> Parser<'s> {
  function consume (line 35) | fn consume(&mut self) -> Option<Token> {
  function expect (line 41) | fn expect(&mut self, token: Token) -> Option<Token> {
  function parse_paren (line 48) | fn parse_paren(&mut self) -> Option<Term> {
  function parse_if (line 54) | fn parse_if(&mut self) -> Option<Term> {
  function parse_term (line 63) | pub fn parse_term(&mut self) -> Option<Term> {
  function diagnostic (line 86) | pub fn diagnostic(self) -> Diagnostic<'s> {
  function baptize (line 92) | fn baptize(int: u32) -> Term {

FILE: 02_lambda/src/context.rs
  type Context (line 4) | pub struct Context {
    method bind (line 9) | pub fn bind(&mut self, hint: String) -> (Context, usize) {
    method lookup (line 20) | pub fn lookup(&self, key: String) -> Option<usize> {
    method size (line 29) | pub fn size(&self) -> usize {

FILE: 02_lambda/src/lexer.rs
  type Token (line 8) | pub enum Token {
  type Lexer (line 18) | pub struct Lexer<'s> {
  function new (line 24) | pub fn new(input: Chars<'s>) -> Lexer<'s> {
  function peek (line 35) | fn peek(&mut self) -> Option<char> {
  function consume (line 40) | fn consume(&mut self) -> Option<char> {
  function consume_while (line 57) | fn consume_while<F: Fn(char) -> bool>(&mut self, pred: F) -> Spanned<Str...
  function consume_delimiter (line 74) | fn consume_delimiter(&mut self) {
  function eat (line 78) | fn eat(&mut self, ch: char, token: Token) -> Option<Spanned<Token>> {
  function lex (line 85) | fn lex(&mut self) -> Option<Spanned<Token>> {
  type Item (line 98) | type Item = Spanned<Token>;
  method next (line 99) | fn next(&mut self) -> Option<Self::Item> {

FILE: 02_lambda/src/main.rs
  function shift1 (line 9) | fn shift1(d: isize, c: isize, tm: RcTerm) -> RcTerm {
  function shift (line 23) | fn shift(d: isize, tm: RcTerm) -> RcTerm {
  function subst_walk (line 27) | fn subst_walk(j: isize, s: RcTerm, c: isize, t: RcTerm) -> RcTerm {
  function subst (line 46) | fn subst(j: isize, s: RcTerm, tm: RcTerm) -> RcTerm {
  function term_subst_top (line 50) | fn term_subst_top(s: RcTerm, tm: RcTerm) -> RcTerm {
  function isval (line 54) | fn isval(_ctx: &Context, tm: RcTerm) -> bool {
  function eval1 (line 61) | fn eval1(ctx: &Context, tm: RcTerm) -> RcTerm {
  function main (line 82) | fn main() {

FILE: 02_lambda/src/parser.rs
  type RcTerm (line 10) | pub struct RcTerm(pub Rc<Term>);
    method from (line 13) | fn from(term: Term) -> RcTerm {
    method fmt (line 19) | fn fmt(&self, f: &mut std::fmt::Formatter) -> std::fmt::Result {
  type Target (line 35) | type Target = Term;
  method deref (line 36) | fn deref(&self) -> &Self::Target {
  type Term (line 42) | pub enum Term {
    method fmt (line 25) | fn fmt(&self, f: &mut std::fmt::Formatter) -> std::fmt::Result {
  type Parser (line 48) | pub struct Parser<'s> {
  function new (line 59) | pub fn new(input: &'s str) -> Parser<'s> {
  function consume (line 68) | fn consume(&mut self) -> Option<Spanned<Token>> {
  function expect (line 74) | fn expect(&mut self, token: Token) -> Option<Spanned<Token>> {
  function peek (line 86) | fn peek(&mut self) -> Option<Token> {
  function lambda (line 90) | fn lambda(&mut self) -> Option<RcTerm> {
  function term (line 121) | fn term(&mut self) -> Option<RcTerm> {
  function application (line 131) | fn application(&mut self) -> Option<RcTerm> {
  function atom (line 142) | fn atom(&mut self) -> Option<RcTerm> {
  function parse_term (line 165) | pub fn parse_term(&mut self) -> Option<RcTerm> {
  function ctx (line 169) | pub fn ctx(&self) -> &Context {
  function diagnostic (line 173) | pub fn diagnostic(self) -> Diagnostic<'s> {

FILE: 03_typedarith/src/ast.rs
  type Type (line 5) | pub enum Type {
  type Term (line 11) | pub enum Term {
  type TyError (line 22) | pub enum TyError {
  function typing (line 26) | pub fn typing(tm: RcTerm) -> Result<Type, TyError> {
  type RcTerm (line 59) | pub struct RcTerm(pub Rc<Term>);
    method fmt (line 62) | fn fmt(&self, f: &mut std::fmt::Formatter) -> std::fmt::Result {
    method from (line 68) | fn from(term: Term) -> RcTerm {
  type Target (line 74) | type Target = Term;
  method deref (line 75) | fn deref(&self) -> &Self::Target {

FILE: 03_typedarith/src/lexer.rs
  type Token (line 8) | pub enum Token {
  type TokenSpan (line 25) | pub struct TokenSpan {
    type Target (line 31) | type Target = Token;
    method deref (line 32) | fn deref(&self) -> &Self::Target {
  type Lexer (line 38) | pub struct Lexer<'s> {
  function new (line 44) | pub fn new(input: Chars<'s>) -> Lexer<'s> {
  function peek (line 55) | fn peek(&mut self) -> Option<char> {
  function consume (line 60) | fn consume(&mut self) -> Option<char> {
  function consume_while (line 77) | fn consume_while<F: Fn(char) -> bool>(&mut self, pred: F) -> Spanned<Str...
  function consume_delimiter (line 94) | fn consume_delimiter(&mut self) {
  function number (line 98) | fn number(&mut self) -> Option<TokenSpan> {
  function keyword (line 104) | fn keyword(&mut self) -> Option<TokenSpan> {
  function eat (line 121) | fn eat(&mut self, ch: char, token: Token) -> Option<TokenSpan> {
  function lex (line 131) | fn lex(&mut self) -> Option<TokenSpan> {
  type Item (line 145) | type Item = TokenSpan;
  method next (line 146) | fn next(&mut self) -> Option<Self::Item> {
  function valid (line 156) | fn valid() {
  function invalid (line 167) | fn invalid() {

FILE: 03_typedarith/src/main.rs
  function main (line 7) | fn main() {

FILE: 03_typedarith/src/parser.rs
  type Parser (line 7) | pub struct Parser<'s> {
  function new (line 17) | pub fn new(input: &'s str) -> Parser<'s> {
  function consume (line 25) | fn consume(&mut self) -> Option<Token> {
  function expect (line 31) | fn expect(&mut self, token: Token) -> Option<Token> {
  function parse_paren (line 38) | fn parse_paren(&mut self) -> Option<RcTerm> {
  function parse_if (line 44) | fn parse_if(&mut self) -> Option<RcTerm> {
  function parse_term (line 53) | pub fn parse_term(&mut self) -> Option<RcTerm> {
  function diagnostic (line 76) | pub fn diagnostic(self) -> Diagnostic<'s> {
  function baptize (line 82) | fn baptize(int: u32) -> Term {

FILE: 04_stlc/src/eval.rs
  type Error (line 6) | pub enum Error {
  function subst (line 11) | fn subst(mut val: Term, body: &mut Term) {
  function value (line 17) | fn value(ctx: &Context, term: &Term) -> bool {
  function eval1 (line 33) | fn eval1(ctx: &Context, term: Term) -> Result<Box<Term>, Error> {
  function eval (line 104) | pub fn eval(ctx: &Context, term: Term) -> Result<Term, Error> {

FILE: 04_stlc/src/lexer.rs
  type TokenKind (line 8) | pub enum TokenKind {
  type Token (line 43) | pub struct Token {
    method new (line 49) | pub const fn new(kind: TokenKind, span: Span) -> Token {
  type Lexer (line 55) | pub struct Lexer<'s> {
  function new (line 61) | pub fn new(input: Chars<'s>) -> Lexer<'s> {
  function peek (line 73) | fn peek(&mut self) -> Option<char> {
  function consume (line 78) | fn consume(&mut self) -> Option<char> {
  function consume_while (line 97) | fn consume_while<F: Fn(char) -> bool>(&mut self, pred: F) -> (String, Sp...
  function consume_delimiter (line 114) | fn consume_delimiter(&mut self) {
  function number (line 119) | fn number(&mut self) -> Token {
  function keyword (line 129) | fn keyword(&mut self) -> Token {
  function eat (line 156) | fn eat(&mut self, ch: char, kind: TokenKind) -> Token {
  function lex (line 166) | pub fn lex(&mut self) -> Token {
  type Item (line 197) | type Item = Token;
  method next (line 198) | fn next(&mut self) -> Option<Self::Item> {
  function valid (line 213) | fn valid() {
  function invalid (line 224) | fn invalid() {

FILE: 04_stlc/src/main.rs
  function ev (line 12) | fn ev(ctx: &mut Context, term: Term) -> Result<Term, eval::Error> {
  function parse (line 34) | fn parse(ctx: &mut Context, input: &str) {
  function main (line 47) | fn main() {

FILE: 04_stlc/src/parser.rs
  type DeBruijnIndexer (line 10) | pub struct DeBruijnIndexer {
    method push (line 15) | pub fn push(&mut self, hint: String) -> usize {
    method pop (line 25) | pub fn pop(&mut self) {
    method lookup (line 29) | pub fn lookup(&self, key: &str) -> Option<usize> {
  type Parser (line 39) | pub struct Parser<'s> {
  function new (line 50) | pub fn new(input: &'s str) -> Parser<'s> {
  function consume (line 59) | fn consume(&mut self) -> Option<Token> {
  function expect (line 65) | fn expect(&mut self, kind: TokenKind) -> Option<Token> {
  function expect_term (line 77) | fn expect_term(&mut self) -> Option<Box<Term>> {
  function peek (line 88) | fn peek(&mut self) -> Option<TokenKind> {
  function peek_span (line 92) | fn peek_span(&mut self) -> Span {
  function lambda (line 96) | fn lambda(&mut self) -> Option<Box<Term>> {
  function let_expr (line 113) | fn let_expr(&mut self) -> Option<Box<Term>> {
  function ty_record_field (line 125) | fn ty_record_field(&mut self) -> Option<RecordField> {
  function ty_atom (line 135) | fn ty_atom(&mut self) -> Option<Type> {
  function ty (line 173) | fn ty(&mut self) -> Option<Type> {
  function application (line 201) | fn application(&mut self) -> Option<Box<Term>> {
  function ident (line 216) | fn ident(&mut self) -> Option<String> {
  function record_field (line 228) | fn record_field(&mut self) -> Option<Field> {
  function record (line 241) | fn record(&mut self) -> Option<Box<Term>> {
  function if_expr (line 251) | fn if_expr(&mut self) -> Option<Box<Term>> {
  function atom (line 263) | fn atom(&mut self) -> Option<Box<Term>> {
  function term (line 322) | fn term(&mut self) -> Option<Box<Term>> {
  function parse_term (line 329) | pub fn parse_term(&mut self) -> Option<Box<Term>> {
  function diagnostic (line 333) | pub fn diagnostic(self) -> Diagnostic<'s> {

FILE: 04_stlc/src/term.rs
  type Field (line 6) | pub struct Field {
  type Term (line 18) | pub enum Term {
    method fmt (line 48) | fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
  function record_access (line 38) | pub fn record_access(fields: &[Field], projection: &str) -> Option<Box<T...

FILE: 04_stlc/src/typing.rs
  type Type (line 5) | pub enum Type {
    method fmt (line 28) | fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
  type Record (line 14) | pub struct Record {
  type RecordField (line 21) | pub struct RecordField {
  type TypeError (line 49) | pub enum TypeError {
  type Context (line 65) | pub struct Context<'a> {
  function add (line 71) | pub fn add(&self, ty: Type) -> Context {
  function get (line 85) | pub fn get(&self, idx: usize) -> Option<&Type> {
  function type_of (line 95) | pub fn type_of(&self, term: &Term) -> Result<Type, TypeError> {

FILE: 04_stlc/src/visitor.rs
  type Visitor (line 5) | pub trait Visitor: Sized {
    method visit_var (line 6) | fn visit_var(&mut self, var: usize);
    method visit_abs (line 7) | fn visit_abs(&mut self, ty: Type, body: &Term);
    method visit_app (line 8) | fn visit_app(&mut self, t1: &Term, t2: &Term);
    method visit_if (line 9) | fn visit_if(&mut self, guard: &Term, csq: &Term, alt: &Term);
    method visit_let (line 10) | fn visit_let(&mut self, bind: &Term, body: &Term);
    method visit_succ (line 11) | fn visit_succ(&mut self, t: &Term);
    method visit_pred (line 12) | fn visit_pred(&mut self, t: &Term);
    method visit_iszero (line 13) | fn visit_iszero(&mut self, t: &Term);
    method visit_const (line 14) | fn visit_const(&mut self, c: &Term);
    method visit_record (line 15) | fn visit_record(&mut self, c: &[Field]);
    method visit_proj (line 16) | fn visit_proj(&mut self, c: &Term, proj: &str);
    method visit_typedecl (line 17) | fn visit_typedecl(&mut self, name: &str, ty: &Type);
  type MutVisitor (line 20) | pub trait MutVisitor: Sized {
    method visit_var (line 21) | fn visit_var(&mut self, var: &mut Term) {}
    method visit_abs (line 23) | fn visit_abs(&mut self, ty: &mut Type, body: &mut Term) {
    method visit_app (line 26) | fn visit_app(&mut self, t1: &mut Term, t2: &mut Term) {
    method visit_if (line 30) | fn visit_if(&mut self, guard: &mut Term, csq: &mut Term, alt: &mut Ter...
    method visit_let (line 35) | fn visit_let(&mut self, bind: &mut Term, body: &mut Term) {
    method visit_succ (line 39) | fn visit_succ(&mut self, t: &mut Term) {
    method visit_pred (line 42) | fn visit_pred(&mut self, t: &mut Term) {
    method visit_iszero (line 45) | fn visit_iszero(&mut self, t: &mut Term) {
    method visit_const (line 48) | fn visit_const(&mut self, t: &mut Term) {}
    method visit_record (line 49) | fn visit_record(&mut self, c: &mut [Field]) {
    method visit_proj (line 54) | fn visit_proj(&mut self, t: &mut Term, proj: &mut String) {
    method visit_typedecl (line 57) | fn visit_typedecl(&mut self, name: &mut String, ty: &mut Type) {}
    method visit_term (line 59) | fn visit_term(&mut self, term: &mut Term) {
    method visit_var (line 108) | fn visit_var(&mut self, var: &mut Term) {
    method visit_abs (line 124) | fn visit_abs(&mut self, ty_: &mut Type, body: &mut Term) {
    method visit_let (line 130) | fn visit_let(&mut self, bind: &mut Term, body: &mut Term) {
    method visit_var (line 151) | fn visit_var(&mut self, var: &mut Term) {
    method visit_abs (line 160) | fn visit_abs(&mut self, ty_: &mut Type, body: &mut Term) {
    method visit_let (line 166) | fn visit_let(&mut self, bind: &mut Term, body: &mut Term) {
  function walk_mut_term (line 64) | fn walk_mut_term<V: MutVisitor>(visitor: &mut V, var: &mut Term) {
  type Direction (line 81) | pub enum Direction {
  type Shifting (line 87) | pub struct Shifting {
    method new (line 102) | pub fn new(direction: Direction) -> Self {
  method default (line 93) | fn default() -> Self {
  type Substitution (line 139) | pub struct Substitution {
    method new (line 145) | pub fn new(term: Term) -> Substitution {

FILE: 05_recon/src/disjoint.rs
  type SetElement (line 7) | struct SetElement<T> {
  type DisjointSet (line 13) | pub struct DisjointSet<T> {
  method default (line 19) | fn default() -> Self {
  type Element (line 28) | pub struct Element(usize);
  type Choice (line 30) | pub enum Choice {
  function new (line 36) | pub fn new() -> DisjointSet<T> {
  function singleton (line 43) | pub fn singleton(&mut self, data: T) -> Element {
  function find_set (line 55) | fn find_set(&self, id: usize) -> usize {
  function find_repr (line 76) | pub fn find_repr(&self, element: Element) -> Element {
  function data (line 80) | pub fn data(&self, element: Element) -> Option<&T> {
  function find (line 84) | pub fn find(&self, element: Element) -> &T {
  function union (line 92) | pub fn union<F: Fn(T, T) -> T>(&mut self, f: F, a: Element, b: Element) {
  function partition (line 124) | pub fn partition(&self) -> Vec<&T> {
  type Variable (line 137) | type Variable = Element;
  type Unification (line 140) | pub enum Unification {
    method is_var (line 146) | fn is_var(&self) -> bool {
  type Unifier (line 155) | pub struct Unifier {
    method new (line 161) | pub fn new() -> Unifier {
    method occurs_check (line 168) | pub fn occurs_check(&self, v: TypeVar, u: &Unification) -> bool {
    method decode (line 175) | pub fn decode(&self, uni: &Unification) -> Type {
    method intern (line 184) | pub fn intern(&mut self, ty: Type) -> Variable {
    method var_bind (line 201) | fn var_bind(&mut self, v: TypeVar, v_: Variable, u: &Unification, u_: ...
    method subst (line 216) | pub fn subst(&self) -> HashMap<TypeVar, Type> {
    method unify (line 230) | pub fn unify(&mut self, a_: Variable, b_: Variable) -> Result<(), Stri...
  function solve (line 266) | pub fn solve<I: Iterator<Item = (Type, Type)>>(iter: I) -> Result<HashMa...
  function fmt (line 288) | fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {

FILE: 05_recon/src/main.rs
  type Term (line 11) | pub enum Term {
  type TypedTerm (line 23) | pub enum TypedTerm {
    method subst (line 201) | fn subst(self, s: &HashMap<TypeVar, Type>) -> TypedTerm {
  type SystemF (line 35) | pub struct SystemF<T = Type> {
    method new (line 56) | fn new(expr: TypedTerm, ty: Type) -> SystemF {
    method de (line 60) | fn de(self) -> (TypedTerm, Type) {
    method subst (line 214) | fn subst(self, s: &HashMap<TypeVar, Type>) -> SystemF {
  type Constraint (line 40) | pub enum Constraint {
  type Elaborator (line 47) | struct Elaborator {
    method fresh (line 66) | fn fresh(&mut self) -> TypeVar {
    method ftv (line 72) | fn ftv(&self) -> HashSet<TypeVar> {
    method get_scheme (line 80) | fn get_scheme(&self, index: usize) -> Option<&Scheme> {
    method generalize (line 89) | fn generalize(&mut self, ty: Type) -> Scheme {
    method instantiate (line 99) | fn instantiate(&mut self, scheme: Scheme) -> Type {
    method push (line 114) | fn push(&mut self, ty: (Type, Type)) {
    method elaborate (line 120) | fn elaborate(&mut self, term: &Term) -> SystemF {
  function main (line 222) | fn main() {

FILE: 05_recon/src/mutation/mod.rs
  type TypeVar (line 9) | pub struct TypeVar {
  type Type (line 15) | pub enum Type {
    method ftv (line 51) | fn ftv(&self, rank: usize) -> HashSet<usize> {
    method apply (line 78) | fn apply(self, map: &HashMap<usize, Type>) -> Type {
    method arrow (line 90) | pub fn arrow(a: Type, b: Type) -> Type {
    method bool (line 94) | pub fn bool() -> Type {
    method de_arrow (line 98) | pub fn de_arrow(&self) -> (&Type, &Type) {
  type Scheme (line 21) | pub enum Scheme {
  type TypedTerm (line 27) | pub enum TypedTerm {
  type SystemF (line 39) | pub struct SystemF {
    method new (line 45) | fn new(expr: TypedTerm, ty: Type) -> SystemF {
  function occurs_check (line 106) | pub fn occurs_check(v: &TypeVar, ty: &Type) -> bool {
  function var_bind (line 125) | fn var_bind(v: &TypeVar, ty: &Type) -> Result<(), String> {
  function unify_type (line 134) | fn unify_type(a: &Type, b: &Type) -> Result<(), String> {
  type Elaborator (line 160) | pub struct Elaborator {
    method fresh (line 167) | fn fresh(&mut self) -> TypeVar {
    method get_scheme (line 176) | fn get_scheme(&self, index: usize) -> Option<&Scheme> {
    method generalize (line 185) | fn generalize(&mut self, ty: Type) -> Scheme {
    method instantiate (line 195) | fn instantiate(&mut self, scheme: Scheme) -> Type {
    method elaborate (line 208) | pub fn elaborate(&mut self, term: &Term) -> SystemF {

FILE: 05_recon/src/mutation/write_once.rs
  type WriteOnce (line 5) | pub struct WriteOnce<T> {
  type WriteOnceCell (line 11) | pub type WriteOnceCell<T> = Rc<WriteOnce<T>>;
  method default (line 14) | fn default() -> Self {
  method eq (line 24) | fn eq(&self, other: &Self) -> bool {
  function fmt (line 30) | fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
  function with_rank (line 36) | pub fn with_rank(rank: usize) -> Self {
  function set (line 44) | pub fn set(&self, data: T) -> Result<(), T> {
  function get (line 56) | pub fn get(&self) -> Option<&T> {
  function set_rank (line 64) | pub fn set_rank(&self, rank: usize) {
  function get_rank (line 68) | pub fn get_rank(&self) -> usize {
  function smoke (line 77) | fn smoke() {
  function smoke_shared (line 86) | fn smoke_shared() {

FILE: 05_recon/src/naive.rs
  function var_bind (line 3) | fn var_bind(var: TypeVar, ty: Type) -> Result<HashMap<TypeVar, Type>, St...
  function unify (line 17) | pub fn unify(a: Type, b: Type) -> Result<HashMap<TypeVar, Type>, String> {
  function solve (line 36) | pub fn solve<I: Iterator<Item = (Type, Type)>>(iter: I) -> Result<HashMa...

FILE: 05_recon/src/parser.rs
  type TokenKind (line 9) | pub enum TokenKind {
  type Token (line 30) | pub struct Token {
    method new (line 36) | pub const fn new(kind: TokenKind, span: Span) -> Token {
  type Lexer (line 42) | pub struct Lexer<'s> {
  function new (line 48) | pub fn new(input: Chars<'s>) -> Lexer<'s> {
  function peek (line 60) | fn peek(&mut self) -> Option<char> {
  function consume (line 65) | fn consume(&mut self) -> Option<char> {
  function consume_while (line 84) | fn consume_while<F: Fn(char) -> bool>(&mut self, pred: F) -> (String, Sp...
  function consume_delimiter (line 101) | fn consume_delimiter(&mut self) {
  function number (line 106) | fn number(&mut self) -> Token {
  function keyword (line 116) | fn keyword(&mut self) -> Token {
  function eat (line 136) | fn eat(&mut self, ch: char, kind: TokenKind) -> Token {
  function lex (line 146) | pub fn lex(&mut self) -> Token {
  type Item (line 167) | type Item = Token;
  method next (line 168) | fn next(&mut self) -> Option<Self::Item> {
  type DeBruijnIndexer (line 179) | pub struct DeBruijnIndexer {
    method push (line 184) | pub fn push(&mut self, hint: String) -> usize {
    method pop (line 194) | pub fn pop(&mut self) {
    method lookup (line 198) | pub fn lookup(&self, key: &str) -> Option<usize> {
  type Parser (line 208) | pub struct Parser<'s> {
  function new (line 218) | pub fn new(input: &'s str) -> Parser<'s> {
  function consume (line 226) | fn consume(&mut self) -> Option<Token> {
  function expect (line 232) | fn expect(&mut self, kind: TokenKind) -> Option<Token> {
  function expect_term (line 243) | fn expect_term(&mut self) -> Option<Box<Term>> {
  function peek (line 254) | fn peek(&mut self) -> Option<TokenKind> {
  function peek_span (line 258) | fn peek_span(&mut self) -> Span {
  function lambda (line 262) | fn lambda(&mut self) -> Option<Box<Term>> {
  function let_expr (line 276) | fn let_expr(&mut self) -> Option<Box<Term>> {
  function application (line 292) | fn application(&mut self) -> Option<Box<Term>> {
  function ident (line 301) | fn ident(&mut self) -> Option<String> {
  function if_expr (line 312) | fn if_expr(&mut self) -> Option<Box<Term>> {
  function atom (line 324) | fn atom(&mut self) -> Option<Box<Term>> {
  function term (line 365) | fn term(&mut self) -> Option<Box<Term>> {
  function parse_term (line 372) | pub fn parse_term(&mut self) -> Option<Box<Term>> {

FILE: 05_recon/src/types.rs
  type TypeVar (line 4) | pub struct TypeVar(pub u32, pub u32);
    method fmt (line 143) | fn fmt(&self, f: &mut std::fmt::Formatter) -> std::fmt::Result {
  type Tycon (line 7) | pub struct Tycon {
    method fmt (line 123) | fn fmt(&self, f: &mut std::fmt::Formatter) -> std::fmt::Result {
  type Type (line 13) | pub enum Type {
    method arrow (line 59) | pub fn arrow(a: Type, b: Type) -> Type {
    method bool (line 63) | pub fn bool() -> Type {
    method occurs (line 67) | pub fn occurs(&self, exist: TypeVar) -> bool {
    method de_arrow (line 74) | pub fn de_arrow(&self) -> (&Type, &Type) {
    method fmt (line 149) | fn fmt(&self, f: &mut std::fmt::Formatter) -> std::fmt::Result {
  type Scheme (line 19) | pub enum Scheme {
  type Substitution (line 24) | pub trait Substitution {
    method ftv (line 25) | fn ftv(&self) -> HashSet<TypeVar>;
    method apply (line 26) | fn apply(self, s: &HashMap<TypeVar, Type>) -> Self;
    method ftv (line 30) | fn ftv(&self) -> HashSet<TypeVar> {
    method apply (line 50) | fn apply(self, map: &HashMap<TypeVar, Type>) -> Type {
    method ftv (line 96) | fn ftv(&self) -> HashSet<TypeVar> {
    method apply (line 103) | fn apply(self, map: &HashMap<TypeVar, Type>) -> Scheme {
  function compose (line 82) | pub fn compose(s1: HashMap<TypeVar, Type>, s2: HashMap<TypeVar, Type>) -...
  constant T_ARROW (line 117) | pub const T_ARROW: Tycon = Tycon { id: 0, arity: 2 };
  constant T_INT (line 118) | pub const T_INT: Tycon = Tycon { id: 1, arity: 0 };
  constant T_UNIT (line 119) | pub const T_UNIT: Tycon = Tycon { id: 2, arity: 0 };
  constant T_BOOL (line 120) | pub const T_BOOL: Tycon = Tycon { id: 3, arity: 0 };
  function fresh_name (line 134) | fn fresh_name(x: u32) -> String {

FILE: 06_system_f/src/diagnostics.rs
  type Level (line 3) | pub enum Level {
  type Annotation (line 9) | pub struct Annotation {
    method new (line 23) | pub fn new<S: Into<String>>(span: Span, message: S) -> Annotation {
  type Diagnostic (line 15) | pub struct Diagnostic {
    method error (line 32) | pub fn error<S: Into<String>>(span: Span, message: S) -> Diagnostic {
    method warn (line 41) | pub fn warn<S: Into<String>>(span: Span, message: S) -> Diagnostic {
    method message (line 50) | pub fn message<S: Into<String>>(mut self, span: Span, message: S) -> D...
    method info (line 55) | pub fn info<S: Into<String>>(mut self, info: S) -> Diagnostic {
    method lines (line 60) | pub fn lines(&self) -> std::ops::Range<u32> {

FILE: 06_system_f/src/eval.rs
  type Eval (line 7) | pub struct Eval<'ctx> {
  function with_context (line 12) | pub fn with_context(_context: &Context) -> Eval<'_> {
  function normal_form (line 16) | fn normal_form(&self, term: &Term) -> bool {
  function eval_primitive (line 31) | fn eval_primitive(&self, p: Primitive, term: Term) -> Option<Term> {
  function small_step (line 52) | pub fn small_step(&self, term: Term) -> Option<Term> {
  function case_subst (line 205) | fn case_subst(&self, pat: &Pattern, expr: &Term, term: &mut Term) {
  function term_subst (line 237) | fn term_subst(mut s: Term, t: &mut Term) {
  function type_subst (line 243) | fn type_subst(s: Type, t: &mut Term) {
  function literal (line 254) | fn literal() {
  function application (line 261) | fn application() {
  function type_application (line 275) | fn type_application() {
  function projection (line 290) | fn projection() {

FILE: 06_system_f/src/main.rs
  function test_variant (line 20) | fn test_variant() -> Type {
  function code_format (line 37) | pub fn code_format(src: &str, diag: Diagnostic) {
  function eval (line 62) | fn eval(ctx: &mut types::Context, mut term: Term, verbose: bool) -> Resu...
  function parse_and_eval (line 91) | fn parse_and_eval(ctx: &mut types::Context, input: &str, verbose: bool) ...
  function nat_list (line 119) | fn nat_list() -> Type {
  function nat_list2 (line 126) | fn nat_list2() -> Type {
  function main (line 133) | fn main() {

FILE: 06_system_f/src/patterns/mod.rs
  type Pattern (line 8) | pub enum Pattern {
    method matches (line 60) | pub fn matches(&self, term: &Term) -> bool {
  type PatVarStack (line 22) | pub struct PatVarStack {
    method collect (line 27) | pub fn collect(pat: &mut Pattern) -> Vec<String> {
  method visit_variable (line 35) | fn visit_variable(&mut self, var: &String) {
  type PatternCount (line 42) | pub struct PatternCount(usize);
    method collect (line 45) | pub fn collect(pat: &mut Pattern) -> usize {
  method visit_variable (line 53) | fn visit_variable(&mut self, var: &String) {
  type PatTyStack (line 91) | pub struct PatTyStack<'ty> {
  function collect (line 97) | pub fn collect(ty: &'ty Type, pat: &Pattern) -> Vec<&'ty Type> {
  method visit_product (line 108) | fn visit_product(&mut self, pats: &Vec<Pattern>) {
  method visit_constructor (line 119) | fn visit_constructor(&mut self, label: &String, pat: &Pattern) {
  method visit_pattern (line 128) | fn visit_pattern(&mut self, pattern: &Pattern) {
  function pattern_count (line 142) | fn pattern_count() {
  function pattern_ty_stack (line 148) | fn pattern_ty_stack() {
  function pattern_var_stack (line 155) | fn pattern_var_stack() {

FILE: 06_system_f/src/syntax/lexer.rs
  type Lexer (line 8) | pub struct Lexer<'s> {
  function new (line 14) | pub fn new(input: Chars<'s>) -> Lexer<'s> {
  function peek (line 26) | fn peek(&mut self) -> Option<char> {
  function consume (line 31) | fn consume(&mut self) -> Option<char> {
  function consume_while (line 50) | fn consume_while<F: Fn(char) -> bool>(&mut self, pred: F) -> (String, Sp...
  function consume_delimiter (line 67) | fn consume_delimiter(&mut self) {
  function number (line 72) | fn number(&mut self) -> Token {
  function keyword (line 82) | fn keyword(&mut self) -> Token {
  function eat (line 127) | fn eat(&mut self, ch: char, kind: TokenKind) -> Token {
  function lex (line 137) | pub fn lex(&mut self) -> Token {
  type Item (line 174) | type Item = Token;
  method next (line 175) | fn next(&mut self) -> Option<Self::Item> {
  function nested (line 190) | fn nested() {
  function case (line 201) | fn case() {

FILE: 06_system_f/src/syntax/mod.rs
  type TokenKind (line 7) | pub enum TokenKind {
  type Token (line 58) | pub struct Token {
    method dummy (line 64) | pub const fn dummy() -> Token {
    method new (line 71) | pub const fn new(kind: TokenKind, span: Span) -> Token {

FILE: 06_system_f/src/syntax/parser.rs
  type DeBruijnIndexer (line 13) | pub struct DeBruijnIndexer {
    method push (line 18) | pub fn push(&mut self, hint: String) -> usize {
    method pop (line 24) | pub fn pop(&mut self) {
    method lookup (line 28) | pub fn lookup(&self, key: &str) -> Option<usize> {
    method len (line 37) | pub fn len(&self) -> usize {
  type Parser (line 42) | pub struct Parser<'s> {
  type Error (line 52) | pub struct Error {
  type ErrorKind (line 59) | pub enum ErrorKind {
  function new (line 71) | pub fn new(input: &'s str) -> Parser<'s> {
  function diagnostic (line 84) | pub fn diagnostic(self) -> Diagnostic<'s> {
  function once_or_more (line 91) | fn once_or_more<T, F>(&mut self, func: F, delimiter: TokenKind) -> Resul...
  function once (line 106) | fn once<T, F>(&mut self, func: F, message: &str) -> Result<T, Error>
  function error (line 121) | fn error<T>(&self, kind: ErrorKind) -> Result<T, Error> {
  function bump (line 129) | fn bump(&mut self) -> TokenKind {
  function bump_if (line 135) | fn bump_if(&mut self, kind: &TokenKind) -> bool {
  function expect (line 144) | fn expect(&mut self, kind: TokenKind) -> Result<(), Error> {
  function kind (line 157) | fn kind(&self) -> &TokenKind {
  function ty_variant (line 161) | fn ty_variant(&mut self) -> Result<Variant, Error> {
  function ty_app (line 171) | fn ty_app(&mut self) -> Result<Type, Error> {
  function ty_atom (line 180) | fn ty_atom(&mut self) -> Result<Type, Error> {
  function ty_tuple (line 230) | fn ty_tuple(&mut self) -> Result<Type, Error> {
  function ty (line 245) | pub fn ty(&mut self) -> Result<Type, Error> {
  function tyabs (line 270) | fn tyabs(&mut self) -> Result<Term, Error> {
  function tmabs (line 278) | fn tmabs(&mut self) -> Result<Term, Error> {
  function fold (line 291) | fn fold(&mut self) -> Result<Term, Error> {
  function unfold (line 299) | fn unfold(&mut self) -> Result<Term, Error> {
  function fix (line 307) | fn fix(&mut self) -> Result<Term, Error> {
  function letexpr (line 314) | fn letexpr(&mut self) -> Result<Term, Error> {
  function lambda (line 337) | fn lambda(&mut self) -> Result<Term, Error> {
  function paren (line 350) | fn paren(&mut self) -> Result<Term, Error> {
  function uppercase_id (line 363) | fn uppercase_id(&mut self) -> Result<String, Error> {
  function lowercase_id (line 374) | fn lowercase_id(&mut self) -> Result<String, Error> {
  function literal (line 385) | fn literal(&mut self) -> Result<Term, Error> {
  function primitive (line 396) | fn primitive(&mut self) -> Result<Term, Error> {
  function pat_atom (line 409) | fn pat_atom(&mut self) -> Result<Pattern, Error> {
  function pattern (line 451) | fn pattern(&mut self) -> Result<Pattern, Error> {
  function case_arm (line 469) | fn case_arm(&mut self) -> Result<Arm, Error> {
  function case (line 505) | fn case(&mut self) -> Result<Term, Error> {
  function injection (line 517) | fn injection(&mut self) -> Result<Term, Error> {
  function pack (line 533) | fn pack(&mut self) -> Result<Term, Error> {
  function unpack (line 548) | fn unpack(&mut self) -> Result<Term, Error> {
  function atom (line 569) | fn atom(&mut self) -> Result<Term, Error> {
  function projection (line 602) | fn projection(&mut self) -> Result<Term, Error> {
  function application (line 623) | fn application(&mut self) -> Result<Term, Error> {
  function parse (line 650) | pub fn parse(&mut self) -> Result<Term, Error> {

FILE: 06_system_f/src/terms/mod.rs
  type Term (line 9) | pub struct Term {
    method new (line 86) | pub fn new(kind: Kind, span: Span) -> Term {
    method unit (line 91) | pub const fn unit() -> Term {
    method span (line 100) | pub fn span(&self) -> Span {
    method kind (line 105) | pub fn kind(&self) -> &Kind {
    method fmt (line 121) | fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
    method fmt (line 159) | fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
  type Primitive (line 16) | pub enum Primitive {
  type Kind (line 24) | pub enum Kind {
  type Arm (line 71) | pub struct Arm {
  type Literal (line 79) | pub enum Literal {
    method fmt (line 111) | fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
  function pattern_matches (line 169) | fn pattern_matches() {

FILE: 06_system_f/src/terms/visit.rs
  type Shift (line 7) | pub struct Shift {
    method new (line 13) | pub const fn new(shift: isize) -> Shift {
  method visit_var (line 19) | fn visit_var(&mut self, sp: &mut Span, var: &mut usize) {
  method visit_abs (line 25) | fn visit_abs(&mut self, sp: &mut Span, ty: &mut Type, term: &mut Term) {
  method visit_let (line 31) | fn visit_let(&mut self, sp: &mut Span, pat: &mut Pattern, t1: &mut Term,...
  method visit_case (line 39) | fn visit_case(&mut self, sp: &mut Span, term: &mut Term, arms: &mut Vec<...
  method visit_unpack (line 49) | fn visit_unpack(&mut self, _: &mut Span, package: &mut Term, term: &mut ...
  type Subst (line 57) | pub struct Subst {
    method new (line 63) | pub fn new(term: Term) -> Subst {
  method visit_abs (line 69) | fn visit_abs(&mut self, sp: &mut Span, ty: &mut Type, term: &mut Term) {
  method visit_let (line 75) | fn visit_let(&mut self, sp: &mut Span, pat: &mut Pattern, t1: &mut Term,...
  method visit_case (line 83) | fn visit_case(&mut self, sp: &mut Span, term: &mut Term, arms: &mut Vec<...
  method visit_unpack (line 93) | fn visit_unpack(&mut self, _: &mut Span, package: &mut Term, term: &mut ...
  method visit (line 100) | fn visit(&mut self, term: &mut Term) {
  type TyTermSubst (line 112) | pub struct TyTermSubst {
    method new (line 118) | pub fn new(ty: Type) -> TyTermSubst {
    method visit_ty (line 125) | fn visit_ty(&mut self, ty: &mut Type) {
  method visit_abs (line 135) | fn visit_abs(&mut self, sp: &mut Span, ty: &mut Type, term: &mut Term) {
  method visit_tyapp (line 142) | fn visit_tyapp(&mut self, sp: &mut Span, term: &mut Term, ty: &mut Type) {
  method visit_tyabs (line 147) | fn visit_tyabs(&mut self, sp: &mut Span, term: &mut Term) {
  method visit_fold (line 153) | fn visit_fold(&mut self, sp: &mut Span, ty: &mut Type, term: &mut Term) {
  method visit_unfold (line 158) | fn visit_unfold(&mut self, sp: &mut Span, ty: &mut Type, term: &mut Term) {
  method visit_unpack (line 163) | fn visit_unpack(&mut self, _: &mut Span, package: &mut Term, term: &mut ...
  method visit_pack (line 170) | fn visit_pack(&mut self, _: &mut Span, wit: &mut Type, body: &mut Term, ...
  method visit_injection (line 176) | fn visit_injection(&mut self, sp: &mut Span, label: &mut String, term: &...
  type InjRewriter (line 187) | pub struct InjRewriter;
  method visit (line 190) | fn visit(&mut self, term: &mut Term) {

FILE: 06_system_f/src/types/mod.rs
  type Type (line 14) | pub enum Type {
    method fmt (line 406) | fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
  type Variant (line 29) | pub struct Variant {
  type TypeError (line 35) | pub struct TypeError {
  type TypeErrorKind (line 41) | pub enum TypeErrorKind {
  type Context (line 58) | pub struct Context {
    method push (line 64) | fn push(&mut self, ty: Type) {
    method pop (line 68) | fn pop(&mut self) {
    method find (line 72) | fn find(&self, idx: usize) -> Option<&Type> {
    method alias (line 76) | pub fn alias(&mut self, alias: String, ty: Type) {
    method aliaser (line 80) | fn aliaser(&self) -> Aliaser<'_> {
    method de_alias (line 84) | pub fn de_alias(&mut self, term: &mut Term) {
    method type_check (line 108) | pub fn type_check(&mut self, term: &Term) -> Result<Type, Diagnostic> {
  function variant_field (line 90) | pub fn variant_field<'vs>(var: &'vs [Variant], label: &str, span: Span) ...
  function subst (line 346) | pub fn subst(mut s: Type, mut t: Type) -> Type {
  type Aliaser (line 353) | struct Aliaser<'ctx> {
  method visit (line 358) | fn visit(&mut self, ty: &mut Type) {
  method visit_abs (line 379) | fn visit_abs(&mut self, sp: &mut Span, ty: &mut Type, term: &mut Term) {
  method visit_tyapp (line 384) | fn visit_tyapp(&mut self, sp: &mut Span, term: &mut Term, ty: &mut Type) {
  method visit_injection (line 389) | fn visit_injection(&mut self, sp: &mut Span, label: &mut String, term: &...
  method visit_fold (line 394) | fn visit_fold(&mut self, sp: &mut Span, ty: &mut Type, tm: &mut Term) {
  method visit_unfold (line 399) | fn visit_unfold(&mut self, sp: &mut Span, ty: &mut Type, tm: &mut Term) {

FILE: 06_system_f/src/types/patterns.rs
  function overlap (line 24) | fn overlap(existing: &Pattern, new: &Pattern) -> bool {
  type Matrix (line 43) | pub struct Matrix<'pat> {
  function new (line 51) | pub fn new(expr_ty: Type) -> Matrix<'pat> {
  function exhaustive (line 77) | pub fn exhaustive(&self) -> bool {
  function can_add_row (line 123) | fn can_add_row(&self, new_row: Vec<&'pat Pattern>) -> bool {
  function try_add_row (line 133) | fn try_add_row(&mut self, new_row: Vec<&'pat Pattern>) -> bool {
  function add_pattern (line 148) | pub fn add_pattern(&mut self, pat: &'pat Pattern) -> bool {
  method type_check_case (line 192) | pub(crate) fn type_check_case(&mut self, expr: &Term, arms: &[Arm]) -> R...
  method pattern_type_eq (line 255) | pub(crate) fn pattern_type_eq(&self, pat: &Pattern, ty: &Type) -> bool {
  function product (line 296) | fn product() {
  function product_mistyped (line 305) | fn product_mistyped() {
  function constructor (line 313) | fn constructor() {
  function constructor_product (line 336) | fn constructor_product() {
  function matrix_tuple (line 363) | fn matrix_tuple() {
  function matrix_constructor (line 382) | fn matrix_constructor() {
  function matrix_bool (line 413) | fn matrix_bool() {

FILE: 06_system_f/src/types/visit.rs
  type Shift (line 5) | pub struct Shift {
    method new (line 11) | pub const fn new(shift: isize) -> Shift {
  method visit_var (line 17) | fn visit_var(&mut self, var: &mut usize) {
  method visit_universal (line 23) | fn visit_universal(&mut self, inner: &mut Type) {
  method visit_existential (line 29) | fn visit_existential(&mut self, inner: &mut Type) {
  method visit_rec (line 35) | fn visit_rec(&mut self, ty: &mut Type) {
  type Subst (line 42) | pub struct Subst {
    method new (line 48) | pub fn new(ty: Type) -> Subst {
  method visit_universal (line 54) | fn visit_universal(&mut self, inner: &mut Type) {
  method visit_existential (line 60) | fn visit_existential(&mut self, inner: &mut Type) {
  method visit_rec (line 66) | fn visit_rec(&mut self, ty: &mut Type) {
  method visit (line 72) | fn visit(&mut self, ty: &mut Type) {

FILE: 06_system_f/src/visit.rs
  type MutTypeVisitor (line 7) | pub trait MutTypeVisitor: Sized {
    method visit_var (line 8) | fn visit_var(&mut self, var: &mut usize) {}
    method visit_alias (line 9) | fn visit_alias(&mut self, alias: &mut String) {}
    method visit_arrow (line 11) | fn visit_arrow(&mut self, ty1: &mut Type, ty2: &mut Type) {
    method visit_universal (line 16) | fn visit_universal(&mut self, inner: &mut Type) {
    method visit_existential (line 20) | fn visit_existential(&mut self, inner: &mut Type) {
    method visit_variant (line 24) | fn visit_variant(&mut self, variant: &mut Vec<Variant>) {
    method visit_product (line 30) | fn visit_product(&mut self, product: &mut Vec<Type>) {
    method visit_rec (line 36) | fn visit_rec(&mut self, ty: &mut Type) {
    method visit (line 40) | fn visit(&mut self, ty: &mut Type) {
  type MutTermVisitor (line 55) | pub trait MutTermVisitor: Sized {
    method visit_lit (line 56) | fn visit_lit(&mut self, sp: &mut Span, lit: &mut Literal) {}
    method visit_var (line 57) | fn visit_var(&mut self, sp: &mut Span, var: &mut usize) {}
    method visit_abs (line 59) | fn visit_abs(&mut self, sp: &mut Span, ty: &mut Type, term: &mut Term) {
    method visit_app (line 63) | fn visit_app(&mut self, sp: &mut Span, t1: &mut Term, t2: &mut Term) {
    method visit_let (line 68) | fn visit_let(&mut self, sp: &mut Span, pat: &mut Pattern, t1: &mut Ter...
    method visit_tyabs (line 73) | fn visit_tyabs(&mut self, sp: &mut Span, term: &mut Term) {
    method visit_tyapp (line 77) | fn visit_tyapp(&mut self, sp: &mut Span, term: &mut Term, ty: &mut Typ...
    method visit_primitive (line 81) | fn visit_primitive(&mut self, sp: &mut Span, prim: &mut Primitive) {}
    method visit_injection (line 82) | fn visit_injection(&mut self, sp: &mut Span, label: &mut String, term:...
    method visit_case (line 86) | fn visit_case(&mut self, sp: &mut Span, term: &mut Term, arms: &mut Ve...
    method visit_product (line 93) | fn visit_product(&mut self, sp: &mut Span, product: &mut Vec<Term>) {
    method visit_projection (line 99) | fn visit_projection(&mut self, sp: &mut Span, term: &mut Term, index: ...
    method visit_fold (line 103) | fn visit_fold(&mut self, sp: &mut Span, ty: &mut Type, term: &mut Term) {
    method visit_unfold (line 106) | fn visit_unfold(&mut self, sp: &mut Span, ty: &mut Type, term: &mut Te...
    method visit_pack (line 110) | fn visit_pack(&mut self, sp: &mut Span, witness: &mut Type, evidence: ...
    method visit_unpack (line 114) | fn visit_unpack(&mut self, sp: &mut Span, package: &mut Term, term: &m...
    method visit (line 119) | fn visit(&mut self, term: &mut Term) {
    method walk (line 123) | fn walk(&mut self, term: &mut Term) {
  type PatternVisitor (line 148) | pub trait PatternVisitor: Sized {
    method visit_literal (line 149) | fn visit_literal(&mut self, lit: &Literal) {}
    method visit_variable (line 150) | fn visit_variable(&mut self, var: &String) {}
    method visit_product (line 151) | fn visit_product(&mut self, pats: &Vec<Pattern>) {
    method visit_constructor (line 157) | fn visit_constructor(&mut self, label: &String, pat: &Pattern) {
    method visit_pattern (line 161) | fn visit_pattern(&mut self, pattern: &Pattern) {

FILE: 07_system_fw/src/diagnostics.rs
  type Level (line 5) | pub enum Level {
  type Annotation (line 11) | pub struct Annotation {
    method new (line 25) | pub fn new<S: Into<String>>(span: Span, message: S) -> Annotation {
  type Diagnostic (line 17) | pub struct Diagnostic {
    method error (line 34) | pub fn error<S: Into<String>>(span: Span, message: S) -> Diagnostic {
    method warn (line 43) | pub fn warn<S: Into<String>>(span: Span, message: S) -> Diagnostic {
    method message (line 52) | pub fn message<S: Into<String>>(mut self, span: Span, message: S) -> D...
    method info (line 57) | pub fn info<S: Into<String>>(mut self, info: S) -> Diagnostic {
    method lines (line 62) | pub fn lines(&self) -> std::ops::Range<u32> {
    method fmt (line 81) | fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {

FILE: 07_system_fw/src/elaborate.rs
  type ElaborationContext (line 13) | pub struct ElaborationContext<'s> {
  type Elaborated (line 24) | pub struct Elaborated {
  type Namespace (line 31) | pub struct Namespace {
  type ElabError (line 40) | pub enum ElabError {
  function new (line 49) | pub fn new() -> Self {
  function elaborate (line 56) | pub fn elaborate(program: &'s Program) -> Result<Elaborated, ElabError> {
  function with_tyvars (line 69) | fn with_tyvars<T, F: Fn(&mut ElaborationContext<'s>) -> T>(&mut self, f:...
  function with_tmvars (line 80) | fn with_tmvars<T, F: Fn(&mut ElaborationContext<'s>) -> T>(&mut self, f:...
  function elab_kind (line 88) | fn elab_kind(&self, k: &Kind) -> hir::Kind {
  function allocate_hir_id (line 95) | fn allocate_hir_id(&mut self) -> HirId {
  function define_value (line 101) | fn define_value(&mut self, name: String, expr: hir::Expr) -> HirId {
  function define_type (line 108) | fn define_type(&mut self, name: String, ty: hir::Type) -> HirId {
  function dump (line 115) | pub fn dump(&self) {
  function lexical_value (line 135) | fn lexical_value(&self, s: &str) -> Option<HirId> {
  function debruijn_value (line 146) | fn debruijn_value(&self, s: &str) -> Option<hir::Expr> {
  function lookup_value (line 154) | fn lookup_value(&self, s: &str) -> Option<hir::Expr> {
  function lexical_type (line 161) | fn lexical_type(&self, s: &str) -> Option<HirId> {
  function debruijn_type (line 171) | fn debruijn_type(&self, s: &str) -> Option<hir::Type> {
  function enter_namespace (line 177) | fn enter_namespace(&mut self) -> usize {
  function leave_namespace (line 189) | fn leave_namespace(&mut self) {
  function with_new_namespace (line 198) | fn with_new_namespace<T, F: Fn(&mut ElaborationContext<'s>) -> T>(&mut s...
  function elab_ty_row (line 208) | fn elab_ty_row(&mut self, row: &'s Row) -> Result<hir::Row, ElabError> {
  function elab_ty_inner (line 215) | fn elab_ty_inner(&mut self, tv: &'s str, ty: &'s Type) -> Result<hir::Ty...
  function elab_type (line 222) | fn elab_type(&mut self, ty: &'s Type) -> Result<hir::Type, ElabError> {
  function elab_let (line 274) | fn elab_let(&mut self, decls: &'s [Decl], expr: &'s Expr) -> Result<hir:...
  function elab_arm (line 283) | fn elab_arm(&mut self, arm: &'s Arm) -> Result<hir::Arm, ElabError> {
  function elab_case (line 290) | fn elab_case(&mut self, expr: &'s Expr, arms: &'s [Arm]) -> Result<hir::...
  function elab_field (line 296) | fn elab_field(&mut self, field: &'s Field) -> Result<hir::Field, ElabErr...
  function elab_abs (line 306) | fn elab_abs(&mut self, pat: &'s Pattern, body: &'s Expr) -> Result<hir::...
  function elab_expr (line 322) | fn elab_expr(&mut self, expr: &'s Expr) -> Result<hir::Expr, ElabError> {
  function naive_type_infer (line 381) | fn naive_type_infer(&self, pat: &hir::Pattern) -> Result<hir::Type, Elab...
  function elab_pattern (line 422) | fn elab_pattern(&mut self, pat: &'s Pattern, bind: bool) -> Result<hir::...
  function elab_decl_type (line 480) | fn elab_decl_type(&mut self, tyvars: &'s [Type], name: &'s str, ty: &'s ...
  function elab_constructor (line 488) | fn elab_constructor(
  function elab_decl_datatype (line 528) | fn elab_decl_datatype(&mut self, tyvars: &'s [Type], name: &'s str, ty: ...
  function deconstruct_pat_binding (line 574) | fn deconstruct_pat_binding(
  function elab_decl_value (line 638) | fn elab_decl_value(&mut self, tyvars: &'s [Type], pat: &'s Pattern, expr...
  function build_pat_matrix (line 650) | fn build_pat_matrix(&mut self, arms: &'s [FnArm]) -> Result<PatternMatri...
  function infer_type_matrix (line 682) | fn infer_type_matrix(&self, mat: &PatternMatrix) -> Result<Vec<HashSet<h...
  function try_unify_type_matrix (line 695) | fn try_unify_type_matrix(mat: Vec<HashSet<hir::Type>>) -> Vec<hir::Type> {
  function elab_decl_fun (line 717) | fn elab_decl_fun(&mut self, tyvars: &'s [Type], name: &'s str, arms: &'s...
  function elab_decl_expr (line 760) | fn elab_decl_expr(&mut self, expr: &'s Expr) -> Result<HirId, ElabError> {
  function elab_decl_and (line 765) | fn elab_decl_and(&mut self, a: &'s Decl, b: &'s Decl) -> Result<HirId, E...
  function elab_decl (line 779) | fn elab_decl(&mut self, decl: &'s Decl) -> Result<HirId, ElabError> {
  function elab_program (line 790) | pub fn elab_program(&mut self, prog: &'s Program) -> Result<Vec<HirId>, ...
  type PatternMatrix (line 801) | pub struct PatternMatrix {
    method collapse (line 809) | fn collapse(self) -> Vec<hir::Arm> {
  type DeclNames (line 824) | struct DeclNames<'s> {
  function visit_pat (line 830) | fn visit_pat(&mut self, pat: &'s Pattern) {
  function visit_decl (line 849) | fn visit_decl(&mut self, d: &'s Decl) {
  type TyNameCollector (line 866) | pub struct TyNameCollector<'s> {
  function visit_variable (line 872) | fn visit_variable(&mut self, s: &'s str) {
  function visit_defined (line 876) | fn visit_defined(&mut self, s: &'s str) {

FILE: 07_system_fw/src/functor.rs
  function parameterized_set (line 3) | pub fn parameterized_set() -> Type {
  function list_type (line 7) | fn list_type() -> Type {
  function parameterized_set_term (line 24) | pub fn parameterized_set_term() -> Term {
  function parameterized_functor (line 59) | fn parameterized_functor() {

FILE: 07_system_fw/src/hir/bidir.rs
  type Context (line 8) | pub struct Context<'hir> {
  type Element (line 17) | pub enum Element {
  type Error (line 35) | pub enum Error {
  function find_annotation (line 43) | fn find_annotation(&self, idx: usize) -> Option<&Type> {
  function infer (line 60) | pub fn infer(&mut self, e: &'hir Expr) -> Result<Type, Error> {
  function test (line 104) | pub fn test(prog: Elaborated) {

FILE: 07_system_fw/src/hir/mod.rs
  type DeBruijn (line 6) | pub struct DeBruijn {
  type HirId (line 12) | pub struct HirId(pub(crate) u32);
  type Arm (line 16) | pub struct Arm {
  type Field (line 22) | pub struct Field {
  type Program (line 27) | pub struct Program {
  type Decl (line 33) | pub enum Decl {
  type Constructor (line 39) | pub struct Constructor {
  type Pattern (line 53) | pub enum Pattern {
  type Expr (line 73) | pub enum Expr {
  type Kind (line 101) | pub enum Kind {
  type Type (line 107) | pub enum Type {
    method fmt (line 151) | fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
  type Variant (line 139) | pub struct Variant {
  type Row (line 145) | pub struct Row {

FILE: 07_system_fw/src/main.rs
  function main (line 21) | fn main() {
  function unfold (line 48) | fn unfold(ty: Type) -> Type {

FILE: 07_system_fw/src/stack.rs
  type Stack (line 5) | pub struct Stack<T> {
  function push (line 11) | pub fn push(&mut self, val: T) {
  function pop (line 16) | pub fn pop(&mut self) -> Option<T> {
  function popn (line 21) | pub fn popn(&mut self, n: usize) {
  function get (line 28) | pub fn get(&self, index: usize) -> Option<&T> {
  function with_capacity (line 33) | pub fn with_capacity(size: usize) -> Self {
  function new (line 40) | pub fn new() -> Self {
  function len (line 45) | pub fn len(&self) -> usize {
  function iter (line 49) | pub fn iter(&self) -> std::slice::Iter<T> {
  function iter_mut (line 53) | pub fn iter_mut(&mut self) -> std::slice::IterMut<T> {
  function lookup (line 59) | pub fn lookup(&self, key: &T) -> Option<usize> {
  function extend (line 70) | fn extend<I: IntoIterator<Item = T>>(&mut self, iter: I) {
  method clone (line 78) | fn clone(&self) -> Self {
  method default (line 86) | fn default() -> Self {
  function fmt (line 92) | fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
  function order (line 101) | fn order() {

FILE: 07_system_fw/src/syntax/ast.rs
  type AstId (line 4) | pub struct AstId(pub(crate) u32);
  constant AST_DUMMY (line 5) | pub const AST_DUMMY: AstId = AstId(std::u32::MAX);
  type Program (line 43) | pub struct Program {
  type Arm (line 49) | pub struct Arm {
  type Field (line 56) | pub struct Field {
  type PatKind (line 64) | pub enum PatKind {
  type ExprKind (line 84) | pub enum ExprKind {
  type FnArm (line 106) | pub struct FnArm {
  type DeclKind (line 113) | pub enum DeclKind {
  type Kind (line 123) | pub enum Kind {
  type TypeKind (line 129) | pub enum TypeKind {
    method variants (line 173) | pub fn variants(&self) -> &[Variant] {
    method as_tyvar (line 180) | pub fn as_tyvar(&self) -> &str {
    method as_tyvar_d (line 187) | pub fn as_tyvar_d(self) -> String {
  type Variant (line 159) | pub struct Variant {
    method fmt (line 196) | fn fmt(&self, f: &mut std::fmt::Formatter) -> std::fmt::Result {
  type Row (line 166) | pub struct Row {
    method fmt (line 202) | fn fmt(&self, f: &mut std::fmt::Formatter) -> std::fmt::Result {

FILE: 07_system_fw/src/syntax/lexer.rs
  type Lexer (line 8) | pub struct Lexer<'s> {
  function new (line 14) | pub fn new(input: Chars<'s>) -> Lexer<'s> {
  function peek (line 26) | fn peek(&mut self) -> Option<char> {
  function consume (line 31) | fn consume(&mut self) -> Option<char> {
  function consume_while (line 50) | fn consume_while<F: Fn(char) -> bool>(&mut self, pred: F) -> (String, Sp...
  function consume_delimiter (line 67) | fn consume_delimiter(&mut self) {
  function id (line 71) | fn id(&mut self) -> (String, Span) {
  function valid_id_char (line 86) | fn valid_id_char(c: char) -> bool {
  function keyword (line 96) | fn keyword(&mut self) -> Spanned<Token> {
  function eat (line 130) | fn eat(&mut self, ch: char, kind: Token) -> Spanned<Token> {
  function number (line 140) | fn number(&mut self) -> Spanned<Token> {
  function lex (line 149) | pub fn lex(&mut self) -> Spanned<Token> {
  type Item (line 208) | type Item = Spanned<Token>;
  method next (line 209) | fn next(&mut self) -> Option<Self::Item> {

FILE: 07_system_fw/src/syntax/parser/decls.rs
  function decl_datatype (line 4) | fn decl_datatype(&mut self) -> Result<Decl, Error> {
  function decl_type (line 20) | fn decl_type(&mut self) -> Result<Decl, Error> {
  function decl_value (line 36) | fn decl_value(&mut self) -> Result<Decl, Error> {
  function decl_fun_arm (line 51) | fn decl_fun_arm(&mut self, ident: &str) -> Result<FnArm, Error> {
  function decl_fun (line 64) | fn decl_fun(&mut self) -> Result<Decl, Error> {
  function decl_expr (line 84) | fn decl_expr(&mut self) -> Result<Decl, Error> {
  function parse_decl_atom (line 96) | pub fn parse_decl_atom(&mut self) -> Result<Decl, Error> {
  function parse_decl (line 106) | pub(crate) fn parse_decl(&mut self) -> Result<Decl, Error> {
  function parse_program (line 119) | pub fn parse_program(&mut self) -> Result<Program, Error> {

FILE: 07_system_fw/src/syntax/parser/exprs.rs
  function record_row (line 4) | fn record_row(&mut self) -> Result<Field, Error> {
  function record_expr (line 13) | fn record_expr(&mut self) -> Result<Expr, Error> {
  function let_binding (line 22) | fn let_binding(&mut self) -> Result<Expr, Error> {
  function case_arm (line 36) | fn case_arm(&mut self) -> Result<Arm, Error> {
  function case_expr (line 46) | fn case_expr(&mut self) -> Result<Expr, Error> {
  function lambda_expr (line 58) | fn lambda_expr(&mut self) -> Result<Expr, Error> {
  function if_expr (line 68) | fn if_expr(&mut self) -> Result<Expr, Error> {
  function atomic_expr (line 90) | fn atomic_expr(&mut self) -> Result<Expr, Error> {
  function projection_expr (line 120) | fn projection_expr(&mut self) -> Result<Expr, Error> {
  function application_expr (line 133) | fn application_expr(&mut self) -> Result<Expr, Error> {
  function parse_expr (line 180) | pub fn parse_expr(&mut self) -> Result<Expr, Error> {

FILE: 07_system_fw/src/syntax/parser/infix.rs
  type Infix (line 4) | pub struct Infix {
    method insert (line 9) | pub fn insert(&mut self, s: String, prec: usize) {
    method get (line 13) | pub fn get(&self, s: &str) -> Option<usize> {

FILE: 07_system_fw/src/syntax/parser/mod.rs
  type Parser (line 13) | pub struct Parser<'s> {
  type ErrorKind (line 22) | pub enum ErrorKind {
  type InfixState (line 39) | pub struct InfixState(Infix);
  type Error (line 42) | pub struct Error {
  function new (line 49) | pub fn new(input: &'s str) -> Parser<'s> {
  function with_infix_state (line 53) | pub fn with_infix_state(input: &'s str, state: InfixState) -> Parser<'s> {
  function top_level (line 65) | pub fn top_level(&mut self) -> Result<Vec<Decl>, Error> {
  function state (line 74) | pub fn state(&self) -> InfixState {
  function allocate_ast_id (line 78) | fn allocate_ast_id(&mut self) -> AstId {
  function error (line 85) | fn error<T>(&self, k: ErrorKind) -> Result<T, Error> {
  function current (line 93) | fn current(&self) -> &Token {
  function bump (line 99) | fn bump(&mut self) -> Token {
  function bump_if (line 121) | fn bump_if(&mut self, kind: &Token) -> bool {
  function expect (line 130) | fn expect(&mut self, kind: Token) -> Result<(), Error> {
  function expect_lower_id (line 139) | fn expect_lower_id(&mut self) -> Result<String, Error> {
  function expect_upper_id (line 146) | fn expect_upper_id(&mut self) -> Result<String, Error> {
  function once (line 158) | fn once<T, E, F>(&mut self, func: F, message: &str) -> Result<T, E>
  function plus (line 177) | fn plus<T, E, F>(&mut self, func: F, delimit: Option<&Token>) -> Result<...
  function star (line 202) | fn star<T, E, F>(&mut self, func: F, delimit: Option<&Token>) -> Vec<T>
  function delimited (line 219) | fn delimited<T, E, F>(&mut self, func: F, delimit: Token) -> Result<Vec<...

FILE: 07_system_fw/src/syntax/parser/patterns.rs
  function tuple_pattern (line 6) | fn tuple_pattern(&mut self) -> Result<Pattern, Error> {
  function record_pattern (line 19) | fn record_pattern(&mut self) -> Result<Pattern, Error> {
  function atomic_pattern (line 34) | pub(crate) fn atomic_pattern(&mut self) -> Result<Pattern, Error> {
  function application_pattern (line 59) | fn application_pattern(&mut self) -> Result<Pattern, Error> {
  function parse_pattern (line 79) | pub fn parse_pattern(&mut self) -> Result<Pattern, Error> {

FILE: 07_system_fw/src/syntax/parser/types.rs
  function variant (line 7) | pub fn variant(&mut self) -> Result<Variant, Error> {
  function type_sum (line 19) | pub fn type_sum(&mut self) -> Result<Type, Error> {
  function parse_tyvar (line 27) | fn parse_tyvar(&mut self) -> Result<Type, Error> {
  function parse_tyvar_sequence (line 38) | pub(crate) fn parse_tyvar_sequence(&mut self) -> Result<Vec<Type>, Error> {
  function parse_type_sequence (line 50) | fn parse_type_sequence(&mut self) -> Result<Vec<Type>, Error> {
  function existential (line 58) | fn existential(&mut self) -> Result<Type, Error> {
  function universal (line 77) | fn universal(&mut self) -> Result<Type, Error> {
  function row (line 94) | fn row(&mut self) -> Result<Row, Error> {
  function record (line 105) | fn record(&mut self) -> Result<Type, Error> {
  function type_atom (line 124) | pub(crate) fn type_atom(&mut self) -> Result<Type, Error> {
  function abstraction_arg (line 173) | fn abstraction_arg(&mut self) -> Result<(Type, Kind), Error> {
  function abstraction (line 184) | fn abstraction(&mut self) -> Result<Type, Error> {
  function application (line 203) | fn application(&mut self) -> Result<Type, Error> {
  function product (line 215) | fn product(&mut self) -> Result<Type, Error> {
  function parse_type (line 226) | pub fn parse_type(&mut self) -> Result<Type, Error> {
  function kind_single (line 238) | fn kind_single(&mut self) -> Result<Kind, Error> {
  function kind (line 249) | pub fn kind(&mut self) -> Result<Kind, Error> {

FILE: 07_system_fw/src/syntax/tokens.rs
  type Token (line 3) | pub enum Token {
    method extract_string (line 59) | pub fn extract_string(self) -> String {

FILE: 07_system_fw/src/syntax/visit/types.rs
  type TypeVisitor (line 4) | pub trait TypeVisitor<'t>: Sized {
    method visit_defined (line 5) | fn visit_defined(&mut self, _: &'t str) {}
    method visit_variable (line 7) | fn visit_variable(&mut self, _: &'t str) {}
    method visit_function (line 9) | fn visit_function(&mut self, ty1: &'t Type, ty2: &'t Type) {
    method visit_application (line 14) | fn visit_application(&mut self, ty1: &'t Type, ty2: &'t Type) {
    method visit_sum (line 19) | fn visit_sum(&mut self, var: &'t [Variant]) {
    method visit_product (line 27) | fn visit_product(&mut self, var: &'t [Type]) {
    method visit_record (line 33) | fn visit_record(&mut self, var: &'t [Row]) {
    method visit_existential (line 39) | fn visit_existential(&mut self, _: &'t str, _: &'t Kind, ty: &'t Type) {
    method visit_universal (line 43) | fn visit_universal(&mut self, _: &'t str, _: &'t Kind, ty: &'t Type) {
    method visit_abstraction (line 47) | fn visit_abstraction(&mut self, _: &'t str, _: &'t Kind, ty: &'t Type) {
    method visit_recursive (line 51) | fn visit_recursive(&mut self, ty: &'t Type) {
    method visit_ty (line 55) | fn visit_ty(&mut self, ty: &'t Type) {
    method walk_ty (line 59) | fn walk_ty(&mut self, ty: &'t Type) {

FILE: 07_system_fw/src/terms.rs
  type Constant (line 6) | pub enum Constant {
  type Kind (line 13) | pub enum Kind {
  type Field (line 59) | pub struct Field {
  type Record (line 66) | pub struct Record {
    method get (line 83) | pub fn get(&self, label: &str) -> Option<&Field> {
  type Term (line 71) | pub struct Term {
    method new (line 77) | pub fn new(kind: Kind, span: Span) -> Term {
    method fmt (line 96) | fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {

FILE: 07_system_fw/src/typecheck.rs
  type Context (line 8) | pub struct Context {
    method kinding (line 124) | pub fn kinding(&mut self, ty: &Type) -> Result<TyKind, KindError> {
    method simplify_ty (line 219) | pub fn simplify_ty(&mut self, ty: &mut Type) -> Result<bool, KindError> {
    method equiv (line 236) | pub fn equiv(&mut self, lhs: &Type, rhs: &Type) -> Result<bool, KindEr...
    method is_star_kind (line 248) | fn is_star_kind(&mut self, ty: &Type, err_span: Span) -> Result<(), Di...
    method typecheck (line 263) | pub fn typecheck(&mut self, term: &Term) -> Result<Type, Diagnostic> {
  method default (line 14) | fn default() -> Context {
  type KindError (line 23) | pub enum KindError {
    method to_diag (line 99) | fn to_diag(self, span: Span) -> Diagnostic {
  type TypeSimplifier (line 30) | struct TypeSimplifier<'a> {
  method visit_universal (line 38) | fn visit_universal(&mut self, kind: &mut TyKind, ty: &mut Type) {
  method visit_existential (line 44) | fn visit_existential(&mut self, kind: &mut TyKind, ty: &mut Type) {
  method visit_abs (line 50) | fn visit_abs(&mut self, kind: &mut TyKind, ty: &mut Type) {
  method visit (line 56) | fn visit(&mut self, ty: &mut Type) {
  function ty_app (line 568) | fn ty_app() {
  function ty_exist (line 597) | fn ty_exist() {
  function ty_abs (line 628) | fn ty_abs() {
  function ty_equivalence (line 639) | fn ty_equivalence() {
  function ty_record (line 647) | fn ty_record() {

FILE: 07_system_fw/src/types.rs
  type Type (line 5) | pub enum Type {
    method subst (line 36) | pub fn subst(&mut self, mut s: Type) {
    method label (line 43) | pub fn label(&self, label: &str) -> Option<&Type> {
    method fmt (line 59) | fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
  type TyField (line 23) | pub struct TyField {
  type TyKind (line 29) | pub enum TyKind {
    method fmt (line 100) | fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
  type MutTypeVisitor (line 120) | pub trait MutTypeVisitor: Sized {
    method visit_var (line 121) | fn visit_var(&mut self, _: &mut usize) {}
    method visit_arrow (line 123) | fn visit_arrow(&mut self, ty1: &mut Type, ty2: &mut Type) {
    method visit_universal (line 128) | fn visit_universal(&mut self, _: &mut TyKind, ty: &mut Type) {
    method visit_existential (line 132) | fn visit_existential(&mut self, _: &mut TyKind, ty: &mut Type) {
    method visit_abs (line 136) | fn visit_abs(&mut self, _: &mut TyKind, ty: &mut Type) {
    method visit_app (line 140) | fn visit_app(&mut self, s: &mut Type, t: &mut Type) {
    method visit_record (line 145) | fn visit_record(&mut self, fields: &mut [TyField]) {
    method visit_product (line 151) | fn visit_product(&mut self, tys: &mut [Type]) {
    method visit_projection (line 157) | fn visit_projection(&mut self, ty: &mut Type, _: usize) {
    method visit_recursive (line 161) | fn visit_recursive(&mut self, ty: &mut Type) {
    method visit (line 165) | fn visit(&mut self, ty: &mut Type) {
    method walk (line 169) | fn walk(&mut self, ty: &mut Type) {
    method visit_var (line 199) | fn visit_var(&mut self, var: &mut usize) {
    method visit_universal (line 206) | fn visit_universal(&mut self, _: &mut TyKind, ty: &mut Type) {
    method visit_existential (line 212) | fn visit_existential(&mut self, _: &mut TyKind, ty: &mut Type) {
    method visit_abs (line 218) | fn visit_abs(&mut self, _: &mut TyKind, ty: &mut Type) {
    method visit_universal (line 243) | fn visit_universal(&mut self, _: &mut TyKind, ty: &mut Type) {
    method visit_existential (line 249) | fn visit_existential(&mut self, _: &mut TyKind, ty: &mut Type) {
    method visit_abs (line 255) | fn visit_abs(&mut self, _: &mut TyKind, ty: &mut Type) {
    method visit (line 267) | fn visit(&mut self, ty: &mut Type) {
  type Shift (line 187) | pub struct Shift {
    method new (line 193) | pub const fn new(shift: isize) -> Shift {
  type Subst (line 231) | pub struct Subst {
    method new (line 237) | pub fn new(ty: Type) -> Subst {

FILE: util/src/arena.rs
  constant MIN_CAPACITY (line 40) | pub const MIN_CAPACITY: u32 = 16;
  type Arena (line 43) | pub struct Arena<T> {
  type Index (line 49) | pub struct Index(NonZeroU32);
  type Entry (line 53) | enum Entry<T> {
  function fmt (line 60) | fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
  function default (line 69) | fn default() -> Arena<T> {
  function with_capacity (line 89) | pub fn with_capacity(n: u32) -> Arena<T> {
  function capacity (line 101) | pub fn capacity(&self) -> u32 {
  function get_free (line 110) | fn get_free(&self) -> Option<NonZeroU32> {
  function set_free (line 119) | fn set_free(&mut self, next: Option<NonZeroU32>) {
  function reserve (line 128) | fn reserve(&mut self, n: u32) {
  function try_insert (line 155) | pub fn try_insert(&mut self, item: T) -> Result<Index, T> {
  function insert (line 176) | pub fn insert(&mut self, item: T) -> Index {
  function reserve_insert (line 185) | fn reserve_insert(&mut self, item: T) -> Index {
  function remove (line 193) | pub fn remove(&mut self, index: Index) -> Option<T> {
  function get (line 212) | pub fn get(&self, index: Index) -> Option<&T> {
  function get_mut (line 221) | pub fn get_mut(&mut self, index: Index) -> Option<&mut T> {
  function iter (line 229) | pub fn iter(&self) -> Iter<'_, T> {
  type Iter (line 242) | pub struct Iter<'a, T> {
  type Item (line 248) | type Item = &'a T;
  method next (line 249) | fn next(&mut self) -> Option<Self::Item> {
  type IntoIter (line 261) | pub struct IntoIter<T> {
  type Item (line 267) | type Item = T;
  method next (line 268) | fn next(&mut self) -> Option<Self::Item> {
  type Item (line 280) | type Item = T;
  type IntoIter (line 281) | type IntoIter = IntoIter<T>;
  method into_iter (line 284) | fn into_iter(self) -> Self::IntoIter {
  type Item (line 293) | type Item = &'a T;
  type IntoIter (line 294) | type IntoIter = Iter<'a, T>;
  method into_iter (line 296) | fn into_iter(self) -> Self::IntoIter {
  function from_iter (line 310) | fn from_iter<I: IntoIterator<Item = T>>(iter: I) -> Arena<T> {
  function index_size (line 324) | fn index_size() {
  function smoke_insert (line 329) | fn smoke_insert() {
  function smoke_remove (line 335) | fn smoke_remove() {
  function smoke_iter (line 343) | fn smoke_iter() {
  function fill (line 352) | fn fill() {

FILE: util/src/diagnostic.rs
  type Diagnostic (line 7) | pub struct Diagnostic<'s> {
  function new (line 13) | pub fn new(src: &str) -> Diagnostic<'_> {
  function push (line 20) | pub fn push<S: Into<String>>(&mut self, msg: S, span: Span) {
  function error_count (line 24) | pub fn error_count(&self) -> usize {
  function pop (line 29) | pub fn pop(&mut self) -> Option<String> {
  function emit (line 47) | pub fn emit(mut self) -> String {
  method drop (line 73) | fn drop(&mut self) {

FILE: util/src/span.rs
  type Location (line 7) | pub struct Location {
    method new (line 14) | pub fn new(line: u32, col: u32, abs: u32) -> Location {
    method fmt (line 20) | fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
  type Span (line 27) | pub struct Span {
    method new (line 39) | pub fn new(start: Location, end: Location) -> Span {
    method dummy (line 43) | pub const fn dummy() -> Span {
    method zero (line 52) | pub const fn zero() -> Span {
    type Output (line 147) | type Output = Self;
    method add (line 148) | fn add(self, rhs: Self) -> Self::Output {
    method add_assign (line 157) | fn add_assign(&mut self, rhs: Self) {
  type Spanned (line 33) | pub struct Spanned<T> {
  function new (line 64) | pub fn new(span: Span, data: T) -> Spanned<T> {
  function map (line 69) | pub fn map<B, F: Fn(T) -> B>(self, f: F) -> Spanned<B> {
  function replace (line 77) | pub fn replace<V>(self, src: V) -> Spanned<V> {
  function into_inner (line 85) | pub fn into_inner(self) -> T {
  function span (line 90) | pub fn span(&self) -> Span {
  function data (line 95) | pub fn data(self) -> T {
  function map_result (line 103) | pub fn map_result(self) -> Result<Spanned<T>, Spanned<E>> {
  function map_option (line 110) | pub fn map_option(self) -> Option<Spanned<T>> {
  function fmt (line 117) | fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
  function fmt (line 123) | fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
  type Target (line 129) | type Target = T;
  function deref (line 130) | fn deref(&self) -> &Self::Target {
  method clone (line 136) | fn clone(&self) -> Self {

FILE: util/src/unsafe_arena.rs
  type Arena (line 13) | pub struct Arena<T> {
  type Chunk (line 20) | struct Chunk<T> {
  type Info (line 29) | struct Info {
  method default (line 35) | fn default() -> Arena<T> {
  function with_capacity (line 46) | pub fn with_capacity(capacity: usize) -> Arena<T> {
  function can_alloc (line 59) | fn can_alloc(&self, n: usize) -> bool {
  function ensure_capacity (line 66) | fn ensure_capacity(&self, n: usize) {
  function entries (line 73) | fn entries(&self) -> usize {
  function chunks (line 81) | fn chunks(&self) -> Vec<Info> {
  function grow (line 106) | fn grow(&self, n: usize) {
  function alloc (line 133) | pub fn alloc(&self, value: T) -> &mut T {
  function alloc_raw_slice (line 147) | unsafe fn alloc_raw_slice(&self, n: usize) -> *mut T {
  function alloc_slice (line 157) | pub fn alloc_slice(&self, slice: &[T]) -> &mut [T]
  function drop (line 171) | fn drop(&mut self) {
  function padding_needed (line 179) | fn padding_needed(l: &Layout, align: usize) -> usize {
  function round (line 186) | fn round(layout: &Layout, align: usize) -> usize {
  function extend (line 193) | fn extend(a: Layout, b: Layout) -> Layout {
  function layout (line 204) | fn layout(capacity: usize) -> Layout {
  function new (line 213) | unsafe fn new(prev: *mut Chunk<T>, capacity: usize) -> *mut Chunk<T> {
  function destroy (line 227) | unsafe fn destroy(&mut self, len: usize) {
  function start (line 245) | pub fn start(&self) -> *mut T {
  function end (line 259) | pub fn end(&self) -> *mut T {
  type DropGuard (line 268) | struct DropGuard {
    method drop (line 273) | fn drop(&mut self) {
  type Test (line 279) | struct Test {
  function new_chunk (line 288) | fn new_chunk() {
  function drop_test (line 301) | fn drop_test() {
  function references (line 317) | fn references() {
  function slice (line 334) | fn slice() {

FILE: x1_bidir/src/helpers.rs
  function ty_display (line 106) | fn ty_display(ty: &Type) -> String {
  function expr_display (line 128) | fn expr_display(ex: &Expr) -> String {
  method fmt (line 171) | fn fmt(&self, f: &mut std::fmt::Formatter) -> std::fmt::Result {
  method fmt (line 176) | fn fmt(&self, f: &mut std::fmt::Formatter) -> std::fmt::Result {

FILE: x1_bidir/src/main.rs
  type Kind (line 12) | enum Kind {
  type Type (line 19) | enum Type {
    method monotype (line 41) | fn monotype(&self) -> bool {
    method freevars (line 50) | fn freevars(&self) -> Vec<usize> {
    method shift (line 81) | fn shift(&mut self, s: isize) {
    method subst (line 111) | fn subst(&mut self, s: &mut Type) {
  type LR (line 148) | enum LR {
  type Expr (line 155) | enum Expr {
  type Arm (line 181) | struct Arm {
  type Element (line 188) | enum Element {
  type Context (line 207) | pub struct Context {
    method fresh_ev (line 216) | fn fresh_ev(&mut self) -> usize {
    method well_formed (line 225) | fn well_formed(&mut self, ty: &Type) -> bool {
    method check_wf (line 239) | fn check_wf(&mut self, ty: &Type) -> Result<bool, String> {
    method with_scope (line 248) | fn with_scope<T, F: FnMut(&mut Context) -> T>(&mut self, e: Element, m...
    method apply (line 262) | fn apply(&self, ty: Type) -> Type {
    method find_annotation (line 285) | fn find_annotation(&self, idx: usize) -> Option<&Type> {
    method find_type_var (line 305) | fn find_type_var(&self, idx: usize) -> Option<&Kind> {
    method find_solved (line 323) | fn find_solved(&self, alpha: usize) -> Option<&Type> {
    method splice_hole (line 337) | fn splice_hole<F: Fn(&mut Vec<Element>)>(&mut self, exist: usize, f: F...
    method split_context (line 344) | fn split_context(&mut self, exist: usize) -> Result<(&mut Self, Vec<El...
    method kinding (line 358) | fn kinding(&mut self, ty: &Type) -> Option<Kind> {
    method beta_reduce (line 362) | fn beta_reduce(&mut self, ty: &mut Type) -> Result<(), String> {
    method subtype (line 403) | fn subtype(&mut self, mut a: Type, mut b: Type) -> Result<(), String> {
    method instantiateL (line 454) | fn instantiateL(&mut self, alpha: usize, a: &Type) -> Result<(), Strin...
    method instantiateR (line 516) | fn instantiateR(&mut self, a: &Type, alpha: usize) -> Result<(), Strin...
    method infer (line 568) | fn infer(&mut self, e: &Expr) -> Result<Type, String> {
    method infer_app (line 692) | fn infer_app(&mut self, ty: &Type, e2: &Expr) -> Result<Type, String> {
    method check (line 728) | fn check(&mut self, e: &Expr, a: &Type) -> Result<(), String> {
  function infer (line 784) | fn infer(ex: &Expr) -> Result<Type, String> {
  function main (line 792) | fn main() {
  function identity (line 830) | fn identity() {
  function application (line 841) | fn application() {
  function application2 (line 853) | fn application2() {
  function sum_type (line 872) | fn sum_type() {
  function product_type (line 897) | fn product_type() {

FILE: x2_dependent/src/main.rs
  type Term (line 2) | enum Term {
    method normal (line 13) | fn normal(&self) -> bool {
    method whnf (line 21) | fn whnf(&self) -> bool {
    method subst (line 29) | fn subst(&mut self, mut t2: Term) {
  type Context (line 63) | struct Context {
    method get (line 105) | fn get(&self, idx: usize) -> Option<&Term> {
    method with_bind (line 109) | fn with_bind<T, F: Fn(&mut Context) -> T>(&mut self, bind: Term, f: F)...
    method equiv (line 116) | fn equiv(&mut self, t1: &Term, t2: &Term) -> bool {
    method type_of (line 131) | fn type_of(&mut self, term: &Term) -> Result<Term, Error> {
  type Error (line 68) | enum Error {
  type Visitor (line 74) | struct Visitor {
    method new (line 79) | fn new() -> Visitor {
    method visit (line 83) | fn visit<F: Fn(&mut Term, usize)>(&mut self, term: &mut Term, f: &F) {
  function beta_reduce (line 168) | fn beta_reduce(mut term: Term) -> Term {
  function main (line 192) | fn main() {
Condensed preview — 97 files, each showing path, character count, and a content snippet. Download the .json file or copy for the full structured content (459K chars).
[
  {
    "path": ".gitattributes",
    "chars": 11,
    "preview": "*\ttext=auto"
  },
  {
    "path": ".github/ISSUE_TEMPLATE/bug_report.md",
    "chars": 536,
    "preview": "---\nname: Bug report\nabout: Create a report to help us improve\ntitle: ''\nlabels: bug\nassignees: ''\n\n---\n\n**Describe the "
  },
  {
    "path": ".github/ISSUE_TEMPLATE/feature_request.md",
    "chars": 604,
    "preview": "---\nname: Feature request\nabout: Suggest an idea for this project\ntitle: ''\nlabels: enhancement\nassignees: ''\n\n---\n\n**Is"
  },
  {
    "path": ".github/workflows/rust.yml",
    "chars": 295,
    "preview": "name: Rust\n\non: [push, pull_request]\n\njobs:\n  build:\n\n    runs-on: ubuntu-latest\n\n    steps:\n    - uses: actions/checkou"
  },
  {
    "path": ".gitignore",
    "chars": 27,
    "preview": "/target\n**/*.rs.bk\n.vscode/"
  },
  {
    "path": ".rustfmt.toml",
    "chars": 36,
    "preview": "wrap_comments = true\nmax_width = 120"
  },
  {
    "path": ".travis.yml",
    "chars": 181,
    "preview": "language: rust\nrust:\n  - stable\n  - nightly\nmatrix:\n  allow_failures:\n    - rust: nightly\nscript:\n  - cargo build --verb"
  },
  {
    "path": "01_arith/Cargo.toml",
    "chars": 153,
    "preview": "[package]\nname = \"arith\"\nversion = \"0.1.0\"\nauthors = [\"Michael Lazear <lazear@scripps.edu>\"]\nedition = \"2018\"\n\n[dependen"
  },
  {
    "path": "01_arith/src/lexer.rs",
    "chars": 4674,
    "preview": "use util::span::{Location, Span, Spanned};\n\nuse std::char;\nuse std::iter::Peekable;\nuse std::str::Chars;\n\n#[derive(Copy,"
  },
  {
    "path": "01_arith/src/main.rs",
    "chars": 2216,
    "preview": "mod lexer;\nmod parser;\nuse parser::{Parser, Term};\n\n#[derive(Copy, Clone, Debug, PartialEq, PartialOrd)]\npub enum Runtim"
  },
  {
    "path": "01_arith/src/parser.rs",
    "chars": 2873,
    "preview": "use crate::lexer::{Lexer, Token};\nuse std::iter::Peekable;\nuse util::diagnostic::Diagnostic;\nuse util::span::Span;\n\n#[de"
  },
  {
    "path": "02_lambda/Cargo.toml",
    "chars": 154,
    "preview": "[package]\nname = \"lambda\"\nversion = \"0.1.0\"\nauthors = [\"Michael Lazear <lazear@scripps.edu>\"]\nedition = \"2018\"\n\n[depende"
  },
  {
    "path": "02_lambda/src/context.rs",
    "chars": 737,
    "preview": "use std::collections::VecDeque;\n\n#[derive(Clone, Debug, Default)]\npub struct Context {\n    inner: VecDeque<String>,\n}\n\ni"
  },
  {
    "path": "02_lambda/src/lexer.rs",
    "chars": 2611,
    "preview": "use util::span::{Location, Span, Spanned};\n\nuse std::char;\nuse std::iter::Peekable;\nuse std::str::Chars;\n\n#[derive(Copy,"
  },
  {
    "path": "02_lambda/src/main.rs",
    "chars": 2806,
    "preview": "mod context;\nmod lexer;\nmod parser;\nuse parser::Parser;\n\nuse context::Context;\nuse parser::{RcTerm, Term};\n\nfn shift1(d:"
  },
  {
    "path": "02_lambda/src/parser.rs",
    "chars": 4761,
    "preview": "use crate::context::Context;\nuse crate::lexer::{Lexer, Token};\nuse std::iter::Peekable;\nuse std::ops::Deref;\nuse std::rc"
  },
  {
    "path": "03_typedarith/Cargo.toml",
    "chars": 256,
    "preview": "[package]\nname = \"typedarith\"\nversion = \"0.1.0\"\nauthors = [\"Michael Lazear <lazear@scripps.edu>\"]\nedition = \"2018\"\n\n# Se"
  },
  {
    "path": "03_typedarith/src/ast.rs",
    "chars": 1926,
    "preview": "use std::ops::Deref;\nuse std::rc::Rc;\n\n#[derive(Copy, Clone, Debug, PartialEq, PartialOrd)]\npub enum Type {\n    Nat,\n   "
  },
  {
    "path": "03_typedarith/src/lexer.rs",
    "chars": 4672,
    "preview": "use util::span::{Location, Span, Spanned};\n\nuse std::char;\nuse std::iter::Peekable;\nuse std::str::Chars;\n\n#[derive(Copy,"
  },
  {
    "path": "03_typedarith/src/main.rs",
    "chars": 487,
    "preview": "mod ast;\nmod lexer;\nmod parser;\nuse ast::*;\nuse parser::Parser;\n\nfn main() {\n    let input = \"if iszero(succ(zero)) then"
  },
  {
    "path": "03_typedarith/src/parser.rs",
    "chars": 2645,
    "preview": "use crate::ast::{RcTerm, Term};\nuse crate::lexer::{Lexer, Token};\nuse std::iter::Peekable;\nuse util::diagnostic::Diagnos"
  },
  {
    "path": "04_stlc/.gitignore",
    "chars": 27,
    "preview": "/target\n**/*.rs.bk\n.vscode/"
  },
  {
    "path": "04_stlc/Cargo.toml",
    "chars": 250,
    "preview": "[package]\nname = \"stlc\"\nversion = \"0.1.0\"\nauthors = [\"Michael Lazear <lazear@scripps.edu>\"]\nedition = \"2018\"\n\n# See more"
  },
  {
    "path": "04_stlc/src/eval.rs",
    "chars": 3511,
    "preview": "use super::term::*;\nuse super::typing::Context;\nuse super::visitor::{Direction, MutVisitor, Shifting, Substitution};\n\n#["
  },
  {
    "path": "04_stlc/src/lexer.rs",
    "chars": 6891,
    "preview": "use util::span::{Location, Span};\n\nuse std::char;\nuse std::iter::Peekable;\nuse std::str::Chars;\n\n#[derive(Clone, Debug, "
  },
  {
    "path": "04_stlc/src/main.rs",
    "chars": 2085,
    "preview": "#![allow(unused_variables)]\nmod eval;\nmod lexer;\nmod parser;\nmod term;\nmod typing;\nmod visitor;\n\nuse term::Term;\nuse typ"
  },
  {
    "path": "04_stlc/src/parser.rs",
    "chars": 9994,
    "preview": "use crate::lexer::{Lexer, Token, TokenKind};\nuse crate::term::{Field, Term};\nuse crate::typing::{Record, RecordField, Ty"
  },
  {
    "path": "04_stlc/src/term.rs",
    "chars": 2123,
    "preview": "use crate::typing::Type;\nuse std::fmt;\nuse util::span::Span;\n\n#[derive(Clone, Debug, PartialEq, PartialOrd)]\npub struct "
  },
  {
    "path": "04_stlc/src/typing.rs",
    "chars": 9306,
    "preview": "use crate::term::Term;\nuse std::fmt;\n\n#[derive(Clone, PartialEq, PartialOrd)]\npub enum Type {\n    Unit,\n    Bool,\n    Na"
  },
  {
    "path": "04_stlc/src/visitor.rs",
    "chars": 4827,
    "preview": "use super::*;\nuse crate::term::{Field, Term};\nuse std::default::Default;\n\npub trait Visitor: Sized {\n    fn visit_var(&m"
  },
  {
    "path": "05_recon/Cargo.toml",
    "chars": 251,
    "preview": "[package]\nname = \"recon\"\nversion = \"0.1.0\"\nauthors = [\"Michael Lazear <lazear@scripps.edu>\"]\nedition = \"2018\"\n\n# See mor"
  },
  {
    "path": "05_recon/src/disjoint.rs",
    "chars": 8354,
    "preview": "//! A disjoint set using the union-find algorithm with path-compression\n\nuse std::cell::Cell;\nuse std::cmp::Ordering;\nus"
  },
  {
    "path": "05_recon/src/main.rs",
    "chars": 8582,
    "preview": "use std::collections::{HashMap, HashSet};\npub mod disjoint;\npub mod mutation;\npub mod naive;\npub mod parser;\npub mod typ"
  },
  {
    "path": "05_recon/src/mutation/mod.rs",
    "chars": 7643,
    "preview": "use super::{Term, T_ARROW, T_BOOL, T_INT, T_UNIT};\nuse std::collections::{HashMap, HashSet, VecDeque};\nuse std::rc::Rc;\n"
  },
  {
    "path": "05_recon/src/mutation/write_once.rs",
    "chars": 2283,
    "preview": "use std::cell::{Cell, UnsafeCell};\nuse std::rc::Rc;\nuse std::sync::atomic::{AtomicBool, Ordering};\n\npub struct WriteOnce"
  },
  {
    "path": "05_recon/src/naive.rs",
    "chars": 1288,
    "preview": "use super::*;\n\nfn var_bind(var: TypeVar, ty: Type) -> Result<HashMap<TypeVar, Type>, String> {\n    if ty.occurs(var) {\n "
  },
  {
    "path": "05_recon/src/parser.rs",
    "chars": 10793,
    "preview": "use super::Term;\nuse std::char;\nuse std::collections::VecDeque;\nuse std::iter::Peekable;\nuse std::str::Chars;\nuse util::"
  },
  {
    "path": "05_recon/src/types.rs",
    "chars": 4153,
    "preview": "use std::collections::{HashMap, HashSet, VecDeque};\n\n#[derive(Copy, Clone, Default, PartialEq, PartialOrd, Eq, Hash)]\npu"
  },
  {
    "path": "06_system_f/Cargo.toml",
    "chars": 254,
    "preview": "[package]\nname = \"system_f\"\nversion = \"0.1.0\"\nauthors = [\"Michael Lazear <lazear@scripps.edu>\"]\nedition = \"2018\"\n\n# See "
  },
  {
    "path": "06_system_f/README.md",
    "chars": 89,
    "preview": "# System F\n\nAn extension of the simply typed lambda calculus with parametric polymorphism"
  },
  {
    "path": "06_system_f/src/diagnostics.rs",
    "chars": 1855,
    "preview": "use util::span::Span;\n#[derive(Debug, Copy, Clone)]\npub enum Level {\n    Warn,\n    Error,\n}\n\n#[derive(Debug, Clone)]\npub"
  },
  {
    "path": "06_system_f/src/eval.rs",
    "chars": 10969,
    "preview": "use crate::patterns::Pattern;\nuse crate::terms::visit::{Shift, Subst, TyTermSubst};\nuse crate::terms::{Kind, Literal, Pr"
  },
  {
    "path": "06_system_f/src/macros.rs",
    "chars": 2986,
    "preview": "//! Macros to make writing tests easier\n\n/// Boolean term\nmacro_rules! lit {\n    ($x:expr) => {\n        crate::terms::Te"
  },
  {
    "path": "06_system_f/src/main.rs",
    "chars": 4137,
    "preview": "#![allow(unused_variables, unused_macros)]\n#[macro_use]\npub mod macros;\npub mod diagnostics;\npub mod eval;\npub mod patte"
  },
  {
    "path": "06_system_f/src/patterns/mod.rs",
    "chars": 4427,
    "preview": "use crate::terms::{Kind, Literal, Term};\nuse crate::types::{variant_field, Type};\nuse crate::visit::PatternVisitor;\nuse "
  },
  {
    "path": "06_system_f/src/syntax/lexer.rs",
    "chars": 7566,
    "preview": "use super::{Token, TokenKind};\nuse std::char;\nuse std::iter::Peekable;\nuse std::str::Chars;\nuse util::span::{Location, S"
  },
  {
    "path": "06_system_f/src/syntax/mod.rs",
    "chars": 1093,
    "preview": "//! Lexical analysis and recursive descent parser for System F\npub mod lexer;\npub mod parser;\nuse util::span::Span;\n\n#[d"
  },
  {
    "path": "06_system_f/src/syntax/parser.rs",
    "chars": 21008,
    "preview": "use super::lexer::Lexer;\nuse super::{Token, TokenKind};\n\nuse std::collections::VecDeque;\nuse util::diagnostic::Diagnosti"
  },
  {
    "path": "06_system_f/src/terms/mod.rs",
    "chars": 5636,
    "preview": "//! Representation lambda calculus terms\nuse crate::patterns::Pattern;\nuse crate::types::Type;\nuse std::fmt;\nuse util::s"
  },
  {
    "path": "06_system_f/src/terms/visit.rs",
    "chars": 5758,
    "preview": "use crate::patterns::{Pattern, PatternCount};\nuse crate::terms::{Arm, Kind, Primitive, Term};\nuse crate::types::Type;\nus"
  },
  {
    "path": "06_system_f/src/types/mod.rs",
    "chars": 15545,
    "preview": "//! Typechecking of the simply typed lambda calculus with parametric\n//! polymorphism\npub mod patterns;\npub mod visit;\nu"
  },
  {
    "path": "06_system_f/src/types/patterns.rs",
    "chars": 15206,
    "preview": "//! Naive, inefficient exhaustiveness checking for pattern matching\n//!\n//! Inspired somewhat by the docs for the Rust c"
  },
  {
    "path": "06_system_f/src/types/visit.rs",
    "chars": 2258,
    "preview": "use super::Type;\nuse crate::visit::MutTypeVisitor;\nuse std::convert::TryFrom;\n\npub struct Shift {\n    pub cutoff: usize,"
  },
  {
    "path": "06_system_f/src/visit.rs",
    "chars": 5682,
    "preview": "//! Visitor traits for [`Pattern`], [`Term`], and [`Type`] objects\nuse crate::patterns::Pattern;\nuse crate::terms::{Arm,"
  },
  {
    "path": "06_system_f/test.sf",
    "chars": 1995,
    "preview": "let func = \\X (\\c: {None | Some X}. \\x: X->(X, X). \n\tcase c of \n\t\t| None => None of {None | Some (X, X)}\n\t\t| Some val =>"
  },
  {
    "path": "07_system_fw/Cargo.toml",
    "chars": 255,
    "preview": "[package]\nname = \"system_fw\"\nversion = \"0.1.0\"\nauthors = [\"Michael Lazear <lazear@scripps.edu>\"]\nedition = \"2018\"\n\n# See"
  },
  {
    "path": "07_system_fw/README.md",
    "chars": 2280,
    "preview": "# System Fω\n\nThis is an implementation (mostly of just the type system) of the higher-order polymorphic lambda calculus "
  },
  {
    "path": "07_system_fw/src/diagnostics.rs",
    "chars": 2313,
    "preview": "use std::fmt;\nuse util::span::Span;\n\n#[derive(Debug, Clone, PartialEq)]\npub enum Level {\n    Warn,\n    Error,\n}\n\n#[deriv"
  },
  {
    "path": "07_system_fw/src/elaborate.rs",
    "chars": 31564,
    "preview": "use super::ast::*;\nuse super::hir::{self, Constructor, DeBruijn, HirId};\nuse super::stack::Stack;\nuse super::syntax::vis"
  },
  {
    "path": "07_system_fw/src/functor.rs",
    "chars": 2036,
    "preview": "use super::*;\n\npub fn parameterized_set() -> Type {\n    tyop!(kind!(*), exist!(kind!(* => *), op_app!(Type::Var(0), Type"
  },
  {
    "path": "07_system_fw/src/hir/bidir.rs",
    "chars": 3626,
    "preview": "use super::*;\nuse crate::elaborate::Elaborated;\nuse std::collections::HashMap;\n\nuse super::Type::*;\n\n#[derive(Debug)]\npu"
  },
  {
    "path": "07_system_fw/src/hir/mod.rs",
    "chars": 5505,
    "preview": "pub mod bidir;\n\nuse std::fmt;\n\n#[derive(Clone, Debug, PartialEq, PartialOrd, Eq, Hash)]\npub struct DeBruijn {\n    pub id"
  },
  {
    "path": "07_system_fw/src/macros.rs",
    "chars": 4164,
    "preview": "#![allow(unused_macros)]\n/// Boolean term\nmacro_rules! bool {\n    ($x:expr) => {\n        crate::terms::Term::new(\n      "
  },
  {
    "path": "07_system_fw/src/main.rs",
    "chars": 1508,
    "preview": "#![allow(dead_code)]\n#[macro_use]\npub mod macros;\npub mod diagnostics;\npub mod elaborate;\npub mod functor;\npub mod hir;\n"
  },
  {
    "path": "07_system_fw/src/stack.rs",
    "chars": 2339,
    "preview": "//! Wrapper around a Vec for use as a de Bruijn indexed stack, e.g. index 0\n//! returns the last item pushed onto the st"
  },
  {
    "path": "07_system_fw/src/syntax/ast.rs",
    "chars": 5089,
    "preview": "use util::span::Span;\n\n#[derive(Copy, Clone, Debug, Default, PartialEq, PartialOrd, Eq, Hash)]\npub struct AstId(pub(crat"
  },
  {
    "path": "07_system_fw/src/syntax/lexer.rs",
    "chars": 7455,
    "preview": "use super::tokens::*;\nuse std::char;\nuse std::iter::Peekable;\nuse std::str::Chars;\nuse util::span::{Location, Span, Span"
  },
  {
    "path": "07_system_fw/src/syntax/mod.rs",
    "chars": 75,
    "preview": "pub mod ast;\npub mod lexer;\npub mod parser;\npub mod tokens;\npub mod visit;\n"
  },
  {
    "path": "07_system_fw/src/syntax/parser/README.md",
    "chars": 571,
    "preview": "# Parser\n\nWe use a handwritten recursive descent parser. In general, there is a top-level entry function for parsing of "
  },
  {
    "path": "07_system_fw/src/syntax/parser/decls.rs",
    "chars": 4189,
    "preview": "use super::*;\n\nimpl<'s> Parser<'s> {\n    fn decl_datatype(&mut self) -> Result<Decl, Error> {\n        let mut span = sel"
  },
  {
    "path": "07_system_fw/src/syntax/parser/exprs.rs",
    "chars": 6971,
    "preview": "use super::*;\n\nimpl<'s> Parser<'s> {\n    fn record_row(&mut self) -> Result<Field, Error> {\n        let mut span = self."
  },
  {
    "path": "07_system_fw/src/syntax/parser/infix.rs",
    "chars": 341,
    "preview": "use std::collections::HashMap;\n\n#[derive(Clone, Default, Debug)]\npub struct Infix {\n    precedence: HashMap<String, usiz"
  },
  {
    "path": "07_system_fw/src/syntax/parser/mod.rs",
    "chars": 6458,
    "preview": "pub mod decls;\npub mod exprs;\npub mod infix;\npub mod patterns;\npub mod types;\n\nuse super::ast::*;\nuse super::lexer::Lexe"
  },
  {
    "path": "07_system_fw/src/syntax/parser/patterns.rs",
    "chars": 3102,
    "preview": "use super::*;\n\nuse PatKind::*;\n\nimpl<'s> Parser<'s> {\n    fn tuple_pattern(&mut self) -> Result<Pattern, Error> {\n      "
  },
  {
    "path": "07_system_fw/src/syntax/parser/types.rs",
    "chars": 9001,
    "preview": "use super::*;\n\nuse TypeKind::*;\n\nimpl<'s> Parser<'s> {\n    /// Parse a datatype Constructor [A-Z]+\n    pub fn variant(&m"
  },
  {
    "path": "07_system_fw/src/syntax/tokens.rs",
    "chars": 899,
    "preview": "#[allow(dead_code)]\n#[derive(Clone, Debug, PartialEq, PartialOrd)]\npub enum Token {\n    Dot,\n    Colon,\n    Opaque,\n    "
  },
  {
    "path": "07_system_fw/src/syntax/visit/mod.rs",
    "chars": 54,
    "preview": "use super::*;\nmod types;\n\npub use types::TypeVisitor;\n"
  },
  {
    "path": "07_system_fw/src/syntax/visit/types.rs",
    "chars": 2218,
    "preview": "use super::*;\nuse ast::{Kind, Row, Type, TypeKind, Variant};\n\npub trait TypeVisitor<'t>: Sized {\n    fn visit_defined(&m"
  },
  {
    "path": "07_system_fw/src/terms.rs",
    "chars": 3649,
    "preview": "use crate::types::{TyKind, Type};\nuse util::span::Span;\n\n/// Constant expression or pattern\n#[derive(Copy, Clone, Debug,"
  },
  {
    "path": "07_system_fw/src/typecheck.rs",
    "chars": 27165,
    "preview": "use crate::diagnostics::Diagnostic;\nuse crate::stack::Stack;\nuse crate::terms::{Constant, Field, Kind, Record, Term};\nus"
  },
  {
    "path": "07_system_fw/src/types.rs",
    "chars": 7715,
    "preview": "use std::convert::TryFrom;\nuse std::fmt;\n\n#[derive(Clone, Debug, PartialEq, PartialOrd)]\npub enum Type {\n    Unit,\n    N"
  },
  {
    "path": "07_system_fw/test.fw",
    "chars": 854,
    "preview": "datatype 'a list = Nil | Cons of 'a * 'a list \ndatatype 'a option = None | Some of 'a \ndatatype ('a, 'b) either = Left o"
  },
  {
    "path": "Cargo.toml",
    "chars": 156,
    "preview": "[workspace]\nmembers = [\"01_arith\", \"02_lambda\",  \"03_typedarith\", \"04_stlc\", \"05_recon\",  \"06_system_f\", \"07_system_fw\","
  },
  {
    "path": "LICENSE",
    "chars": 1071,
    "preview": "MIT License\n\nCopyright (c) 2019 Michael Lazear\n\nPermission is hereby granted, free of charge, to any person obtaining a "
  },
  {
    "path": "README.md",
    "chars": 2542,
    "preview": "# types-and-programming-languages\n\n![](https://github.com/lazear/types-and-programming-languages/workflows/Rust/badge.sv"
  },
  {
    "path": "util/.gitignore",
    "chars": 27,
    "preview": "/target\n**/*.rs.bk\n.vscode/"
  },
  {
    "path": "util/Cargo.toml",
    "chars": 124,
    "preview": "[package]\nname = \"util\"\nversion = \"0.1.0\"\nauthors = [\"Michael Lazear <lazear@scripps.edu>\"]\nedition = \"2018\"\n\n[dependenc"
  },
  {
    "path": "util/src/arena.rs",
    "chars": 10333,
    "preview": "//! A safe, fast, and space-efficient typed Arena allocator\n//!\n//! # Examples:\n//!\n//! ```\n//! use util::arena::Arena;\n"
  },
  {
    "path": "util/src/diagnostic.rs",
    "chars": 2426,
    "preview": "//! Diagnostic handling for errors detected in source code.\n//!\n//! Dropping a [`Diagnostic`] without calling `emit` wil"
  },
  {
    "path": "util/src/lib.rs",
    "chars": 171,
    "preview": "//! Source code locations and diagnostic reporting that can be shared\n//! across different projects\npub mod arena;\npub m"
  },
  {
    "path": "util/src/span.rs",
    "chars": 3639,
    "preview": "//! Source code locations and spans\n\nuse std::fmt;\n\n#[derive(Copy, Clone, Debug, PartialEq, PartialOrd, Default)]\n/// St"
  },
  {
    "path": "util/src/unsafe_arena.rs",
    "chars": 9483,
    "preview": "//! A fast and efficient typed arena\n//!\n//! Translated from rustc's TypedArena into stable rust\n//!\n//! https://doc.rus"
  },
  {
    "path": "x1_bidir/Cargo.toml",
    "chars": 224,
    "preview": "[package]\nname = \"bidir\"\nversion = \"0.1.0\"\nauthors = [\"Michael Lazear <lazear@scripps.edu>\"]\nedition = \"2018\"\n\n# See mor"
  },
  {
    "path": "x1_bidir/src/helpers.rs",
    "chars": 5101,
    "preview": "use super::*;\n\nmacro_rules! var {\n    ($x:expr) => {\n        Expr::Var($x)\n    };\n}\n\nmacro_rules! app {\n    ($x:expr, $y"
  },
  {
    "path": "x1_bidir/src/main.rs",
    "chars": 33244,
    "preview": "//! \"Complete and Easy Bidirectional Typechecking for Higher-Rank Polymorphism\"\n//! Paper by J. Dunfield and N. Krishnas"
  },
  {
    "path": "x2_dependent/Cargo.toml",
    "chars": 228,
    "preview": "[package]\nname = \"dependent\"\nversion = \"0.1.0\"\nauthors = [\"Michael Lazear <lazear@scripps.edu>\"]\nedition = \"2018\"\n\n# See"
  },
  {
    "path": "x2_dependent/src/main.rs",
    "chars": 6187,
    "preview": "#[derive(Debug, Clone, PartialEq)]\nenum Term {\n    Universe(usize),\n    Nat,\n    Var(usize),\n    Int(usize),\n    App(Box"
  }
]

About this extraction

This page contains the full source code of the lazear/types-and-programming-languages GitHub repository, extracted and formatted as plain text for AI agents and large language models (LLMs). The extraction includes 97 files (427.6 KB), approximately 111.7k tokens, and a symbol index with 1012 extracted functions, classes, methods, constants, and types. Use this with OpenClaw, Claude, ChatGPT, Cursor, Windsurf, or any other AI tool that accepts text input. You can copy the full output to your clipboard or download it as a .txt file.

Extracted by GitExtract — free GitHub repo to text converter for AI. Built by Nikandr Surkov.

Copied to clipboard!