Full Code of teikalang/teika for AI

main 8aba5452dda1 cached
80 files
177.1 KB
52.1k tokens
2 symbols
1 requests
Download .txt
Repository: teikalang/teika
Branch: main
Commit: 8aba5452dda1
Files: 80
Total size: 177.1 KB

Directory structure:
gitextract_2gsoytl5/

├── .envrc
├── .gitignore
├── .ocamlformat
├── LICENSE
├── README.md
├── design/
│   ├── GOALS.md
│   ├── INFERENCE.md
│   ├── LANGUAGE.md
│   ├── MODULE.md
│   └── SYNTAX.md
├── dune-project
├── flake.nix
├── jsend/
│   ├── dune
│   ├── emit.ml
│   ├── emit.mli
│   ├── jprinter.ml
│   ├── jprinter.mli
│   ├── jtree.ml
│   ├── jtree.mli
│   ├── test.ml
│   ├── untype.ml
│   ├── untype.mli
│   ├── utree.ml
│   ├── utree.mli
│   ├── var.ml
│   └── var.mli
├── nix/
│   ├── default.nix
│   └── shell.nix
├── smol/
│   ├── HACKING.md
│   ├── dune
│   ├── index.ml
│   ├── index.mli
│   ├── level.ml
│   ├── level.mli
│   ├── stree.ml
│   ├── stree.mli
│   ├── styper.ml
│   ├── test.ml
│   └── test.mli
├── syntax/
│   ├── clexer.ml
│   ├── clexer.mli
│   ├── cparser.mly
│   ├── ctree.ml
│   ├── ctree.mli
│   ├── dune
│   └── test.ml
├── teika/
│   ├── dune
│   ├── solve.ml
│   ├── solve.mli
│   ├── terror.ml
│   ├── terror.mli
│   ├── test.ml
│   ├── test.mli
│   ├── tprinter.ml
│   ├── tprinter.mli
│   ├── ttree.ml
│   ├── ttree.mli
│   ├── typer.ml
│   └── typer.mli
├── teikalsp/
│   ├── dune
│   ├── lsp_channel.ml
│   ├── lsp_channel.mli
│   ├── lsp_context.ml
│   ├── lsp_context.mli
│   ├── lsp_error.ml
│   ├── lsp_error.mli
│   ├── lsp_notification.ml
│   ├── lsp_notification.mli
│   ├── lsp_request.ml
│   ├── lsp_request.mli
│   ├── lsp_text_document.ml
│   ├── lsp_text_document.mli
│   ├── teikalsp.ml
│   └── teikalsp.mli
├── teikavsc/
│   ├── main.ts
│   ├── package.json
│   └── tsconfig.json
└── utils/
    ├── dune
    ├── utils.ml
    └── utils.mli

================================================
FILE CONTENTS
================================================

================================================
FILE: .envrc
================================================
use_flake


================================================
FILE: .gitignore
================================================
_build
node_modules
.direnv
dist


================================================
FILE: .ocamlformat
================================================
version = unknown


================================================
FILE: LICENSE
================================================
MIT License

Copyright (c) 2022 Eduardo Rafael

Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:

The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.

THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.


================================================
FILE: README.md
================================================
# Teika

Teika is a functional programming language. Same pronunciation as in "take a break" or if you prefer: "teika break".

## WIP

This is highly in progress, so it's accepted and even expected that at any point the main branch may be broken.


================================================
FILE: design/GOALS.md
================================================
# Goals

This should document the goals of the project, the why and the tradeoff's.

## Assumptions

1. Most code will be read more times than written
2. Most code will be read by proficient developers
3. Most code will be written by proficient developers
4. Most code will be simple even if powerful features exists
5. Beginners developers are only temporarily beginners
6. Tooling can be made to help beginners developers

## Direct

Indirections makes code more complex, a language should be direct. Functions should be known statically and data indirections should be optimized away.

## Succinct

Only local information should be syntactically provided, a language should be succint. Noise should be avoided and contextual information should be provided on demand.

## Powerful

Users should be able to describe abstract and efficient code, while still being able to reason about it locally, a language should be powerful. Effects should be tracked and mutation controlled.

## Flexible

Hacks were needed in the past, are needed today and will be needed in the future, users will need to hack code, a language should be flexible. Abstractions are gonna restrict code, tooling should flexibilize it.


================================================
FILE: design/INFERENCE.md
================================================
# Inference

Teika intends to be an ML-like language, which means inference is a must.

## HM inference

The basic kind of inference present at Teika is Hindley-Milner inference, which essentially assume that all parameters are monomorphic(no quantification) and when a weak variable(aka a variable not constrained) escape it's scope, then an implicit forall is added.

It is quite limited but relatively simple to understand and good visualizations are possible to be developed.

### Higher Kinded Types

There is a couple decisions to be made on inference for higher kinded types.

```rust
// When infering the following two types are possible
f = T => (x: T Int) => x;
// the easy one, here we follow the fact that Int is a type
f = (T: _ -> _) => (x: T Int) => x; // T is an arrow
f = (T: #(Int) -> _) => (x: T Int) => x; // T param is the type Int
f = (T: #(Int) -> *) => (x: T Int) => x; // unify with x
// the logical one, consider it's kind
 f = T => (x: T (Int: *)) => x; // Int has kind *
 f = (T: _ -> _) => (x: T (Int: *)) => x; // T is an arrow
 f = (T: * -> _) => (x: T (Int: *)) => x; // T param is the kind *
 f = (T: * -> *) => (x: T (Int: *)) => x; // unify with x
```

I decided to go with the second one, because in the same way that when you call a function with a value it infers it's type, when you call a function with a type it infers it's kind. If you do `f => f 1` the type is not `{A} -> (f: 1 -> A) -> f 1`, but `Int -> Int`.


================================================
FILE: design/LANGUAGE.md
================================================
# Teika Language

This document intends to describe the Teika Language. Which refers to all the features supported by Teika but not exactly how it will be presented to an user.

## Goals

Those are the current goals in order of importance and they may change.

1. Soundness, type preservation
2. Convenience, inference and effects
3. Uniformity, calculus of construction
4. Logical consistency, predicativity
5. Decidability, subtyping

## Warning

This document was written by me(@EduardoRFS) and I'm not proficient in type theory, so mistakes and changes are very likely to happen here.

## Smol

This is the canonical language and should contain all the theoritical power on Teika, everything else should be describable in terms of the following features.

```rust
// Type in Type
(Type : Type);

// variable
(A);

// forall
((A : Type) -> (x : A) -> A);
// lambda
((A : Type) => (x : A) => x);
// apply
(id Nat 1);

// exists
(A : Type, A);
// pair
(A = Nat, 1 : A);
// split
((A, x) = p; x);

// equal
Equal : (A : Type) -> (x : A) -> (y : B) -> Type;
// refl
Refl : (A : Type) -> (x : A) -> Equal A x x;
// subst
Subst :
  (A : Type) ->
  (x : A) ->
  (y : A) ->
  (x_eq_y : Equal A x y) ->
  (P : (x_or_y : A) -> Type) ->
  (p_x : P x) ->
  P y;
```

### Forall, Lambda and Apply

This is abstraction and can be seen as universal quantification. Those are functions, they're your work horse, the majority of your code will likely be about declaring and calling those.

#### Forall

The forall is the description of a function, it describes the type of the paramater and the type of the return, the return may depend on the type such as in polymorphic functions and dependent functions. Those are values.

```rust
/* forall syntax */
((A : T) -> A);

/* rule */
A : Type  B : Type
------------------
  (x : A) -> B
```

#### Lambda

The lambda is the way to introduce a function, the type of a lambda will always be a forall. The body may dependend on the parameter. Those are values.

```rust
/* lambda syntax */
((A : Type) => A);

/* rule */
           b : B
---------------------------
(x : A) => b : (x : A) -> B
```

#### Apply

This is the source of computing, the left side is anything of an arrow type and the right side must have the same type of the paramater expected by the lambda.

```rust
/* apply syntax */
(lambda argument);

/* rule */
l : (x : A) -> B  a : A
-----------------------
      l a : B
```

### Exists, Pair and Split

This is allows pairing and can be seen as existential quantification. Those are pairs, they're the fundamental way to pack data together, all of your modules can be represented using those.

#### Exists

This is the description of pairs, it describes the type of the left value and the type of the right value, the type of the right value may depend on the left value. Those are values.

```rust
/* exists syntax */
(A : Type, A);

/* rule */
A : Type  B : Type
------------------
   (x : A, B)
```

#### Pair

This is how you introduce a pair, the type of a pair will always be an exists. The type of the right side may depend on the left side, but the value itself cannot depended on the left side. Those are values.

```rust
/* pair syntax */
(A = Nat, 1 : A)

/* rule */
      l : A   R : B
---------------------------
(x = l, r : B) : (x : A, B)
```

#### Split

This is how you destruct a pair, it is like a let, but it can extract pairs. The body may depend on the pair values. The type of the body may depend on the pair values.

```rust
/* split syntax */
((A, one) = p; A);

/* rule */
p : (x : A, B)  b : C
---------------------
((x, y) = p; b) : C
```

### Equal, Refl and Subst

This is leibniz equality, it can be used as a way to do some form of dependent elimination, they hopefully only matters as a stepping stone for building libraries.

#### Equal

This is the type of an equality, it states that the first and second values are literally the same, allowing to replace one for the other.

#### Refl

This is the introduction of an equality, it is the only way to construct an equality and any equality is equivalent to this.

#### Subst

This is how you can eliminate an equality, it is enough to derive all the other transformations that are desirable for equalities

```rust
/* sym */
Sym =
  (A : Type) =>
  (B : Type) =>
  (A_eq_B : Equal A B) =>
    subst
      ((B : Type) => Equal B A)
      A
      B
      A_eq_B
      (Refl A)
/* trans */
Trans =
  (A : Type) =>
  (B : Type) =>
  (C : Type) =>
  (A_eq_B : Equal A B) =>
  (B_eq_C : Equal B C) =>
    subst
      ((C : Type) => Equal A C)
      B
      C
      B_eq_C
      A_eq_B;
```


================================================
FILE: design/MODULE.md
================================================
# Module

This intends to document behavior and features of modules.

## Implicit type

All structures contain an implicit type which by default is abstract both internally and externally, it can be assigned internally.

This behaves similarly to object-oriented languages like Java and but the goal is to achieve similar interface design to the `Module.t` convention in OCaml.

<!-- TODO: syntax for accessing internal type -->

### Implicit type alias

When assigning a structure to a value, the implicit type has an internal alias so that users can have consistent type naming internally and externally.

<!-- TODO: should this be available when doing (F { x }), no this makes changing the ident a breaking change -->

While this is not always available it will be used it can be used in most cases.

Example:

```rust
Amount = {
  // alias
  of_nat: Nat -> Amount;
  add: Amount -> Amount -> Amount;
};
```


================================================
FILE: design/SYNTAX.md
================================================
# Syntax

This should document the syntax of the project, they and the tradeoff's.

## Requirements

<!-- TODO: technically modules and types are fused  -->

The syntax needs to be able to describe four class of terms, modules, expressions, types and patterns.

The syntax should also be consistent and succint, this requires the syntax meaning to be context dependent.

The syntax should be simple so that it can easily be user manipulated, allowing tools like macros and ppx to be trivially implemented.

## Unified Representation

To avoid having too much notation, an unified representation was choosen, this means that all classes of terms have an identical syntax.

This is achieved by accepting a lot more code during parsing and rejecting this code later.

Another problem is the need for parens in many situations such as typing a simple binding. This is avoided by allowing a couple places to omit parens in an ad-hoc manner.

Pros:

- AST is really small, making macros easier
- Syntax error messages are much easier
- Error recovery is way easier
- Flexible, ppx can use of the invalid syntax

Cons:

- invalid code may be parsed
- not clear where parens are needed
- no syntatical indication of current context

```rust
Syntax =
  | Identifier // variable
  | Number // number
  | Syntax -> Syntax // arrow
  | Syntax => Syntax // lambda
  | Syntax Syntax // apply
  | Syntax = Syntax;
  | Syntax = Syntax; Syntax // binding
  | Syntax : Syntax;
  | Syntax : Syntax; Syntax // binding signature
  | Syntax Syntax[] = Syntax;
  | Syntax Syntax[] = Syntax; Syntax // binding + lambda
  | { Syntax } // structure
  | Syntax : Syntax // constraint
  | Syntax.Syntax // field
  | (Syntax) // parens
```

## Arrow and Lambda

The entire language can be described using lambda only, in fact arrow can only describe a subset of the language.

But adding arrow allows easier inference and a simpler syntax for describing common types.

## Implicit Argument

Initially `A. Syntax` was the thought way to do implicit parameters, but this lead to a couple weird syntax properties, such `A.Eq` and `A. Eq` being two different things.

Also that completely prevented binding + function syntax `id x = x`, which may be desirable in the future.

So for the syntax argument it is currently using `{A: M}` which was an already supported syntax.

But this makes it ambiguous with record destructuring, currently this means that destructuring on a record with a single fields the last semicolon cannot be omitted `{A} -> A` means `forall a. a` but `{A;} -> A` means `{ A: _ } -> A`.

- https://agda.readthedocs.io/en/v2.6.1/language/implicit-arguments.html

Another option for implicit arguments is using syntax from optional parameters, `?A -> A`.

## Binding Lambda

A common feature in languages such OCaml and Haskell is to support a fusion syntax for lambdas and binding, in the form of `add a b = a + b`.

The advantages of this is that it's way more succinct for most functions and it's a common feature in other programming languages.

The disadvantage is that it's not straightforward to explain to users that `f = x -> x` is the same as `f x = x`, it also doesn't work with binding + constraint such as `x: Int -> Int = x -> x`.

This was decided to be a reasonable choice due to type constructors. And as constraints + lambda should not be common.

```rust
Packet {A} = {
  id: Nat;
  data: A;
};
```


================================================
FILE: dune-project
================================================
(lang dune 2.9)

(using menhir 2.0)


================================================
FILE: flake.nix
================================================
{
  description = "Nix Flake";

  inputs = {
    nixpkgs.url = "github:anmonteiro/nix-overlays";
    nix-filter.url = "github:numtide/nix-filter";
    flake-utils.url = "github:numtide/flake-utils";
  };

  outputs = { self, nixpkgs, nix-filter, flake-utils }:
    flake-utils.lib.eachDefaultSystem (system:
      let pkgs = (nixpkgs.makePkgs {
        inherit system;
      }).extend (self: super: {
        ocamlPackages = super.ocaml-ng.ocamlPackages_5_3;
      }); in
      let teika = pkgs.callPackage ./nix {
        inherit nix-filter;
        doCheck = true;
      }; in
      rec {
        packages = { inherit teika; };
        devShell = import ./nix/shell.nix { inherit pkgs teika; };
      });
}


================================================
FILE: jsend/dune
================================================
(library
 (name jsend)
 (libraries teika)
 (modules
  (:standard \ Test))
 (preprocess
  (pps ppx_deriving.show ppx_deriving.eq ppx_deriving.ord)))

(executable
 (name test)
 (modules Test)
 (libraries alcotest jsend)
 (preprocess
  (pps ppx_deriving.show ppx_deriving.eq ppx_deriving.ord)))

(rule
 (alias runtest)
 (deps
  (:exe ./test.exe))
 (action
  (run %{exe})))


================================================
FILE: jsend/emit.ml
================================================
open Utree
open Jtree

let emit_curry function_ =
  JE_call { lambda = JE_var { var = Var.curry }; args = [ function_ ] }

let rec emit_term : Utree.term -> expression =
 fun term ->
  match term with
  (* TODO: sourcemap *)
  | UT_loc { term; loc = _ } -> emit_term term
  | UT_var { var } -> JE_var { var }
  | UT_lambda _ ->
      (* TODO: weird to ignore UT_lambda like this *)
      emit_curry @@ emit_generator ~params:[] term
  | UT_apply _ ->
      (* TODO: weird to ignore UT_apply like this *)
      let call = emit_call ~args:[] term in
      (* TODO: test optimization, if instanceof before yield *)
      JE_yield { expression = call }
  | UT_let _ ->
      (* TODO: weird to ignore UT_let like this *)
      let block = emit_block ~consts:[] term in
      let wrapper = JE_generator { params = []; block } in
      let call = JE_call { lambda = wrapper; args = [] } in
      JE_yield { expression = call }
  | UT_string { literal } -> JE_string { literal }
  | UT_external { external_ } -> translate_external external_

and emit_generator ~params return =
  (* TODO: is this transformation desired?
      Does it changes performance behaviour *)
  (* TODO: too many params *)
  match return with
  | UT_loc { term = return; loc = _ } -> emit_generator ~params return
  | UT_lambda { param; return } ->
      let params = param :: params in
      emit_generator ~params return
  | UT_var _ | UT_apply _ | UT_let _ | UT_string _ | UT_external _ ->
      let params = List.rev params in
      let block = emit_block ~consts:[] return in
      JE_generator { params; block }

and emit_call ~args lambda =
  (* TODO: too many args? *)
  match lambda with
  | UT_loc { term = lambda; loc = _ } -> emit_call ~args lambda
  | UT_apply { lambda; arg } ->
      let arg = emit_term arg in
      let args = arg :: args in
      emit_call ~args lambda
  | UT_var _ | UT_lambda _ | UT_let _ | UT_string _ | UT_external _ ->
      let lambda = emit_term lambda in
      JE_call { lambda; args }

and emit_block ~consts return =
  match return with
  | UT_loc { term = return; loc = _ } -> emit_block ~consts return
  | UT_let { var; value; return } ->
      let value = emit_term value in
      let consts = (var, value) :: consts in
      emit_block ~consts return
  | UT_apply _ ->
      (* tco *)
      let return =
        let return = emit_call ~args:[] return in
        let constructor =
          JE_call { lambda = JE_var { var = Var.jmp }; args = [ return ] }
        in
        JE_new { constructor }
      in
      let consts = List.rev consts in
      JBlock { consts; return }
  | UT_var _ | UT_lambda _ | UT_string _ | UT_external _ ->
      let return = emit_term return in
      let consts = List.rev consts in
      JBlock { consts; return }

and translate_external : external_ -> expression =
 fun external_ ->
  let var =
    match external_ with
    | UE_type -> Var.type_
    | UE_fix -> Var.fix
    | UE_unit -> Var.unit
    | UE_debug -> Var.debug
  in
  JE_var { var }


================================================
FILE: jsend/emit.mli
================================================
val emit_term : Utree.term -> Jtree.expression


================================================
FILE: jsend/jprinter.ml
================================================
open Jtree
open Format

(* TODO: identation *)
let pp_block_syntax ~pp_wrapped_expression fmt block =
  let (JBlock { consts; return }) = block in
  List.iter
    (fun (var, value) ->
      fprintf fmt "const %a = %a;" Var.pp var pp_wrapped_expression value)
    consts;
  fprintf fmt "return %a;" pp_wrapped_expression return

let rec pp_expression_syntax ~pp_wrapped ~pp_call ~pp_atom ~pp_block fmt
    expression =
  let pp_expression_syntax fmt expression =
    pp_expression_syntax ~pp_wrapped ~pp_call ~pp_atom ~pp_block fmt expression
  in
  match expression with
  | JE_loc { expression; loc = _ } -> pp_expression_syntax fmt expression
  | JE_var { var } -> Var.pp fmt var
  | JE_generator { params; block } ->
      (* TODO: names on functions? *)
      let rec pp_params fmt params =
        match params with
        | [] -> ()
        | [ param ] -> fprintf fmt "%a" Var.pp param
        | param :: params -> fprintf fmt "%a, %a" Var.pp param pp_params params
      in
      fprintf fmt "function* (%a) { %a }" pp_params params pp_block block
  (* TODO: new precedence is the same as call? *)
  | JE_new { constructor } -> fprintf fmt "new %a" pp_call constructor
  | JE_call { lambda; args } ->
      (* TODO: almost duplicated from params *)
      let rec pp_args fmt args =
        match args with
        | [] -> ()
        | [ arg ] -> fprintf fmt "%a" pp_wrapped arg
        | arg :: args -> fprintf fmt "%a, %a" pp_wrapped arg pp_args args
      in
      fprintf fmt "%a(%a)" pp_call lambda pp_args args
  | JE_yield { expression } -> fprintf fmt "yield %a" pp_call expression
  | JE_string { literal } ->
      (* TODO: proper JS escaping *)
      fprintf fmt "%S" literal

type prec = Wrapped | Call | Atom

let rec pp_expression prec fmt expression =
  let pp_wrapped fmt term = pp_expression Wrapped fmt term in
  let pp_call fmt term = pp_expression Call fmt term in
  let pp_atom fmt term = pp_expression Atom fmt term in
  let pp_block fmt block =
    pp_block_syntax ~pp_wrapped_expression:pp_wrapped fmt block
  in
  match (expression, prec) with
  | JE_loc { expression; loc = _ }, prec -> pp_expression prec fmt expression
  | (JE_var _ | JE_string _), (Wrapped | Call | Atom)
  | (JE_new _ | JE_call _), (Wrapped | Call)
  | (JE_generator _ | JE_yield _), Wrapped ->
      pp_expression_syntax ~pp_wrapped ~pp_call ~pp_atom ~pp_block fmt
        expression
  | (JE_new _ | JE_call _), Atom | (JE_generator _ | JE_yield _), (Call | Atom)
    ->
      fprintf fmt "(%a)" pp_wrapped expression

let pp_expression fmt expression = pp_expression Wrapped fmt expression


================================================
FILE: jsend/jprinter.mli
================================================
val pp_expression : Format.formatter -> Jtree.expression -> unit



================================================
FILE: jsend/jtree.ml
================================================
type expression =
  | JE_loc of { expression : expression; loc : Location.t }
  | JE_var of { var : Var.t }
  | JE_generator of { params : Var.t list; block : block }
  | JE_new of { constructor : expression }
  | JE_call of { lambda : expression; args : expression list }
  | JE_yield of { expression : expression }
  | JE_string of { literal : string }

and block =
  | JBlock of { consts : (Var.t * expression) list; return : expression }


================================================
FILE: jsend/jtree.mli
================================================
type expression =
  | JE_loc of { expression : expression; loc : Location.t }
  | JE_var of { var : Var.t }
  | JE_generator of { params : Var.t list; block : block }
  (* TODO: not really a lambda and arg *)
  | JE_new of { constructor : expression }
  | JE_call of { lambda : expression; args : expression list }
  | JE_yield of { expression : expression }
  | JE_string of { literal : string }

and block =
  | JBlock of { consts : (Var.t * expression) list; return : expression }


================================================
FILE: jsend/test.ml
================================================
open Syntax
open Teika
open Jsend

let compile code =
  let term = Option.get @@ Clexer.from_string Cparser.term_opt code in
  (* TODO: locations *)
  let loc = Location.none in
  let term = Lparser.parse_term ~loc term in
  let term =
    match Typer.Infer.infer_term term with
    | Ok ttree -> ttree
    | Error error ->
        Format.eprintf "%a\n%!" Terror.pp error;
        failwith "infer"
  in

  let term = Untype.untype_term term in
  let term = Emit.emit_term term in
  Format.printf "%a\n\n%!" Jprinter.pp_expression term

let () = Printexc.record_backtrace true

let () =
  compile {|
      ((A : Type) => (x : A) => x) String "Hello World"
    |}

let () =
  compile
    {|
      noop = (u : (A : Type) -> (x : A) -> A) => u ((A : Type) -> (x : A) -> A) u;
      noop
    |}

(* let () =
     compile
       {|
         Unit = (A : Type) -> (x : A) -> A;
         (noop : (u : Unit) -> Unit) = u => u Unit u;
         noop
       |}

   let () =
     compile
       {|
         Bool = (A : Type) -> (t : A) -> (f : A) -> A;
         (true : Bool) = A => x => y => x;
         (false : Bool) = A => x => y => y;
         f = (bool : Bool) => @native("debug")(bool String "!!true" "!!false");
         f false
       |}

   let () =
     compile
       {|
           Nat = (A : Type) -> (z : A) ->
             (s : (x : A) -> A) -> (k : (x : A) -> A) -> A;
           (zero : Nat) = A => z => s => k => k z;
           (succ : (n : Nat) -> Nat) =
             n => A => z => s => k => n A z s (x => k (s x));
           (add : (n : Nat) -> (m : Nat) -> Nat) =
             n => m => n Nat m succ (x => x);
           (mul : (n : Nat) -> (m : Nat) -> Nat) =
             n => m => n Nat zero (add m) (x => x);
           one = succ zero;
           two = succ one;
           four = mul two two;
           eight = mul two four;
           sixteen = mul two eight;
           byte = mul sixteen sixteen;
           short = mul byte byte;
           short String "zero" (_ => @native("debug")("hello")) (x => x)
         |} *)


================================================
FILE: jsend/untype.ml
================================================
open Syntax
open Teika
open Ttree
open Utree

exception Term_subst_found
exception Term_shift_found
exception Invalid_variable

let type_term : term = UT_external { external_ = UE_type }
(* let fix_term : term = UT_external { external_ = UE_fix }
   let unit_term : term = UT_external { external_ = UE_unit }
   let debug_term : term = UT_external { external_ = UE_debug } *)

let next_level ~vars =
  let current_level =
    match Level.Map.max_binding_opt vars with
    | Some (current_level, _) -> current_level
    | None ->
        (* TODO: this is weird *)
        level_type_string
  in
  Level.next current_level

module Context : sig
  type 'a context

  val run : (unit -> 'a context) -> 'a
  val return : 'a -> 'a context
  val ( let* ) : 'a context -> ('a -> 'b context) -> 'b context
  val ( let+ ) : 'a context -> ('a -> 'b) -> 'b context
  val with_var : Name.t -> (Var.t -> 'k context) -> 'k context
  val lookup : Level.t -> Var.t context
end = struct
  type 'a context = vars:Var.t Level.Map.t -> 'a

  let run context =
    let vars =
      let open Level.Map in
      let vars = empty in
      (* TODO: string is also a $type *)
      let vars = add level_type_univ Var.type_ vars in
      let vars = add level_type_string Var.type_ vars in
      vars
    in
    context () ~vars

  let return x ~vars:_ = x
  let ( let* ) context k ~vars = k (context ~vars) ~vars
  let ( let+ ) context k ~vars = k (context ~vars)

  let with_var name k ~vars =
    let var = Var.create name in
    let level = next_level ~vars in
    let vars = Level.Map.add level var vars in
    k var ~vars

  let lookup level ~vars =
    match Level.Map.find_opt level vars with
    | Some var -> var
    | None -> raise Invalid_variable
end

open Context

let rec untype_term term =
  match term with
  | TT_with_type { term; type_ = _ } -> untype_term term
  | TT_with_sort { term } ->
      (* TODO: should also not be reachable? *)
      untype_term term
  | TT_subst _ -> raise Term_subst_found
  | TT_shift _ -> raise Term_shift_found
  | TT_var { var } ->
      let+ var = lookup var in
      UT_var { var }
  | TT_forall _ -> return type_term
  | TT_lambda { param; return } ->
      let+ param, return =
        erase_pat param @@ fun var ->
        let+ return = untype_term return in
        (var, return)
      in
      UT_lambda { param; return }
  | TT_apply { lambda; arg } ->
      let* lambda = untype_term lambda in
      let+ arg = untype_term arg in
      UT_apply { lambda; arg }
  | TT_let { bound; value; return } ->
      (* TODO: param first *)
      let* value = untype_term value in
      let+ var, return =
        erase_pat bound @@ fun var ->
        let+ return = untype_term return in
        (var, return)
      in
      UT_let { var; value; return }
  | TT_annot { term; annot = _ } -> untype_term term
  | TT_string { literal } -> return @@ UT_string { literal }

and erase_pat pat k =
  match pat with
  | TP_with_type { pat; type_ = _ } -> erase_pat pat k
  | TP_annot { pat; annot = _ } -> erase_pat pat k
  | TP_var { name } -> with_var name k

let untype_term term = Context.run @@ fun () -> untype_term term


================================================
FILE: jsend/untype.mli
================================================
open Teika

val untype_term : Ttree.term -> Utree.term


================================================
FILE: jsend/utree.ml
================================================
type term =
  | UT_loc of { term : term; loc : Location.t }
  | UT_var of { var : Var.t }
  | UT_lambda of { param : Var.t; return : term }
  | UT_apply of { lambda : term; arg : term }
  | UT_let of { var : Var.t; value : term; return : term }
  | UT_string of { literal : string }
  | UT_external of { external_ : external_ }

and external_ = UE_type | UE_fix | UE_unit | UE_debug


================================================
FILE: jsend/utree.mli
================================================
type term =
  (* TODO: why is loc a term? *)
  | UT_loc of { term : term; loc : Location.t }
  | UT_var of { var : Var.t }
  (* TODO: patterns in the Itree? *)
  | UT_lambda of { param : Var.t; return : term }
  | UT_apply of { lambda : term; arg : term }
  | UT_let of { var : Var.t; value : term; return : term }
  | UT_string of { literal : string }
  | UT_external of { external_ : external_ }

and external_ = UE_type | UE_fix | UE_unit | UE_debug


================================================
FILE: jsend/var.ml
================================================
open Utils

module Id : sig
  type t [@@deriving show]

  val next : unit -> t
  val equal : t -> t -> bool
  val compare : t -> t -> int
end = struct
  type t = int [@@deriving show]

  let acc = Atomic.make 0
  let next () = Atomic.fetch_and_add acc 1
  let equal = Int.equal
  let compare = Int.compare
end

let _ = Id.show

type var_kind = Global | Scoped
type var = { id : Id.t; name : Name.t; kind : var_kind }
type t = var

let pp fmt var =
  let { id; name; kind } = var in
  match kind with
  | Global -> Format.fprintf fmt "%s" (Name.repr name)
  | Scoped -> Format.fprintf fmt "%s$%a" (Name.repr name) Id.pp id

let show var = Format.asprintf "%a" pp var

let create_any kind name =
  let id = Id.next () in
  { id; name; kind }

let create name = create_any Scoped name

let predef name =
  let name = Name.make name in
  create_any Global name

let equal a b =
  let { id = a; name = _; kind = _ } = a in
  let { id = b; name = _; kind = _ } = b in
  Id.equal a b

let compare a b =
  let { id = a; name = _; kind = _ } = a in
  let { id = b; name = _; kind = _ } = b in
  Id.compare a b

let name var =
  let { id = _; name; kind = _ } = var in
  name

(* TODO: those should be checked somewhere *)
let type_ = predef "$type"
let fix = predef "$fix"
let unit = predef "$unit"
let debug = predef "$debug"
let curry = predef "$curry"
let jmp = predef "$jmp"

module Map = Map.Make (struct
  type t = var

  let compare = compare
end)


================================================
FILE: jsend/var.mli
================================================
open Utils

type var
type t = var [@@deriving show]

val create : Name.t -> var
val equal : var -> var -> bool
val compare : var -> var -> int
val name : var -> Name.t

(* predefined *)
val type_ : var
val fix : var
val unit : var
val debug : var
val curry : var
val jmp : var

module Map : Map.S with type key = t


================================================
FILE: nix/default.nix
================================================
{ pkgs, doCheck ? true, nix-filter }:

let inherit (pkgs) lib stdenv ocamlPackages; in

with ocamlPackages; buildDunePackage rec {
  pname = "teika";
  version = "0.0.0-dev";

  src = with nix-filter.lib;
    filter {
      root = ./..;
      include = [
        "dune-project"
        "smol"
        "teika"
      ];
      exclude = [ ];
    };

  propagatedBuildInputs = [
    menhir
    menhirLib
    sedlex
    ppx_deriving
    eio
    eio_main
    ppx_sexp_conv
    zarith
    lsp
  ]
  # checkInputs are here because when cross compiling dune needs test dependencies
  # but they are not available for the build phase. The issue can be seen by adding strictDeps = true;.
  ++ checkInputs;

  checkInputs = [ alcotest ];
}


================================================
FILE: nix/shell.nix
================================================
{ pkgs, teika }:

with pkgs; with ocamlPackages; mkShell {
  inputsFrom = [ teika ];
  packages = [
    # Make developer life easier
    # formatters
    nixfmt
    # ocamlformat
    ocamlformat
    # OCaml developer tooling
    ocaml
    dune_3
    ocaml-lsp
    utop
  ];
}


================================================
FILE: smol/HACKING.md
================================================
# Smol Frontend

## Optimizations

### Explicit Substitutions

This uses explicit substitutions to achieve laziness, similar to λυ.

I think Smol doesn't have metavariables as no unification exists, additionally currently substituions are not used for equality.

Also the failure mode is that it will reject terms not accept terms, which seems to be okay.

- https://drops.dagstuhl.de/opus/volltexte/2014/4858/pdf/34.pdf
- https://www.irif.fr/~kesner/papers/springer-csl07.pdf
- https://hal.inria.fr/inria-00074197/document


================================================
FILE: smol/dune
================================================
(library
 (name smol)
 (libraries syntax)
 (modules
  (:standard \ Test))
 (preprocess
  (pps ppx_deriving.show ppx_deriving.eq ppx_deriving.ord sedlex.ppx)))

(executable
 (name test)
 (modules Test)
 (libraries alcotest smol)
 (preprocess
  (pps ppx_deriving.show)))

(rule
 (alias runtest)
 (deps
  (:exe ./test.exe))
 (action
  (run %{exe})))


================================================
FILE: smol/index.ml
================================================
type index = int
and t = index [@@deriving show, eq]

let zero = 0
let one = 1
let previous x = match x > 0 with true -> Some (x - 1) | false -> None
(* TODO: overflow detection *)

let next x = x + 1

let of_int x =
  match x >= 0 with
  | true -> x
  | false -> raise (Invalid_argument "index must be bigger than zero")

let repr x = x
let ( < ) (a : index) (b : index) = a < b
let ( > ) (a : index) (b : index) = a > b


================================================
FILE: smol/index.mli
================================================
type index
type t = index [@@deriving show, eq]

val zero : index
val one : index
val previous : index -> index option
val next : index -> index

(* repr *)
(* TODO: this API is non ideal *)
val of_int : int -> index
val repr : index -> int

(* operations *)
val ( < ) : index -> index -> bool
val ( > ) : index -> index -> bool


================================================
FILE: smol/level.ml
================================================
type level = int
and t = level [@@deriving show, eq]

let zero = 0

(* TODO: check for overflows *)
let next n = n + 1
let offset ~from ~to_ = Index.of_int (to_ - from)
let ( < ) : level -> level -> bool = ( < )

module Map = Map.Make (Int)

================================================
FILE: smol/level.mli
================================================
type level
type t = level [@@deriving show, eq]

val zero : level
val next : level -> level
val offset : from:level -> to_:level -> Index.t
val ( < ) : level -> level -> bool

module Map : Map.S with type key = level


================================================
FILE: smol/stree.ml
================================================
type ty_term = ST_typed of { term : term; type_ : term }

and term =
  | ST_loc of { term : term; loc : Location.t [@opaque] }
  | ST_free_var of { level : Level.t }
  | ST_bound_var of { index : Index.t }
  | ST_forall of { param : ty_pat; return : term }
  | ST_lambda of { param : ty_pat; return : term }
  | ST_apply of { lambda : term; arg : term }
  (* TODO: self being only pat is weird *)
  | ST_self of { self : pat; body : term }
  | ST_fix of { self : ty_pat; body : term }
  | ST_unroll of { term : term }
  | ST_let of { bound : ty_pat; value : term; return : term }
  | ST_annot of { term : term; annot : term }

and ty_pat = SP_typed of { pat : pat; type_ : term }

and pat =
  | SP_loc of { pat : pat; loc : Location.t [@opaque] }
  (* TODO: extract Syntax.Name *)
  | SP_var of { var : Syntax.Name.t }
  | SP_erasable of { pat : pat }
  (* TODO: SP_unroll *)
  | SP_annot of { pat : pat; annot : term }
[@@deriving show]


================================================
FILE: smol/stree.mli
================================================
type ty_term = ST_typed of { term : term; type_ : term }

and term =
  | ST_loc of { term : term; loc : Location.t }
  | ST_free_var of { level : Level.t }
  | ST_bound_var of { index : Index.t }
  | ST_forall of { param : ty_pat; return : term }
  | ST_lambda of { param : ty_pat; return : term }
  | ST_apply of { lambda : term; arg : term }
  | ST_self of { self : pat; body : term }
  | ST_fix of { self : ty_pat; body : term }
  | ST_unroll of { term : term }
  | ST_let of { bound : ty_pat; value : term; return : term }
  | ST_annot of { term : term; annot : term }

and ty_pat = SP_typed of { pat : pat; type_ : term }

and pat =
  | SP_loc of { pat : pat; loc : Location.t }
  | SP_var of { var : Syntax.Name.t }
  | SP_erasable of { pat : pat }
  | SP_annot of { pat : pat; annot : term }
[@@deriving show]


================================================
FILE: smol/styper.ml
================================================
(* TODO: remove all failwith *)

module Error = struct
  open Syntax

  type error =
    | E_loc of { error : error; loc : Location.t }
    (* machinery *)
    | E_free_var_clash
    | E_bound_var_clash
    | E_type_clash
    | E_pattern_clash
    (* context *)
    | E_unknown_var of { var : Name.t }
    | E_variable_used of { var : Name.t }
    | E_variable_unused of { var : Name.t }
    | E_grades_invariant_violated
    | E_types_invariant_violated
    (* typer *)
    | E_unsupported_extensions
    | E_string_not_supported
    | E_missing_annotations
    | E_unroll_pattern_not_supported
    | E_expected_forall
    | E_expected_self

  exception Error of { error : error }

  let rec pp_error fmt error =
    let open Format in
    match error with
    | E_loc { error; loc = _ } -> pp_error fmt error
    | E_free_var_clash -> fprintf fmt "free var clash"
    | E_bound_var_clash -> fprintf fmt "bound var clash"
    | E_type_clash -> fprintf fmt "type clash"
    | E_pattern_clash -> fprintf fmt "pattern clash"
    | E_unknown_var { var } -> fprintf fmt "unknown variable: %a" Name.pp var
    (* TODO: show all other occurrences *)
    | E_variable_used { var } ->
        fprintf fmt "duplicated variable: %a" Name.pp var
    (* TODO: show error on pattern *)
    | E_variable_unused { var } ->
        fprintf fmt "variable not used: %a" Name.pp var
    | E_grades_invariant_violated ->
        fprintf fmt "compiler bug, grades invariant"
    | E_types_invariant_violated -> fprintf fmt "compiler bug, types invariant"
    | E_unsupported_extensions -> fprintf fmt "extensions are not supported"
    | E_string_not_supported -> fprintf fmt "strings are not supported"
    | E_missing_annotations -> fprintf fmt "not enough annotations here"
    | E_unroll_pattern_not_supported ->
        fprintf fmt "unroll patterns are not supported"
    | E_expected_forall -> fprintf fmt "expected a function"
    | E_expected_self -> fprintf fmt "expected a fixpoint"

  let pp_loc fmt loc =
    let open Format in
    (* TODO: loc ghost?*)
    let Location.{ loc_start; loc_end; loc_ghost = _ } = loc in
    fprintf fmt "[%d:%d .. %d:%d]" loc_start.pos_lnum
      (loc_start.pos_cnum - loc_start.pos_bol)
      loc_end.pos_lnum
      (loc_end.pos_cnum - loc_end.pos_bol)

  let rec pp_error_loc ~loc fmt error =
    let open Format in
    match error with
    | E_loc { error; loc = new_loc } -> (
        match Location.is_none new_loc with
        | true -> pp_error_loc ~loc:new_loc fmt error
        | false -> pp_error_loc ~loc:new_loc fmt error)
    | error -> (
        match Location.is_none loc with
        | true -> fprintf fmt "type error : %a" pp_error error
        | false -> fprintf fmt "type error at %a : %a" pp_loc loc pp_error error
        )

  let () =
    Printexc.register_printer @@ function
    | Error { error } ->
        Some (Format.asprintf "%a" (pp_error_loc ~loc:Location.none) error)
    | _ -> None

  let error error = raise (Error { error })
end

module Machinery = struct
  open Stree
  open Error

  let rec open_term ~from ~to_ term =
    let open_term ~from term = open_term ~from ~to_ term in
    let open_ty_pat ~from pat = open_ty_pat ~from ~to_ pat in
    let open_pat ~from pat = open_pat ~from ~to_ pat in
    match term with
    | ST_loc { term; loc } ->
        let term = open_term ~from term in
        ST_loc { term; loc }
    | ST_free_var { level } -> ST_free_var { level }
    | ST_bound_var { index } -> (
        match Index.equal from index with
        | true -> to_
        | false -> ST_bound_var { index })
    | ST_forall { param; return } ->
        let param = open_ty_pat ~from param in
        let return =
          (* TODO: what if pairs in patterns *)
          let from = Index.next from in
          open_term ~from return
        in
        ST_forall { param; return }
    | ST_lambda { param; return } ->
        let param = open_ty_pat ~from param in
        let return =
          let from = Index.next from in
          open_term ~from return
        in
        ST_lambda { param; return }
    | ST_apply { lambda; arg } ->
        let lambda = open_term ~from lambda in
        let arg = open_term ~from arg in
        ST_apply { lambda; arg }
    | ST_self { self; body } ->
        let self = open_pat ~from self in
        let body =
          let from = Index.next from in
          open_term ~from body
        in
        ST_self { self; body }
    | ST_fix { self; body } ->
        let self = open_ty_pat ~from self in
        let body =
          let from = Index.next from in
          open_term ~from body
        in
        ST_fix { self; body }
    | ST_unroll { term } ->
        let term = open_term ~from term in
        ST_unroll { term }
    | ST_let { bound; value; return } ->
        let bound = open_ty_pat ~from bound in
        let value = open_term ~from value in
        let return =
          let from = Index.next from in
          open_term ~from return
        in
        ST_let { bound; value; return }
    | ST_annot { term; annot } ->
        let term = open_term ~from term in
        let annot = open_term ~from annot in
        ST_annot { term; annot }

  and open_ty_pat ~from ~to_ pat =
    let (SP_typed { pat; type_ }) = pat in
    let pat = open_pat ~from ~to_ pat in
    let type_ = open_term ~from ~to_ type_ in
    SP_typed { pat; type_ }

  and open_pat ~from ~to_ pat =
    match pat with
    | SP_loc { pat; loc } ->
        let pat = open_pat ~from ~to_ pat in
        SP_loc { pat; loc }
    | SP_var { var } -> SP_var { var }
    | SP_erasable { pat } ->
        let pat = open_pat ~from ~to_ pat in
        SP_erasable { pat }
    | SP_annot { pat; annot } ->
        let pat = open_pat ~from ~to_ pat in
        let annot = open_term ~from ~to_ annot in
        SP_annot { pat; annot }

  let open_term ~to_ term = open_term ~from:Index.zero ~to_ term

  let rec close_term ~from ~to_ term =
    let close_term ~to_ term = close_term ~from ~to_ term in
    let close_ty_pat ~to_ pat = close_ty_pat ~from ~to_ pat in
    let close_pat ~to_ pat = close_pat ~from ~to_ pat in
    match term with
    | ST_loc { term; loc } ->
        let term = close_term ~to_ term in
        ST_loc { term; loc }
    | ST_free_var { level } -> (
        match Level.equal from level with
        | true -> ST_bound_var { index = to_ }
        | false -> ST_free_var { level })
    | ST_bound_var { index } -> ST_bound_var { index }
    | ST_forall { param; return } ->
        let param = close_ty_pat ~to_ param in
        let return =
          let to_ = Index.next to_ in
          close_term ~to_ return
        in
        ST_forall { param; return }
    | ST_lambda { param; return } ->
        let param = close_ty_pat ~to_ param in
        let return =
          let to_ = Index.next to_ in
          close_term ~to_ return
        in
        ST_lambda { param; return }
    | ST_apply { lambda; arg } ->
        let lambda = close_term ~to_ lambda in
        let arg = close_term ~to_ arg in
        ST_apply { lambda; arg }
    | ST_self { self; body } ->
        let self = close_pat ~to_ self in
        let body =
          let to_ = Index.next to_ in
          close_term ~to_ body
        in
        ST_self { self; body }
    | ST_fix { self; body } ->
        let self = close_ty_pat ~to_ self in
        let body =
          let to_ = Index.next to_ in
          close_term ~to_ body
        in
        ST_fix { self; body }
    | ST_unroll { term } ->
        let term = close_term ~to_ term in
        ST_unroll { term }
    | ST_let { bound; value; return } ->
        let bound = close_ty_pat ~to_ bound in
        let value = close_term ~to_ value in
        let return =
          let to_ = Index.next to_ in
          close_term ~to_ return
        in
        ST_let { bound; value; return }
    | ST_annot { term; annot } ->
        let term = close_term ~to_ term in
        let annot = close_term ~to_ annot in
        ST_annot { term; annot }

  and close_ty_pat ~from ~to_ pat =
    let (SP_typed { pat; type_ }) = pat in
    let pat = close_pat ~from ~to_ pat in
    let type_ = close_term ~from ~to_ type_ in
    SP_typed { pat; type_ }

  and close_pat ~from ~to_ pat =
    match pat with
    | SP_loc { pat; loc } ->
        let pat = close_pat ~from ~to_ pat in
        SP_loc { pat; loc }
    | SP_var { var } -> SP_var { var }
    | SP_erasable { pat } ->
        let pat = close_pat ~from ~to_ pat in
        SP_erasable { pat }
    | SP_annot { pat; annot } ->
        let pat = close_pat ~from ~to_ pat in
        let annot = close_term ~from ~to_ annot in
        SP_annot { pat; annot }

  (* TODO: expansion of unroll *)
  let rec expand_head_term term =
    match term with
    (* TODO: use this loc during equality?*)
    | ST_loc { term; loc = _ } -> expand_head_term term
    (* TODO: equality expansion *)
    | ST_free_var _ as term -> term
    | ST_bound_var _ as term -> term
    | ST_forall _ as term -> term
    | ST_lambda _ as term -> term
    | ST_apply { lambda; arg } -> (
        match expand_head_term lambda with
        (* TODO: use pattern when moving to subst *)
        | ST_lambda { param = _; return } -> open_term ~to_:arg return
        | lambda -> ST_apply { lambda; arg })
    | ST_self _ as term -> term
    | ST_fix _ as term -> term
    | ST_unroll _ as term -> term
    (* TODO: use pattern when moving to subst *)
    | ST_let { bound = _; value; return } ->
        expand_head_term @@ open_term ~to_:value return
    | ST_annot { term; annot = _ } -> expand_head_term term

  (* TODO: document multi step equality *)
  let rec equal_term ~received ~expected =
    match received == expected with
    | true -> ()
    | false -> equal_term_structural ~received ~expected

  and equal_term_structural ~received ~expected =
    let received = expand_head_term received in
    let expected = expand_head_term expected in
    match (received, expected) with
    (* TODO: locs? *)
    | ST_loc { term = received; loc = _ }, expected
    | received, ST_loc { term = expected; loc = _ } ->
        equal_term ~received ~expected
    | ST_free_var { level = received }, ST_free_var { level = expected } -> (
        match Level.equal received expected with
        | true -> ()
        | false -> error E_free_var_clash)
    | ST_bound_var { index = received }, ST_bound_var { index = expected } -> (
        match Index.equal received expected with
        | true -> ()
        | false -> error E_bound_var_clash)
    | ( ST_forall { param = received_param; return = received_return },
        ST_forall { param = expected_param; return = expected_return } ) ->
        let () =
          equal_ty_pat ~received:received_param ~expected:expected_param
        in
        equal_term ~received:received_return ~expected:expected_return
    | ( ST_lambda { param = received_param; return = received_return },
        ST_lambda { param = expected_param; return = expected_return } ) ->
        let () =
          equal_ty_pat ~received:received_param ~expected:expected_param
        in
        equal_term ~received:received_return ~expected:expected_return
    | ( ST_apply { lambda = received_lambda; arg = received_arg },
        ST_apply { lambda = expected_lambda; arg = expected_arg } ) ->
        let () =
          equal_term ~received:received_lambda ~expected:expected_lambda
        in
        equal_term ~received:received_arg ~expected:expected_arg
    | ( ST_self { self = received_self; body = received_body },
        ST_self { self = expected_self; body = expected_body } ) ->
        let () = equal_pat ~received:received_self ~expected:expected_self in
        equal_term ~received:received_body ~expected:expected_body
    | ( ST_fix { self = received_self; body = received_body },
        ST_fix { self = expected_self; body = expected_body } ) ->
        let () = equal_ty_pat ~received:received_self ~expected:expected_self in
        equal_term ~received:received_body ~expected:expected_body
    | ST_unroll { term = received }, ST_unroll { term = expected } ->
        equal_term ~received ~expected
    (* TODO: document why let here *)
    | ( ST_let
          {
            bound = received_bound;
            value = received_value;
            return = received_return;
          },
        ST_let
          {
            bound = expected_bound;
            value = expected_value;
            return = expected_return;
          } ) ->
        let () =
          equal_ty_pat ~received:received_bound ~expected:expected_bound
        in
        let () = equal_term ~received:received_value ~expected:expected_value in
        equal_term ~received:received_return ~expected:expected_return
    (* TODO: document why annot here *)
    (* TODO: should check also for annot equality? *)
    | ST_annot { term = received; annot = _ }, expected
    | received, ST_annot { term = expected; annot = _ } ->
        equal_term ~received ~expected
    | ( ( ST_free_var _ | ST_bound_var _ | ST_forall _ | ST_lambda _
        | ST_apply _ | ST_self _ | ST_fix _ | ST_unroll _ | ST_let _ ),
        ( ST_free_var _ | ST_bound_var _ | ST_forall _ | ST_lambda _
        | ST_apply _ | ST_self _ | ST_fix _ | ST_unroll _ | ST_let _ ) ) ->
        error E_type_clash

  and equal_ty_pat ~received ~expected =
    let (SP_typed { pat = received_pat; type_ = received_type }) = received in
    let (SP_typed { pat = expected_pat; type_ = expected_type }) = expected in
    let () = equal_pat ~received:received_pat ~expected:expected_pat in
    equal_term ~received:received_type ~expected:expected_type

  and equal_pat ~received ~expected =
    (* TODO: normalize pattern *)
    (* TODO: check pat? *)
    match (received, expected) with
    (* TODO: locs *)
    | SP_loc { pat = received; loc = _ }, expected
    | received, SP_loc { pat = expected; loc = _ } ->
        equal_pat ~received ~expected
    | SP_var { var = _ }, SP_var { var = _ } -> ()
    | SP_erasable { pat = received }, SP_erasable { pat = expected } ->
        equal_pat ~received ~expected
    | SP_annot { pat = received; annot = _ }, expected
    | received, SP_annot { pat = expected; annot = _ } ->
        equal_pat ~received ~expected
    | (SP_var _ | SP_erasable _), (SP_var _ | SP_erasable _) ->
        error E_pattern_clash

  let typeof_pat pat =
    let (SP_typed { pat; type_ }) = pat in
    let rec is_erasable pat =
      match pat with
      (* TODO: weird *)
      | SP_loc { pat; loc = _ } -> is_erasable pat
      | SP_var { var = _ } -> false
      | SP_erasable { pat = _ } -> true
      | SP_annot { pat; annot = _ } -> is_erasable pat
    in
    (type_, `Erasability (is_erasable pat))
end

module Assume = struct
  (* TODO: document assumption mode *)
  open Syntax
  open Ltree
  open Stree

  (* TODO: linearity on assume? *)
  module Context : sig
    type 'a context
    type 'a t = 'a context

    (* monad *)
    (* TODO: this should not be exposed *)
    val run :
      level:Level.t -> names:Level.t Name.Map.t -> (unit -> 'a context) -> 'a

    val pure : 'a -> 'a context
    val ( let* ) : 'a context -> ('a -> 'b context) -> 'b context

    (* locs *)
    val with_loc : Location.t -> (unit -> 'a context) -> 'a context

    (* vars *)
    val enter : Name.t -> (unit -> 'a context) -> 'a context
    val lookup : Name.t -> Level.t context

    (* machinery *)
    val close_term : term -> term context
  end = struct
    open Machinery
    open Error

    (* TODO: names map vs names list / stack *)
    type 'a context = level:Level.t -> names:Level.t Name.Map.t -> 'a
    type 'a t = 'a context

    let run ~level ~names f = f () ~level ~names
    let pure x ~level:_ ~names:_ = x
    let ( let* ) ctx f ~level ~names = f (ctx ~level ~names) ~level ~names

    let with_loc loc f ~level ~names =
      try f () ~level ~names
      with Error { error } ->
        let error = E_loc { error; loc } in
        raise (Error { error })

    let enter name f ~level ~names =
      let level = Level.next level in
      let names = Name.Map.add name level names in
      f () ~level ~names

    let lookup name ~level:_ ~names =
      match Name.Map.find_opt name names with
      | Some level -> level
      | None -> error (E_unknown_var { var = name })

    let close_term term ~level ~names:_ =
      close_term ~from:level ~to_:Index.zero term
  end

  open Context
  open Error

  let rec assume_term term =
    match term with
    | LT_loc { term; loc } ->
        let* term = with_loc loc @@ fun () -> assume_term term in
        pure @@ ST_loc { term; loc }
    | LT_var { var } ->
        let* level = lookup var in
        pure @@ ST_free_var { level }
    | LT_extension _ -> error E_unsupported_extensions
    | LT_forall { param; return } ->
        let* param, enter = assume_ty_pat param in
        let* return = enter @@ fun () -> assume_term return in
        pure @@ ST_forall { param; return }
    | LT_lambda { param; return } ->
        let* param, enter = assume_ty_pat param in
        let* return = enter @@ fun () -> assume_term return in
        pure @@ ST_lambda { param; return }
    | LT_apply { lambda; arg } ->
        let* lambda = assume_term lambda in
        let* arg = assume_term arg in
        pure @@ ST_apply { lambda; arg }
    | LT_self { self; body } -> assume_self ~self ~body
    | LT_fix { self; body } -> assume_fix ~self ~body
    | LT_unroll { term } ->
        let* term = assume_term term in
        pure @@ ST_unroll { term }
    | LT_let { bound; return } ->
        (* TODO: assume bind? *)
        (* TODO: use this loc *)
        let (LBind { loc = _; pat = bound; value }) = bound in
        (* TODO: should let always be typed here *)
        let* bound, enter = assume_ty_pat bound in
        let* value = assume_term value in
        let* return = enter @@ fun () -> assume_term return in
        pure @@ ST_let { bound; value; return }
    | LT_annot { term; annot } ->
        let* annot = assume_term annot in
        let* term = assume_term term in
        pure @@ ST_annot { term; annot }
    | LT_string _ -> error E_string_not_supported

  and assume_self ~self ~body =
    let* self, enter = assume_pat self in
    let* body = enter @@ fun () -> assume_term body in
    pure @@ ST_self { self; body }

  and assume_fix ~self ~body =
    let* self, enter = assume_ty_pat self in
    let* body = enter @@ fun () -> assume_term body in
    pure @@ ST_fix { self; body }

  and assume_ty_pat pat =
    let wrap ~enter ~type_ pat = pure @@ (SP_typed { pat; type_ }, enter) in
    match pat with
    | LP_loc { pat; loc } ->
        let* SP_typed { pat; type_ }, enter =
          with_loc loc @@ fun () -> assume_ty_pat pat
        in
        wrap ~enter ~type_ @@ SP_loc { pat; loc }
    | LP_var _ -> error E_missing_annotations
    | LP_unroll _ -> error E_unroll_pattern_not_supported
    | LP_erasable _ ->
        let* SP_typed { pat; type_ }, enter = assume_ty_pat pat in
        wrap ~enter ~type_ @@ SP_erasable { pat }
    | LP_annot { pat; annot } ->
        let* annot = assume_term annot in
        let* pat, enter = assume_pat pat in
        wrap ~enter ~type_:annot @@ SP_annot { pat; annot }

  and assume_pat pat =
    (* TODO: with should do auto close *)
    match pat with
    | LP_loc { pat; loc } ->
        with_loc loc @@ fun () ->
        let* pat, enter = assume_pat pat in
        let pat = SP_loc { pat; loc } in
        pure @@ (pat, enter)
    | LP_var { var } ->
        let enter k =
          enter var @@ fun () ->
          (* TODO: better place or name for close term*)
          let* term = k () in
          close_term term
        in
        pure @@ (SP_var { var }, enter)
    | LP_erasable { pat } ->
        let* pat, enter = assume_pat pat in
        pure @@ (SP_erasable { pat }, enter)
    | LP_unroll _ -> error E_unroll_pattern_not_supported
    | LP_annot { pat; annot } ->
        let* annot = assume_term annot in
        let* pat, enter = assume_pat pat in
        let pat = SP_annot { pat; annot } in
        pure @@ (pat, enter)
end

open Syntax
open Ltree
open Stree
open Machinery

(* TODO: this being hard coded is bad *)
let st_type = ST_free_var { level = Level.zero }

module Context : sig
  type 'a context
  type 'a t = 'a context

  (* monad *)
  val run : (unit -> 'a context) -> 'a
  val pure : 'a -> 'a context
  val ( let* ) : 'a context -> ('a -> 'b context) -> 'b context

  (* locs *)
  val with_loc : Location.t -> (unit -> 'a context) -> 'a context

  (* mode *)
  val enter_erasable_zone : (unit -> 'a context) -> 'a context

  (* vars *)
  val enter :
    Name.t -> erasable:bool -> type_:term -> (unit -> 'a context) -> 'a context

  val lookup : Name.t -> ([ `Type of term ] * Level.t) context

  (* machinery *)
  val assume_self : self:Ltree.pat -> body:Ltree.term -> term context
  val assume_fix : self:Ltree.pat -> body:Ltree.term -> term context
  val subst_term : to_:term -> term -> term context
  val open_term : term -> term context
  val close_term : term -> term context
end = struct
  open Machinery
  open Error

  type status = Var_pending | Var_used

  (* TODO: names map vs names list / stack *)
  (* TODO: vars map vs vars list / stack *)
  type 'a context =
    level:Level.t ->
    names:Level.t Name.Map.t ->
    types:term Level.Map.t ->
    grades:status Level.Map.t ->
    status Level.Map.t * 'a

  type 'a t = 'a context

  let run k =
    let level = Level.(next zero) in
    (* TODO: move this to Name? *)
    let names = Name.Map.(add (Name.make "Type") Level.zero empty) in
    let types = Level.Map.(add Level.zero st_type empty) in
    let grades = Level.Map.(add Level.zero Var_used empty) in
    let _grades, x = k () ~level ~names ~types ~grades in
    (* TODO: check grades here *)
    x

  let pure x ~level:_ ~names:_ ~types:_ ~grades = (grades, x)

  let ( let* ) ctx f ~level ~names ~types ~grades =
    let grades, x = ctx ~level ~names ~types ~grades in
    f x ~level ~names ~types ~grades

  let with_loc loc f ~level ~names ~types ~grades =
    try f () ~level ~names ~types ~grades
    with Error { error } ->
      let error = E_loc { error; loc } in
      raise (Error { error })

  let enter_erasable_zone f ~level ~names ~types ~grades =
    let _grades, x = f () ~level ~names ~types ~grades:Level.Map.empty in
    (* TODO: check grades to be empty here *)
    (grades, x)

  let enter_linear name ~type_ f ~level ~names ~types ~grades =
    let level = Level.next level in
    let names = Name.Map.add name level names in
    let types = Level.Map.add level type_ types in
    let grades = Level.Map.add level Var_pending grades in
    let grades, x = f () ~level ~names ~types ~grades in
    match Level.Map.find_opt level grades with
    | Some Var_pending -> error (E_variable_unused { var = name })
    | Some Var_used -> (Level.Map.remove level grades, x)
    | None -> error E_grades_invariant_violated

  let enter_erasable name ~type_ f ~level ~names ~types ~grades =
    let level = Level.next level in
    let names = Name.Map.add name level names in
    let types = Level.Map.add level type_ types in
    (* TODO: explain why it enters variable as used *)
    (* TODO: Var_erasable? *)
    let grades = Level.Map.add level Var_used grades in
    let grades, x = f () ~level ~names ~types ~grades in
    (Level.Map.remove level grades, x)

  let enter name ~erasable ~type_ f ~level ~names ~types ~grades =
    match erasable with
    | true -> enter_erasable name ~type_ f ~level ~names ~types ~grades
    | false -> enter_linear name ~type_ f ~level ~names ~types ~grades

  let lookup name ~level:_ ~names ~types ~grades =
    match Name.Map.find_opt name names with
    | Some level -> (
        match Level.Map.find_opt level types with
        | Some type_ -> (
            match Level.Map.find_opt level grades with
            | Some Var_pending ->
                let grades = Level.Map.add level Var_used grades in
                (grades, (`Type type_, level))
            | Some Var_used -> error (E_variable_used { var = name })
            | None ->
                (* removed by erasable zone *)
                (grades, (`Type type_, level)))
        | None -> error E_types_invariant_violated)
    | None -> error (E_unknown_var { var = name })

  let assume_self ~self ~body ~level ~names ~types:_ ~grades =
    let x =
      let open Assume in
      Context.run ~level ~names @@ fun () -> assume_self ~self ~body
    in
    (grades, x)

  let assume_fix ~self ~body ~level ~names ~types:_ ~grades =
    let x =
      let open Assume in
      Context.run ~level ~names @@ fun () -> assume_fix ~self ~body
    in
    (grades, x)

  let subst_term ~to_ term ~level:_ ~names:_ ~types:_ ~grades =
    (grades, open_term ~to_ term)

  let open_term term ~level ~names:_ ~types:_ ~grades =
    (grades, open_term ~to_:(ST_free_var { level }) term)

  let close_term term ~level ~names:_ ~types:_ ~grades =
    (grades, close_term ~from:level ~to_:Index.zero term)
end

open Context
open Error

(* TODO: think better about enter pat *)
let rec enter_pat ~erasable pat ~type_ k =
  match pat with
  (* TODO: weird *)
  | SP_loc { pat; loc = _ } -> enter_pat pat ~erasable ~type_ k
  | SP_var { var } -> enter ~erasable var ~type_ k
  | SP_erasable { pat } -> enter_pat ~erasable:true pat ~type_ k
  | SP_annot { pat; annot = _ } -> enter_pat pat ~erasable ~type_ k

let enter_ty_pat ~erasable pat k =
  let (SP_typed { pat; type_ }) = pat in
  enter_pat pat ~erasable ~type_ k

(* TODO: this is clearly bad *)
let enter_erasable_zone_conditional ~erasable k =
  match erasable with true -> enter_erasable_zone k | false -> k ()

let rec infer_term term =
  let wrap ~type_ term = pure @@ ST_typed { term; type_ } in
  match term with
  | LT_loc { term; loc } ->
      let* (ST_typed { term; type_ }) =
        with_loc loc @@ fun () -> infer_term term
      in
      wrap ~type_ @@ ST_loc { term; loc }
  | LT_var { var } ->
      let* `Type type_, level = lookup var in
      wrap ~type_ @@ ST_free_var { level }
  | LT_extension _ -> error E_unsupported_extensions
  | LT_forall { param; return } ->
      let* param = infer_ty_pat param in
      let* return =
        enter_erasable_zone @@ fun () ->
        check_term_with_ty_pat ~erasable:true param return ~expected:st_type
      in
      wrap ~type_:st_type @@ ST_forall { param; return }
  | LT_lambda { param; return } ->
      let* param = infer_ty_pat param in
      let* (ST_typed { term = return; type_ = return_type }) =
        infer_term_with_ty_pat ~erasable:false param return
      in
      let type_ = ST_forall { param; return = return_type } in
      wrap ~type_ @@ ST_lambda { param; return }
  | LT_apply { lambda; arg } -> (
      let* (ST_typed { term = lambda; type_ = forall }) = infer_term lambda in
      (* TODO: maybe machinery to eliminate forall *)
      match expand_head_term forall with
      | ST_forall { param; return } ->
          let* arg =
            let expected, `Erasability erasable = typeof_pat param in
            enter_erasable_zone_conditional ~erasable @@ fun () ->
            check_term arg ~expected
          in
          let* type_ = subst_term ~to_:arg return in
          wrap ~type_ @@ ST_apply { lambda; arg }
      (* TODO: expand cases *)
      | _ -> error E_expected_forall)
  | LT_self { self; body } ->
      let* assumed_self = assume_self ~self ~body in
      let* self = check_pat self ~expected:assumed_self in
      let* body =
        enter_erasable_zone @@ fun () ->
        check_term_with_pat ~erasable:true self ~type_:assumed_self body
          ~expected:st_type
      in
      let self = ST_self { self; body } in
      (* this equality is about peace of mind *)
      let () = equal_term ~received:self ~expected:assumed_self in
      wrap ~type_:st_type @@ self
  | LT_fix { self; body } ->
      let* self = infer_ty_pat self in
      let type_, `Erasability erasable = typeof_pat self in
      let* body =
        enter_erasable_zone_conditional ~erasable @@ fun () ->
        check_term_with_ty_pat ~erasable:false self body ~expected:type_
      in
      wrap ~type_ @@ ST_fix { self; body }
  | LT_unroll { term } -> (
      (* TODO: rename to fix *)
      let* (ST_typed { term; type_ = self }) = infer_term term in
      (* TODO: maybe machinery to eliminate forall *)
      match expand_head_term self with
      | ST_self { self = _; body } ->
          let* type_ = subst_term ~to_:term body in
          wrap ~type_ @@ ST_unroll { term }
      (* TODO: expand cases *)
      | _ -> error E_expected_self)
  | LT_let { bound; return } ->
      (* TODO: check bind? *)
      (* TODO: use this loc *)
      let (LBind { loc = _; pat = bound; value }) = bound in
      (* TODO: remove need for typing of let *)
      let* bound = infer_ty_pat bound in
      let* value =
        let value_type, `Erasability erasable = typeof_pat bound in
        enter_erasable_zone_conditional ~erasable @@ fun () ->
        check_term value ~expected:value_type
      in
      let* (ST_typed { term = return; type_ = return_type }) =
        infer_term_with_ty_pat ~erasable:false bound return
      in
      (* TODO: could use let at type level *)
      let* type_ = subst_term ~to_:value return_type in
      wrap ~type_ @@ ST_let { bound; value; return }
  | LT_annot { term; annot } ->
      let* annot =
        enter_erasable_zone @@ fun () -> check_term annot ~expected:st_type
      in
      let* term = check_term term ~expected:annot in
      wrap ~type_:annot @@ ST_annot { term; annot }
  | LT_string _ -> error E_string_not_supported

and check_term term ~expected =
  (* TODO: check term equality for nested annot ((x : A) : B)? *)
  (* TODO: propagate *)
  match (term, expand_head_term expected) with
  | LT_loc { term; loc }, expected ->
      let* term = with_loc loc @@ fun () -> check_term term ~expected in
      pure @@ ST_loc { term; loc }
  | ( LT_lambda { param; return },
      ST_forall { param = expected_param; return = expected_return } ) ->
      let* param =
        (* TODO: use this erasable? *)
        let expected_param_type, `Erasability _erasable =
          typeof_pat expected_param
        in
        check_ty_pat param ~expected:expected_param_type
      in
      let () =
        (* TODO : loc for error message *)
        equal_ty_pat ~received:param ~expected:expected_param
      in
      let* return =
        check_term_with_ty_pat ~erasable:false param return
          ~expected:expected_return
      in
      pure @@ ST_lambda { param; return }
  | ( LT_fix { self; body },
      (ST_self { self = expected_self; body = expected_body } as expected) ) ->
      let* self = check_ty_pat self ~expected in
      let () =
        (* TODO : loc for error message *)
        let (SP_typed { pat = self; type_ = _ }) = self in
        equal_pat ~received:self ~expected:expected_self
      in
      let* body =
        check_term_with_ty_pat ~erasable:false self body ~expected:expected_body
      in
      pure @@ ST_fix { self; body }
  | term, expected ->
      let* (ST_typed { term; type_ = received }) = infer_term term in
      let () = equal_term ~received ~expected in
      pure term

and infer_ty_pat pat =
  let wrap ~type_ pat = pure @@ SP_typed { pat; type_ } in
  match pat with
  | LP_loc { pat; loc } ->
      with_loc loc @@ fun () ->
      let* (SP_typed { pat; type_ }) = infer_ty_pat pat in
      wrap ~type_ @@ SP_loc { pat; loc }
  | LP_var _ -> error E_missing_annotations
  | LP_erasable { pat } ->
      let* (SP_typed { pat; type_ }) = infer_ty_pat pat in
      wrap ~type_ @@ SP_erasable { pat }
  | LP_unroll _ -> error E_unroll_pattern_not_supported
  | LP_annot { pat; annot } ->
      let* annot =
        enter_erasable_zone @@ fun () -> check_term annot ~expected:st_type
      in
      check_ty_pat pat ~expected:annot

and check_ty_pat pat ~expected =
  let* pat = check_pat pat ~expected in
  pure @@ SP_typed { pat; type_ = expected }

and check_pat pat ~expected =
  match pat with
  | LP_loc { pat; loc } ->
      with_loc loc @@ fun () ->
      let* pat = check_pat pat ~expected in
      pure @@ SP_loc { pat; loc }
  | LP_var { var } -> pure @@ SP_var { var }
  | LP_erasable { pat } ->
      let* pat = check_pat pat ~expected in
      pure @@ SP_erasable { pat }
  | LP_unroll _ -> error E_unroll_pattern_not_supported
  | LP_annot { pat; annot } ->
      let* annot =
        enter_erasable_zone @@ fun () -> check_term annot ~expected:st_type
      in
      let* pat = check_pat pat ~expected:annot in
      let () = equal_term ~received:annot ~expected in
      pure @@ SP_annot { pat; annot }

and infer_term_with_ty_pat ~erasable pat term =
  enter_ty_pat ~erasable pat @@ fun () ->
  let* (ST_typed { term; type_ }) = infer_term term in
  let* term = close_term term in
  let* type_ = close_term type_ in
  pure @@ ST_typed { term; type_ }

and check_term_with_ty_pat ~erasable pat term ~expected =
  enter_ty_pat ~erasable pat @@ fun () ->
  (* TODO: open and close should probably not be here *)
  let* expected = open_term expected in
  let* term = check_term term ~expected in
  close_term term

and check_term_with_pat ~erasable pat ~type_ term ~expected =
  enter_pat ~erasable pat ~type_ @@ fun () ->
  let* expected = open_term expected in
  let* term = check_term term ~expected in
  close_term term


================================================
FILE: smol/test.ml
================================================
open Syntax
open Smol

type test = { name : string; term : string }

let type_term name term = { name; term }

(* TODO: used variable still usable on type level *)
let id =
  type_term "id"
    {|((A : Type $ 0) => (x : A) => x
      :(A : Type $ 0) -> (x : A) -> A)|}

let id_propagate =
  type_term "id_propagate"
    {|((A $ 0) => x => x : (A : Type $ 0) -> (x : A) -> A)|}

let apply_erasable =
  type_term "apply_erasable"
    {|
      (A : Type $ 0) => ((A : Type $ 0) => (x : A) => x) A
    |}

let sequence =
  type_term "sequence"
    {|((A : Type) => (x : A) => (B : Type) => (y : B) => y
      :(A : Type) -> (x : A) -> (B : Type) -> (y : B) -> B)|}

let bool =
  type_term "bool"
    {|((A : Type) => (x : A) => (y : A) => x
               :(A : Type) -> (x : A) -> (y : A) -> A)|}

let sequence_propagate =
  type_term "sequence_propagate"
    {|(A => x => B => y => y
       :(A : Type) -> (x : A) -> (B : Type) -> (y : B) -> B)|}

let true_ =
  type_term "true"
    {|((A : Type) => (x : A) => (y : A) => x
      :(A : Type) -> (x : A) -> (y : A) -> A)|}

let true_propagate =
  type_term "true_propagate"
    {|(A => x => y => x
       :(A : Type) -> (x : A) -> (y : A) -> A)|}

let false_ =
  type_term "false"
    {|((A : Type) => (x : A) => (y : A) => y
      :(A : Type) -> (x : A) -> (y : A) -> A)|}

let ind_False =
  let b_false = {|f @-> (P : (f : @False I_False) -> Type) -> @I_False P f|} in
  let i_false_t =
    Format.sprintf
      {|I_False @-> (P : (f : @False I_False) -> Type) -> (f : %s) -> Type|}
      b_false
  in
  let code =
    Format.sprintf
      {|
        (FalseT : Type) === False @-> (I_False : %s) -> Type;
        (False : FalseT) @=> (I_False : %s) => %s
      |}
      i_false_t i_false_t b_false
  in
  type_term "ind_False" code

let ind_Unit =
  let b_Unit =
    {|
      u @-> (P : (x : @Unit I_Unit unit I_unit) -> Type) ->
        (x : @I_Unit unit I_unit P (@unit I_unit)) -> @I_Unit unit I_unit P u
    |}
  in
  let b_unit =
    {|
      (u : u @->
        (P : (x : @Unit I_Unit unit I_unit) -> Type) ->
        (a : @I_Unit unit I_unit P (@unit I_unit)) -> @I_Unit unit I_unit P u
      ) @=> (P : (x : @Unit I_Unit unit I_unit) -> Type) =>
        (b : @I_Unit unit I_unit P (@unit I_unit)) => @I_unit P b
    |}
  in
  let t_i_unit =
    Format.sprintf
      {|
        I_unit @-> (P : (c : @Unit I_Unit unit I_unit) -> Type) ->
          (d : @I_Unit unit I_unit P (@unit I_unit)) ->
          @I_Unit unit I_unit P (%s)
      |}
      b_unit
  in
  let t_unit =
    Format.sprintf {|
      unit @-> (I_unit : %s) -> %s
    |} t_i_unit b_Unit
  in
  let t_i_Unit =
    Format.sprintf
      {|
        I_Unit @-> (unit : %s) -> (I_unit : %s) ->
          (P : (e : @Unit I_Unit unit I_unit) -> Type) ->
          (f : %s) -> Type
      |}
      t_unit t_i_unit b_Unit
  in
  (*
  {|
    (UnitT : Type) ===
      Unit @-> (I_Unit : %s) -> (unit : %s) -> (I_unit : %s) -> Type;
    (Unit : UnitT) === (Unit : UnitT) @=> 
      (I_Unit : %s) => (unit : %s) => (I_unit : %s) => %s;
    (UnitR : Type) === (I_Unit : %s) -> (unit : %s) -> (I_unit : %s) -> Type;
    (UnitEq : (P : (x : UnitR) -> Type) -> (x : P @Unit) ->
      P ((I_Unit : %s) => (unit : %s) => (I_unit : %s) => %s)
    ) === (P : (x : UnitR) -> Type) => (x : P @Unit) => %%expand x;
    (I_Unit : %s) === (I_Unit : %s) @=> (unit : %s) => (I_unit : %s) =>
      (P : (f : @Unit I_Unit unit I_unit) -> Type) =>
        UnitEq ((Unit : UnitR) => (f : Unit I_Unit unit I_unit) -> Type) P;
    (unit : %s) === (unit : %s) @=> (I_unit : %s) => %s;
    (I_unitT : Type) === %s;
    (unitR : Type) === (I_unit : I_unitT) -> %s;
    (unitEq : (P : (x : unitR) -> Type) -> (x : P @unit) ->
      P ((I_unit : I_unitT) => %s)) ===
      (P : (x : unitR) -> Type) => (x : P @unit) => %%expand P;
    unitEq
  |}
  *)
  let _code =
    Format.sprintf
      {|
        (UnitT : Type) ===
          Unit @-> (I_Unit : %s) -> (unit : %s) -> (I_unit : %s) -> Type;
        (I_UnitT : (Unit : UnitT) -> Type) ===
          (Unit : UnitT) => %s;
        (unitT : (Unit : UnitT) -> (I_Unit : I_UnitT Unit) -> Type) ===
          (Unit : UnitT) => (I_Unit : I_UnitT Unit) => %s;
        (I_unitT : (Unit : UnitT) -> (I_Unit : I_UnitT Unit) ->
          (unit : unitT Unit I_Unit) -> Type) ===
          (Unit : UnitT) => (I_Unit : I_UnitT Unit) =>
          (unit : unitT Unit I_Unit) => %s;
        (Unit : UnitT) === (Unit : UnitT) @=> 
          (I_Unit : I_UnitT Unit) => (unit : unitT Unit I_Unit) =>
            (I_unit : I_unitT Unit I_Unit unit) => %s;
        (I_UnitT : Type) === I_UnitT Unit;
        (unitT : (I_Unit : I_UnitT) -> Type) === unitT Unit;
        (I_unitT : (I_Unit : I_UnitT) ->
          (unit : unitT I_Unit) -> Type) === I_unitT Unit;
        (UnitR : Type) === (I_Unit : I_UnitT) -> (unit : unitT I_Unit) ->
            (I_unit : I_unitT I_Unit unit) -> Type;
        (UnitEq : (P : (x : UnitR) -> Type) -> (x : P @Unit) ->
          P (
          (I_Unit : I_UnitT) => (unit : unitT I_Unit) =>
            (I_unit : I_unitT I_Unit unit) => %s)
        ) === (P : (x : UnitR) -> Type) => (x : P @Unit) => %%expand x;
        (I_Unit : I_UnitT) === (I_Unit : I_UnitT) @=>
          (unit : unitT I_Unit) => (I_unit : I_unitT I_Unit unit) =>
          (P : (f : @Unit I_Unit unit I_unit) -> Type) =>
            UnitEq ((Unit : UnitR) => (f : Unit I_Unit unit I_unit) -> Type) P;
        (unitT : Type) === unitT I_Unit;
        (I_unitT : (unit : unitT) -> Type) === I_unitT I_Unit;
        (unit : unitT) === (unit : unitT) @=>
          (I_unit : I_unitT unit) => %s;
        (I_unitT : Type) === I_unitT unit;
        (unitR : Type) === (I_unit : I_unitT) -> %s;
        (unitEq : (P : (x : unitR) -> Type) -> (x : P @unit) ->
          P ((I_unit : I_unitT) => %s)) ===
            (P : (x : unitR) -> Type) => (x : P @unit) => %%expand x;
        (I_unit : I_unitT) === (I_unit : I_unitT) @=>
          (P : (c : @Unit I_Unit unit I_unit) -> Type) =>
          (d : @I_Unit unit I_unit P (@unit I_unit)) =>
          unitEq ((at_unit : unitR) => @I_Unit unit I_unit P (at_unit I_unit))
            d;
        (UnitEq : (P : (_ : Type) -> Type) ->
          (x : P (@Unit I_Unit unit I_unit)) -> P (%s)) ===
          (P : (_ : Type) -> Type) =>
          (x : P (@Unit I_Unit unit I_unit)) => %%expand x;
        (RevUnitEq : (P : (_ : Type) -> Type) ->
          (x : P (%s)) -> P (@Unit I_Unit unit I_unit)) ===
          (P : (_ : Type) -> Type) =>
          UnitEq ((T : Type) => (x : P T) -> P (@Unit I_Unit unit I_unit))
            ((x : P (@Unit I_Unit unit I_unit)) => x);
        
        (unitK : @Unit I_Unit unit I_unit) === RevUnitEq ((X : Type) => X) (@unit I_unit);
        @(%%expand unitK)
      |}
      t_i_Unit t_unit t_i_unit t_i_Unit t_unit t_i_unit b_Unit b_Unit b_unit
      b_Unit b_unit b_Unit b_Unit
  in
  let code =
    Format.sprintf
      {|
        (UnitT : Type) ===
          Unit @-> (I_Unit : %s) -> (unit : %s) -> (I_unit : %s) -> Type;
        (I_UnitT : (Unit : UnitT) -> Type) ===
          (Unit : UnitT) => %s;
        (unitT : (Unit : UnitT) -> (I_Unit : I_UnitT Unit) -> Type) ===
          (Unit : UnitT) => (I_Unit : I_UnitT Unit) => %s;
        (I_unitT : (Unit : UnitT) -> (I_Unit : I_UnitT Unit) ->
          (unit : unitT Unit I_Unit) -> Type) ===
          (Unit : UnitT) => (I_Unit : I_UnitT Unit) =>
          (unit : unitT Unit I_Unit) => %s;
        (Unit : UnitT) === (Unit : UnitT) @=> 
          (I_Unit : I_UnitT Unit) => (unit : unitT Unit I_Unit) =>
            (I_unit : I_unitT Unit I_Unit unit) => %s;
        (I_UnitT : Type) === I_UnitT Unit;
        (unitT : (I_Unit : I_UnitT) -> Type) === unitT Unit;
        (I_unitT : (I_Unit : I_UnitT) ->
          (unit : unitT I_Unit) -> Type) === I_unitT Unit;
        (UnitR : Type) === (I_Unit : I_UnitT) -> (unit : unitT I_Unit) ->
            (I_unit : I_unitT I_Unit unit) -> Type;
        (UnitEq : (P : (x : UnitR) -> Type) -> (x : P @Unit) ->
          P (
          (I_Unit : I_UnitT) => (unit : unitT I_Unit) =>
            (I_unit : I_unitT I_Unit unit) => %s)
        ) === (P : (x : UnitR) -> Type) => (x : P @Unit) => %%expand x;
        (I_Unit : I_UnitT) === (I_Unit : I_UnitT) @=>
          (unit : unitT I_Unit) => (I_unit : I_unitT I_Unit unit) =>
          (P : (f : @Unit I_Unit unit I_unit) -> Type) =>
            UnitEq ((Unit : UnitR) => (f : Unit I_Unit unit I_unit) -> Type) P;
        I_Unit
      |}
      t_i_Unit t_unit t_i_unit t_i_Unit t_unit t_i_unit b_Unit b_Unit
  in
  type_term "ind_Unit" code

let _tests =
  [
    id;
    id_propagate;
    sequence;
    sequence_propagate;
    bool;
    true_;
    true_propagate;
    false_;
    ind_False;
    ind_Unit;
  ]

let tests = [ id; id_propagate; apply_erasable ]

let type_term term =
  let term = Clexer.from_string Cparser.term_opt term in
  let term = Option.get term in
  let term =
    let loc = Location.none in
    Lparser.parse_term ~loc term
  in
  Styper.(Context.run @@ fun () -> infer_term term)

let test { name; term } =
  let check () =
    let _type_ = type_term term in
    (* Format.eprintf "type_ : %a\n%!" Tprinter.pp_term type_; *)
    ()
  in
  Alcotest.test_case name `Quick check

let tests = ("tests", List.map test tests)
let () = Alcotest.run "Typer" [ tests ]


================================================
FILE: smol/test.mli
================================================


================================================
FILE: syntax/clexer.ml
================================================
open Cparser
open Sedlexing.Utf8

exception Lexer_error of { loc : Location.t }
exception Parser_error of { loc : Location.t }

let () =
  Printexc.register_printer @@ function
  | Lexer_error { loc = _ } -> Some "lexer: syntax error"
  | Parser_error { loc = _ } -> Some "parser: syntax error"
  | _ -> None

let loc buf =
  let loc_start, loc_end = Sedlexing.lexing_positions buf in
  Location.{ loc_start; loc_end; loc_ghost = false }

let whitespace = [%sedlex.regexp? Plus (' ' | '\t' | '\n')]
let alphabet = [%sedlex.regexp? 'a' .. 'z' | 'A' .. 'Z']
let digit = [%sedlex.regexp? '0' .. '9']
let variable = [%sedlex.regexp? (alphabet | '_'), Star (alphabet | digit | '_')]
let extension = [%sedlex.regexp? '%', variable]
let string = [%sedlex.regexp? '"', Star (Sub (any, '"')), '"']
let number = [%sedlex.regexp? Plus '0' .. '9']

let rec tokenizer buf =
  match%sedlex buf with
  | whitespace -> tokenizer buf
  | variable -> VAR (lexeme buf)
  | extension -> EXTENSION (lexeme buf)
  | ":" -> COLON
  | "->" -> ARROW
  | "=>" -> FAT_ARROW
  | "=" -> EQUAL
  | "," -> COMMA
  | "&" -> AMPERSAND
  | ";" -> SEMICOLON
  | string ->
      (* TODO: this should probably be somewhere else *)
      let literal = lexeme buf in
      (* TODO: this is dangerous *)
      let literal = String.sub literal 1 (String.length literal - 2) in
      STRING literal
  | number ->
      (* TODO: this should probably be somewhere else *)
      let literal = lexeme buf in
      (* TODO: this is dangerous *)
      let literal = Z.of_string literal in
      NUMBER literal
  | "(" -> LEFT_PARENS
  | ")" -> RIGHT_PARENS
  | "{" -> LEFT_BRACE
  | "}" -> RIGHT_BRACE
  | eof -> EOF
  | _ ->
      let loc = loc buf in
      raise @@ Lexer_error { loc }

let next buf =
  let token = tokenizer buf in
  let start, end_ = Sedlexing.lexing_positions buf in
  (token, start, end_)

open Cparser.MenhirInterpreter

let rec loop buf state =
  match state with
  | InputNeeded _env ->
      (* The parser needs a token. Request one from the lexer,
         and offer it to the parser, which will produce a new
         checkpoint. Then, repeat. *)
      let token, start, end_ = next buf in
      let state = offer state (token, start, end_) in
      loop buf state
  | Shifting _ | AboutToReduce _ ->
      let state = resume state in
      loop buf state
  | HandlingError _env ->
      let loc = loc buf in
      raise (Parser_error { loc })
  | Accepted value -> value
  | Rejected -> failwith "cdriver.loop: rejected reached"

let buf_from_string string =
  (* TODO: from string seems to not trigger new line, likely a bug in sedlex *)
  let index = ref 0 in
  let length = String.length string in
  from_gen (fun () ->
      match !index < length with
      | true ->
          let char = String.get string !index in
          incr index;
          Some char
      | false -> None)

let term_opt_from_string string =
  let buf = buf_from_string string in
  (* TODO: allow to change this *)
  let start, _end = Sedlexing.lexing_positions buf in
  loop buf @@ Cparser.Incremental.term_opt start


================================================
FILE: syntax/clexer.mli
================================================
exception Lexer_error of { loc : Location.t }
exception Parser_error of { loc : Location.t }

val loc : Sedlexing.lexbuf -> Location.t
val next : Sedlexing.lexbuf -> Cparser.token * Lexing.position * Lexing.position
val term_opt_from_string : string -> Ctree.term option


================================================
FILE: syntax/cparser.mly
================================================
%{
open Utils
open Ctree

let mk (loc_start, loc_end) =
  Location.{ loc_start; loc_end; loc_ghost = false }

%}
%token <string> VAR (* x *)
%token COLON (* : *)
%token ARROW (* -> *)
%token FAT_ARROW (* => *)
%token EQUAL (* = *)
%token COMMA (* , *)
%token AMPERSAND (* & *)
%token SEMICOLON (* ; *)
%token <string> STRING (* "abc" *)
%token <Z.t> NUMBER (* 123 *)
%token LEFT_PARENS (* ( *)
%token RIGHT_PARENS (* ) *)
%token LEFT_BRACE (* { *)
%token RIGHT_BRACE (* } *)
%token <string> EXTENSION (* %x *)

%token EOF

%start <Ctree.term option> term_opt
%%

let term_opt :=
  | EOF;
    { None }
  | term = term; EOF;
    { Some term }

let term := term_rec_pair

let term_rec_pair :=
  | term_semi_or_annot
  | term_pair(term_rec_pair, term_semi_or_annot)

let term_semi_or_annot :=
  | term_rec_annot
  | term_semi(term_rec_semi, term_rec_semi_bind)

let term_rec_semi :=
  | term_rec_funct
  | term_semi(term_rec_semi, term_rec_semi_bind)

let term_rec_semi_bind :=
  | term_rec_semi_annot
  | term_bind(term_rec_semi, term_rec_semi_annot)

let term_rec_semi_annot :=
  | term_rec_funct
  | term_annot(term_rec_semi_annot, term_rec_funct)

let term_rec_annot :=
  | term_rec_funct
  | term_annot(term_rec_annot, term_rec_funct)

let term_rec_funct :=
  | term_rec_apply
  | term_forall(term_rec_funct, term_rec_apply)
  | term_lambda(term_rec_funct, term_rec_apply)
  | term_both(term_rec_funct, term_rec_apply)

let term_rec_apply :=
  | term_atom
  | term_apply(term_rec_apply, term_atom)

let term_atom :=
  | term_var
  | term_extension
  | term_string
  | term_number
  | term_parens(term)
  | term_braces(term)

let term_forall(self, lower) ==
  | param = lower; ARROW; body = self;
    { ct_forall (mk $loc) ~param ~body }
let term_lambda(self, lower) ==
  | param = lower; FAT_ARROW; body = self;
    { ct_lambda (mk $loc) ~param ~body }
let term_apply(self, lower) ==
  | funct = self; arg = lower;
    { ct_apply (mk $loc) ~funct ~arg }
let term_pair(self, lower) ==
  | left = lower; COMMA; right = self;
    { ct_pair (mk $loc) ~left ~right }
let term_both(self, lower) ==
  | left = lower; AMPERSAND; right = self;
    { ct_both (mk $loc) ~left ~right }
let term_bind(self, lower) ==
  | bound = lower; EQUAL; value = lower;
    { ct_bind (mk $loc) ~bound ~value }
let term_semi(self, lower) ==
  | left = lower; SEMICOLON; right = self;
    { ct_semi (mk $loc) ~left ~right }
let term_annot(self, lower) ==
  | value = lower; COLON; annot = self;
    { ct_annot (mk $loc) ~value ~annot }
let term_var ==
  | var = VAR;
    { ct_var (mk $loc) ~var:(Name.make var) }
let term_extension ==
  | extension = EXTENSION;
    { ct_extension (mk $loc) ~extension:(Name.make extension) }
let term_string ==
  | literal = STRING;
    { ct_string (mk $loc) ~literal }
let term_number ==
  | literal = NUMBER;
    { ct_number (mk $loc) ~literal }
let term_parens(content) ==
  | LEFT_PARENS; content = content; RIGHT_PARENS;
    { ct_parens (mk $loc) ~content }
let term_braces(content) ==
  | LEFT_BRACE; content = content; RIGHT_BRACE;
    { ct_braces (mk $loc) ~content }


================================================
FILE: syntax/ctree.ml
================================================
open Utils

type term =
  (* TODO: printer location *)
  | CTerm of { term : term_syntax; loc : Location.t [@opaque] }

and term_syntax =
  | CT_var of { var : Name.t }
  | CT_extension of { extension : Name.t }
  | CT_forall of { param : term; body : term }
  | CT_lambda of { param : term; body : term }
  | CT_apply of { funct : term; arg : term }
  | CT_pair of { left : term; right : term }
  | CT_both of { left : term; right : term }
  | CT_bind of { bound : term; value : term }
  | CT_semi of { left : term; right : term }
  | CT_annot of { value : term; annot : term }
  | CT_string of { literal : string }
  | CT_number of { literal : Z.t [@printer Z.pp_print] }
  | CT_parens of { content : term }
  | CT_braces of { content : term }
[@@deriving show { with_path = false }]

let cterm loc term = CTerm { loc; term }
let ct_var loc ~var = cterm loc (CT_var { var })
let ct_extension loc ~extension = cterm loc (CT_extension { extension })
let ct_forall loc ~param ~body = cterm loc (CT_forall { param; body })
let ct_lambda loc ~param ~body = cterm loc (CT_lambda { param; body })
let ct_apply loc ~funct ~arg = cterm loc (CT_apply { funct; arg })
let ct_pair loc ~left ~right = cterm loc (CT_pair { left; right })
let ct_both loc ~left ~right = cterm loc (CT_both { left; right })
let ct_bind loc ~bound ~value = cterm loc (CT_bind { bound; value })
let ct_semi loc ~left ~right = cterm loc (CT_semi { left; right })
let ct_annot loc ~value ~annot = cterm loc (CT_annot { value; annot })
let ct_string loc ~literal = cterm loc (CT_string { literal })
let ct_number loc ~literal = cterm loc (CT_number { literal })
let ct_parens loc ~content = cterm loc (CT_parens { content })
let ct_braces loc ~content = cterm loc (CT_braces { content })


================================================
FILE: syntax/ctree.mli
================================================
open Utils

type term = CTerm of { term : term_syntax; loc : Location.t }

and term_syntax =
  | CT_var of { var : Name.t }
  | CT_extension of { extension : Name.t }
  | CT_forall of { param : term; body : term }
  | CT_lambda of { param : term; body : term }
  | CT_apply of { funct : term; arg : term }
  | CT_pair of { left : term; right : term }
  | CT_both of { left : term; right : term }
  | CT_bind of { bound : term; value : term }
  | CT_semi of { left : term; right : term }
  | CT_annot of { value : term; annot : term }
  | CT_string of { literal : string }
  | CT_number of { literal : Z.t }
  | CT_parens of { content : term }
  | CT_braces of { content : term }
[@@deriving show]

val ct_var : Location.t -> var:Name.t -> term
val ct_extension : Location.t -> extension:Name.t -> term
val ct_forall : Location.t -> param:term -> body:term -> term
val ct_lambda : Location.t -> param:term -> body:term -> term
val ct_apply : Location.t -> funct:term -> arg:term -> term
val ct_pair : Location.t -> left:term -> right:term -> term
val ct_both : Location.t -> left:term -> right:term -> term
val ct_bind : Location.t -> bound:term -> value:term -> term
val ct_semi : Location.t -> left:term -> right:term -> term
val ct_annot : Location.t -> value:term -> annot:term -> term
val ct_string : Location.t -> literal:string -> term
val ct_number : Location.t -> literal:Z.t -> term
val ct_parens : Location.t -> content:term -> term
val ct_braces : Location.t -> content:term -> term


================================================
FILE: syntax/dune
================================================
(library
 (name syntax)
 (libraries menhirLib compiler-libs.common utils zarith)
 (modules
  (:standard \ Test))
 (preprocess
  (pps ppx_deriving.eq ppx_deriving.ord ppx_deriving.show sedlex.ppx)))

(menhir
 (modules cparser)
 (flags --dump --explain --table))

(executable
 (name test)
 (modules Test)
 (libraries alcotest syntax)
 (preprocess
  (pps ppx_deriving.show ppx_deriving.eq ppx_deriving.ord)))

(rule
 (alias runtest)
 (deps
  (:exe ./test.exe))
 (action
  (run %{exe})))


================================================
FILE: syntax/test.ml
================================================


================================================
FILE: teika/dune
================================================
(library
 (name teika)
 (libraries syntax compiler-libs.common)
 (modules
  (:standard \ Test))
 (preprocess
  (pps
   ppx_deriving.show
   ppx_sexp_conv
   ppx_deriving.eq
   ppx_deriving.ord
   sedlex.ppx)))

(executable
 (name test)
 (modules Test)
 (libraries alcotest teika)
 (preprocess
  (pps ppx_deriving.show ppx_deriving.eq ppx_deriving.ord)))

(rule
 (alias runtest)
 (deps
  (:exe ./test.exe))
 (action
  (run %{exe})))


================================================
FILE: teika/solve.ml
================================================
open Utils
open Syntax
open Ctree
open Ttree
open Terror

exception Solve_error of { loc : Location.t; exn : exn }

(* TODO: context vs env *)
type context =
  | Context of { names : (bool * Level.t) Name.Map.t; next : Level.t }

let with_loc ~loc f =
  try f () with
  | Solve_error { loc; exn } ->
      (* TODO: reraise *)
      raise @@ Solve_error { loc; exn }
  | exn -> raise @@ Solve_error { loc; exn }

let () =
  Printexc.register_printer @@ function
  | Solve_error { loc = _; exn } -> Some (Printexc.to_string exn)
  | _ -> None

let split_pat_annot pat =
  let (Pat { struct_ = pat; loc = _ }) = pat in
  match pat with
  | P_annot { pat; annot } -> (pat, annot)
  (* TODO: support tuple here *)
  | P_var _ | P_tuple _ -> error_missing_annotation ()

let rec enter ctx pat =
  let (Pat { struct_ = pat; loc = _ }) = pat in
  (* TODO: use this location? *)
  match pat with
  | P_annot { pat; annot = _ } -> enter ctx pat
  | P_var { var = name } ->
      let (Context { names; next }) = ctx in
      Format.eprintf "hi\n%!";
      let names = Name.Map.add name (false, next) names in
      let next = Level.next next in
      Context { names; next }
  | P_tuple { elements } ->
      List.fold_left (fun ctx el -> enter ctx el) ctx elements

let rec name_of_var_pat pat =
  let (VPat { struct_ = pat; loc = _ }) = pat in
  match pat with
  | VP_annot { pat; annot = _ } -> name_of_var_pat pat
  | VP_var { var } -> var

let open_hoist ctx pat =
  (* TODO: ensure that somehow all the hoists are closed *)
  let name = name_of_var_pat pat in
  let (Context { names; next }) = ctx in
  Format.eprintf "hi\n%!";
  let names = Name.Map.add name (true, next) names in
  let next = Level.next next in
  Context { names; next }

let close_hoist ctx pat =
  (* TODO: this is a bad API *)
  let name = name_of_var_pat pat in
  let (Context { names; next }) = ctx in
  let names =
    match Name.Map.find_opt name names with
    | Some (true, from) -> Name.Map.add name (false, from) names
    | Some (false, _from) -> failwith "compiler bug invalid name"
    | None -> failwith "close_hoist: compiler bug invalid name"
  in
  Context { names; next }

let lookup ctx name =
  let (Context { names; next }) = ctx in
  match Name.Map.find_opt name names with
  | Some (is_open_hoist, from) -> (
      match Level.offset ~from ~to_:next with
      | Some var -> (`hoist is_open_hoist, var)
      | None -> failwith "compiler bug invalid var")
  | None -> error_unknown_var ~name

let is_hoist ctx name =
  let (Context { names; next }) = ctx in
  match Name.Map.find_opt name names with
  | Some (is_open_hoist, from) -> (
      match Level.offset ~from ~to_:next with
      | Some var -> (
          match is_open_hoist with true -> Some var | false -> None)
      | None -> None)
  | None -> None

type meta_pat =
  | MP_simple of var_pat
  | MP_fancy of pat
  | MP_fix of Index.t * var_pat

let rec pat_of_var_pat var_pat =
  let (VPat { struct_ = var_pat; loc }) = var_pat in
  match var_pat with
  | VP_annot { pat; annot } ->
      let pat = pat_of_var_pat pat in
      p_wrap ~loc @@ P_annot { pat; annot }
  | VP_var { var } -> p_wrap ~loc @@ P_var { var }

let pat_not_fix meta_pat =
  match meta_pat with
  | MP_simple pat -> pat_of_var_pat pat
  | MP_fancy pat -> pat
  | MP_fix (_var, _pat) ->
      (* TODO: proper error here *)
      failwith "a variable with the same name is open on a hoist"

let self_pat_simple meta_pat =
  match meta_pat with
  | MP_simple pat -> pat
  | MP_fancy _pat -> failwith "fancy patterns are not supported on self"
  | MP_fix (_var, _pat) ->
      failwith "a variable with the same name is open on a hoist"

let rec solve_term ctx term =
  let (CTerm { term; loc }) = term in
  with_loc ~loc @@ fun () ->
  match term with
  | CT_parens { content = term } -> solve_term ctx term
  | CT_annot { value = term; annot } ->
      let annot = solve_term ctx annot in
      let term = solve_term ctx term in
      t_wrap ~loc @@ T_annot { term; annot }
  | CT_var { var = name } ->
      (* TODO: this could be treated as forward *)
      let `hoist _, var = lookup ctx name in
      t_wrap ~loc @@ T_var { var }
  | CT_semi { left; right } -> solve_semi ctx ~loc ~left ~right
  | CT_extension _ -> error_extensions_not_implemented ()
  | CT_apply { funct; arg } ->
      let funct = solve_term ctx funct in
      let arg = solve_term ctx arg in
      t_wrap ~loc @@ T_apply { funct; arg }
  | CT_lambda { param; body } ->
      let bound = solve_pat ctx param in
      let bound = pat_not_fix bound in
      let body =
        let ctx = enter ctx bound in
        solve_term ctx body
      in
      t_wrap ~loc @@ T_lambda { bound; body }
  | CT_forall { param; body } ->
      let bound = solve_pat ctx param in
      let bound = pat_not_fix bound in
      let bound, param = split_pat_annot bound in
      let body =
        let ctx = enter ctx bound in
        solve_term ctx body
      in
      t_wrap ~loc @@ T_forall { bound; param; body }
  | CT_pair { left; right } ->
      let left = solve_term ctx left in
      let acc = [ left ] in
      let elements = solve_term_tuple ctx ~acc ~right in
      t_wrap ~loc @@ T_tuple { elements }
  | CT_both { left; right } ->
      let bound = solve_pat ctx left in
      let bound = self_pat_simple bound in
      let body =
        (* TODO: this is hackish *)
        let bound = pat_of_var_pat bound in
        let ctx = enter ctx bound in
        solve_term ctx right
      in
      t_wrap ~loc @@ T_self { bound; body }
  | CT_bind _ | CT_number _ | CT_braces _ | CT_string _ ->
      error_invalid_notation ()

and solve_term_tuple ctx ~acc ~right =
  match
    let (CTerm { term = right; loc = _ }) = right in
    right
  with
  | CT_pair { left; right } ->
      let left = solve_term ctx left in
      let acc = left :: acc in
      solve_term_tuple ctx ~acc ~right
  | CT_var _ | CT_extension _ | CT_forall _ | CT_lambda _ | CT_apply _
  | CT_both _ | CT_bind _ | CT_semi _ | CT_annot _ | CT_string _ | CT_number _
  | CT_parens _ | CT_braces _ ->
      let right = solve_term ctx right in
      List.rev (right :: acc)

and solve_semi ctx ~loc ~left ~right =
  let (CTerm { term = left_desc; loc = _ }) = left in
  match left_desc with
  | CT_bind { bound; value } -> (
      let bound = solve_pat ctx bound in
      (* TODO: just clean this *)
      match bound with
      | MP_simple bound ->
          let bound = pat_of_var_pat bound in
          let arg = solve_term ctx value in
          let body =
            let ctx = enter ctx bound in
            solve_term ctx right
          in
          t_wrap ~loc @@ T_let { bound; arg; body }
      | MP_fancy bound ->
          let arg = solve_term ctx value in
          let body =
            let ctx = enter ctx bound in
            solve_term ctx right
          in
          t_wrap ~loc @@ T_let { bound; arg; body }
      | MP_fix (var, bound) ->
          let arg = solve_term ctx value in
          let body =
            let ctx = close_hoist ctx bound in
            solve_term ctx right
          in
          t_wrap ~loc @@ T_fix { bound; var; arg; body })
  | CT_annot { value = _; annot = _ } ->
      let bound =
        match solve_pat ctx left with
        | MP_simple pat -> pat
        | MP_fancy _pat -> failwith "fancy patterns are not supported on hoist"
        | MP_fix (_var, _pat) ->
            failwith "a variable with the same name is already open"
      in
      let body =
        let ctx = open_hoist ctx bound in
        solve_term ctx right
      in
      t_wrap ~loc @@ T_hoist { bound; body }
  | CT_parens _ | CT_var _ | CT_extension _ | CT_forall _ | CT_lambda _
  | CT_apply _ | CT_pair _ | CT_both _ | CT_semi _ | CT_string _ | CT_number _
  | CT_braces _ ->
      error_invalid_notation ()

(* TODO: this code is kind of ugly *)
and solve_pat ctx pat = solve_pat_simple ctx pat

and solve_pat_simple ctx pat =
  let (CTerm { term = pat_desc; loc }) = pat in
  (* TODO: a bit duplicated *)
  match pat_desc with
  | CT_parens { content = pat } -> solve_pat_simple ctx pat
  | CT_var { var = name } -> (
      match is_hoist ctx name with
      | Some var -> MP_fix (var, vp_wrap ~loc @@ VP_var { var = name })
      | None -> MP_simple (vp_wrap ~loc @@ VP_var { var = name }))
  | CT_annot { value = pat; annot } -> (
      let annot = solve_term ctx annot in
      match solve_pat ctx pat with
      | MP_simple pat -> MP_simple (vp_wrap ~loc @@ VP_annot { pat; annot })
      | MP_fancy pat -> MP_fancy (p_wrap ~loc @@ P_annot { pat; annot })
      | MP_fix (var, pat) ->
          MP_fix (var, vp_wrap ~loc @@ VP_annot { pat; annot }))
  | CT_pair { left = _; right = _ } -> MP_fancy (solve_pat_fancy ctx pat)
  | CT_extension _ | CT_forall _ | CT_lambda _ | CT_apply _ | CT_both _
  | CT_bind _ | CT_semi _ | CT_string _ | CT_number _ | CT_braces _ ->
      error_invalid_notation ()

and solve_pat_fancy ctx pat =
  (* TODO: no duplicated name on pattern *)
  (* TODO: to_ here *)
  let (CTerm { term = pat; loc }) = pat in
  match pat with
  | CT_parens { content = pat } -> solve_pat_fancy ctx pat
  | CT_var { var = name } -> (
      match is_hoist ctx name with
      | Some _var -> failwith "hoist is not supported on fancy patterns"
      | None -> p_wrap ~loc @@ P_var { var = name })
  | CT_annot { value = pat; annot } ->
      let annot = solve_term ctx annot in
      let pat = solve_pat_fancy ctx pat in
      p_wrap ~loc @@ P_annot { pat; annot }
  | CT_pair { left; right } ->
      let left = solve_pat_fancy ctx left in
      let acc = [ left ] in
      let elements = solve_pat_tuple ctx ~acc ~right in
      p_wrap ~loc @@ P_tuple { elements }
  | CT_extension _ | CT_forall _ | CT_lambda _ | CT_apply _ | CT_both _
  | CT_bind _ | CT_semi _ | CT_string _ | CT_number _ | CT_braces _ ->
      error_invalid_notation ()

and solve_pat_tuple ctx ~acc ~right =
  match
    let (CTerm { term = right; loc = _ }) = right in
    right
  with
  | CT_pair { left; right } ->
      let left = solve_pat_fancy ctx left in
      let acc = left :: acc in
      solve_pat_tuple ctx ~acc ~right
  | CT_var _ | CT_extension _ | CT_forall _ | CT_lambda _ | CT_apply _
  | CT_both _ | CT_bind _ | CT_semi _ | CT_annot _ | CT_string _ | CT_number _
  | CT_parens _ | CT_braces _ ->
      let right = solve_pat_fancy ctx right in
      List.rev (right :: acc)

(* external *)
let solve_term ctx term = try Ok (solve_term ctx term) with exn -> Error exn

let initial =
  (* TODO: duplicated from Typer *)
  let next = Level.(next zero) in
  (* TODO: predef somewhere *)
  (* TODO: rename Type to data *)
  let type_ = Name.make "Type" in
  let names = Name.Map.(empty |> add type_ (false, Level.zero)) in
  Context { names; next }


================================================
FILE: teika/solve.mli
================================================
open Syntax

exception Solve_error of { loc : Location.t; exn : exn }

type context

(* TODO: couple all the initial contexts *)
val initial : context
val solve_term : context -> Ctree.term -> (Ttree.term, exn) result


================================================
FILE: teika/terror.ml
================================================
open Utils
open Ttree

(* TODO: too much work to add errors,
   adding here and context is bad*)
type error =
  (* TODO: why track nested locations?
         Probably because things like macros exists *)
  | TError_loc of { error : error; loc : Location.t [@opaque] }
  (* equal *)
  | TError_type_clash
  (* typer *)
  | TError_unknown_var of { name : Name.t }
  | TError_not_a_forall of { type_ : term }
  | TError_hoist_not_implemented
  | TError_extensions_not_implemented
  | TError_pairs_not_implemented
  | TError_unknown_native of { native : string }
  | TError_missing_annotation
  (* elaborate *)
  | TError_invalid_notation

and t = error [@@deriving show { with_path = false }]

exception TError of { error : error }

let () =
  Printexc.register_printer @@ function
  | TError { error } -> Some (show_error error)
  | _ -> None

let terror error = raise (TError { error })
let error_type_clash () = terror @@ TError_type_clash
let error_unknown_var ~name = terror @@ TError_unknown_var { name }
let error_not_a_forall ~type_ = terror @@ TError_not_a_forall { type_ }
let error_hoist_not_implemented () = terror @@ TError_hoist_not_implemented

let error_extensions_not_implemented () =
  terror @@ TError_extensions_not_implemented

let error_pairs_not_implemented () = terror @@ TError_pairs_not_implemented
let error_unknown_native ~native = terror @@ TError_unknown_native { native }
let error_missing_annotation () = terror @@ TError_missing_annotation
let error_invalid_notation () = terror @@ TError_invalid_notation


================================================
FILE: teika/terror.mli
================================================
open Utils
open Ttree

type error =
  (* metadata *)
  | TError_loc of { error : error; loc : Location.t [@opaque] }
  (* equal *)
  | TError_type_clash
  (* TODO: infer *)
  (* typer *)
  | TError_unknown_var of { name : Name.t }
  | TError_not_a_forall of { type_ : term }
  | TError_hoist_not_implemented
  | TError_extensions_not_implemented
  | TError_pairs_not_implemented
  (* TODO: native should not be a string *)
  | TError_unknown_native of { native : string }
  | TError_missing_annotation
  (* elaborate *)
  | TError_invalid_notation

type t = error [@@deriving show]

exception TError of { error : error }

(* TODO: error_loc *)
val error_type_clash : unit -> 'a
val error_unknown_var : name:Name.t -> 'a
val error_not_a_forall : type_:term -> 'a
val error_hoist_not_implemented : unit -> 'a
val error_extensions_not_implemented : unit -> 'a
val error_pairs_not_implemented : unit -> 'a
val error_unknown_native : native:string -> 'a
val error_missing_annotation : unit -> 'a
val error_invalid_notation : unit -> 'a


================================================
FILE: teika/test.ml
================================================
open Syntax

module Typer = struct
  open Teika

  type test =
    | Check of { name : string; annotated_term : string }
    | Fail of { name : string; annotated_term : string }

  let check name annotated_term = Check { name; annotated_term }
  let fail name annotated_term = Fail { name; annotated_term }

  (* TODO: write tests for locations and names / offset *)
  (* TODO: write tests for escape check *)
  let univ_type = check "Type" {|(Type : Type)|}
  let string_type = check "String" {|(String : Type)|}
  let false_type = check "False" {|(A : Type) -> A|}

  let id =
    check "id" {|((A : Type) => (x : A) => x : (A : Type) -> (x : A) -> A)|}

  let id_propagate =
    check "id_propagate" {|((A => x => x) : (A : Type) -> (x : A) -> A)|}

  let id_unify =
    check "id_unify" {|((A => (x : A) => x) : (A : Type) -> (x : A) -> A)|}

  let let_id =
    check "let_id"
      {|((
        id : (A : Type) -> (x : A) -> A = A => (x : A) => x;
        id
      ) : (A : Type) -> (x : A) -> A)|}

  let id_type = check "id_type" {|(((A : Type) => (x : A) => x) Type)|}

  let id_type_never =
    check "id_type_never"
      {|(((A : Type) => (x : A) => x) Type ((A : Type) -> A)
        : Type)|}

  let return_id_propagate =
    check "return_id_propagate"
      {|((((id : (A : Type) -> (x : A) -> A) => id) (A => x => x))
        : (A : Type) -> (x : A) -> A)|}

  let sequence =
    check "sequence"
      {|((A => (x : A) => B => (y : B) => y)
        : (A : Type) -> (x : A) -> (B : Type) -> (y : B) -> B)|}

  let bool =
    check "bool" {|(((A : Type) -> (x : A) -> (y : A) -> A)
        : Type)|}

  let true_ =
    check "true"
      {|(((A : Type) => (x : A) => (y : A) => x)
        : (A : Type) -> (x : A) -> (y : A) -> A)|}

  let true_unify =
    check "true_unify"
      {|(((A : Type) => x => (y : A) => x)
        : (A : Type) -> (x : A) -> (y : A) -> A)|}

  let false_ =
    check "false"
      {|((A => (x : A) => (y : A) => y)
        : (A : Type) -> (x : A) -> (y : A) -> A)|}

  let ind_false_T =
    check "False_T"
      {|
        (@self(False -> (f : @self(f -> @unroll False f)) -> Type) : Type)
      |}

  let ind_false =
    check "False"
      {|
        (@fix(False => f =>
          (P : (f : @self(f -> @unroll False f)) -> Type) -> P f
        ) : @self(False -> (f : @self(f -> @unroll False f)) -> Type))
      |}

  let let_alias =
    check "let_alias"
      {|
        Id : (A : Type) -> Type = (A : Type) => A;
        ((A : Type) => (x : A) => (x : Id A))
      |}

  let simple_string = check "simple_string" {|("simple string" : String)|}

  let rank_2_propagate =
    check "rank_2_propagate"
      {|
        Unit = (A : Type) -> (x : A) -> A;
        (u => u Unit u : (u : Unit) -> Unit)
      |}

  let rank_2_propagate_let =
    check "rank_2_propagate"
      {|
        Unit = (A : Type) -> (x : A) -> A;
        noop : (u : Unit) -> Unit = u => u Unit u;
        noop
      |}

  let invalid_annotation = fail "invalid_annotation" {|(String : "A")|}
  let simplest_escape_check = fail "simplest_escape_check" "x => A => (x : A)"

  let bound_var_escape_check =
    fail "bound_var_escape_check"
      {|
        call = f => v => f v;
        (never : (A : Type) -> A) => call never
      |}

  let hole_lowering_check =
    fail "hole_lowering_check"
      {|
        x => (A : Type) => y => (id => (_ = (id x); _ = id y; (y : A))) (x => x)
      |}

  let trivial_equality =
    check "trivial_equality"
      {|
        Eq = (A : Type) => (x : A) => (y : A) =>
          (P : (z : A) -> Type) -> (l : P x) -> P y;
        refl = (A : Type) => (x : A) =>
          (P : (z : A) -> Type) => (l : P x) => l;
        (refl Type Type : Eq Type Type Type)
      |}

  let split_at_a_distance =
    check "split_at_a_distance"
      {|
        (l : Type) =>
          (f : (X = Type; (A : X) => (x : A) -> A) Type) => f l
      |}

  let nat_256_equality =
    check "nat_256_equality"
      {|
        Eq = (A : Type) => (x : A) => (y : A) =>
          (P : (z : A) -> Type) -> (l : P x) -> P y;
        refl : (A : Type) -> (x : A) -> Eq A x x
          = (A : Type) => (x : A) =>
            (P : (z : A) -> Type) => (l : P x) => l;

        Nat = (A : Type) -> (z : A) -> (s : (acc : A) -> A) -> A;
        zero : Nat = (A : Type) => (z : A) => (s : (acc : A) -> A) => z;
        succ : (pred : Nat) -> Nat = (pred : Nat) => 
          (A : Type) => (z : A) => (s : (acc : A) -> A) => s (pred A z s);
        one = succ zero;

        add = (a : Nat) => (b : Nat) => a Nat b succ;
        mul = (a : Nat) => (b : Nat) => a Nat zero ((n : Nat) => add n b);
        
        two = succ one;
        three = succ two;
        four = add two two;
        eight = add four four;
        sixteen = add eight eight;
        n256 = mul sixteen sixteen;
        sixteen_is_eight_times_two : Eq Nat sixteen (mul eight two)
          = refl Nat sixteen;
        (refl Nat n256 : Eq Nat (mul (mul eight eight) four) n256)
      |}

  let simple_alpha_rename =
    check "simple_alpha_rename"
      {|( (f : (B : Type) -> B) => (f ((C : Type) -> C) : (D : Type) -> D)
        : (f : (E : Type) -> E) -> (F : Type) -> F)|}

  let _tests =
    [
      id_propagate;
      id_unify;
      let_id;
      return_id_propagate;
      sequence;
      true_unify;
      false_;
      ind_false_T;
      ind_false;
      rank_2_propagate;
      rank_2_propagate_let;
      simplest_escape_check;
    ]

  let _tests =
    [
      univ_type;
      string_type;
      false_type;
      id;
      (*
      id_propagate;
      id_unify;
      let_id;
      *)
      id_type;
      id_type_never;
      (* return_id_propagate; *)
      (* sequence; *)
      bool;
      true_;
      (* true_unify; *)
      (* false_; *)
      let_alias;
      simple_string;
      (*
      ind_false_T;
      ind_false;
      rank_2_propagate;
      rank_2_propagate_let;
      *)
      invalid_annotation;
      (* simplest_escape_check; *)
      bound_var_escape_check;
      hole_lowering_check;
      trivial_equality;
      split_at_a_distance;
      nat_256_equality;
      simple_alpha_rename;
    ]

  let _tests =
    [
      check "nat_256_equality"
        {|
          Eq : (A : Type) -> (x : A) -> (y : A) -> Type
            = A => x => y => (P : (z : A) -> Type) -> (l : P x) -> P y;
          refl : (A : Type) -> (x : A) -> Eq A x x
            = A => x => P => l => l;

          Nat = (A : Type) -> (z : A) -> (s : (acc : A) -> A) -> A;
          zero : Nat = A => z => s => z;
          succ : (pred : Nat) -> Nat = pred => A => z => s => s (pred A z s);
          one = succ zero;

          add : (a : Nat) -> (b : Nat) -> Nat
            = a => b => a Nat b succ;
          mul : (a : Nat) -> (b : Nat) -> Nat
            = a => b => a Nat zero (n => add n b);

          two = succ one;
          three = succ two;
          four = add two two;
          eight = add four four;
          sixteen = add eight eight;
          n256 = mul sixteen sixteen;
          n512 = mul n256 two;
          (refl Nat n512 : Eq Nat (mul (mul eight eight) eight) n512)
      |};
    ]

  let tests =
    [
      check "fix"
        {|
          Never : Type;
          Never = (A : Type) -> A;

          Unit : Type;
          unit : Unit;

          Unit = (u : Unit) & (P : (u : Unit) -> Type) -> (x : P(unit)) -> P(u);
          unit = P => x => x;
          ind_unit : (u : Unit) -> (P : (u : Unit) -> Type) ->
            (x : P(unit)) -> P(u) = u => u;

          Bool : Type;
          true : Bool;
          false : Bool;

          Bool = (b : Bool) & (P : (b : Bool) -> Type) ->
            (x : P(true)) -> (y : P(false)) -> P(b);
          true = P => x => y => x;
          false = P => x => y => y;
          ind_bool : (b : Bool) -> (P : (b : Bool) -> Type) ->
            (x : P(true)) -> (y : P(false)) -> P(b) = b => b;

          Equal : (A : Type) -> (x : A) -> (y : A) -> Type;
          refl : (A : Type) -> (x : A) -> Equal A x x;

          Equal = A => x => y => (eq : Equal A x y) &
            (P : (z : A) -> Type) -> (v : P(x)) -> P(y);
          refl = A => x => P => v => v;

          transport : (A : Type) -> (x : A) -> (y : A) -> 
            (H : Equal A x y) -> (P : (z : A) -> Type) -> (v : P(x)) -> P(y);
          transport = A => x => y => H => H;

          true_not_false : (H : Equal(Bool)(true)(false)) -> Never;
          true_not_false = H => (
            P : (b : Bool) -> Type = b => ind_bool(b)(_ => Type)(Unit)(Never);
            transport(Bool)(true)(false)(H)(P)(unit)
          );

          id : (A : Type) -> (x : A) -> A = (
            (A : Type) => (x : A) => x
          );

          true
        |};
    ]

  (* alcotest *)
  let test test =
    let check ~name ~annotated_term =
      Alcotest.test_case name `Quick @@ fun () ->
      let ctree =
        match Clexer.from_string Cparser.term_opt annotated_term with
        | Some ctree -> ctree
        | None -> failwith "failed to parse"
      in
      let ttree =
        match Solve.(solve_term initial ctree) with
        | Ok ttree -> ttree
        | Error exn ->
            failwith
            @@ Format.asprintf "failed to infer types: %s"
                 (Printexc.to_string exn)
      in
      Format.eprintf "ttree : %a\n%!" Ttree.pp_term ttree;
      match Typer.infer_term ttree with
      | Ok _type_ -> Format.eprintf "typed\n%!"
      | Error exn ->
          failwith
          @@ Format.asprintf "failed to infer types: %s"
               (Printexc.to_string exn)
    in
    let fail ~name ~annotated_term =
      Alcotest.test_case name `Quick @@ fun () ->
      let ctree =
        match Clexer.from_string Cparser.term_opt annotated_term with
        | Some ctree -> ctree
        | None -> failwith "failed to parse"
      in
      let ttree =
        match Solve.(solve_term initial ctree) with
        | Ok ttree -> ttree
        | Error exn ->
            failwith
            @@ Format.asprintf "failed to infer types: %s"
                 (Printexc.to_string exn)
      in
      match Typer.infer_term ttree with
      | Ok _type_ -> failwith "worked but should had failed"
      | Error _exn -> ()
    in
    match test with
    | Check { name; annotated_term } -> check ~name ~annotated_term
    | Fail { name; annotated_term } -> fail ~name ~annotated_term

  let tests = ("typer", List.map test tests)
end

let () = Alcotest.run "Teika" [ Typer.tests ]

(* TODO: (n : Nat & n >= 1, x : Nat) should be valid
   *)


================================================
FILE: teika/test.mli
================================================


================================================
FILE: teika/tprinter.ml
================================================
[@@@ocaml.warning "-unused-constructor"]

module Ptree = struct
  open Format
  open Utils

  type term =
    (* TODO: use PT_meta for level, subst and shift *)
    | PT_meta of { term : term }
    | PT_annot of { term : term; annot : term }
    | PT_var of { var : Name.t }
    | PT_free_var of { var : Level.t }
    | PT_bound_var of { var : Index.t }
    | PT_hoist of { bound : term; body : term }
    | PT_let of { bound : term; arg : term; body : term }
    | PT_apply of { funct : term; arg : term }
    | PT_lambda of { param : term; body : term }
    | PT_forall of { param : term; body : term }
    | PT_inter of { left : term; right : term }
    | PT_string of { literal : string }

  type term_prec = T_wrapped | T_let | T_funct | T_apply | T_atom

  let pp_term_syntax ~pp_wrapped ~pp_let ~pp_funct ~pp_apply ~pp_atom fmt term =
    match term with
    | PT_meta { term } -> fprintf fmt "#%a" pp_atom term
    | PT_annot { term; annot } ->
        fprintf fmt "%a : %a" pp_funct term pp_wrapped annot
    | PT_var { var } -> fprintf fmt "%s" (Name.repr var)
    | PT_free_var { var } -> fprintf fmt "\\+%a" Level.pp var
    | PT_bound_var { var } -> fprintf fmt "\\-%a" Index.pp var
    | PT_hoist { bound; body } ->
        (* TODO: is pp_wrapped correct here? *)
        fprintf fmt "%a; %a" pp_wrapped bound pp_let body
    | PT_let { bound; arg; body } ->
        fprintf fmt "%a = %a; %a" pp_atom bound pp_funct arg pp_let body
    | PT_lambda { param; body } ->
        fprintf fmt "%a => %a" pp_atom param pp_funct body
    | PT_apply { funct; arg } -> fprintf fmt "%a %a" pp_apply funct pp_atom arg
    | PT_string { literal } ->
        (* TODO: proper escaping *)
        fprintf fmt "%S" literal
    | PT_forall { param; body } ->
        fprintf fmt "(%a) -> %a" pp_wrapped param pp_funct body
    | PT_inter { left; right } ->
        fprintf fmt "(%a) & %a" pp_wrapped left pp_funct right

  let rec pp_term prec fmt term =
    let pp_wrapped fmt term = pp_term T_wrapped fmt term in
    let pp_let fmt term = pp_term T_let fmt term in
    let pp_funct fmt term = pp_term T_funct fmt term in
    let pp_apply fmt term = pp_term T_apply fmt term in
    let pp_atom fmt term = pp_term T_atom fmt term in
    match (term, prec) with
    | ( (PT_meta _ | PT_var _ | PT_free_var _ | PT_bound_var _ | PT_string _),
        (T_wrapped | T_let | T_funct | T_apply | T_atom) )
    | PT_apply _, (T_wrapped | T_let | T_funct | T_apply)
    | (PT_lambda _ | PT_forall _ | PT_inter _), (T_wrapped | T_let | T_funct)
    | (PT_hoist _ | PT_let _), (T_wrapped | T_let)
    | PT_annot _, T_wrapped ->
        pp_term_syntax ~pp_wrapped ~pp_let ~pp_funct ~pp_apply ~pp_atom fmt term
    | PT_apply _, T_atom
    | (PT_lambda _ | PT_forall _ | PT_inter _), (T_apply | T_atom)
    | (PT_hoist _ | PT_let _), (T_funct | T_apply | T_atom)
    | PT_annot _, (T_let | T_funct | T_apply | T_atom) ->
        fprintf fmt "(%a)" pp_wrapped term

  let pp_term fmt term = pp_term T_wrapped fmt term
end

open Ttree
open Ptree

let _pt_with_type ~type_ term =
  PT_meta { term = PT_annot { term; annot = type_ } }

(* TODO: extract substitutions *)
(* TODO: rename all tt_ to term_ *)
let rec tt_print term =
  let (Term { struct_ = term; loc = _ }) = term in
  match term with
  | T_annot { term; annot } ->
      let term = tt_print term in
      let annot = tt_print annot in
      PT_annot { term; annot }
  | T_var { var } -> PT_bound_var { var }
  | T_let { bound; arg; body } ->
      let bound = tp_print bound in
      let arg = tt_print arg in
      let body = tt_print body in
      PT_let { bound; arg; body }
  | T_hoist { bound; body } ->
      let bound = vp_print bound in
      let body = tt_print body in
      PT_hoist { bound; body }
  | T_fix { bound; var = _; arg; body } ->
      (* TODO: proper var renaming *)
      let bound = vp_print bound in
      let arg = tt_print arg in
      let body = tt_print body in
      PT_let { bound; arg; body }
  | T_lambda { bound; body } ->
      let param = tp_print bound in
      let body = tt_print body in
      PT_lambda { param; body }
  | T_apply { funct; arg } ->
      let funct = tt_print funct in
      let arg = tt_print arg in
      PT_apply { funct; arg }
  | T_forall { bound; param; body } ->
      let param =
        let pat = tp_print bound in
        let annot = tt_print param in
        PT_annot { term = pat; annot }
      in
      let body = tt_print body in
      PT_forall { param; body }
  | T_self { bound; body } ->
      let left = vp_print bound in
      let right = tt_print body in
      (* TODO: self *)
      PT_inter { left; right }
  | T_tuple _ | T_exists _ -> failwith "not implemented"

and vp_print pat =
  let (VPat { struct_ = pat; loc = _ }) = pat in
  match pat with
  | VP_annot { pat; annot } ->
      let pat = vp_print pat in
      let annot = tt_print annot in
      PT_annot { term = pat; annot }
  | VP_var { var } -> PT_var { var }

and tp_print pat =
  let (Pat { struct_ = pat; loc = _ }) = pat in
  match pat with
  | P_annot { pat; annot } ->
      let pat = tp_print pat in
      let annot = tt_print annot in
      PT_annot { term = pat; annot }
  | P_var { var } -> PT_var { var }
  | P_tuple _ -> failwith "not implemented"

let pp_term fmt term =
  let term = tt_print term in
  Ptree.pp_term fmt term

let pp_pat fmt pat =
  let pat = tp_print pat in
  Ptree.pp_term fmt pat

module Perror = struct
  open Format
  open Utils
  open Ptree

  type error =
    | PE_loc of { loc : Location.t; error : error }
    | PE_type_clash
    | PE_unknown_var of { name : Name.t }
    | PE_not_a_forall of { type_ : term }
    | PE_hoist_not_implemented
    | PE_extensions_not_implemented
    | PE_pairs_not_implemented
    | PE_unknown_native of { native : string }
    | PE_missing_annotation
    | PE_invalid_notation

  let pp_pos fmt pos =
    let Lexing.{ pos_fname; pos_lnum; pos_bol; pos_cnum = _ } = pos in
    (* TODO: print only file by default? *)
    fprintf fmt "%s:%d:%d" pos_fname pos_lnum pos_bol

  let pp_loc fmt loc =
    let Location.{ loc_start; loc_end; loc_ghost = _ } = loc in
    match Location.is_none loc with
    | true -> fprintf fmt "[__NONE__]"
    | false -> fprintf fmt "[%a .. %a]" pp_pos loc_start pp_pos loc_end

  let rec pp_error fmt error =
    match error with
    | PE_loc { loc; error } -> fprintf fmt "%a\n%a" pp_loc loc pp_error error
    | PE_type_clash -> fprintf fmt "type clash"
    | PE_unknown_var { name } -> fprintf fmt "unknown variable %a" Name.pp name
    | PE_not_a_forall { type_ } ->
        fprintf fmt "expected forall\nreceived : %a" pp_term type_
    | PE_hoist_not_implemented -> fprintf fmt "hoist not implemented"
    | PE_extensions_not_implemented -> fprintf fmt "extensions not implemented"
    | PE_pairs_not_implemented -> fprintf fmt "pairs not implemented"
    | PE_unknown_native { native } -> fprintf fmt "unknown native : %S" native
    (* TODO: rename missing annotation *)
    | PE_missing_annotation -> fprintf fmt "not enough annotations"
    | PE_invalid_notation -> fprintf fmt "invalid notation"
end

let rec te_print error =
  let open Terror in
  let open Perror in
  match error with
  | TError_loc { error; loc } ->
      let rec loop loc error =
        match error with
        | TError_loc { error; loc = loc' } ->
            let loc =
              (* ignore none locations *)
              match Location.is_none loc' with
              | true -> loc
              | false -> loc'
            in
            loop loc error
        | error ->
            let error = te_print error in
            PE_loc { loc; error }
      in
      loop loc error
  | TError_type_clash -> PE_type_clash
  | TError_unknown_var { name } -> PE_unknown_var { name }
  | TError_not_a_forall { type_ } ->
      let type_ = tt_print type_ in
      PE_not_a_forall { type_ }
  | TError_extensions_not_implemented -> PE_extensions_not_implemented
  | TError_hoist_not_implemented -> PE_hoist_not_implemented
  | TError_pairs_not_implemented ->
      PE_pairs_not_implemented (* TODO: print payload *)
  | TError_unknown_native { native } -> PE_unknown_native { native }
  | TError_missing_annotation -> PE_missing_annotation
  | TError_invalid_notation -> PE_invalid_notation

let pp_error fmt error =
  let error = te_print error in
  Perror.pp_error fmt error


================================================
FILE: teika/tprinter.mli
================================================
open Format
open Ttree
open Terror

val pp_term : formatter -> term -> unit
val pp_pat : formatter -> pat -> unit
val pp_error : Format.formatter -> error -> unit


================================================
FILE: teika/ttree.ml
================================================
open Utils

(* TODO: explicit unfold for loops on terms *)
type term = Term of { struct_ : term_struct; loc : Location.t [@opaque] }

and term_struct =
  (* (M : A) *)
  | T_annot of { term : term; annot : term }
  (* \n *)
  | T_var of { var : Index.t }
  (* P = N; M *)
  | T_let of { bound : pat; arg : term; body : term }
  (* x : A; M *)
  | T_hoist of { bound : var_pat; body : term }
  (* x : A; ...; x = N; M *)
  | T_fix of { bound : var_pat; var : Index.t; arg : term; body : term }
  (* P => M *)
  | T_lambda of { bound : pat; body : term }
  (* M N *)
  | T_apply of { funct : term; arg : term }
  (* (P : A) -> B *)
  | T_forall of { bound : pat; param : term; body : term }
  (* (P : A) & B *)
  | T_self of { bound : var_pat; body : term }
  (* (M, ...) *)
  | T_tuple of { elements : term list }
  (* (x : A, ...) *)
  | T_exists of { elements : pat list }

and var_pat = VPat of { struct_ : var_pat_struct; loc : Location.t [@opaque] }

and var_pat_struct =
  (* (P : A) *)
  | VP_annot of { pat : var_pat; annot : term }
  (* x *)
  | VP_var of { var : Name.t }

and pat = Pat of { struct_ : pat_struct; loc : Location.t [@opaque] }

and pat_struct =
  (* (P : A) *)
  | P_annot of { pat : pat; annot : term }
  (* x *)
  (* TODO: drop names and uses receipts *)
  | P_var of { var : Name.t }
  (* (x, ...) *)
  | P_tuple of { elements : pat list }
[@@deriving show { with_path = false }]

let t_wrap ~loc struct_ = Term { struct_; loc }
let vp_wrap ~loc struct_ = VPat { struct_; loc }
let p_wrap ~loc struct_ = Pat { struct_; loc }


================================================
FILE: teika/ttree.mli
================================================
open Utils

(* TODO: explicit unfold for loops on terms *)
type term = Term of { struct_ : term_struct; loc : Location.t [@opaque] }

and term_struct =
  (* (M : A) *)
  | T_annot of { term : term; annot : term }
  (* \n *)
  | T_var of { var : Index.t }
  (* P = N; M *)
  | T_let of { bound : pat; arg : term; body : term }
  (* x : A; M *)
  | T_hoist of { bound : var_pat; body : term }
  (* x : A; ...; x = N; M *)
  | T_fix of { bound : var_pat; var : Index.t; arg : term; body : term }
  (* P => M *)
  | T_lambda of { bound : pat; body : term }
  (* M N *)
  | T_apply of { funct : term; arg : term }
  (* (P : A) -> B *)
  | T_forall of { bound : pat; param : term; body : term }
  (* (P : A) & B *)
  | T_self of { bound : var_pat; body : term }
  (* (M, ...) *)
  | T_tuple of { elements : term list }
  (* (x : A, ...) *)
  | T_exists of { elements : pat list }

and var_pat = VPat of { struct_ : var_pat_struct; loc : Location.t }

and var_pat_struct =
  (* (P : A) *)
  | VP_annot of { pat : var_pat; annot : term }
  (* x *)
  | VP_var of { var : Name.t }

and pat = Pat of { struct_ : pat_struct; loc : Location.t }

and pat_struct =
  (* (P : A) *)
  | P_annot of { pat : pat; annot : term }
  (* x *)
  (* TODO: drop names and uses receipts *)
  | P_var of { var : Name.t }
  (* (x, ...) *)
  | P_tuple of { elements : pat list }
[@@deriving show]

val t_wrap : loc:Location.t -> term_struct -> term
val vp_wrap : loc:Location.t -> var_pat_struct -> var_pat
val p_wrap : loc:Location.t -> pat_struct -> pat


================================================
FILE: teika/typer.ml
================================================
open Utils
open Ttree
open Terror

module Value : sig
  type value

  and value_struct =
    | V_hole
    (* TODO: name on var? *)
    | V_var of { name : Name.t }
    | V_forward of { name : Name.t; mutable inner : value }
    | V_apply of { funct : value; arg : value }
    | V_lambda of { env : env; bound : pat; body : term }
    | V_univ
    | V_forall of { param : value; env : env; bound : pat; body : term }
    | V_self of { env : env; bound : var_pat; body : term }
    | V_thunk of { env : env; term : term }
    | V_link of { mutable value : value }

  and env [@@deriving show]

  (* environment *)
  val empty : env
  val access : env -> Index.t -> value
  val append : env -> value -> env

  (* constructors *)
  val v_null : value
  val v_var : at:Level.t -> name:Name.t -> value
  val fresh_v_hole : at:Level.t -> value
  val fresh_v_forward : name:Name.t -> value
  val v_apply : funct:value -> arg:value -> value
  val v_lambda : env:env -> bound:pat -> body:term -> value
  val v_univ : value
  val v_forall : param:value -> env:env -> bound:pat -> body:term -> value
  val v_self : env:env -> bound:var_pat -> body:term -> value
  val v_thunk : env:env -> term:term -> value

  (* utilities *)
  val repr : value -> value
  val struct_ : value -> value_struct
  val level : value -> Level.t
  val same : value -> value -> bool
  val assert_forward : value -> unit
  val init_forward : value -> to_:value -> unit
  val lock_forward : value -> (unit -> 'a) -> 'a
  val hole_lower : value -> to_:Level.t -> unit
  val hole_link : value -> to_:value -> unit
  val thunk_link : value -> to_:value -> unit
end = struct
  type value = { mutable struct_ : value_struct; mutable at : Level.t }

  and value_struct =
    | V_hole
    | V_var of { name : Name.t }
    | V_forward of { name : Name.t; mutable inner : value [@opaque] }
    | V_apply of { funct : value; arg : value }
    | V_lambda of { env : env; [@opaque] bound : pat; body : term }
      (* TODO: is univ actually needed or useful here? *)
    | V_univ
    (* TODO: non dependent version of types and function *)
    | V_forall of {
        param : value;
        env : env; [@opaque]
        bound : pat;
        body : term;
      }
    | V_self of { env : env; [@opaque] bound : var_pat; body : term }
    | V_thunk of { env : env; [@opaque] term : term }
    | V_link of { mutable value : value }

  and env = value list [@@deriving show { with_path = false }]

  let v_new ~at struct_ = { struct_; at }

  let v_null =
    let name = Name.make "**null**" in
    v_new ~at:Level.zero @@ V_var { name }

  let v_var ~at ~name = v_new ~at @@ V_var { name }
  let fresh_v_hole ~at = v_new ~at @@ V_hole

  let fresh_v_forward ~name =
    (* TODO: proper level here *)
    let at = Level.zero in
    v_new ~at @@ V_forward { name; inner = v_null }

  let v_apply ~funct ~arg =
    let at = Level.max funct.at arg.at in
    v_new ~at @@ V_apply { funct; arg }

  let v_lambda ~env ~bound ~body =
    (* TODO: proper level for lambdas *)
    let at = Level.zero in
    v_new ~at @@ V_lambda { env; bound; body }

  let v_univ = v_new ~at:Level.zero @@ V_univ

  let v_forall ~param ~env ~bound ~body =
    (* TODO: proper level for forall *)
    let at = Level.zero in
    v_new ~at @@ V_forall { param; env; bound; body }

  let v_self ~env ~bound ~body =
    (* TODO: proper level for self *)
    let at = Level.zero in
    v_new ~at @@ V_self { env; bound; body }

  let v_thunk ~env ~term =
    (* TODO: proper level here *)
    let at = Level.zero in
    v_new ~at @@ V_thunk { env; term }

  let rec repr value =
    match value.struct_ with
    | V_link { value } -> repr value
    | V_hole | V_var _ | V_forward _ | V_apply _ | V_lambda _ | V_univ
    | V_forall _ | V_self _ | V_thunk _ ->
        value

  (* TODO: inline repr? *)
  let repr value =
    match value.struct_ with
    | V_link ({ value } as link) ->
        (* path compression *)
        let value = repr value in
        link.value <- value;
        value
    | V_hole | V_var _ | V_forward _ | V_apply _ | V_lambda _ | V_univ
    | V_forall _ | V_self _ | V_thunk _ ->
        value

  let struct_ value = (repr value).struct_

  (* TODO: level vs at *)
  let level value = (repr value).at
  let same (left : value) (right : value) = left == right

  let assert_forward value =
    match value.struct_ with
    | V_forward { name = _; inner = _ } -> ()
    | V_hole | V_var _ | V_apply _ | V_lambda _ | V_univ | V_forall _ | V_self _
    | V_thunk _ | V_link _ ->
        failwith "assert_forward: not a forward"

  let init_forward value ~to_ =
    let value = repr value in
    match value.struct_ with
    | V_forward ({ name = _; inner } as forward) -> (
        match same inner v_null with
        | true -> forward.inner <- to_
        | false -> failwith "init_forward: already initialized")
    | V_hole | V_var _ | V_apply _ | V_lambda _ | V_univ | V_forall _ | V_self _
    | V_thunk _ | V_link _ ->
        failwith "init_forward: not a forward"

  let lock_forward value f =
    match struct_ value with
    | V_forward ({ name = _; inner } as forward) ->
        forward.inner <- v_null;
        let finally () = forward.inner <- inner in
        Fun.protect ~finally f
    | V_hole | V_var _ | V_apply _ | V_lambda _ | V_univ | V_forall _ | V_self _
    | V_thunk _ | V_link _ ->
        failwith "lock_forward: not a forward"

  let hole_lower hole ~to_ =
    let hole = repr hole in
    (match hole.struct_ with
    | V_hole -> ()
    | _ -> failwith "hole_lower: not a hole");
    hole.at <- Level.min hole.at to_

  let hole_link hole ~to_ =
    let hole = repr hole in
    (match hole.struct_ with
    | V_hole -> ()
    | _ -> failwith "link_hole: not a hole");
    hole.struct_ <- V_link { value = to_ }

  let thunk_link thunk ~to_ =
    let thunk = repr thunk in
    (match thunk.struct_ with
    | V_thunk _ -> ()
    | _ -> failwith "link_thunk: not a thunk");
    thunk.struct_ <- V_link { value = to_ }

  let empty = []

  let access env var =
    let var = (var : Index.t :> int) in
    match List.nth_opt env var with
    | Some value -> value
    | None -> failwith "lookup: unknown variable"

  let append env value = value :: env
end

module Eval = struct
  open Value

  let rec with_var_pat env bound ~arg =
    let (VPat { struct_ = bound; loc = _ }) = bound in
    match bound with
    | VP_annot { pat; annot = _ } -> with_var_pat env pat ~arg
    | VP_var { var = _ } ->
        (* TODO: name and maybe type here? *)
        append env arg

  let rec with_pat env bound ~arg =
    let (Pat { struct_ = bound; loc = _ }) = bound in
    match bound with
    | P_annot { pat; annot = _ } -> with_pat env pat ~arg
    | P_var { var = _ } ->
        (* TODO: name and maybe type here? *)
        append env arg
    | P_tuple { elements = _ } -> failwith "not implemented"

  let rec fresh_v_forward_of_var_pat pat =
    let (VPat { struct_ = pat; loc = _ }) = pat in
    match pat with
    | VP_annot { pat; annot = _ } -> fresh_v_forward_of_var_pat pat
    | VP_var { var = name } -> fresh_v_forward ~name

  let rec eval env term =
    let (Term { struct_ = term; loc = _ }) = term in
    match term with
    | T_annot { term; annot = _ } -> eval env term
    | T_var { var } -> weak_force @@ access env var
    | T_hoist { bound; body } ->
        let env =
          let arg = fresh_v_forward_of_var_pat bound in
          with_var_pat env bound ~arg
        in
        eval env body
    | T_fix { bound = _; var; arg; body } ->
        let forward = access env var in
        let () = assert_forward forward in
        let () =
          let arg = eval env arg in
          init_forward forward ~to_:arg
        in
        eval env body
    | T_let { bound; arg; body } ->
        let env =
          let arg = eval env arg in
          with_pat env bound ~arg
        in
        eval env body
    | T_apply { funct; arg } ->
        let funct = eval env funct in
        let arg = eval env arg in
        eval_apply ~funct ~arg
    | T_lambda { bound; body } -> v_lambda ~env ~bound ~body
    | T_forall { bound; param; body } ->
        let param = eval env param in
        v_forall ~param ~env ~bound ~body
    | T_self { bound; body } -> v_self ~env ~bound ~body
    | T_tuple _ | T_exists _ -> failwith "not implemented"

  and eval_apply ~funct ~arg =
    let funct = weak_force funct in
    match struct_ funct with
    | V_lambda { env; bound; body } ->
        let env = with_pat env bound ~arg in
        eval env body
    | V_var _ | V_forward _ | V_apply _ -> v_apply ~funct ~arg
    | V_hole | V_univ | V_forall _ | V_self _ ->
        failwith "eval_apply: type clash"
    | V_link _ | V_thunk _ -> failwith "eval_apply: unreacheable"

  and weak_force value =
    (* TODO: forcing every time removes some short circuits *)
    let value = repr value in
    match struct_ value with
    | V_thunk { env; term } ->
        (* TODO: detect recursive force? *)
        let final = eval env term in
        thunk_link value ~to_:final;
        final
    | V_hole | V_var _ | V_forward _ | V_apply _ | V_lambda _ | V_univ
    | V_forall _ | V_self _ | V_link _ ->
        value

  let rec strong_force value =
    (* TODO: forcing every time removes some short circuits *)
    (* TODO: path compression is bad for reasons *)
    let value = weak_force value in
    match struct_ value with
    | V_forward { name = _; inner } -> (
        match same inner v_null with
        | true -> value
        | false -> strong_force inner)
    | V_apply { funct; arg } ->
        let funct = strong_force funct in
        strong_eval_apply ~funct ~arg
    | V_hole | V_var _ | V_lambda _ | V_univ | V_forall _ | V_self _ | V_link _
    | V_thunk _ ->
        value

  and strong_eval_apply ~funct ~arg =
    match struct_ funct with
    | V_lambda { env; bound; body } ->
        let env = with_pat env bound ~arg in
        strong_force @@ eval env body
    | V_var _ | V_forward _ | V_apply _ -> v_apply ~funct ~arg
    | V_hole | V_univ | V_forall _ | V_self _ ->
        failwith "strong_eval_apply: type clash"
    | V_link _ | V_thunk _ -> failwith "strong_eval_apply: unreacheable"
end

module Unify = struct
  open Value
  open Eval

  let rec unify_check ~at ~hole in_ =
    (* TODO: short circuit on level *)
    (* TODO: color to avoid size explosion *)
    let in_ = weak_force in_ in
    match struct_ in_ with
    | V_hole -> (
        match same hole in_ with
        | true -> failwith "occurs check"
        | false -> hole_lower in_ ~to_:at)
    | V_var { name = _ } -> (
        (* TODO: poly comparison *)
        match level in_ >= at with
        | true -> failwith "escape check"
        | false -> ())
    | V_forward { name = _; inner } ->
        lock_forward in_ @@ fun () -> unify_check ~at ~hole inner
    | V_apply { funct; arg } ->
        unify_check ~at ~hole funct;
        unify_check ~at ~hole arg
    | V_univ -> ()
    | V_lambda { env; bound = _; body } -> unify_check_under ~at ~hole env body
    | V_forall { param; env; bound = _; body } ->
        unify_check ~at ~hole param;
        unify_check_under ~at ~hole env body
    | V_self { env; bound = _; body } -> unify_check_under ~at ~hole env body
    | V_thunk _ | V_link _ -> failwith "unify_check: unreacheable"

  and unify_check_under ~at ~hole env body =
    (* TODO: fill this *)
    let name = Name.make "**unify_check_under**" in
    let skolem = v_var ~at ~name in
    let at = Level.next at in
    let body =
      let env = append env skolem in
      eval env body
    in
    unify_check ~at ~hole body

  let unify_hole ~at ~hole ~to_ =
    match same hole to_ with
    | true -> ()
    | false ->
        unify_check ~at ~hole to_;
        hole_link hole ~to_

  let rec unify ~at lhs rhs =
    (* TODO: do repr shortcircuit first *)
    let lhs = weak_force lhs in
    let rhs = weak_force rhs in
    match same lhs rhs with true -> () | false -> unify_struct ~at lhs rhs

  and unify_struct ~at lhs rhs =
    match (struct_ lhs, struct_ rhs) with
    | V_hole, _ -> unify_hole ~at ~hole:lhs ~to_:rhs
    | _, V_hole -> unify_hole ~at ~hole:rhs ~to_:lhs
    | V_var { name = _ }, V_var { name = _ } -> failwith "var clash"
    | ( V_forward { name = lhs_name; inner = lhs_inner },
        V_forward { name = rhs_name; inner = rhs_inner } ) -> (
        match same lhs_inner v_null || same rhs_inner v_null with
        | true ->
            failwith
            @@ Format.asprintf "forward clash: %s == %s" (Name.repr lhs_name)
                 (Name.repr rhs_name)
        | false ->
            (* TODO: is this a good idea? *)
            lock_forward lhs @@ fun () ->
            lock_forward rhs @@ fun () -> unify ~at lhs_inner rhs_inner)
    | ( V_apply { funct = lhs_funct; arg = lhs_arg },
        V_apply { funct = rhs_funct; arg = rhs_arg } ) ->
        unify ~at lhs_funct rhs_funct;
        unify ~at lhs_arg rhs_arg
    | ( V_lambda { env = lhs_env; bound = _; body = lhs_body },
        V_lambda { env = rhs_env; bound = _; body = rhs_body } ) ->
        unify_under ~at lhs_env lhs_body rhs_env rhs_body
    | V_univ, V_univ -> ()
    | ( V_forall { param = lhs_param; env = lhs_env; bound = _; body = lhs_body },
        V_forall
          { param = rhs_param; env = rhs_env; bound = _; body = rhs_body } ) ->
        unify ~at lhs_param rhs_param;
        unify_under ~at lhs_env lhs_body rhs_env rhs_body
    | ( V_self { env = lhs_env; bound = _; body = lhs_body },
        V_self { env = rhs_env; bound = _; body = rhs_body } ) ->
        (* TODO: check only bound? *)
        unify_under ~at lhs_env lhs_body rhs_env rhs_body
    | ( ( V_var _ | V_forward _ | V_apply _ | V_lambda _ | V_univ | V_forall _
        | V_self _ | V_thunk _ | V_link _ ),
        ( V_var _ | V_forward _ | V_apply _ | V_lambda _ | V_univ | V_forall _
        | V_self _ | V_thunk _ | V_link _ ) ) ->
        error_type_clash ()

  and unify_under ~at lhs_env lhs rhs_env rhs =
    (* TODO: should use pattern *)
    (* TODO: fill this *)
    let name = Name.make "**unify_check_under**" in
    let skolem = v_var ~at ~name in
    let at = Level.next at in
    let lhs =
      let lhs_env = append lhs_env skolem in
      eval lhs_env lhs
    in
    let rhs =
      let rhs_env = append rhs_env skolem in
      eval rhs_env rhs
    in
    unify ~at lhs rhs
end

module Machinery = struct
  open Value
  open Eval

  let rec inst_self ~self type_ =
    let type_ = strong_force type_ in
    match struct_ @@ type_ with
    | V_self { env; bound; body } ->
        let type_ =
          let env = with_var_pat env bound ~arg:self in
          eval env body
        in
        inst_self ~self type_
    | V_hole | V_var _ | V_forward _ | V_apply _ | V_lambda _ | V_univ
    | V_forall _ | V_thunk _ | V_link _ ->
        type_

  let split_forall value =
    let value = strong_force value in
    match struct_ value with
    | V_forall { param; env; bound; body } -> (param, env, bound, body)
    | V_hole -> failwith "hole is not a forall"
    | V_var _ | V_forward _ | V_apply _ | V_lambda _ | V_univ | V_self _
    | V_thunk _ | V_link _ ->
        failwith "not a forall"

  let coerce ~at ~self lhs rhs =
    let lhs = inst_self ~self lhs in
    (* TODO: this is really bad *)
    let rhs = inst_self ~self rhs in
    Format.eprintf "%a == %a\n%!" pp_value lhs pp_value rhs;
    Unify.unify ~at lhs rhs
end

open Value
open Eval
open Unify
open Machinery

type value = Value.value

(* infer *)
type vars = Vars of { types : (Name.t * value) list }
[@@ocaml.unboxed] [@@deriving show { with_path = false }]

let rec v_skolem ~at pat =
  let (Pat { struct_ = pat; loc = _ }) = pat in
  match pat with
  | P_annot { pat; annot = _ } -> v_skolem ~at pat
  | P_var { var = name } -> v_var ~at ~name
  | P_tuple _ -> failwith "not implemented"

let rec v_skolem_of_var_pat ~at pat =
  let (VPat { struct_ = pat; loc = _ }) = pat in
  match pat with
  | VP_annot { pat; annot = _ } -> v_skolem_of_var_pat ~at pat
  | VP_var { var = name } -> v_var ~at ~name

let rec enter vars pat ~type_ =
  let (Pat { struct_ = pat; loc = _ }) = pat in
  match pat with
  | P_annot { pat; annot = _ } -> enter vars pat ~type_
  | P_var { var = name } ->
      let (Vars { types }) = vars in
      (* TODO: why this *)
      let type_ =
        (* TODO: thunk strong force *)
        strong_force type_
      in
      let types = (name, type_) :: types in
      Vars { types }
  | P_tuple _ -> failwith "not implemented"

let rec enter_var_pat vars pat ~type_ =
  let (VPat { struct_ = pat; loc = _ }) = pat in
  match pat with
  | VP_annot { pat; annot = _ } -> enter_var_pat vars pat ~type_
  | VP_var { var = name } ->
      let (Vars { types }) = vars in
      (* TODO: why this *)
      let type_ =
        (* TODO: thunk strong force *)
        strong_force type_
      in
      let types = (name, type_) :: types in
      Vars { types }

let solve vars env var =
  let rec solve types var =
    match (types, var) with
    | (_name, type_) :: _types, 0 -> type_
    | (_name, _type) :: types, var -> solve types (var - 1)
    | [], _var ->
        (* TODO: this is a problem *)
        failwith "unexpected unbound variable"
  in
  let (Vars { types }) = vars in
  let self = access env var in
  let var = ((var : Index.t) :> int) in
  let type_ = solve types var in
  inst_self ~self type_

let rec type_of_pat env pat ~type_ =
  let (Pat { struct_ = pat; loc = _ }) = pat in
  match pat with
  | P_annot { pat; annot } ->
      let type_ = v_thunk ~env ~term:annot in
      type_of_pat env pat ~type_
  | P_var { var = _ } -> type_
  | P_tuple _ -> failwith "not implemented"

let rec type_of_var_pat env pat ~type_ =
  let (VPat { struct_ = pat; loc = _ }) = pat in
  match pat with
  | VP_annot { pat; annot } ->
      let type_ = v_thunk ~env ~term:annot in
      type_of_var_pat env pat ~type_
  | VP_var { var = _ } -> type_

(* TODO: ideally ensure that infer_term returns head normalized type *)
let rec infer_term vars env ~at term =
  let expected_self = None in
  let expected = fresh_v_hole ~at in
  check_term vars env ~at term ~expected_self ~expected;
  (* TODO: is this correct or a good idea? *)
  let self = v_thunk ~env ~term in
  inst_self ~self expected

and check_term vars env ~at term ~expected_self ~expected =
  (* TODO: not principled, let and annot will break this *)
  let (Term { struct_ = term; loc = _ }) = term in
  match term with
  | T_annot { term; annot } ->
      let annot = check_annot vars env ~at annot ~expected_self ~expected in
      check_term vars env ~at term ~expected_self ~expected:annot
  | T_var { var } ->
      (* TODO: use expected_self? *)
      let received = solve vars env var in
      let self = access env var in
      coerce ~at ~self received expected
  | T_hoist { bound; body } ->
      (* TODO: ensure it's eventually bound *)
      let type_ = infer_var_pat vars env ~at bound in
      let vars = enter_var_pat vars bound ~type_ in
      let arg = fresh_v_forward_of_var_pat bound in
      let env = with_var_pat env bound ~arg in
      check_term vars env ~at body ~expected_self ~expected
  | T_fix { bound; var; arg; body } ->
      (* TODO: ensure it's not trivially recursive? A = M(A) *)
      let forward = access env var in
      let () = assert_forward forward in
      let () =
        let expected = solve vars env var in
        check_var_pat vars env ~at bound ~expected;
        let self = forward in
        let expected_self = Some self in
        let expected = type_of_var_pat env bound ~type_:expected in
        check_term vars env ~at arg ~expected_self ~expected
      in
      let () =
        let arg = v_thunk ~env ~term:arg in
        init_forward forward ~to_:arg
      in
      let expected =
        (* TODO: this could unlock some reductions *)
        match expected_self with
        | Some self -> inst_self ~self expected
        | None -> expected
      in
      check_term vars env ~at body ~expected_self ~expected
  | T_let { bound; arg; body } ->
      let type_ = infer_pat vars env ~at bound in
      let () =
        check_term vars env ~at arg ~expected_self:None ~expected:type_
      in
      let arg = v_thunk ~env ~term:arg in
      let vars = enter vars bound ~type_ in
      let env = with_pat env bound ~arg in
      check_term vars env ~at body ~expected_self ~expected
  | T_lambda { bound; body } ->
      let expected_param, expected_env, expected_bound, expected_body =
        split_forall expected
      in
      let () = check_pat vars env ~at bound ~expected:expected_param in
      let param = type_of_pat env bound ~type_:expected_param in
      let skolem = v_skolem ~at bound in
      let vars = enter vars bound ~type_:param in
      let env = with_pat env bound ~arg:skolem in
      let at = Level.next at in
      let expected =
        let env = with_pat expected_env expected_bound ~arg:skolem in
        eval env expected_body
      in
      let expected = strong_force expected in
      let expected, expected_self =
        match expected_self with
        | Some self ->
            let self = strong_force self in
            let self = eval_apply ~funct:self ~arg:skolem in
            let expected = inst_self ~self expected in
            (expected, Some self)
        | None -> (expected, None)
      in
      check_term vars env ~at body ~expected_self ~expected
  | T_apply { funct; arg } ->
      let funct_type = infer_term vars env ~at funct in
      let param, body_env, bound, body_type = split_forall funct_type in
      let () =
        check_term vars env ~at arg ~expected_self:None ~expected:param
      in
      let received =
        let arg = v_thunk ~env ~term:arg in
        let body_env = with_pat body_env bound ~arg in
        eval body_env body_type
      in
      (* TODO: coerce? *)
      unify ~at received expected
  | T_forall { bound; param; body } ->
      unify ~at v_univ expected;
      let () =
        check_term vars env ~at param ~expected_self:None ~expected:v_univ
      in
      let param = eval env param in
      check_pat vars env ~at bound ~expected:param;
      let skolem = v_skolem ~at bound in
      let at = Level.next at in
      let vars = enter vars bound ~type_:param in
      let env = append env skolem in
      check_term vars env ~at body ~expected_self:None ~expected:v_univ
  | T_self { bound; body } ->
      (* TODO: this is really ugly *)
      unify ~at v_univ expected;
      let expected_self =
        match expected_self with
        | Some expected_self -> expected_self
        | None -> failwith "self is only supported in a fixpoint"
      in
      check_var_pat vars env ~at bound ~expected:expected_self;
      let type_ = type_of_var_pat env bound ~type_:expected_self in
      let skolem = v_skolem_of_var_pat ~at bound in
      let at = Level.next at in
      let vars = enter_var_pat vars bound ~type_ in
      let env = with_var_pat env bound ~arg:skolem in
      check_term vars env ~at body ~expected_self:None ~expected:v_univ
  | T_tuple _ | T_exists _ -> failwith "not implemented"

and check_annot vars env ~at annot ~expected_self ~expected =
  check_term vars env ~at annot ~expected_self:None ~expected:v_univ;
  let received = eval env annot in
  match expected_self with
  | Some self ->
      let received = inst_self ~self received in
      coerce ~at ~self received expected;
      received
  | None ->
      unify ~at received expected;
      received

and infer_var_pat vars env ~at pat =
  let expected = fresh_v_hole ~at in
  check_var_pat vars env ~at pat ~expected;
  expected

and check_var_pat vars env ~at pat ~expected =
  let (VPat { struct_ = pat_struct; loc = _ }) = pat in
  match pat_struct with
  | VP_annot { pat; annot } ->
      let annot =
        check_annot vars env ~at annot ~expected_self:None ~expected
      in
      check_var_pat vars env ~at pat ~expected:annot
  | VP_var { var = _ } -> ()

and infer_pat vars env ~at pat =
  let expected = fresh_v_hole ~at in
  check_pat vars env ~at pat ~expected;
  expected

and check_pat vars env ~at pat ~expected =
  let (Pat { struct_ = pat_struct; loc = _ }) = pat in
  match pat_struct with
  | P_annot { pat; annot } ->
      let annot =
        check_annot vars env ~at annot ~expected_self:None ~expected
      in
      check_pat vars env ~at pat ~expected:annot
  | P_var { var = _ } -> ()
  | P_tuple _ -> failwith "not implemented"

(* external *)
let infer_term term =
  let at = Level.(next zero) in
  let vars =
    let types = [ (Name.make "Type", v_univ) ] in
    Vars { types }
  in
  let env = append empty v_univ in
  try Ok (infer_term vars env ~at term) with exn -> Error exn


================================================
FILE: teika/typer.mli
================================================
open Ttree

type value

val infer_term : term -> (value, exn) result


================================================
FILE: teikalsp/dune
================================================
(executable
 (name teikalsp)
 (libraries lsp eio eio_main)
 (preprocess
  (pps ppx_deriving.show ppx_deriving.eq ppx_deriving.ord sedlex.ppx)))


================================================
FILE: teikalsp/lsp_channel.ml
================================================
module Io : sig
  type 'a t

  val return : 'a -> 'a t
  val raise : exn -> 'a t
  val await : 'a t -> 'a
  val async : (sw:Eio.Switch.t -> ('a, exn) result) -> 'a t

  module O : sig
    val ( let+ ) : 'a t -> ('a -> 'b) -> 'b t
    val ( let* ) : 'a t -> ('a -> 'b t) -> 'b t
  end
end = struct
  type 'a t = sw:Eio.Switch.t -> ('a, exn) result Eio.Promise.t

  let await t = Eio.Switch.run @@ fun sw -> Eio.Promise.await_exn (t ~sw)
  let return value ~sw:_ = Eio.Promise.create_resolved (Ok value)
  let error desc ~sw:_ = Eio.Promise.create_resolved (Error desc)

  let async f ~sw =
    let promise, resolver = Eio.Promise.create () in
    ( Eio.Fiber.fork ~sw @@ fun () ->
      try
        let result = f ~sw in
        Eio.Promise.resolve resolver result
      with exn -> Eio.Promise.resolve resolver @@ Error exn );
    promise

  let bind t f =
    async @@ fun ~sw ->
    match Eio.Promise.await (t ~sw) with
    | Ok value -> Eio.Promise.await @@ f value ~sw
    | Error desc -> Error desc

  let raise = error

  module O = struct
    let ( let+ ) x f = bind x @@ fun value -> return @@ f value
    let ( let* ) = bind
  end
end

module Chan : sig
  type input
  type output

  (* eio *)
  val of_source : #Eio.Flow.source -> input
  val with_sink : #Eio.Flow.sink -> (output -> 'a) -> 'a

  (* lsp *)
  val read_line : input -> string option Io.t
  val read_exactly : input -> int -> string option Io.t
  val write : output -> string -> unit Io.t
end = struct
  type input = Input of { mutex : Eio.Mutex.t; buf : Eio.Buf_read.t }
  type output = Output of { mutex : Eio.Mutex.t; buf : Eio.Buf_write.t }

  (* TODO: magic numbers *)
  let initial_size = 1024
  let max_size = 1024 * 1024

  let of_source source =
    let mutex = Eio.Mutex.create () in
    let buf = Eio.Buf_read.of_flow ~initial_size ~max_size source in
    Input { mutex; buf }

  let with_sink sink f =
    let mutex = Eio.Mutex.create () in
    Eio.Buf_write.with_flow ~initial_size sink @@ fun buf ->
    f @@ Output { mutex; buf }

  let read_line input =
    let (Input { mutex; buf }) = input in
    Io.async @@ fun ~sw:_ ->
    (* TODO: what this protect does? *)
    Eio.Mutex.use_rw ~protect:true mutex @@ fun () ->
    match Eio.Buf_read.eof_seen buf with
    | true -> Ok None
    | false -> Ok (Some (Eio.Buf_read.line buf))

  let read_exactly input size =
    let (Input { mutex; buf }) = input in
    Io.async @@ fun ~sw:_ ->
    Eio.Mutex.use_rw ~protect:true mutex @@ fun () ->
    match Eio.Buf_read.eof_seen buf with
    | true -> Ok None
    | false -> Ok (Some (Eio.Buf_read.take size buf))

  let write output str =
    let (Output { mutex; buf }) = output in
    Io.async @@ fun ~sw:_ ->
    Eio.Mutex.use_rw ~protect:true mutex @@ fun () ->
    Ok (Eio.Buf_write.string buf str)
end

module Lsp_io = Lsp.Io.Make (Io) (Chan)
open Jsonrpc
open Lsp_error

(* TODO: is a mutex needed for write? *)
type channel = Chan.output
type t = channel

let notify channel notification =
  (* TODO: fork here *)
  (* TODO: buffering and async? *)
  let notification = Lsp.Server_notification.to_jsonrpc notification in
  Io.await @@ Lsp_io.write channel @@ Notification notification

let respond channel response =
  Io.await @@ Lsp_io.write channel @@ Response response

let rec input_loop ~input ~output with_ =
  (* TODO: buffering and async handling *)
  match Io.await @@ Lsp_io.read input with
  | Some packet ->
      let () = with_ packet in
      input_loop ~input ~output with_
  | exception exn -> (* TODO: handle this exception *) raise exn
  | None ->
      (* TODO: this means EOF right? *)
      ()

let request_of_jsonrpc request =
  match Lsp.Client_request.of_jsonrpc request with
  | Ok request -> request
  | Error error -> fail (Error_invalid_notification { error })

let notification_of_jsonrpc notification =
  match Lsp.Client_notification.of_jsonrpc notification with
  | Ok notification -> notification
  | Error error -> fail (Error_invalid_notification { error })

type on_request = {
  f :
    'response.
    channel ->
    'response Lsp.Client_request.t ->
    ('response, Response.Error.t) result;
}

let listen ~input ~output ~on_request ~on_notification =
  let on_request channel request =
    (* TODO: error handling *)
    let result =
      let (E request) = request_of_jsonrpc request in
      match on_request.f channel request with
      | Ok result -> Ok (Lsp.Client_request.yojson_of_result request result)
      | Error _error as error -> error
    in
    let response = Jsonrpc.Response.{ id = request.id; result } in
    respond channel response
  in
  let on_notification channel notification =
    let notification = notification_of_jsonrpc notification in
    on_notification channel notification
  in

  let input = Chan.of_source input in
  Chan.with_sink output @@ fun channel ->
  input_loop ~input ~output @@ fun packet ->
  (* TODO: make this async? *)
  match packet with
  | Notification notification -> on_notification channel notification
  | Request request -> on_request channel request
  | Batch_call calls ->
      (* TODO: what if one fails? It should not prevents the others *)
      List.iter
        (fun call ->
          match call with
          | `Request request -> on_request channel request
          | `Notification notification -> on_notification channel notification)
        calls
  (* TODO: can the server receive a response?
      Yes but right now it will not be supported *)
  | Response _ -> fail Error_response_unsupported
  | Batch_response _ -> fail Error_response_unsupported


================================================
FILE: teikalsp/lsp_channel.mli
================================================
open Jsonrpc

type channel
type t = channel

val notify : channel -> Lsp.Server_notification.t -> unit

type on_request = {
  f :
    'response.
    channel ->
    'response Lsp.Client_request.t ->
    ('response, Response.Error.t) result;
}

(* TODO: request*)
val listen :
  input:#Eio.Flow.source ->
  output:#Eio.Flow.sink ->
  on_request:on_request ->
  on_notification:(channel -> Lsp.Client_notification.t -> unit) ->
  unit

(* val input_loop : input:Chan.input ->
   output:Chan.output -> (Jsonrpc.Packet.t -> Jsonrpc.Packet.t list) -> unit) *)


================================================
FILE: teikalsp/lsp_context.ml
================================================
open Lsp.Types
open Lsp_error
module Document_uri_map = Map.Make (DocumentUri)

(* TODO: capabilities *)
(* TODO: initialized *)
type status = Handshake | Running

type context = {
  mutable status : status;
  mutable text_documents : Lsp_text_document.t Document_uri_map.t;
}

type t = context

let create () = { status = Handshake; text_documents = Document_uri_map.empty }
let status context = context.status

let initialize context =
  match context.status with
  | Handshake -> context.status <- Running
  | Running -> fail Error_invalid_status_during_initialize

let update_text_documents context f =
  let text_documents = context.text_documents in
  let text_documents = f text_documents in
  context.text_documents <- text_documents

let open_text_document context uri text_document =
  update_text_documents context @@ fun text_documents ->
  (match Document_uri_map.mem uri text_documents with
  | true -> fail Error_text_document_already_in_context
  | false -> ());
  Document_uri_map.add uri text_document text_documents

let change_text_document context uri cb =
  update_text_documents context @@ fun text_documents ->
  let text_document =
    match Document_uri_map.find_opt uri text_documents with
    | Some text_document -> text_document
    | None -> fail Error_text_document_not_in_context
  in
  let text_document = cb text_document in
  (* TODO: only accept if version is newer or equal *)
  Document_uri_map.add uri text_document text_documents

let close_text_document context uri =
  update_text_documents context @@ fun text_documents ->
  (match Document_uri_map.mem uri text_documents with
  | true -> ()
  | false -> fail Error_text_document_not_in_context);
  Document_uri_map.remove uri text_documents


================================================
FILE: teikalsp/lsp_context.mli
================================================
open Lsp.Types

type status = private Handshake | Running
type context
type t = context

(* TODO: rollback? Requests and notifications should probably be atomic *)
val create : unit -> context
val status : context -> status
val initialize : context -> unit

(* documents *)
val open_text_document : context -> DocumentUri.t -> Lsp_text_document.t -> unit

val change_text_document :
  context ->
  DocumentUri.t ->
  (Lsp_text_document.t -> Lsp_text_document.t) ->
  unit

val close_text_document : context -> DocumentUri.t -> unit


================================================
FILE: teikalsp/lsp_error.ml
================================================
open Lsp.Types

type error =
  (* channel *)
  | Error_request_unsupported
  | Error_response_unsupported
  | Error_invalid_request of { error : string }
  | Error_invalid_notification of { error : string }
  (* server *)
  | Error_unsupported_request
  | Error_unsupported_notification
  (* context *)
  | Error_notification_before_initialize
  | Error_invalid_status_during_initialize
  | Error_text_document_already_in_context
  | Error_text_document_not_in_context
  (* notification *)
  | Error_multiple_content_changes of {
      content_changes : TextDocumentContentChangeEvent.t list; [@opaque]
    }
  | Error_partial_content_change of {
      content_change : TextDocumentContentChangeEvent.t; [@opaque]
    }
  | Error_invalid_content_change of {
      content_change : TextDocumentContentChangeEvent.t; [@opaque]
    }
  | Error_unknown_language_id of { language_id : string }

and t = error [@@deriving show]

(* TODO: what happen with errors? *)
exception Lsp_error of { error : error }

let fail error = raise (Lsp_error { error })


================================================
FILE: teikalsp/lsp_error.mli
================================================
open Lsp.Types

type error =
  (* channel *)
  | Error_request_unsupported
  | Error_response_unsupported
  | Error_invalid_request of { error : string }
  | Error_invalid_notification of { error : string }
  (* server *)
  | Error_unsupported_request
  | Error_unsupported_notification
  (* context *)
  | Error_notification_before_initialize
  | Error_invalid_status_during_initialize
  | Error_text_document_already_in_context
  | Error_text_document_not_in_context
  (* notification *)
  | Error_multiple_content_changes of {
      content_changes : TextDocumentContentChangeEvent.t list;
    }
  | Error_partial_content_change of {
      content_change : TextDocumentContentChangeEvent.t;
    }
  | Error_invalid_content_change of {
      content_change : TextDocumentContentChangeEvent.t;
    }
  | Error_unknown_language_id of { language_id : string }

type t = error [@@deriving show]

exception Lsp_error of { error : error }

val fail : error -> 'a


================================================
FILE: teikalsp/lsp_notification.ml
================================================
open Lsp.Types
open Lsp_context
open Lsp_error

module Server_life_cycle = struct
  (* TODO: do something here?*)
  let initialized _context = ()
end

module Text_document_sync = struct
  let did_open context ~params =
    let DidOpenTextDocumentParams.{ textDocument = text_document } = params in
    let TextDocumentItem.{ uri; languageId = language_id; version; text } =
      text_document
    in
    let document =
      match language_id with
      | "teika" -> Lsp_text_document.teika ~version ~text
      | language_id -> fail (Error_unknown_language_id { language_id })
    in
    (* TODO: async typing here *)
    open_text_document context uri document

  let did_change context ~params =
    (* TODO: currently only full content changes are supported
        partial content changes could be supported *)
    let DidChangeTextDocumentParams.
          { textDocument = { uri; version }; contentChanges = content_changes }
        =
      params
    in
    let content_change =
      match content_changes with
      | [ content_change ] -> content_change
      | content_changes ->
          fail (Error_multiple_content_changes { content_changes })
    in
    let TextDocumentContentChangeEvent.{ range; rangeLength; text } =
      content_change
    in
    (match (range, rangeLength) with
    | None, None -> ()
    | Some _, Some _ -> fail (Error_partial_content_change { content_change })
    | Some _, None | None, Some _ ->
        fail (Error_invalid_content_change { content_change }));
    change_text_document context uri @@ fun document ->
    (* TODO: async typing here *)
    Lsp_text_document.with_change ~version ~text document

  let did_close context ~params =
    let DidCloseTextDocumentParams.{ textDocument = { uri } } = params in
    close_text_document context uri

  (* TODO: save and rename *)
end


================================================
FILE: teikalsp/lsp_notification.mli
================================================
open Lsp.Types

module Server_life_cycle : sig
  val initialized : Lsp_context.t -> unit
end

module Text_document_sync : sig
  val did_open : Lsp_context.t -> params:DidOpenTextDocumentParams.t -> unit
  val did_change : Lsp_context.t -> params:DidChangeTextDocumentParams.t -> unit
  val did_close : Lsp_context.t -> params:DidCloseTextDocumentParams.t -> unit
end


================================================
FILE: teikalsp/lsp_request.ml
================================================
open Lsp.Types
open Lsp_context
open Lsp_error

module Server_life_cycle = struct
  let initialize context ~params =
    let () =
      (* TODO: this is duplicated *)
      match status context with
      | Handshake -> ()
      | Running -> fail Error_invalid_status_during_initialize
    in
    (* TODO: use additional data *)
    let InitializeParams.
          {
            workDoneToken = _;
            processId = _;
            clientInfo = _;
            locale = _;
            rootPath = _;
            rootUri = _;
            initializationOptions = _;
            capabilities = _;
            (* TODO: definitely ignore capabilities *)
            trace = _;
            (* TODO: enable logging using tgrace*)
            workspaceFolders = _;
          } =
      params
    in
    let () = Lsp_context.initialize context in
    (* TODO: better capabilities *)
    let capabilities =
      ServerCapabilities.create ~textDocumentSync:(`TextDocumentSyncKind Full)
        ~hoverProvider:(`Bool true) ()
    in
    (* TODO: server_info *)
    InitializeResult.create ~capabilities ()
end


================================================
FILE: teikalsp/lsp_request.mli
================================================
open Lsp.Types

module Server_life_cycle : sig
  val initialize :
    Lsp_context.t -> params:InitializeParams.t -> InitializeResult.t
end


================================================
FILE: teikalsp/lsp_text_document.ml
================================================
(* TODO: proper types for text and version *)
type document = Smol of { version : int; text : string }
type t = document

let teika ~version ~text = Smol { version; text }

let with_change ~version ~text document =
  (* TODO: use the version for something? *)
  let (Smol { version = _; text = _ }) = document in
  Smol { version; text }


================================================
FILE: teikalsp/lsp_text_document.mli
================================================
type document
type t = document

val teika : version:int -> text:string -> document
val with_change : version:int -> text:string -> document -> document


================================================
FILE: teikalsp/teikalsp.ml
================================================
open Lsp_error

let on_request (type response) context _channel
    (request : response Lsp.Client_request.t) : response =
  let open Lsp_request in
  let open Lsp.Client_request in
  (* TODO: use channel? *)
  match request with
  | Initialize params -> Server_life_cycle.initialize context ~params
  | _request ->
      (* TODO: print which requests are not supported *)
      fail Error_unsupported_request

let on_request_error _context _channel error =
  let open Jsonrpc.Response.Error in
  (* TODO: maybe error should show to user? *)
  (* TODO: better errors *)
  let message =
    match error with
    | Lsp_error { error } -> Lsp_error.show error
    | error -> Printexc.to_string error
  in
  Jsonrpc.Response.Error.make ~code:Code.InternalError ~message ()

let on_notification context _channel notification =
  let open Lsp_notification in
  let open Lsp.Client_notification in
  (* TODO: use channel? *)
  match notification with
  | Initialized -> Server_life_cycle.initialized context
  | TextDocumentDidOpen params -> Text_document_sync.did_open context ~params
  | TextDocumentDidChange params ->
      Text_document_sync.did_change context ~params
  | TextDocumentDidClose params -> Text_document_sync.did_close context ~params
  | _notification ->
      (* TODO: print which notifications are not supported *)
      fail Error_unsupported_notification

let on_notification context channel notification =
  (* TODO: notification error handling *)
  match Lsp_context.status context with
  | Handshake ->
      (* TODO: log *)
      (* TODO: server can send some notifications during handshake *)
      fail Error_notification_before_initialize
  | Running -> on_notification context channel notification

let on_notification_error _context channel error =
  let open Lsp.Types in
  let open Lsp.Server_notification in
  let message =
    match error with
    | Lsp_error { error } -> Lsp_error.show error
    | error -> Printexc.to_string error
  in
  (* TODO: maybe error should show to user? *)
  let message = LogMessageParams.create ~type_:Error ~message in
  Lsp_channel.notify channel @@ LogMessage message

let main () =
  Eio_main.run @@ fun env ->
  let context = Lsp_context.create () in
  let on_request channel request =
    try Ok (on_request context channel request)
    with error -> Error (on_request_error context channel error)
  in
  let on_notification channel notification =
    try on_notification context channel notification
    with error -> on_notification_error context channel error
  in
  Lsp_channel.listen ~input:env#stdin ~output:env#stdout
    ~on_request:{ f = on_request } ~on_notification

let () = main ()


================================================
FILE: teikalsp/teikalsp.mli
================================================


================================================
FILE: teikavsc/main.ts
================================================
import { workspace, ExtensionContext, commands } from "vscode";

import {
  LanguageClient,
  LanguageClientOptions,
  ServerOptions,
  TransportKind,
} from "vscode-languageclient/node";

let client: LanguageClient;

const restartClient = async () => {
  const workspaceConfiguration = workspace.getConfiguration();
  const teikaServerPath = workspaceConfiguration.get<string>(
    "teika.server.path",
    ""
  );

  // If the extension is launched in debug mode then the debug server options are used
  // Otherwise the run options are used
  const serverOptions: ServerOptions = {
    run: {
      command: teikaServerPath,
      args: [],
      transport: TransportKind.stdio,
    },
    debug: {
      command: teikaServerPath,
      args: [],
      transport: TransportKind.stdio,
    },
  };

  // Options to control the language client
  const clientOptions: LanguageClientOptions = {
    // Register the server for plain text documents
    documentSelector: [{ scheme: "file", language: "teika" }],
    synchronize: {},
  };

  if (!client) {
    // Create the language client and start the client.
    client = new LanguageClient(
      "teika",
      "Teika Server",
      serverOptions,
      clientOptions
    );
  }

  if (client && client.isRunning()) {
    await client.restart();
  } else {
    // Start the client. This will also launch the server
    await client.start();
  }
};

export function activate(context: ExtensionContext) {
  context.subscriptions.push(
    commands.registerCommand("teika.server.restart", () => {
      // TODO: handle promise below
      restartClient();
    })
  );

  restartClient();
}

export function deactivate(): Thenable<void> | undefined {
  if (!client) {
    return undefined;
  }
  return client.stop();
}


================================================
FILE: teikavsc/package.json
================================================
{
  "name": "teikavsc",
  "displayName": "Teika",
  "description": "Teika language extension for VSCode",
  "license": "MIT",
  "version": "0.0.1",
  "publisher": "teikalang",
  "repository": {
    "type": "git",
    "url": "https://github.com/teikalang/teika"
  },
  "bugs": {
    "url": "https://github.com/teikalang/teika/issues"
  },
  "homepage": "my home page",
  "main": "./dist/main.js",
  "engines": {
    "vscode": "^1.64.0"
  },
  "categories": [
    "Programming Languages"
  ],
  "activationEvents": [
    "workspaceContains:**/*.te",
    "workspaceContains:**/*.tei"
  ],
  "_icon": "assets/logo.png",
  "contributes": {
    "viewsWelcome": [],
    "viewsContainers": {},
    "views": {},
    "commands": [
      {
        "command": "teika.server.restart",
        "category": "Teika",
        "title": "Restart Language Server"
      }
    ],
    "keybindings": [],
    "menus": {
      "editor/context": [],
      "commandPalette": [],
      "editor/title": [],
      "view/title": [],
      "view/item/context": []
    },
    "configuration": {
      "title": "Teika",
      "properties": {
        "teika.server.path": {
          "type": "string",
          "default": null,
          "description": "teikalsp path"
        },
        "teika.trace.server": {
          "scope": "window",
          "type": "string",
          "enum": [
            "off",
            "messages",
            "verbose"
          ],
          "default": "off",
          "description": "Traces the communication between VS Code and the language server."
        }
      }
    },
    "configurationDefaults": {
      "[teika]": {
        "editor.tabSize": 2
      }
    },
    "problemMatchers": [],
    "taskDefinitions": [],
    "languages": [
      {
        "id": "teika",
        "aliases": [
          "Teika",
          "teika"
        ],
        "extensions": [
          ".te",
          ".tei"
        ],
        "configuration": "./teika.language.json"
      }
    ],
    "grammars": [
      {
        "language": "teika",
        "scopeName": "source.teika",
        "path": "./teika.syntax.json"
      },
      {
        "scopeName": "markdown.teika.codeblock",
        "path": "./teika.markdown.codeblock.json",
        "injectTo": [
          "text.html.markdown"
        ],
        "embeddedLanguages": {
          "meta.embedded.block.teika": "teika"
        }
      }
    ],
    "snippets": [
      {
        "language": "teika",
        "path": "./teika.snippets.json"
      }
    ],
    "jsonValidation": [],
    "customEditors": []
  },
  "scripts": {
    "package": "vsce package --out vscode-teika.vsix --yarn",
    "deploy:vsce": "vsce publish --packagePath vscode-teika.vsix --yarn",
    "fmt:check": "prettier . --check",
    "fmt": "prettier . --write"
  },
  "dependencies": {
    "vscode-languageclient": "*"
  },
  "devDependencies": {
    "@types/vscode": "*",
    "@types/node": "*",
    "prettier": "*",
    "vsce": "*",
    "typescript": "*"
  },
  "prettier": {
    "proseWrap": "always",
    "overrides": []
  }
}


================================================
FILE: teikavsc/tsconfig.json
================================================
{
  "compilerOptions": {
    /* Visit https://aka.ms/tsconfig to read more about this file */
    /* Projects */
    // "incremental": true,                              /* Save .tsbuildinfo files to allow for incremental compilation of projects. */
    // "composite": true,                                /* Enable constraints that allow a TypeScript project to be used with project references. */
    // "tsBuildInfoFile": "./.tsbuildinfo",              /* Specify the path to .tsbuildinfo incremental compilation file. */
    // "disableSourceOfProjectReferenceRedirect": true,  /* Disable preferring source files instead of declaration files when referencing composite projects. */
    // "disableSolutionSearching": true,                 /* Opt a project out of multi-project reference checking when editing. */
    // "disableReferencedProjectLoad": true,             /* Reduce the number of projects loaded automatically by TypeScript. */
    /* Language and Environment */
    "target": "es2020", /* Set the JavaScript language version for emitted JavaScript and include compatible library declarations. */
    "lib": [
      "es2020"
    ], /* Specify a set of bundled library declaration files that describe the target runtime environment. */
    // "jsx": "preserve",                                /* Specify what JSX code is generated. */
    // "experimentalDecorators": true,                   /* Enable experimental support for TC39 stage 2 draft decorators. */
    // "emitDecoratorMetadata": true,                    /* Emit design-type metadata for decorated declarations in source files. */
    // "jsxFactory": "",                                 /* Specify the JSX factory function used when targeting React JSX emit, e.g. 'React.createElement' or 'h'. */
    // "jsxFragmentFactory": "",                         /* Specify the JSX Fragment reference used for fragments when targeting React JSX emit e.g. 'React.Fragment' or 'Fragment'. */
    // "jsxImportSource": "",                            /* Specify module specifier used to import the JSX factory functions when using 'jsx: react-jsx*'. */
    // "reactNamespace": "",                             /* Specify the object invoked for 'createElement'. This only applies when targeting 'react' JSX emit. */
    // "noLib": true,                                    /* Disable including any library files, including the default lib.d.ts. */
    // "useDefineForClassFields": true,                  /* Emit ECMAScript-standard-compliant class fields. */
    // "moduleDetection": "auto",                        /* Control what method is used to detect module-format JS files. */
    /* Modules */
    "module": "commonjs", /* Specify what module code is generated. */
    // "rootDir": "./",                                  /* Specify the root folder within your source files. */
    "moduleResolution": "node", /* Specify how TypeScript looks up a file from a given module specifier. */
    // "baseUrl": "./",                                  /* Specify the base directory to resolve non-relative module names. */
    // "paths": {},                                      /* Specify a set of entries that re-map imports to additional lookup locations. */
    // "rootDirs": [],                                   /* Allow multiple folders to be treated as one when resolving modules. */
    // "typeRoots": [],                                  /* Specify multiple folders that act like './node_modules/@types'. */
    // "types": [],                                      /* Specify type package names to be included without being referenced in a source file. */
    // "allowUmdGlobalAccess": true,                     /* Allow accessing UMD globals from modules. */
    // "moduleSuffixes": [],                             /* List of file name suffixes to search when resolving a module. */
    // "resolveJsonModule": true,                        /* Enable importing .json files. */
    // "noResolve": true,                                /* Disallow 'import's, 'require's or '<reference>'s from expanding the number of files TypeScript should add to a project. */
    /* JavaScript Support */
    // "allowJs": true,                                  /* Allow JavaScript files to be a part of your program. Use the 'checkJS' option to get errors from these files. */
    // "checkJs": true,                                  /* Enable error reporting in type-checked JavaScript files. */
    // "maxNodeModuleJsDepth": 1,                        /* Specify the maximum folder depth used for checking JavaScript files from 'node_modules'. Only applicable with 'allowJs'. */
    /* Emit */
    // "declaration": true,                              /* Generate .d.ts files from TypeScript and JavaScript files in your project. */
    // "declarationMap": true,                           /* Create sourcemaps for d.ts files. */
    // "emitDeclarationOnly": true,                      /* Only output d.ts files and not JavaScript files. */
    "sourceMap": false, /* Create source map files for emitted JavaScript files. */
    // "outFile": "./",                                  /* Specify a file that bundles all outputs into one JavaScript file. If 'declaration' is true, also designates a file that bundles all .d.ts output. */
    "outDir": "./dist", /* Specify an output folder for all emitted files. */
    // "removeComments": true,                           /* Disable emitting comments. */
    // "noEmit": true,                                   /* Disable emitting files from a compilation. */
    // "importHelpers": true,                            /* Allow importing helper functions from tslib once per project, instead of including them per-file. */
    // "importsNotUsedAsValues": "remove",               /* Specify emit/checking behavior for imports that are only used for types. */
    // "downlevelIteration": true,                       /* Emit more compliant, but verbose and less performant JavaScript for iteration. */
    // "sourceRoot": "",                                 /* Specify the root path for debuggers to find the reference source code. */
    // "mapRoot": "",                                    /* Specify the location where debugger should locate map files instead of generated locations. */
    // "inlineSourceMap": true,                          /* Include sourcemap files inside the emitted JavaScript. */
    // "inlineSources": true,                            /* Include source code in the sourcemaps inside the emitted JavaScript. */
    // "emitBOM": true,                                  /* Emit a UTF-8 Byte Order Mark (BOM) in the beginning of output files. */
    // "newLine": "crlf",                                /* Set the newline character for emitting files. */
    // "stripInternal": true,                            /* Disable emitting declarations that have '@internal' in their JSDoc comments. */
    // "noEmitHelpers": true,                            /* Disable generating custom helper functions like '__extends' in compiled output. */
    // "noEmitOnError": true,                            /* Disable emitting files if any type checking errors are reported. */
    // "preserveConstEnums": true,                       /* Disable erasing 'const enum' declarations in generated code. */
    // "declarationDir": "./",                           /* Specify the output directory for generated declaration files. */
    // "preserveValueImports": true,                     /* Preserve unused imported values in the JavaScript output that would otherwise be removed. */
    /* Interop Constraints */
    // "isolatedModules": true,                          /* Ensure that each file can be safely transpiled without relying on other imports. */
    // "allowSyntheticDefaultImports": true,             /* Allow 'import x from y' when a module doesn't have a default export. */
    "esModuleInterop": true, /* Emit additional JavaScript to ease support for importing CommonJS modules. This enables 'allowSyntheticDefaultImports' for type compatibility. */
    // "preserveSymlinks": true,                         /* Disable resolving symlinks to their realpath. This correlates to the same flag in node. */
    "forceConsistentCasingInFileNames": true, /* Ensure that casing is correct in imports. */
    /* Type Checking */
    "strict": true, /* Enable all strict type-checking options. */
    // "noImplicitAny": true,                            /* Enable error reporting for expressions and declarations with an implied 'any' type. */
    // "strictNullChecks": true,                         /* When type checking, take into account 'null' and 'undefined'. */
    // "strictFunctionTypes": true,                      /* When assigning functions, check to ensure parameters and the return values are subtype-compatible. */
    // "strictBindCallApply": true,                      /* Check that the arguments for 'bind', 'call', and 'apply' methods match the original function. */
    // "strictPropertyInitialization": true,             /* Check for class properties that are declared but not set in the constructor. */
    // "noImplicitThis": true,                           /* Enable error reporting when 'this' is given the type 'any'. */
    // "useUnknownInCatchVariables": true,               /* Default catch clause variables as 'unknown' instead of 'any'. */
    // "alwaysStrict": true,                             /* Ensure 'use strict' is always emitted. */
    // "noUnusedLocals": true,                           /* Enable error reporting when local variables aren't read. */
    // "noUnusedParameters": true,                       /* Raise an error when a function parameter isn't read. */
    // "exactOptionalPropertyTypes": true,               /* Interpret optional property types as written, rather than adding 'undefined'. */
    // "noImplicitReturns": true,                        /* Enable error reporting for codepaths that do not explicitly return in a function. */
    // "noFallthroughCasesInSwitch": true,               /* Enable error reporting for fallthrough cases in switch statements. */
    // "noUncheckedIndexedAccess": true,                 /* Add 'undefined' to a type when accessed using an index. */
    // "noImplicitOverride": true,                       /* Ensure overriding members in derived classes are marked with an override modifier. */
    // "noPropertyAccessFromIndexSignature": true,       /* Enforces using indexed accessors for keys declared using an indexed type. */
    // "allowUnusedLabels": true,                        /* Disable error reporting for unused labels. */
    // "allowUnreachableCode": true,                     /* Disable error reporting for unreachable code. */
    /* Completeness */
    // "skipDefaultLibCheck": true,                      /* Skip type checking .d.ts files that are included with TypeScript. */
    "skipLibCheck": true /* Skip type checking all .d.ts files. */
  }
}


================================================
FILE: utils/dune
================================================
(library
 (name utils)
 (preprocess
  (pps ppx_deriving.eq ppx_deriving.ord ppx_deriving.eq ppx_deriving.show)))


================================================
FILE: utils/utils.ml
================================================
module Index = struct
  type index = int
  and t = index [@@deriving show, eq]

  let zero = 0

  let next n =
    let n = n + 1 in
    assert (n + 1 >= zero);
    n
end

module Level = struct
  type level = int
  and t = level [@@deriving show, eq]

  let zero = 0

  let next n =
    let n = n + 1 in
    assert (n + 1 >= zero);
    n

  let offset ~from ~to_ =
    match to_ > from with
    | true ->
        (* TODO: explain this -1 *)
        Some (to_ - from - 1)
    | false -> None
end

module Name = struct
  type name = string
  and t = name [@@deriving show, eq, ord]

  let make t = t
  let repr t = t

  module Map = Map.Make (String)
end


================================================
FILE: utils/utils.mli
================================================
module Index : sig
  type index = private int
  type t = index [@@deriving show, eq]

  val zero : index
  val next : index -> index
end

module Level : sig
  (* TODO: this private int is not ideal *)
  type level = private int
  type t = level [@@deriving show, eq]

  val zero : level
  val next : level -> level
  val offset : from:level -> to_:level -> Index.t option
end

module Name : sig
  type name
  type t = name [@@deriving show, eq, ord]

  val make : string -> name
  val repr : name -> string

  (* TODO: stop exposing this? *)
  module Map : Map.S with type key = name
end
Download .txt
gitextract_2gsoytl5/

├── .envrc
├── .gitignore
├── .ocamlformat
├── LICENSE
├── README.md
├── design/
│   ├── GOALS.md
│   ├── INFERENCE.md
│   ├── LANGUAGE.md
│   ├── MODULE.md
│   └── SYNTAX.md
├── dune-project
├── flake.nix
├── jsend/
│   ├── dune
│   ├── emit.ml
│   ├── emit.mli
│   ├── jprinter.ml
│   ├── jprinter.mli
│   ├── jtree.ml
│   ├── jtree.mli
│   ├── test.ml
│   ├── untype.ml
│   ├── untype.mli
│   ├── utree.ml
│   ├── utree.mli
│   ├── var.ml
│   └── var.mli
├── nix/
│   ├── default.nix
│   └── shell.nix
├── smol/
│   ├── HACKING.md
│   ├── dune
│   ├── index.ml
│   ├── index.mli
│   ├── level.ml
│   ├── level.mli
│   ├── stree.ml
│   ├── stree.mli
│   ├── styper.ml
│   ├── test.ml
│   └── test.mli
├── syntax/
│   ├── clexer.ml
│   ├── clexer.mli
│   ├── cparser.mly
│   ├── ctree.ml
│   ├── ctree.mli
│   ├── dune
│   └── test.ml
├── teika/
│   ├── dune
│   ├── solve.ml
│   ├── solve.mli
│   ├── terror.ml
│   ├── terror.mli
│   ├── test.ml
│   ├── test.mli
│   ├── tprinter.ml
│   ├── tprinter.mli
│   ├── ttree.ml
│   ├── ttree.mli
│   ├── typer.ml
│   └── typer.mli
├── teikalsp/
│   ├── dune
│   ├── lsp_channel.ml
│   ├── lsp_channel.mli
│   ├── lsp_context.ml
│   ├── lsp_context.mli
│   ├── lsp_error.ml
│   ├── lsp_error.mli
│   ├── lsp_notification.ml
│   ├── lsp_notification.mli
│   ├── lsp_request.ml
│   ├── lsp_request.mli
│   ├── lsp_text_document.ml
│   ├── lsp_text_document.mli
│   ├── teikalsp.ml
│   └── teikalsp.mli
├── teikavsc/
│   ├── main.ts
│   ├── package.json
│   └── tsconfig.json
└── utils/
    ├── dune
    ├── utils.ml
    └── utils.mli
Download .txt
SYMBOL INDEX (2 symbols across 1 files)

FILE: teikavsc/main.ts
  function activate (line 59) | function activate(context: ExtensionContext) {
  function deactivate (line 70) | function deactivate(): Thenable<void> | undefined {
Condensed preview — 80 files, each showing path, character count, and a content snippet. Download the .json file or copy for the full structured content (191K chars).
[
  {
    "path": ".envrc",
    "chars": 10,
    "preview": "use_flake\n"
  },
  {
    "path": ".gitignore",
    "chars": 33,
    "preview": "_build\nnode_modules\n.direnv\ndist\n"
  },
  {
    "path": ".ocamlformat",
    "chars": 18,
    "preview": "version = unknown\n"
  },
  {
    "path": "LICENSE",
    "chars": 1071,
    "preview": "MIT License\n\nCopyright (c) 2022 Eduardo Rafael\n\nPermission is hereby granted, free of charge, to any person obtaining a "
  },
  {
    "path": "README.md",
    "chars": 247,
    "preview": "# Teika\n\nTeika is a functional programming language. Same pronunciation as in \"take a break\" or if you prefer: \"teika br"
  },
  {
    "path": "design/GOALS.md",
    "chars": 1205,
    "preview": "# Goals\n\nThis should document the goals of the project, the why and the tradeoff's.\n\n## Assumptions\n\n1. Most code will b"
  },
  {
    "path": "design/INFERENCE.md",
    "chars": 1455,
    "preview": "# Inference\n\nTeika intends to be an ML-like language, which means inference is a must.\n\n## HM inference\n\nThe basic kind "
  },
  {
    "path": "design/LANGUAGE.md",
    "chars": 4621,
    "preview": "# Teika Language\n\nThis document intends to describe the Teika Language. Which refers to all the features supported by Te"
  },
  {
    "path": "design/MODULE.md",
    "chars": 911,
    "preview": "# Module\n\nThis intends to document behavior and features of modules.\n\n## Implicit type\n\nAll structures contain an implic"
  },
  {
    "path": "design/SYNTAX.md",
    "chars": 3405,
    "preview": "# Syntax\n\nThis should document the syntax of the project, they and the tradeoff's.\n\n## Requirements\n\n<!-- TODO: technica"
  },
  {
    "path": "dune-project",
    "chars": 36,
    "preview": "(lang dune 2.9)\n\n(using menhir 2.0)\n"
  },
  {
    "path": "flake.nix",
    "chars": 709,
    "preview": "{\n  description = \"Nix Flake\";\n\n  inputs = {\n    nixpkgs.url = \"github:anmonteiro/nix-overlays\";\n    nix-filter.url = \"g"
  },
  {
    "path": "jsend/dune",
    "chars": 370,
    "preview": "(library\n (name jsend)\n (libraries teika)\n (modules\n  (:standard \\ Test))\n (preprocess\n  (pps ppx_deriving.show ppx_deri"
  },
  {
    "path": "jsend/emit.ml",
    "chars": 2994,
    "preview": "open Utree\nopen Jtree\n\nlet emit_curry function_ =\n  JE_call { lambda = JE_var { var = Var.curry }; args = [ function_ ] "
  },
  {
    "path": "jsend/emit.mli",
    "chars": 47,
    "preview": "val emit_term : Utree.term -> Jtree.expression\n"
  },
  {
    "path": "jsend/jprinter.ml",
    "chars": 2596,
    "preview": "open Jtree\nopen Format\n\n(* TODO: identation *)\nlet pp_block_syntax ~pp_wrapped_expression fmt block =\n  let (JBlock { co"
  },
  {
    "path": "jsend/jprinter.mli",
    "chars": 66,
    "preview": "val pp_expression : Format.formatter -> Jtree.expression -> unit\n\n"
  },
  {
    "path": "jsend/jtree.ml",
    "chars": 442,
    "preview": "type expression =\n  | JE_loc of { expression : expression; loc : Location.t }\n  | JE_var of { var : Var.t }\n  | JE_gener"
  },
  {
    "path": "jsend/jtree.mli",
    "chars": 484,
    "preview": "type expression =\n  | JE_loc of { expression : expression; loc : Location.t }\n  | JE_var of { var : Var.t }\n  | JE_gener"
  },
  {
    "path": "jsend/test.ml",
    "chars": 2039,
    "preview": "open Syntax\nopen Teika\nopen Jsend\n\nlet compile code =\n  let term = Option.get @@ Clexer.from_string Cparser.term_opt cod"
  },
  {
    "path": "jsend/untype.ml",
    "chars": 3142,
    "preview": "open Syntax\nopen Teika\nopen Ttree\nopen Utree\n\nexception Term_subst_found\nexception Term_shift_found\nexception Invalid_va"
  },
  {
    "path": "jsend/untype.mli",
    "chars": 55,
    "preview": "open Teika\n\nval untype_term : Ttree.term -> Utree.term\n"
  },
  {
    "path": "jsend/utree.ml",
    "chars": 383,
    "preview": "type term =\n  | UT_loc of { term : term; loc : Location.t }\n  | UT_var of { var : Var.t }\n  | UT_lambda of { param : Var"
  },
  {
    "path": "jsend/utree.mli",
    "chars": 453,
    "preview": "type term =\n  (* TODO: why is loc a term? *)\n  | UT_loc of { term : term; loc : Location.t }\n  | UT_var of { var : Var.t"
  },
  {
    "path": "jsend/var.ml",
    "chars": 1446,
    "preview": "open Utils\n\nmodule Id : sig\n  type t [@@deriving show]\n\n  val next : unit -> t\n  val equal : t -> t -> bool\n  val compar"
  },
  {
    "path": "jsend/var.mli",
    "chars": 315,
    "preview": "open Utils\n\ntype var\ntype t = var [@@deriving show]\n\nval create : Name.t -> var\nval equal : var -> var -> bool\nval compa"
  },
  {
    "path": "nix/default.nix",
    "chars": 728,
    "preview": "{ pkgs, doCheck ? true, nix-filter }:\n\nlet inherit (pkgs) lib stdenv ocamlPackages; in\n\nwith ocamlPackages; buildDunePac"
  },
  {
    "path": "nix/shell.nix",
    "chars": 276,
    "preview": "{ pkgs, teika }:\n\nwith pkgs; with ocamlPackages; mkShell {\n  inputsFrom = [ teika ];\n  packages = [\n    # Make developer"
  },
  {
    "path": "smol/HACKING.md",
    "chars": 524,
    "preview": "# Smol Frontend\n\n## Optimizations\n\n### Explicit Substitutions\n\nThis uses explicit substitutions to achieve laziness, sim"
  },
  {
    "path": "smol/dune",
    "chars": 347,
    "preview": "(library\n (name smol)\n (libraries syntax)\n (modules\n  (:standard \\ Test))\n (preprocess\n  (pps ppx_deriving.show ppx_deri"
  },
  {
    "path": "smol/index.ml",
    "chars": 422,
    "preview": "type index = int\nand t = index [@@deriving show, eq]\n\nlet zero = 0\nlet one = 1\nlet previous x = match x > 0 with true ->"
  },
  {
    "path": "smol/index.mli",
    "chars": 329,
    "preview": "type index\ntype t = index [@@deriving show, eq]\n\nval zero : index\nval one : index\nval previous : index -> index option\nv"
  },
  {
    "path": "smol/level.ml",
    "chars": 240,
    "preview": "type level = int\nand t = level [@@deriving show, eq]\n\nlet zero = 0\n\n(* TODO: check for overflows *)\nlet next n = n + 1\nl"
  },
  {
    "path": "smol/level.mli",
    "chars": 217,
    "preview": "type level\ntype t = level [@@deriving show, eq]\n\nval zero : level\nval next : level -> level\nval offset : from:level -> t"
  },
  {
    "path": "smol/stree.ml",
    "chars": 938,
    "preview": "type ty_term = ST_typed of { term : term; type_ : term }\n\nand term =\n  | ST_loc of { term : term; loc : Location.t [@opa"
  },
  {
    "path": "smol/stree.mli",
    "chars": 817,
    "preview": "type ty_term = ST_typed of { term : term; type_ : term }\n\nand term =\n  | ST_loc of { term : term; loc : Location.t }\n  |"
  },
  {
    "path": "smol/styper.ml",
    "chars": 33349,
    "preview": "(* TODO: remove all failwith *)\n\nmodule Error = struct\n  open Syntax\n\n  type error =\n    | E_loc of { error : error; loc"
  },
  {
    "path": "smol/test.ml",
    "chars": 9325,
    "preview": "open Syntax\nopen Smol\n\ntype test = { name : string; term : string }\n\nlet type_term name term = { name; term }\n\n(* TODO: "
  },
  {
    "path": "smol/test.mli",
    "chars": 0,
    "preview": ""
  },
  {
    "path": "syntax/clexer.ml",
    "chars": 3079,
    "preview": "open Cparser\nopen Sedlexing.Utf8\n\nexception Lexer_error of { loc : Location.t }\nexception Parser_error of { loc : Locati"
  },
  {
    "path": "syntax/clexer.mli",
    "chars": 271,
    "preview": "exception Lexer_error of { loc : Location.t }\nexception Parser_error of { loc : Location.t }\n\nval loc : Sedlexing.lexbuf"
  },
  {
    "path": "syntax/cparser.mly",
    "chars": 3084,
    "preview": "%{\nopen Utils\nopen Ctree\n\nlet mk (loc_start, loc_end) =\n  Location.{ loc_start; loc_end; loc_ghost = false }\n\n%}\n%token "
  },
  {
    "path": "syntax/ctree.ml",
    "chars": 1752,
    "preview": "open Utils\n\ntype term =\n  (* TODO: printer location *)\n  | CTerm of { term : term_syntax; loc : Location.t [@opaque] }\n\n"
  },
  {
    "path": "syntax/ctree.mli",
    "chars": 1494,
    "preview": "open Utils\n\ntype term = CTerm of { term : term_syntax; loc : Location.t }\n\nand term_syntax =\n  | CT_var of { var : Name."
  },
  {
    "path": "syntax/dune",
    "chars": 484,
    "preview": "(library\n (name syntax)\n (libraries menhirLib compiler-libs.common utils zarith)\n (modules\n  (:standard \\ Test))\n (prepr"
  },
  {
    "path": "syntax/test.ml",
    "chars": 0,
    "preview": ""
  },
  {
    "path": "teika/dune",
    "chars": 432,
    "preview": "(library\n (name teika)\n (libraries syntax compiler-libs.common)\n (modules\n  (:standard \\ Test))\n (preprocess\n  (pps\n   p"
  },
  {
    "path": "teika/solve.ml",
    "chars": 10729,
    "preview": "open Utils\nopen Syntax\nopen Ctree\nopen Ttree\nopen Terror\n\nexception Solve_error of { loc : Location.t; exn : exn }\n\n(* T"
  },
  {
    "path": "teika/solve.mli",
    "chars": 218,
    "preview": "open Syntax\n\nexception Solve_error of { loc : Location.t; exn : exn }\n\ntype context\n\n(* TODO: couple all the initial con"
  },
  {
    "path": "teika/terror.ml",
    "chars": 1536,
    "preview": "open Utils\nopen Ttree\n\n(* TODO: too much work to add errors,\n   adding here and context is bad*)\ntype error =\n  (* TODO:"
  },
  {
    "path": "teika/terror.mli",
    "chars": 1031,
    "preview": "open Utils\nopen Ttree\n\ntype error =\n  (* metadata *)\n  | TError_loc of { error : error; loc : Location.t [@opaque] }\n  ("
  },
  {
    "path": "teika/test.ml",
    "chars": 10490,
    "preview": "open Syntax\n\nmodule Typer = struct\n  open Teika\n\n  type test =\n    | Check of { name : string; annotated_term : string }"
  },
  {
    "path": "teika/test.mli",
    "chars": 0,
    "preview": ""
  },
  {
    "path": "teika/tprinter.ml",
    "chars": 8345,
    "preview": "[@@@ocaml.warning \"-unused-constructor\"]\n\nmodule Ptree = struct\n  open Format\n  open Utils\n\n  type term =\n    (* TODO: u"
  },
  {
    "path": "teika/tprinter.mli",
    "chars": 163,
    "preview": "open Format\nopen Ttree\nopen Terror\n\nval pp_term : formatter -> term -> unit\nval pp_pat : formatter -> pat -> unit\nval pp"
  },
  {
    "path": "teika/ttree.ml",
    "chars": 1553,
    "preview": "open Utils\n\n(* TODO: explicit unfold for loops on terms *)\ntype term = Term of { struct_ : term_struct; loc : Location.t"
  },
  {
    "path": "teika/ttree.mli",
    "chars": 1525,
    "preview": "open Utils\n\n(* TODO: explicit unfold for loops on terms *)\ntype term = Term of { struct_ : term_struct; loc : Location.t"
  },
  {
    "path": "teika/typer.ml",
    "chars": 24694,
    "preview": "open Utils\nopen Ttree\nopen Terror\n\nmodule Value : sig\n  type value\n\n  and value_struct =\n    | V_hole\n    (* TODO: name "
  },
  {
    "path": "teika/typer.mli",
    "chars": 69,
    "preview": "open Ttree\n\ntype value\n\nval infer_term : term -> (value, exn) result\n"
  },
  {
    "path": "teikalsp/dune",
    "chars": 144,
    "preview": "(executable\n (name teikalsp)\n (libraries lsp eio eio_main)\n (preprocess\n  (pps ppx_deriving.show ppx_deriving.eq ppx_der"
  },
  {
    "path": "teikalsp/lsp_channel.ml",
    "chars": 5555,
    "preview": "module Io : sig\n  type 'a t\n\n  val return : 'a -> 'a t\n  val raise : exn -> 'a t\n  val await : 'a t -> 'a\n  val async : "
  },
  {
    "path": "teikalsp/lsp_channel.mli",
    "chars": 554,
    "preview": "open Jsonrpc\n\ntype channel\ntype t = channel\n\nval notify : channel -> Lsp.Server_notification.t -> unit\n\ntype on_request "
  },
  {
    "path": "teikalsp/lsp_context.ml",
    "chars": 1736,
    "preview": "open Lsp.Types\nopen Lsp_error\nmodule Document_uri_map = Map.Make (DocumentUri)\n\n(* TODO: capabilities *)\n(* TODO: initia"
  },
  {
    "path": "teikalsp/lsp_context.mli",
    "chars": 532,
    "preview": "open Lsp.Types\n\ntype status = private Handshake | Running\ntype context\ntype t = context\n\n(* TODO: rollback? Requests and"
  },
  {
    "path": "teikalsp/lsp_error.ml",
    "chars": 1047,
    "preview": "open Lsp.Types\n\ntype error =\n  (* channel *)\n  | Error_request_unsupported\n  | Error_response_unsupported\n  | Error_inva"
  },
  {
    "path": "teikalsp/lsp_error.mli",
    "chars": 959,
    "preview": "open Lsp.Types\n\ntype error =\n  (* channel *)\n  | Error_request_unsupported\n  | Error_response_unsupported\n  | Error_inva"
  },
  {
    "path": "teikalsp/lsp_notification.ml",
    "chars": 1836,
    "preview": "open Lsp.Types\nopen Lsp_context\nopen Lsp_error\n\nmodule Server_life_cycle = struct\n  (* TODO: do something here?*)\n  let "
  },
  {
    "path": "teikalsp/lsp_notification.mli",
    "chars": 367,
    "preview": "open Lsp.Types\n\nmodule Server_life_cycle : sig\n  val initialized : Lsp_context.t -> unit\nend\n\nmodule Text_document_sync "
  },
  {
    "path": "teikalsp/lsp_request.ml",
    "chars": 1102,
    "preview": "open Lsp.Types\nopen Lsp_context\nopen Lsp_error\n\nmodule Server_life_cycle = struct\n  let initialize context ~params =\n   "
  },
  {
    "path": "teikalsp/lsp_request.mli",
    "chars": 139,
    "preview": "open Lsp.Types\n\nmodule Server_life_cycle : sig\n  val initialize :\n    Lsp_context.t -> params:InitializeParams.t -> Init"
  },
  {
    "path": "teikalsp/lsp_text_document.ml",
    "chars": 338,
    "preview": "(* TODO: proper types for text and version *)\ntype document = Smol of { version : int; text : string }\ntype t = document"
  },
  {
    "path": "teikalsp/lsp_text_document.mli",
    "chars": 153,
    "preview": "type document\ntype t = document\n\nval teika : version:int -> text:string -> document\nval with_change : version:int -> tex"
  },
  {
    "path": "teikalsp/teikalsp.ml",
    "chars": 2662,
    "preview": "open Lsp_error\n\nlet on_request (type response) context _channel\n    (request : response Lsp.Client_request.t) : response"
  },
  {
    "path": "teikalsp/teikalsp.mli",
    "chars": 0,
    "preview": ""
  },
  {
    "path": "teikavsc/main.ts",
    "chars": 1768,
    "preview": "import { workspace, ExtensionContext, commands } from \"vscode\";\n\nimport {\n  LanguageClient,\n  LanguageClientOptions,\n  S"
  },
  {
    "path": "teikavsc/package.json",
    "chars": 3050,
    "preview": "{\n  \"name\": \"teikavsc\",\n  \"displayName\": \"Teika\",\n  \"description\": \"Teika language extension for VSCode\",\n  \"license\": \""
  },
  {
    "path": "teikavsc/tsconfig.json",
    "chars": 11005,
    "preview": "{\n  \"compilerOptions\": {\n    /* Visit https://aka.ms/tsconfig to read more about this file */\n    /* Projects */\n    // "
  },
  {
    "path": "utils/dune",
    "chars": 113,
    "preview": "(library\n (name utils)\n (preprocess\n  (pps ppx_deriving.eq ppx_deriving.ord ppx_deriving.eq ppx_deriving.show)))\n"
  },
  {
    "path": "utils/utils.ml",
    "chars": 652,
    "preview": "module Index = struct\n  type index = int\n  and t = index [@@deriving show, eq]\n\n  let zero = 0\n\n  let next n =\n    let n"
  },
  {
    "path": "utils/utils.mli",
    "chars": 588,
    "preview": "module Index : sig\n  type index = private int\n  type t = index [@@deriving show, eq]\n\n  val zero : index\n  val next : in"
  }
]

About this extraction

This page contains the full source code of the teikalang/teika GitHub repository, extracted and formatted as plain text for AI agents and large language models (LLMs). The extraction includes 80 files (177.1 KB), approximately 52.1k tokens, and a symbol index with 2 extracted functions, classes, methods, constants, and types. Use this with OpenClaw, Claude, ChatGPT, Cursor, Windsurf, or any other AI tool that accepts text input. You can copy the full output to your clipboard or download it as a .txt file.

Extracted by GitExtract — free GitHub repo to text converter for AI. Built by Nikandr Surkov.

Copied to clipboard!