[
  {
    "path": ".envrc",
    "content": "use_flake\n"
  },
  {
    "path": ".gitignore",
    "content": "_build\nnode_modules\n.direnv\ndist\n"
  },
  {
    "path": ".ocamlformat",
    "content": "version = unknown\n"
  },
  {
    "path": "LICENSE",
    "content": "MIT License\n\nCopyright (c) 2022 Eduardo Rafael\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
  },
  {
    "path": "README.md",
    "content": "# Teika\n\nTeika is a functional programming language. Same pronunciation as in \"take a break\" or if you prefer: \"teika break\".\n\n## WIP\n\nThis is highly in progress, so it's accepted and even expected that at any point the main branch may be broken.\n"
  },
  {
    "path": "design/GOALS.md",
    "content": "# Goals\n\nThis should document the goals of the project, the why and the tradeoff's.\n\n## Assumptions\n\n1. Most code will be read more times than written\n2. Most code will be read by proficient developers\n3. Most code will be written by proficient developers\n4. Most code will be simple even if powerful features exists\n5. Beginners developers are only temporarily beginners\n6. Tooling can be made to help beginners developers\n\n## Direct\n\nIndirections makes code more complex, a language should be direct. Functions should be known statically and data indirections should be optimized away.\n\n## Succinct\n\nOnly local information should be syntactically provided, a language should be succint. Noise should be avoided and contextual information should be provided on demand.\n\n## Powerful\n\nUsers should be able to describe abstract and efficient code, while still being able to reason about it locally, a language should be powerful. Effects should be tracked and mutation controlled.\n\n## Flexible\n\nHacks were needed in the past, are needed today and will be needed in the future, users will need to hack code, a language should be flexible. Abstractions are gonna restrict code, tooling should flexibilize it.\n"
  },
  {
    "path": "design/INFERENCE.md",
    "content": "# Inference\n\nTeika intends to be an ML-like language, which means inference is a must.\n\n## HM inference\n\nThe basic kind of inference present at Teika is Hindley-Milner inference, which essentially assume that all parameters are monomorphic(no quantification) and when a weak variable(aka a variable not constrained) escape it's scope, then an implicit forall is added.\n\nIt is quite limited but relatively simple to understand and good visualizations are possible to be developed.\n\n### Higher Kinded Types\n\nThere is a couple decisions to be made on inference for higher kinded types.\n\n```rust\n// When infering the following two types are possible\nf = T => (x: T Int) => x;\n// the easy one, here we follow the fact that Int is a type\nf = (T: _ -> _) => (x: T Int) => x; // T is an arrow\nf = (T: #(Int) -> _) => (x: T Int) => x; // T param is the type Int\nf = (T: #(Int) -> *) => (x: T Int) => x; // unify with x\n// the logical one, consider it's kind\n f = T => (x: T (Int: *)) => x; // Int has kind *\n f = (T: _ -> _) => (x: T (Int: *)) => x; // T is an arrow\n f = (T: * -> _) => (x: T (Int: *)) => x; // T param is the kind *\n f = (T: * -> *) => (x: T (Int: *)) => x; // unify with x\n```\n\nI decided to go with the second one, because in the same way that when you call a function with a value it infers it's type, when you call a function with a type it infers it's kind. If you do `f => f 1` the type is not `{A} -> (f: 1 -> A) -> f 1`, but `Int -> Int`.\n"
  },
  {
    "path": "design/LANGUAGE.md",
    "content": "# Teika Language\n\nThis document intends to describe the Teika Language. Which refers to all the features supported by Teika but not exactly how it will be presented to an user.\n\n## Goals\n\nThose are the current goals in order of importance and they may change.\n\n1. Soundness, type preservation\n2. Convenience, inference and effects\n3. Uniformity, calculus of construction\n4. Logical consistency, predicativity\n5. Decidability, subtyping\n\n## Warning\n\nThis document was written by me(@EduardoRFS) and I'm not proficient in type theory, so mistakes and changes are very likely to happen here.\n\n## Smol\n\nThis is the canonical language and should contain all the theoritical power on Teika, everything else should be describable in terms of the following features.\n\n```rust\n// Type in Type\n(Type : Type);\n\n// variable\n(A);\n\n// forall\n((A : Type) -> (x : A) -> A);\n// lambda\n((A : Type) => (x : A) => x);\n// apply\n(id Nat 1);\n\n// exists\n(A : Type, A);\n// pair\n(A = Nat, 1 : A);\n// split\n((A, x) = p; x);\n\n// equal\nEqual : (A : Type) -> (x : A) -> (y : B) -> Type;\n// refl\nRefl : (A : Type) -> (x : A) -> Equal A x x;\n// subst\nSubst :\n  (A : Type) ->\n  (x : A) ->\n  (y : A) ->\n  (x_eq_y : Equal A x y) ->\n  (P : (x_or_y : A) -> Type) ->\n  (p_x : P x) ->\n  P y;\n```\n\n### Forall, Lambda and Apply\n\nThis is abstraction and can be seen as universal quantification. Those are functions, they're your work horse, the majority of your code will likely be about declaring and calling those.\n\n#### Forall\n\nThe forall is the description of a function, it describes the type of the paramater and the type of the return, the return may depend on the type such as in polymorphic functions and dependent functions. Those are values.\n\n```rust\n/* forall syntax */\n((A : T) -> A);\n\n/* rule */\nA : Type  B : Type\n------------------\n  (x : A) -> B\n```\n\n#### Lambda\n\nThe lambda is the way to introduce a function, the type of a lambda will always be a forall. The body may dependend on the parameter. Those are values.\n\n```rust\n/* lambda syntax */\n((A : Type) => A);\n\n/* rule */\n           b : B\n---------------------------\n(x : A) => b : (x : A) -> B\n```\n\n#### Apply\n\nThis is the source of computing, the left side is anything of an arrow type and the right side must have the same type of the paramater expected by the lambda.\n\n```rust\n/* apply syntax */\n(lambda argument);\n\n/* rule */\nl : (x : A) -> B  a : A\n-----------------------\n      l a : B\n```\n\n### Exists, Pair and Split\n\nThis is allows pairing and can be seen as existential quantification. Those are pairs, they're the fundamental way to pack data together, all of your modules can be represented using those.\n\n#### Exists\n\nThis is the description of pairs, it describes the type of the left value and the type of the right value, the type of the right value may depend on the left value. Those are values.\n\n```rust\n/* exists syntax */\n(A : Type, A);\n\n/* rule */\nA : Type  B : Type\n------------------\n   (x : A, B)\n```\n\n#### Pair\n\nThis is how you introduce a pair, the type of a pair will always be an exists. The type of the right side may depend on the left side, but the value itself cannot depended on the left side. Those are values.\n\n```rust\n/* pair syntax */\n(A = Nat, 1 : A)\n\n/* rule */\n      l : A   R : B\n---------------------------\n(x = l, r : B) : (x : A, B)\n```\n\n#### Split\n\nThis is how you destruct a pair, it is like a let, but it can extract pairs. The body may depend on the pair values. The type of the body may depend on the pair values.\n\n```rust\n/* split syntax */\n((A, one) = p; A);\n\n/* rule */\np : (x : A, B)  b : C\n---------------------\n((x, y) = p; b) : C\n```\n\n### Equal, Refl and Subst\n\nThis is leibniz equality, it can be used as a way to do some form of dependent elimination, they hopefully only matters as a stepping stone for building libraries.\n\n#### Equal\n\nThis is the type of an equality, it states that the first and second values are literally the same, allowing to replace one for the other.\n\n#### Refl\n\nThis is the introduction of an equality, it is the only way to construct an equality and any equality is equivalent to this.\n\n#### Subst\n\nThis is how you can eliminate an equality, it is enough to derive all the other transformations that are desirable for equalities\n\n```rust\n/* sym */\nSym =\n  (A : Type) =>\n  (B : Type) =>\n  (A_eq_B : Equal A B) =>\n    subst\n      ((B : Type) => Equal B A)\n      A\n      B\n      A_eq_B\n      (Refl A)\n/* trans */\nTrans =\n  (A : Type) =>\n  (B : Type) =>\n  (C : Type) =>\n  (A_eq_B : Equal A B) =>\n  (B_eq_C : Equal B C) =>\n    subst\n      ((C : Type) => Equal A C)\n      B\n      C\n      B_eq_C\n      A_eq_B;\n```\n"
  },
  {
    "path": "design/MODULE.md",
    "content": "# Module\n\nThis intends to document behavior and features of modules.\n\n## Implicit type\n\nAll structures contain an implicit type which by default is abstract both internally and externally, it can be assigned internally.\n\nThis behaves similarly to object-oriented languages like Java and but the goal is to achieve similar interface design to the `Module.t` convention in OCaml.\n\n<!-- TODO: syntax for accessing internal type -->\n\n### Implicit type alias\n\nWhen assigning a structure to a value, the implicit type has an internal alias so that users can have consistent type naming internally and externally.\n\n<!-- TODO: should this be available when doing (F { x }), no this makes changing the ident a breaking change -->\n\nWhile this is not always available it will be used it can be used in most cases.\n\nExample:\n\n```rust\nAmount = {\n  // alias\n  of_nat: Nat -> Amount;\n  add: Amount -> Amount -> Amount;\n};\n```\n"
  },
  {
    "path": "design/SYNTAX.md",
    "content": "# Syntax\n\nThis should document the syntax of the project, they and the tradeoff's.\n\n## Requirements\n\n<!-- TODO: technically modules and types are fused  -->\n\nThe syntax needs to be able to describe four class of terms, modules, expressions, types and patterns.\n\nThe syntax should also be consistent and succint, this requires the syntax meaning to be context dependent.\n\nThe syntax should be simple so that it can easily be user manipulated, allowing tools like macros and ppx to be trivially implemented.\n\n## Unified Representation\n\nTo avoid having too much notation, an unified representation was choosen, this means that all classes of terms have an identical syntax.\n\nThis is achieved by accepting a lot more code during parsing and rejecting this code later.\n\nAnother problem is the need for parens in many situations such as typing a simple binding. This is avoided by allowing a couple places to omit parens in an ad-hoc manner.\n\nPros:\n\n- AST is really small, making macros easier\n- Syntax error messages are much easier\n- Error recovery is way easier\n- Flexible, ppx can use of the invalid syntax\n\nCons:\n\n- invalid code may be parsed\n- not clear where parens are needed\n- no syntatical indication of current context\n\n```rust\nSyntax =\n  | Identifier // variable\n  | Number // number\n  | Syntax -> Syntax // arrow\n  | Syntax => Syntax // lambda\n  | Syntax Syntax // apply\n  | Syntax = Syntax;\n  | Syntax = Syntax; Syntax // binding\n  | Syntax : Syntax;\n  | Syntax : Syntax; Syntax // binding signature\n  | Syntax Syntax[] = Syntax;\n  | Syntax Syntax[] = Syntax; Syntax // binding + lambda\n  | { Syntax } // structure\n  | Syntax : Syntax // constraint\n  | Syntax.Syntax // field\n  | (Syntax) // parens\n```\n\n## Arrow and Lambda\n\nThe entire language can be described using lambda only, in fact arrow can only describe a subset of the language.\n\nBut adding arrow allows easier inference and a simpler syntax for describing common types.\n\n## Implicit Argument\n\nInitially `A. Syntax` was the thought way to do implicit parameters, but this lead to a couple weird syntax properties, such `A.Eq` and `A. Eq` being two different things.\n\nAlso that completely prevented binding + function syntax `id x = x`, which may be desirable in the future.\n\nSo for the syntax argument it is currently using `{A: M}` which was an already supported syntax.\n\nBut this makes it ambiguous with record destructuring, currently this means that destructuring on a record with a single fields the last semicolon cannot be omitted `{A} -> A` means `forall a. a` but `{A;} -> A` means `{ A: _ } -> A`.\n\n- https://agda.readthedocs.io/en/v2.6.1/language/implicit-arguments.html\n\nAnother option for implicit arguments is using syntax from optional parameters, `?A -> A`.\n\n## Binding Lambda\n\nA common feature in languages such OCaml and Haskell is to support a fusion syntax for lambdas and binding, in the form of `add a b = a + b`.\n\nThe advantages of this is that it's way more succinct for most functions and it's a common feature in other programming languages.\n\nThe disadvantage is that it's not straightforward to explain to users that `f = x -> x` is the same as `f x = x`, it also doesn't work with binding + constraint such as `x: Int -> Int = x -> x`.\n\nThis was decided to be a reasonable choice due to type constructors. And as constraints + lambda should not be common.\n\n```rust\nPacket {A} = {\n  id: Nat;\n  data: A;\n};\n```\n"
  },
  {
    "path": "dune-project",
    "content": "(lang dune 2.9)\n\n(using menhir 2.0)\n"
  },
  {
    "path": "flake.nix",
    "content": "{\n  description = \"Nix Flake\";\n\n  inputs = {\n    nixpkgs.url = \"github:anmonteiro/nix-overlays\";\n    nix-filter.url = \"github:numtide/nix-filter\";\n    flake-utils.url = \"github:numtide/flake-utils\";\n  };\n\n  outputs = { self, nixpkgs, nix-filter, flake-utils }:\n    flake-utils.lib.eachDefaultSystem (system:\n      let pkgs = (nixpkgs.makePkgs {\n        inherit system;\n      }).extend (self: super: {\n        ocamlPackages = super.ocaml-ng.ocamlPackages_5_3;\n      }); in\n      let teika = pkgs.callPackage ./nix {\n        inherit nix-filter;\n        doCheck = true;\n      }; in\n      rec {\n        packages = { inherit teika; };\n        devShell = import ./nix/shell.nix { inherit pkgs teika; };\n      });\n}\n"
  },
  {
    "path": "jsend/dune",
    "content": "(library\n (name jsend)\n (libraries teika)\n (modules\n  (:standard \\ Test))\n (preprocess\n  (pps ppx_deriving.show ppx_deriving.eq ppx_deriving.ord)))\n\n(executable\n (name test)\n (modules Test)\n (libraries alcotest jsend)\n (preprocess\n  (pps ppx_deriving.show ppx_deriving.eq ppx_deriving.ord)))\n\n(rule\n (alias runtest)\n (deps\n  (:exe ./test.exe))\n (action\n  (run %{exe})))\n"
  },
  {
    "path": "jsend/emit.ml",
    "content": "open Utree\nopen Jtree\n\nlet emit_curry function_ =\n  JE_call { lambda = JE_var { var = Var.curry }; args = [ function_ ] }\n\nlet rec emit_term : Utree.term -> expression =\n fun term ->\n  match term with\n  (* TODO: sourcemap *)\n  | UT_loc { term; loc = _ } -> emit_term term\n  | UT_var { var } -> JE_var { var }\n  | UT_lambda _ ->\n      (* TODO: weird to ignore UT_lambda like this *)\n      emit_curry @@ emit_generator ~params:[] term\n  | UT_apply _ ->\n      (* TODO: weird to ignore UT_apply like this *)\n      let call = emit_call ~args:[] term in\n      (* TODO: test optimization, if instanceof before yield *)\n      JE_yield { expression = call }\n  | UT_let _ ->\n      (* TODO: weird to ignore UT_let like this *)\n      let block = emit_block ~consts:[] term in\n      let wrapper = JE_generator { params = []; block } in\n      let call = JE_call { lambda = wrapper; args = [] } in\n      JE_yield { expression = call }\n  | UT_string { literal } -> JE_string { literal }\n  | UT_external { external_ } -> translate_external external_\n\nand emit_generator ~params return =\n  (* TODO: is this transformation desired?\n      Does it changes performance behaviour *)\n  (* TODO: too many params *)\n  match return with\n  | UT_loc { term = return; loc = _ } -> emit_generator ~params return\n  | UT_lambda { param; return } ->\n      let params = param :: params in\n      emit_generator ~params return\n  | UT_var _ | UT_apply _ | UT_let _ | UT_string _ | UT_external _ ->\n      let params = List.rev params in\n      let block = emit_block ~consts:[] return in\n      JE_generator { params; block }\n\nand emit_call ~args lambda =\n  (* TODO: too many args? *)\n  match lambda with\n  | UT_loc { term = lambda; loc = _ } -> emit_call ~args lambda\n  | UT_apply { lambda; arg } ->\n      let arg = emit_term arg in\n      let args = arg :: args in\n      emit_call ~args lambda\n  | UT_var _ | UT_lambda _ | UT_let _ | UT_string _ | UT_external _ ->\n      let lambda = emit_term lambda in\n      JE_call { lambda; args }\n\nand emit_block ~consts return =\n  match return with\n  | UT_loc { term = return; loc = _ } -> emit_block ~consts return\n  | UT_let { var; value; return } ->\n      let value = emit_term value in\n      let consts = (var, value) :: consts in\n      emit_block ~consts return\n  | UT_apply _ ->\n      (* tco *)\n      let return =\n        let return = emit_call ~args:[] return in\n        let constructor =\n          JE_call { lambda = JE_var { var = Var.jmp }; args = [ return ] }\n        in\n        JE_new { constructor }\n      in\n      let consts = List.rev consts in\n      JBlock { consts; return }\n  | UT_var _ | UT_lambda _ | UT_string _ | UT_external _ ->\n      let return = emit_term return in\n      let consts = List.rev consts in\n      JBlock { consts; return }\n\nand translate_external : external_ -> expression =\n fun external_ ->\n  let var =\n    match external_ with\n    | UE_type -> Var.type_\n    | UE_fix -> Var.fix\n    | UE_unit -> Var.unit\n    | UE_debug -> Var.debug\n  in\n  JE_var { var }\n"
  },
  {
    "path": "jsend/emit.mli",
    "content": "val emit_term : Utree.term -> Jtree.expression\n"
  },
  {
    "path": "jsend/jprinter.ml",
    "content": "open Jtree\nopen Format\n\n(* TODO: identation *)\nlet pp_block_syntax ~pp_wrapped_expression fmt block =\n  let (JBlock { consts; return }) = block in\n  List.iter\n    (fun (var, value) ->\n      fprintf fmt \"const %a = %a;\" Var.pp var pp_wrapped_expression value)\n    consts;\n  fprintf fmt \"return %a;\" pp_wrapped_expression return\n\nlet rec pp_expression_syntax ~pp_wrapped ~pp_call ~pp_atom ~pp_block fmt\n    expression =\n  let pp_expression_syntax fmt expression =\n    pp_expression_syntax ~pp_wrapped ~pp_call ~pp_atom ~pp_block fmt expression\n  in\n  match expression with\n  | JE_loc { expression; loc = _ } -> pp_expression_syntax fmt expression\n  | JE_var { var } -> Var.pp fmt var\n  | JE_generator { params; block } ->\n      (* TODO: names on functions? *)\n      let rec pp_params fmt params =\n        match params with\n        | [] -> ()\n        | [ param ] -> fprintf fmt \"%a\" Var.pp param\n        | param :: params -> fprintf fmt \"%a, %a\" Var.pp param pp_params params\n      in\n      fprintf fmt \"function* (%a) { %a }\" pp_params params pp_block block\n  (* TODO: new precedence is the same as call? *)\n  | JE_new { constructor } -> fprintf fmt \"new %a\" pp_call constructor\n  | JE_call { lambda; args } ->\n      (* TODO: almost duplicated from params *)\n      let rec pp_args fmt args =\n        match args with\n        | [] -> ()\n        | [ arg ] -> fprintf fmt \"%a\" pp_wrapped arg\n        | arg :: args -> fprintf fmt \"%a, %a\" pp_wrapped arg pp_args args\n      in\n      fprintf fmt \"%a(%a)\" pp_call lambda pp_args args\n  | JE_yield { expression } -> fprintf fmt \"yield %a\" pp_call expression\n  | JE_string { literal } ->\n      (* TODO: proper JS escaping *)\n      fprintf fmt \"%S\" literal\n\ntype prec = Wrapped | Call | Atom\n\nlet rec pp_expression prec fmt expression =\n  let pp_wrapped fmt term = pp_expression Wrapped fmt term in\n  let pp_call fmt term = pp_expression Call fmt term in\n  let pp_atom fmt term = pp_expression Atom fmt term in\n  let pp_block fmt block =\n    pp_block_syntax ~pp_wrapped_expression:pp_wrapped fmt block\n  in\n  match (expression, prec) with\n  | JE_loc { expression; loc = _ }, prec -> pp_expression prec fmt expression\n  | (JE_var _ | JE_string _), (Wrapped | Call | Atom)\n  | (JE_new _ | JE_call _), (Wrapped | Call)\n  | (JE_generator _ | JE_yield _), Wrapped ->\n      pp_expression_syntax ~pp_wrapped ~pp_call ~pp_atom ~pp_block fmt\n        expression\n  | (JE_new _ | JE_call _), Atom | (JE_generator _ | JE_yield _), (Call | Atom)\n    ->\n      fprintf fmt \"(%a)\" pp_wrapped expression\n\nlet pp_expression fmt expression = pp_expression Wrapped fmt expression\n"
  },
  {
    "path": "jsend/jprinter.mli",
    "content": "val pp_expression : Format.formatter -> Jtree.expression -> unit\n\n"
  },
  {
    "path": "jsend/jtree.ml",
    "content": "type expression =\n  | JE_loc of { expression : expression; loc : Location.t }\n  | JE_var of { var : Var.t }\n  | JE_generator of { params : Var.t list; block : block }\n  | JE_new of { constructor : expression }\n  | JE_call of { lambda : expression; args : expression list }\n  | JE_yield of { expression : expression }\n  | JE_string of { literal : string }\n\nand block =\n  | JBlock of { consts : (Var.t * expression) list; return : expression }\n"
  },
  {
    "path": "jsend/jtree.mli",
    "content": "type expression =\n  | JE_loc of { expression : expression; loc : Location.t }\n  | JE_var of { var : Var.t }\n  | JE_generator of { params : Var.t list; block : block }\n  (* TODO: not really a lambda and arg *)\n  | JE_new of { constructor : expression }\n  | JE_call of { lambda : expression; args : expression list }\n  | JE_yield of { expression : expression }\n  | JE_string of { literal : string }\n\nand block =\n  | JBlock of { consts : (Var.t * expression) list; return : expression }\n"
  },
  {
    "path": "jsend/test.ml",
    "content": "open Syntax\nopen Teika\nopen Jsend\n\nlet compile code =\n  let term = Option.get @@ Clexer.from_string Cparser.term_opt code in\n  (* TODO: locations *)\n  let loc = Location.none in\n  let term = Lparser.parse_term ~loc term in\n  let term =\n    match Typer.Infer.infer_term term with\n    | Ok ttree -> ttree\n    | Error error ->\n        Format.eprintf \"%a\\n%!\" Terror.pp error;\n        failwith \"infer\"\n  in\n\n  let term = Untype.untype_term term in\n  let term = Emit.emit_term term in\n  Format.printf \"%a\\n\\n%!\" Jprinter.pp_expression term\n\nlet () = Printexc.record_backtrace true\n\nlet () =\n  compile {|\n      ((A : Type) => (x : A) => x) String \"Hello World\"\n    |}\n\nlet () =\n  compile\n    {|\n      noop = (u : (A : Type) -> (x : A) -> A) => u ((A : Type) -> (x : A) -> A) u;\n      noop\n    |}\n\n(* let () =\n     compile\n       {|\n         Unit = (A : Type) -> (x : A) -> A;\n         (noop : (u : Unit) -> Unit) = u => u Unit u;\n         noop\n       |}\n\n   let () =\n     compile\n       {|\n         Bool = (A : Type) -> (t : A) -> (f : A) -> A;\n         (true : Bool) = A => x => y => x;\n         (false : Bool) = A => x => y => y;\n         f = (bool : Bool) => @native(\"debug\")(bool String \"!!true\" \"!!false\");\n         f false\n       |}\n\n   let () =\n     compile\n       {|\n           Nat = (A : Type) -> (z : A) ->\n             (s : (x : A) -> A) -> (k : (x : A) -> A) -> A;\n           (zero : Nat) = A => z => s => k => k z;\n           (succ : (n : Nat) -> Nat) =\n             n => A => z => s => k => n A z s (x => k (s x));\n           (add : (n : Nat) -> (m : Nat) -> Nat) =\n             n => m => n Nat m succ (x => x);\n           (mul : (n : Nat) -> (m : Nat) -> Nat) =\n             n => m => n Nat zero (add m) (x => x);\n           one = succ zero;\n           two = succ one;\n           four = mul two two;\n           eight = mul two four;\n           sixteen = mul two eight;\n           byte = mul sixteen sixteen;\n           short = mul byte byte;\n           short String \"zero\" (_ => @native(\"debug\")(\"hello\")) (x => x)\n         |} *)\n"
  },
  {
    "path": "jsend/untype.ml",
    "content": "open Syntax\nopen Teika\nopen Ttree\nopen Utree\n\nexception Term_subst_found\nexception Term_shift_found\nexception Invalid_variable\n\nlet type_term : term = UT_external { external_ = UE_type }\n(* let fix_term : term = UT_external { external_ = UE_fix }\n   let unit_term : term = UT_external { external_ = UE_unit }\n   let debug_term : term = UT_external { external_ = UE_debug } *)\n\nlet next_level ~vars =\n  let current_level =\n    match Level.Map.max_binding_opt vars with\n    | Some (current_level, _) -> current_level\n    | None ->\n        (* TODO: this is weird *)\n        level_type_string\n  in\n  Level.next current_level\n\nmodule Context : sig\n  type 'a context\n\n  val run : (unit -> 'a context) -> 'a\n  val return : 'a -> 'a context\n  val ( let* ) : 'a context -> ('a -> 'b context) -> 'b context\n  val ( let+ ) : 'a context -> ('a -> 'b) -> 'b context\n  val with_var : Name.t -> (Var.t -> 'k context) -> 'k context\n  val lookup : Level.t -> Var.t context\nend = struct\n  type 'a context = vars:Var.t Level.Map.t -> 'a\n\n  let run context =\n    let vars =\n      let open Level.Map in\n      let vars = empty in\n      (* TODO: string is also a $type *)\n      let vars = add level_type_univ Var.type_ vars in\n      let vars = add level_type_string Var.type_ vars in\n      vars\n    in\n    context () ~vars\n\n  let return x ~vars:_ = x\n  let ( let* ) context k ~vars = k (context ~vars) ~vars\n  let ( let+ ) context k ~vars = k (context ~vars)\n\n  let with_var name k ~vars =\n    let var = Var.create name in\n    let level = next_level ~vars in\n    let vars = Level.Map.add level var vars in\n    k var ~vars\n\n  let lookup level ~vars =\n    match Level.Map.find_opt level vars with\n    | Some var -> var\n    | None -> raise Invalid_variable\nend\n\nopen Context\n\nlet rec untype_term term =\n  match term with\n  | TT_with_type { term; type_ = _ } -> untype_term term\n  | TT_with_sort { term } ->\n      (* TODO: should also not be reachable? *)\n      untype_term term\n  | TT_subst _ -> raise Term_subst_found\n  | TT_shift _ -> raise Term_shift_found\n  | TT_var { var } ->\n      let+ var = lookup var in\n      UT_var { var }\n  | TT_forall _ -> return type_term\n  | TT_lambda { param; return } ->\n      let+ param, return =\n        erase_pat param @@ fun var ->\n        let+ return = untype_term return in\n        (var, return)\n      in\n      UT_lambda { param; return }\n  | TT_apply { lambda; arg } ->\n      let* lambda = untype_term lambda in\n      let+ arg = untype_term arg in\n      UT_apply { lambda; arg }\n  | TT_let { bound; value; return } ->\n      (* TODO: param first *)\n      let* value = untype_term value in\n      let+ var, return =\n        erase_pat bound @@ fun var ->\n        let+ return = untype_term return in\n        (var, return)\n      in\n      UT_let { var; value; return }\n  | TT_annot { term; annot = _ } -> untype_term term\n  | TT_string { literal } -> return @@ UT_string { literal }\n\nand erase_pat pat k =\n  match pat with\n  | TP_with_type { pat; type_ = _ } -> erase_pat pat k\n  | TP_annot { pat; annot = _ } -> erase_pat pat k\n  | TP_var { name } -> with_var name k\n\nlet untype_term term = Context.run @@ fun () -> untype_term term\n"
  },
  {
    "path": "jsend/untype.mli",
    "content": "open Teika\n\nval untype_term : Ttree.term -> Utree.term\n"
  },
  {
    "path": "jsend/utree.ml",
    "content": "type term =\n  | UT_loc of { term : term; loc : Location.t }\n  | UT_var of { var : Var.t }\n  | UT_lambda of { param : Var.t; return : term }\n  | UT_apply of { lambda : term; arg : term }\n  | UT_let of { var : Var.t; value : term; return : term }\n  | UT_string of { literal : string }\n  | UT_external of { external_ : external_ }\n\nand external_ = UE_type | UE_fix | UE_unit | UE_debug\n"
  },
  {
    "path": "jsend/utree.mli",
    "content": "type term =\n  (* TODO: why is loc a term? *)\n  | UT_loc of { term : term; loc : Location.t }\n  | UT_var of { var : Var.t }\n  (* TODO: patterns in the Itree? *)\n  | UT_lambda of { param : Var.t; return : term }\n  | UT_apply of { lambda : term; arg : term }\n  | UT_let of { var : Var.t; value : term; return : term }\n  | UT_string of { literal : string }\n  | UT_external of { external_ : external_ }\n\nand external_ = UE_type | UE_fix | UE_unit | UE_debug\n"
  },
  {
    "path": "jsend/var.ml",
    "content": "open Utils\n\nmodule Id : sig\n  type t [@@deriving show]\n\n  val next : unit -> t\n  val equal : t -> t -> bool\n  val compare : t -> t -> int\nend = struct\n  type t = int [@@deriving show]\n\n  let acc = Atomic.make 0\n  let next () = Atomic.fetch_and_add acc 1\n  let equal = Int.equal\n  let compare = Int.compare\nend\n\nlet _ = Id.show\n\ntype var_kind = Global | Scoped\ntype var = { id : Id.t; name : Name.t; kind : var_kind }\ntype t = var\n\nlet pp fmt var =\n  let { id; name; kind } = var in\n  match kind with\n  | Global -> Format.fprintf fmt \"%s\" (Name.repr name)\n  | Scoped -> Format.fprintf fmt \"%s$%a\" (Name.repr name) Id.pp id\n\nlet show var = Format.asprintf \"%a\" pp var\n\nlet create_any kind name =\n  let id = Id.next () in\n  { id; name; kind }\n\nlet create name = create_any Scoped name\n\nlet predef name =\n  let name = Name.make name in\n  create_any Global name\n\nlet equal a b =\n  let { id = a; name = _; kind = _ } = a in\n  let { id = b; name = _; kind = _ } = b in\n  Id.equal a b\n\nlet compare a b =\n  let { id = a; name = _; kind = _ } = a in\n  let { id = b; name = _; kind = _ } = b in\n  Id.compare a b\n\nlet name var =\n  let { id = _; name; kind = _ } = var in\n  name\n\n(* TODO: those should be checked somewhere *)\nlet type_ = predef \"$type\"\nlet fix = predef \"$fix\"\nlet unit = predef \"$unit\"\nlet debug = predef \"$debug\"\nlet curry = predef \"$curry\"\nlet jmp = predef \"$jmp\"\n\nmodule Map = Map.Make (struct\n  type t = var\n\n  let compare = compare\nend)\n"
  },
  {
    "path": "jsend/var.mli",
    "content": "open Utils\n\ntype var\ntype t = var [@@deriving show]\n\nval create : Name.t -> var\nval equal : var -> var -> bool\nval compare : var -> var -> int\nval name : var -> Name.t\n\n(* predefined *)\nval type_ : var\nval fix : var\nval unit : var\nval debug : var\nval curry : var\nval jmp : var\n\nmodule Map : Map.S with type key = t\n"
  },
  {
    "path": "nix/default.nix",
    "content": "{ pkgs, doCheck ? true, nix-filter }:\n\nlet inherit (pkgs) lib stdenv ocamlPackages; in\n\nwith ocamlPackages; buildDunePackage rec {\n  pname = \"teika\";\n  version = \"0.0.0-dev\";\n\n  src = with nix-filter.lib;\n    filter {\n      root = ./..;\n      include = [\n        \"dune-project\"\n        \"smol\"\n        \"teika\"\n      ];\n      exclude = [ ];\n    };\n\n  propagatedBuildInputs = [\n    menhir\n    menhirLib\n    sedlex\n    ppx_deriving\n    eio\n    eio_main\n    ppx_sexp_conv\n    zarith\n    lsp\n  ]\n  # checkInputs are here because when cross compiling dune needs test dependencies\n  # but they are not available for the build phase. The issue can be seen by adding strictDeps = true;.\n  ++ checkInputs;\n\n  checkInputs = [ alcotest ];\n}\n"
  },
  {
    "path": "nix/shell.nix",
    "content": "{ pkgs, teika }:\n\nwith pkgs; with ocamlPackages; mkShell {\n  inputsFrom = [ teika ];\n  packages = [\n    # Make developer life easier\n    # formatters\n    nixfmt\n    # ocamlformat\n    ocamlformat\n    # OCaml developer tooling\n    ocaml\n    dune_3\n    ocaml-lsp\n    utop\n  ];\n}\n"
  },
  {
    "path": "smol/HACKING.md",
    "content": "# Smol Frontend\n\n## Optimizations\n\n### Explicit Substitutions\n\nThis uses explicit substitutions to achieve laziness, similar to λυ.\n\nI think Smol doesn't have metavariables as no unification exists, additionally currently substituions are not used for equality.\n\nAlso the failure mode is that it will reject terms not accept terms, which seems to be okay.\n\n- https://drops.dagstuhl.de/opus/volltexte/2014/4858/pdf/34.pdf\n- https://www.irif.fr/~kesner/papers/springer-csl07.pdf\n- https://hal.inria.fr/inria-00074197/document\n"
  },
  {
    "path": "smol/dune",
    "content": "(library\n (name smol)\n (libraries syntax)\n (modules\n  (:standard \\ Test))\n (preprocess\n  (pps ppx_deriving.show ppx_deriving.eq ppx_deriving.ord sedlex.ppx)))\n\n(executable\n (name test)\n (modules Test)\n (libraries alcotest smol)\n (preprocess\n  (pps ppx_deriving.show)))\n\n(rule\n (alias runtest)\n (deps\n  (:exe ./test.exe))\n (action\n  (run %{exe})))\n"
  },
  {
    "path": "smol/index.ml",
    "content": "type index = int\nand t = index [@@deriving show, eq]\n\nlet zero = 0\nlet one = 1\nlet previous x = match x > 0 with true -> Some (x - 1) | false -> None\n(* TODO: overflow detection *)\n\nlet next x = x + 1\n\nlet of_int x =\n  match x >= 0 with\n  | true -> x\n  | false -> raise (Invalid_argument \"index must be bigger than zero\")\n\nlet repr x = x\nlet ( < ) (a : index) (b : index) = a < b\nlet ( > ) (a : index) (b : index) = a > b\n"
  },
  {
    "path": "smol/index.mli",
    "content": "type index\ntype t = index [@@deriving show, eq]\n\nval zero : index\nval one : index\nval previous : index -> index option\nval next : index -> index\n\n(* repr *)\n(* TODO: this API is non ideal *)\nval of_int : int -> index\nval repr : index -> int\n\n(* operations *)\nval ( < ) : index -> index -> bool\nval ( > ) : index -> index -> bool\n"
  },
  {
    "path": "smol/level.ml",
    "content": "type level = int\nand t = level [@@deriving show, eq]\n\nlet zero = 0\n\n(* TODO: check for overflows *)\nlet next n = n + 1\nlet offset ~from ~to_ = Index.of_int (to_ - from)\nlet ( < ) : level -> level -> bool = ( < )\n\nmodule Map = Map.Make (Int)"
  },
  {
    "path": "smol/level.mli",
    "content": "type level\ntype t = level [@@deriving show, eq]\n\nval zero : level\nval next : level -> level\nval offset : from:level -> to_:level -> Index.t\nval ( < ) : level -> level -> bool\n\nmodule Map : Map.S with type key = level\n"
  },
  {
    "path": "smol/stree.ml",
    "content": "type ty_term = ST_typed of { term : term; type_ : term }\n\nand term =\n  | ST_loc of { term : term; loc : Location.t [@opaque] }\n  | ST_free_var of { level : Level.t }\n  | ST_bound_var of { index : Index.t }\n  | ST_forall of { param : ty_pat; return : term }\n  | ST_lambda of { param : ty_pat; return : term }\n  | ST_apply of { lambda : term; arg : term }\n  (* TODO: self being only pat is weird *)\n  | ST_self of { self : pat; body : term }\n  | ST_fix of { self : ty_pat; body : term }\n  | ST_unroll of { term : term }\n  | ST_let of { bound : ty_pat; value : term; return : term }\n  | ST_annot of { term : term; annot : term }\n\nand ty_pat = SP_typed of { pat : pat; type_ : term }\n\nand pat =\n  | SP_loc of { pat : pat; loc : Location.t [@opaque] }\n  (* TODO: extract Syntax.Name *)\n  | SP_var of { var : Syntax.Name.t }\n  | SP_erasable of { pat : pat }\n  (* TODO: SP_unroll *)\n  | SP_annot of { pat : pat; annot : term }\n[@@deriving show]\n"
  },
  {
    "path": "smol/stree.mli",
    "content": "type ty_term = ST_typed of { term : term; type_ : term }\n\nand term =\n  | ST_loc of { term : term; loc : Location.t }\n  | ST_free_var of { level : Level.t }\n  | ST_bound_var of { index : Index.t }\n  | ST_forall of { param : ty_pat; return : term }\n  | ST_lambda of { param : ty_pat; return : term }\n  | ST_apply of { lambda : term; arg : term }\n  | ST_self of { self : pat; body : term }\n  | ST_fix of { self : ty_pat; body : term }\n  | ST_unroll of { term : term }\n  | ST_let of { bound : ty_pat; value : term; return : term }\n  | ST_annot of { term : term; annot : term }\n\nand ty_pat = SP_typed of { pat : pat; type_ : term }\n\nand pat =\n  | SP_loc of { pat : pat; loc : Location.t }\n  | SP_var of { var : Syntax.Name.t }\n  | SP_erasable of { pat : pat }\n  | SP_annot of { pat : pat; annot : term }\n[@@deriving show]\n"
  },
  {
    "path": "smol/styper.ml",
    "content": "(* TODO: remove all failwith *)\n\nmodule Error = struct\n  open Syntax\n\n  type error =\n    | E_loc of { error : error; loc : Location.t }\n    (* machinery *)\n    | E_free_var_clash\n    | E_bound_var_clash\n    | E_type_clash\n    | E_pattern_clash\n    (* context *)\n    | E_unknown_var of { var : Name.t }\n    | E_variable_used of { var : Name.t }\n    | E_variable_unused of { var : Name.t }\n    | E_grades_invariant_violated\n    | E_types_invariant_violated\n    (* typer *)\n    | E_unsupported_extensions\n    | E_string_not_supported\n    | E_missing_annotations\n    | E_unroll_pattern_not_supported\n    | E_expected_forall\n    | E_expected_self\n\n  exception Error of { error : error }\n\n  let rec pp_error fmt error =\n    let open Format in\n    match error with\n    | E_loc { error; loc = _ } -> pp_error fmt error\n    | E_free_var_clash -> fprintf fmt \"free var clash\"\n    | E_bound_var_clash -> fprintf fmt \"bound var clash\"\n    | E_type_clash -> fprintf fmt \"type clash\"\n    | E_pattern_clash -> fprintf fmt \"pattern clash\"\n    | E_unknown_var { var } -> fprintf fmt \"unknown variable: %a\" Name.pp var\n    (* TODO: show all other occurrences *)\n    | E_variable_used { var } ->\n        fprintf fmt \"duplicated variable: %a\" Name.pp var\n    (* TODO: show error on pattern *)\n    | E_variable_unused { var } ->\n        fprintf fmt \"variable not used: %a\" Name.pp var\n    | E_grades_invariant_violated ->\n        fprintf fmt \"compiler bug, grades invariant\"\n    | E_types_invariant_violated -> fprintf fmt \"compiler bug, types invariant\"\n    | E_unsupported_extensions -> fprintf fmt \"extensions are not supported\"\n    | E_string_not_supported -> fprintf fmt \"strings are not supported\"\n    | E_missing_annotations -> fprintf fmt \"not enough annotations here\"\n    | E_unroll_pattern_not_supported ->\n        fprintf fmt \"unroll patterns are not supported\"\n    | E_expected_forall -> fprintf fmt \"expected a function\"\n    | E_expected_self -> fprintf fmt \"expected a fixpoint\"\n\n  let pp_loc fmt loc =\n    let open Format in\n    (* TODO: loc ghost?*)\n    let Location.{ loc_start; loc_end; loc_ghost = _ } = loc in\n    fprintf fmt \"[%d:%d .. %d:%d]\" loc_start.pos_lnum\n      (loc_start.pos_cnum - loc_start.pos_bol)\n      loc_end.pos_lnum\n      (loc_end.pos_cnum - loc_end.pos_bol)\n\n  let rec pp_error_loc ~loc fmt error =\n    let open Format in\n    match error with\n    | E_loc { error; loc = new_loc } -> (\n        match Location.is_none new_loc with\n        | true -> pp_error_loc ~loc:new_loc fmt error\n        | false -> pp_error_loc ~loc:new_loc fmt error)\n    | error -> (\n        match Location.is_none loc with\n        | true -> fprintf fmt \"type error : %a\" pp_error error\n        | false -> fprintf fmt \"type error at %a : %a\" pp_loc loc pp_error error\n        )\n\n  let () =\n    Printexc.register_printer @@ function\n    | Error { error } ->\n        Some (Format.asprintf \"%a\" (pp_error_loc ~loc:Location.none) error)\n    | _ -> None\n\n  let error error = raise (Error { error })\nend\n\nmodule Machinery = struct\n  open Stree\n  open Error\n\n  let rec open_term ~from ~to_ term =\n    let open_term ~from term = open_term ~from ~to_ term in\n    let open_ty_pat ~from pat = open_ty_pat ~from ~to_ pat in\n    let open_pat ~from pat = open_pat ~from ~to_ pat in\n    match term with\n    | ST_loc { term; loc } ->\n        let term = open_term ~from term in\n        ST_loc { term; loc }\n    | ST_free_var { level } -> ST_free_var { level }\n    | ST_bound_var { index } -> (\n        match Index.equal from index with\n        | true -> to_\n        | false -> ST_bound_var { index })\n    | ST_forall { param; return } ->\n        let param = open_ty_pat ~from param in\n        let return =\n          (* TODO: what if pairs in patterns *)\n          let from = Index.next from in\n          open_term ~from return\n        in\n        ST_forall { param; return }\n    | ST_lambda { param; return } ->\n        let param = open_ty_pat ~from param in\n        let return =\n          let from = Index.next from in\n          open_term ~from return\n        in\n        ST_lambda { param; return }\n    | ST_apply { lambda; arg } ->\n        let lambda = open_term ~from lambda in\n        let arg = open_term ~from arg in\n        ST_apply { lambda; arg }\n    | ST_self { self; body } ->\n        let self = open_pat ~from self in\n        let body =\n          let from = Index.next from in\n          open_term ~from body\n        in\n        ST_self { self; body }\n    | ST_fix { self; body } ->\n        let self = open_ty_pat ~from self in\n        let body =\n          let from = Index.next from in\n          open_term ~from body\n        in\n        ST_fix { self; body }\n    | ST_unroll { term } ->\n        let term = open_term ~from term in\n        ST_unroll { term }\n    | ST_let { bound; value; return } ->\n        let bound = open_ty_pat ~from bound in\n        let value = open_term ~from value in\n        let return =\n          let from = Index.next from in\n          open_term ~from return\n        in\n        ST_let { bound; value; return }\n    | ST_annot { term; annot } ->\n        let term = open_term ~from term in\n        let annot = open_term ~from annot in\n        ST_annot { term; annot }\n\n  and open_ty_pat ~from ~to_ pat =\n    let (SP_typed { pat; type_ }) = pat in\n    let pat = open_pat ~from ~to_ pat in\n    let type_ = open_term ~from ~to_ type_ in\n    SP_typed { pat; type_ }\n\n  and open_pat ~from ~to_ pat =\n    match pat with\n    | SP_loc { pat; loc } ->\n        let pat = open_pat ~from ~to_ pat in\n        SP_loc { pat; loc }\n    | SP_var { var } -> SP_var { var }\n    | SP_erasable { pat } ->\n        let pat = open_pat ~from ~to_ pat in\n        SP_erasable { pat }\n    | SP_annot { pat; annot } ->\n        let pat = open_pat ~from ~to_ pat in\n        let annot = open_term ~from ~to_ annot in\n        SP_annot { pat; annot }\n\n  let open_term ~to_ term = open_term ~from:Index.zero ~to_ term\n\n  let rec close_term ~from ~to_ term =\n    let close_term ~to_ term = close_term ~from ~to_ term in\n    let close_ty_pat ~to_ pat = close_ty_pat ~from ~to_ pat in\n    let close_pat ~to_ pat = close_pat ~from ~to_ pat in\n    match term with\n    | ST_loc { term; loc } ->\n        let term = close_term ~to_ term in\n        ST_loc { term; loc }\n    | ST_free_var { level } -> (\n        match Level.equal from level with\n        | true -> ST_bound_var { index = to_ }\n        | false -> ST_free_var { level })\n    | ST_bound_var { index } -> ST_bound_var { index }\n    | ST_forall { param; return } ->\n        let param = close_ty_pat ~to_ param in\n        let return =\n          let to_ = Index.next to_ in\n          close_term ~to_ return\n        in\n        ST_forall { param; return }\n    | ST_lambda { param; return } ->\n        let param = close_ty_pat ~to_ param in\n        let return =\n          let to_ = Index.next to_ in\n          close_term ~to_ return\n        in\n        ST_lambda { param; return }\n    | ST_apply { lambda; arg } ->\n        let lambda = close_term ~to_ lambda in\n        let arg = close_term ~to_ arg in\n        ST_apply { lambda; arg }\n    | ST_self { self; body } ->\n        let self = close_pat ~to_ self in\n        let body =\n          let to_ = Index.next to_ in\n          close_term ~to_ body\n        in\n        ST_self { self; body }\n    | ST_fix { self; body } ->\n        let self = close_ty_pat ~to_ self in\n        let body =\n          let to_ = Index.next to_ in\n          close_term ~to_ body\n        in\n        ST_fix { self; body }\n    | ST_unroll { term } ->\n        let term = close_term ~to_ term in\n        ST_unroll { term }\n    | ST_let { bound; value; return } ->\n        let bound = close_ty_pat ~to_ bound in\n        let value = close_term ~to_ value in\n        let return =\n          let to_ = Index.next to_ in\n          close_term ~to_ return\n        in\n        ST_let { bound; value; return }\n    | ST_annot { term; annot } ->\n        let term = close_term ~to_ term in\n        let annot = close_term ~to_ annot in\n        ST_annot { term; annot }\n\n  and close_ty_pat ~from ~to_ pat =\n    let (SP_typed { pat; type_ }) = pat in\n    let pat = close_pat ~from ~to_ pat in\n    let type_ = close_term ~from ~to_ type_ in\n    SP_typed { pat; type_ }\n\n  and close_pat ~from ~to_ pat =\n    match pat with\n    | SP_loc { pat; loc } ->\n        let pat = close_pat ~from ~to_ pat in\n        SP_loc { pat; loc }\n    | SP_var { var } -> SP_var { var }\n    | SP_erasable { pat } ->\n        let pat = close_pat ~from ~to_ pat in\n        SP_erasable { pat }\n    | SP_annot { pat; annot } ->\n        let pat = close_pat ~from ~to_ pat in\n        let annot = close_term ~from ~to_ annot in\n        SP_annot { pat; annot }\n\n  (* TODO: expansion of unroll *)\n  let rec expand_head_term term =\n    match term with\n    (* TODO: use this loc during equality?*)\n    | ST_loc { term; loc = _ } -> expand_head_term term\n    (* TODO: equality expansion *)\n    | ST_free_var _ as term -> term\n    | ST_bound_var _ as term -> term\n    | ST_forall _ as term -> term\n    | ST_lambda _ as term -> term\n    | ST_apply { lambda; arg } -> (\n        match expand_head_term lambda with\n        (* TODO: use pattern when moving to subst *)\n        | ST_lambda { param = _; return } -> open_term ~to_:arg return\n        | lambda -> ST_apply { lambda; arg })\n    | ST_self _ as term -> term\n    | ST_fix _ as term -> term\n    | ST_unroll _ as term -> term\n    (* TODO: use pattern when moving to subst *)\n    | ST_let { bound = _; value; return } ->\n        expand_head_term @@ open_term ~to_:value return\n    | ST_annot { term; annot = _ } -> expand_head_term term\n\n  (* TODO: document multi step equality *)\n  let rec equal_term ~received ~expected =\n    match received == expected with\n    | true -> ()\n    | false -> equal_term_structural ~received ~expected\n\n  and equal_term_structural ~received ~expected =\n    let received = expand_head_term received in\n    let expected = expand_head_term expected in\n    match (received, expected) with\n    (* TODO: locs? *)\n    | ST_loc { term = received; loc = _ }, expected\n    | received, ST_loc { term = expected; loc = _ } ->\n        equal_term ~received ~expected\n    | ST_free_var { level = received }, ST_free_var { level = expected } -> (\n        match Level.equal received expected with\n        | true -> ()\n        | false -> error E_free_var_clash)\n    | ST_bound_var { index = received }, ST_bound_var { index = expected } -> (\n        match Index.equal received expected with\n        | true -> ()\n        | false -> error E_bound_var_clash)\n    | ( ST_forall { param = received_param; return = received_return },\n        ST_forall { param = expected_param; return = expected_return } ) ->\n        let () =\n          equal_ty_pat ~received:received_param ~expected:expected_param\n        in\n        equal_term ~received:received_return ~expected:expected_return\n    | ( ST_lambda { param = received_param; return = received_return },\n        ST_lambda { param = expected_param; return = expected_return } ) ->\n        let () =\n          equal_ty_pat ~received:received_param ~expected:expected_param\n        in\n        equal_term ~received:received_return ~expected:expected_return\n    | ( ST_apply { lambda = received_lambda; arg = received_arg },\n        ST_apply { lambda = expected_lambda; arg = expected_arg } ) ->\n        let () =\n          equal_term ~received:received_lambda ~expected:expected_lambda\n        in\n        equal_term ~received:received_arg ~expected:expected_arg\n    | ( ST_self { self = received_self; body = received_body },\n        ST_self { self = expected_self; body = expected_body } ) ->\n        let () = equal_pat ~received:received_self ~expected:expected_self in\n        equal_term ~received:received_body ~expected:expected_body\n    | ( ST_fix { self = received_self; body = received_body },\n        ST_fix { self = expected_self; body = expected_body } ) ->\n        let () = equal_ty_pat ~received:received_self ~expected:expected_self in\n        equal_term ~received:received_body ~expected:expected_body\n    | ST_unroll { term = received }, ST_unroll { term = expected } ->\n        equal_term ~received ~expected\n    (* TODO: document why let here *)\n    | ( ST_let\n          {\n            bound = received_bound;\n            value = received_value;\n            return = received_return;\n          },\n        ST_let\n          {\n            bound = expected_bound;\n            value = expected_value;\n            return = expected_return;\n          } ) ->\n        let () =\n          equal_ty_pat ~received:received_bound ~expected:expected_bound\n        in\n        let () = equal_term ~received:received_value ~expected:expected_value in\n        equal_term ~received:received_return ~expected:expected_return\n    (* TODO: document why annot here *)\n    (* TODO: should check also for annot equality? *)\n    | ST_annot { term = received; annot = _ }, expected\n    | received, ST_annot { term = expected; annot = _ } ->\n        equal_term ~received ~expected\n    | ( ( ST_free_var _ | ST_bound_var _ | ST_forall _ | ST_lambda _\n        | ST_apply _ | ST_self _ | ST_fix _ | ST_unroll _ | ST_let _ ),\n        ( ST_free_var _ | ST_bound_var _ | ST_forall _ | ST_lambda _\n        | ST_apply _ | ST_self _ | ST_fix _ | ST_unroll _ | ST_let _ ) ) ->\n        error E_type_clash\n\n  and equal_ty_pat ~received ~expected =\n    let (SP_typed { pat = received_pat; type_ = received_type }) = received in\n    let (SP_typed { pat = expected_pat; type_ = expected_type }) = expected in\n    let () = equal_pat ~received:received_pat ~expected:expected_pat in\n    equal_term ~received:received_type ~expected:expected_type\n\n  and equal_pat ~received ~expected =\n    (* TODO: normalize pattern *)\n    (* TODO: check pat? *)\n    match (received, expected) with\n    (* TODO: locs *)\n    | SP_loc { pat = received; loc = _ }, expected\n    | received, SP_loc { pat = expected; loc = _ } ->\n        equal_pat ~received ~expected\n    | SP_var { var = _ }, SP_var { var = _ } -> ()\n    | SP_erasable { pat = received }, SP_erasable { pat = expected } ->\n        equal_pat ~received ~expected\n    | SP_annot { pat = received; annot = _ }, expected\n    | received, SP_annot { pat = expected; annot = _ } ->\n        equal_pat ~received ~expected\n    | (SP_var _ | SP_erasable _), (SP_var _ | SP_erasable _) ->\n        error E_pattern_clash\n\n  let typeof_pat pat =\n    let (SP_typed { pat; type_ }) = pat in\n    let rec is_erasable pat =\n      match pat with\n      (* TODO: weird *)\n      | SP_loc { pat; loc = _ } -> is_erasable pat\n      | SP_var { var = _ } -> false\n      | SP_erasable { pat = _ } -> true\n      | SP_annot { pat; annot = _ } -> is_erasable pat\n    in\n    (type_, `Erasability (is_erasable pat))\nend\n\nmodule Assume = struct\n  (* TODO: document assumption mode *)\n  open Syntax\n  open Ltree\n  open Stree\n\n  (* TODO: linearity on assume? *)\n  module Context : sig\n    type 'a context\n    type 'a t = 'a context\n\n    (* monad *)\n    (* TODO: this should not be exposed *)\n    val run :\n      level:Level.t -> names:Level.t Name.Map.t -> (unit -> 'a context) -> 'a\n\n    val pure : 'a -> 'a context\n    val ( let* ) : 'a context -> ('a -> 'b context) -> 'b context\n\n    (* locs *)\n    val with_loc : Location.t -> (unit -> 'a context) -> 'a context\n\n    (* vars *)\n    val enter : Name.t -> (unit -> 'a context) -> 'a context\n    val lookup : Name.t -> Level.t context\n\n    (* machinery *)\n    val close_term : term -> term context\n  end = struct\n    open Machinery\n    open Error\n\n    (* TODO: names map vs names list / stack *)\n    type 'a context = level:Level.t -> names:Level.t Name.Map.t -> 'a\n    type 'a t = 'a context\n\n    let run ~level ~names f = f () ~level ~names\n    let pure x ~level:_ ~names:_ = x\n    let ( let* ) ctx f ~level ~names = f (ctx ~level ~names) ~level ~names\n\n    let with_loc loc f ~level ~names =\n      try f () ~level ~names\n      with Error { error } ->\n        let error = E_loc { error; loc } in\n        raise (Error { error })\n\n    let enter name f ~level ~names =\n      let level = Level.next level in\n      let names = Name.Map.add name level names in\n      f () ~level ~names\n\n    let lookup name ~level:_ ~names =\n      match Name.Map.find_opt name names with\n      | Some level -> level\n      | None -> error (E_unknown_var { var = name })\n\n    let close_term term ~level ~names:_ =\n      close_term ~from:level ~to_:Index.zero term\n  end\n\n  open Context\n  open Error\n\n  let rec assume_term term =\n    match term with\n    | LT_loc { term; loc } ->\n        let* term = with_loc loc @@ fun () -> assume_term term in\n        pure @@ ST_loc { term; loc }\n    | LT_var { var } ->\n        let* level = lookup var in\n        pure @@ ST_free_var { level }\n    | LT_extension _ -> error E_unsupported_extensions\n    | LT_forall { param; return } ->\n        let* param, enter = assume_ty_pat param in\n        let* return = enter @@ fun () -> assume_term return in\n        pure @@ ST_forall { param; return }\n    | LT_lambda { param; return } ->\n        let* param, enter = assume_ty_pat param in\n        let* return = enter @@ fun () -> assume_term return in\n        pure @@ ST_lambda { param; return }\n    | LT_apply { lambda; arg } ->\n        let* lambda = assume_term lambda in\n        let* arg = assume_term arg in\n        pure @@ ST_apply { lambda; arg }\n    | LT_self { self; body } -> assume_self ~self ~body\n    | LT_fix { self; body } -> assume_fix ~self ~body\n    | LT_unroll { term } ->\n        let* term = assume_term term in\n        pure @@ ST_unroll { term }\n    | LT_let { bound; return } ->\n        (* TODO: assume bind? *)\n        (* TODO: use this loc *)\n        let (LBind { loc = _; pat = bound; value }) = bound in\n        (* TODO: should let always be typed here *)\n        let* bound, enter = assume_ty_pat bound in\n        let* value = assume_term value in\n        let* return = enter @@ fun () -> assume_term return in\n        pure @@ ST_let { bound; value; return }\n    | LT_annot { term; annot } ->\n        let* annot = assume_term annot in\n        let* term = assume_term term in\n        pure @@ ST_annot { term; annot }\n    | LT_string _ -> error E_string_not_supported\n\n  and assume_self ~self ~body =\n    let* self, enter = assume_pat self in\n    let* body = enter @@ fun () -> assume_term body in\n    pure @@ ST_self { self; body }\n\n  and assume_fix ~self ~body =\n    let* self, enter = assume_ty_pat self in\n    let* body = enter @@ fun () -> assume_term body in\n    pure @@ ST_fix { self; body }\n\n  and assume_ty_pat pat =\n    let wrap ~enter ~type_ pat = pure @@ (SP_typed { pat; type_ }, enter) in\n    match pat with\n    | LP_loc { pat; loc } ->\n        let* SP_typed { pat; type_ }, enter =\n          with_loc loc @@ fun () -> assume_ty_pat pat\n        in\n        wrap ~enter ~type_ @@ SP_loc { pat; loc }\n    | LP_var _ -> error E_missing_annotations\n    | LP_unroll _ -> error E_unroll_pattern_not_supported\n    | LP_erasable _ ->\n        let* SP_typed { pat; type_ }, enter = assume_ty_pat pat in\n        wrap ~enter ~type_ @@ SP_erasable { pat }\n    | LP_annot { pat; annot } ->\n        let* annot = assume_term annot in\n        let* pat, enter = assume_pat pat in\n        wrap ~enter ~type_:annot @@ SP_annot { pat; annot }\n\n  and assume_pat pat =\n    (* TODO: with should do auto close *)\n    match pat with\n    | LP_loc { pat; loc } ->\n        with_loc loc @@ fun () ->\n        let* pat, enter = assume_pat pat in\n        let pat = SP_loc { pat; loc } in\n        pure @@ (pat, enter)\n    | LP_var { var } ->\n        let enter k =\n          enter var @@ fun () ->\n          (* TODO: better place or name for close term*)\n          let* term = k () in\n          close_term term\n        in\n        pure @@ (SP_var { var }, enter)\n    | LP_erasable { pat } ->\n        let* pat, enter = assume_pat pat in\n        pure @@ (SP_erasable { pat }, enter)\n    | LP_unroll _ -> error E_unroll_pattern_not_supported\n    | LP_annot { pat; annot } ->\n        let* annot = assume_term annot in\n        let* pat, enter = assume_pat pat in\n        let pat = SP_annot { pat; annot } in\n        pure @@ (pat, enter)\nend\n\nopen Syntax\nopen Ltree\nopen Stree\nopen Machinery\n\n(* TODO: this being hard coded is bad *)\nlet st_type = ST_free_var { level = Level.zero }\n\nmodule Context : sig\n  type 'a context\n  type 'a t = 'a context\n\n  (* monad *)\n  val run : (unit -> 'a context) -> 'a\n  val pure : 'a -> 'a context\n  val ( let* ) : 'a context -> ('a -> 'b context) -> 'b context\n\n  (* locs *)\n  val with_loc : Location.t -> (unit -> 'a context) -> 'a context\n\n  (* mode *)\n  val enter_erasable_zone : (unit -> 'a context) -> 'a context\n\n  (* vars *)\n  val enter :\n    Name.t -> erasable:bool -> type_:term -> (unit -> 'a context) -> 'a context\n\n  val lookup : Name.t -> ([ `Type of term ] * Level.t) context\n\n  (* machinery *)\n  val assume_self : self:Ltree.pat -> body:Ltree.term -> term context\n  val assume_fix : self:Ltree.pat -> body:Ltree.term -> term context\n  val subst_term : to_:term -> term -> term context\n  val open_term : term -> term context\n  val close_term : term -> term context\nend = struct\n  open Machinery\n  open Error\n\n  type status = Var_pending | Var_used\n\n  (* TODO: names map vs names list / stack *)\n  (* TODO: vars map vs vars list / stack *)\n  type 'a context =\n    level:Level.t ->\n    names:Level.t Name.Map.t ->\n    types:term Level.Map.t ->\n    grades:status Level.Map.t ->\n    status Level.Map.t * 'a\n\n  type 'a t = 'a context\n\n  let run k =\n    let level = Level.(next zero) in\n    (* TODO: move this to Name? *)\n    let names = Name.Map.(add (Name.make \"Type\") Level.zero empty) in\n    let types = Level.Map.(add Level.zero st_type empty) in\n    let grades = Level.Map.(add Level.zero Var_used empty) in\n    let _grades, x = k () ~level ~names ~types ~grades in\n    (* TODO: check grades here *)\n    x\n\n  let pure x ~level:_ ~names:_ ~types:_ ~grades = (grades, x)\n\n  let ( let* ) ctx f ~level ~names ~types ~grades =\n    let grades, x = ctx ~level ~names ~types ~grades in\n    f x ~level ~names ~types ~grades\n\n  let with_loc loc f ~level ~names ~types ~grades =\n    try f () ~level ~names ~types ~grades\n    with Error { error } ->\n      let error = E_loc { error; loc } in\n      raise (Error { error })\n\n  let enter_erasable_zone f ~level ~names ~types ~grades =\n    let _grades, x = f () ~level ~names ~types ~grades:Level.Map.empty in\n    (* TODO: check grades to be empty here *)\n    (grades, x)\n\n  let enter_linear name ~type_ f ~level ~names ~types ~grades =\n    let level = Level.next level in\n    let names = Name.Map.add name level names in\n    let types = Level.Map.add level type_ types in\n    let grades = Level.Map.add level Var_pending grades in\n    let grades, x = f () ~level ~names ~types ~grades in\n    match Level.Map.find_opt level grades with\n    | Some Var_pending -> error (E_variable_unused { var = name })\n    | Some Var_used -> (Level.Map.remove level grades, x)\n    | None -> error E_grades_invariant_violated\n\n  let enter_erasable name ~type_ f ~level ~names ~types ~grades =\n    let level = Level.next level in\n    let names = Name.Map.add name level names in\n    let types = Level.Map.add level type_ types in\n    (* TODO: explain why it enters variable as used *)\n    (* TODO: Var_erasable? *)\n    let grades = Level.Map.add level Var_used grades in\n    let grades, x = f () ~level ~names ~types ~grades in\n    (Level.Map.remove level grades, x)\n\n  let enter name ~erasable ~type_ f ~level ~names ~types ~grades =\n    match erasable with\n    | true -> enter_erasable name ~type_ f ~level ~names ~types ~grades\n    | false -> enter_linear name ~type_ f ~level ~names ~types ~grades\n\n  let lookup name ~level:_ ~names ~types ~grades =\n    match Name.Map.find_opt name names with\n    | Some level -> (\n        match Level.Map.find_opt level types with\n        | Some type_ -> (\n            match Level.Map.find_opt level grades with\n            | Some Var_pending ->\n                let grades = Level.Map.add level Var_used grades in\n                (grades, (`Type type_, level))\n            | Some Var_used -> error (E_variable_used { var = name })\n            | None ->\n                (* removed by erasable zone *)\n                (grades, (`Type type_, level)))\n        | None -> error E_types_invariant_violated)\n    | None -> error (E_unknown_var { var = name })\n\n  let assume_self ~self ~body ~level ~names ~types:_ ~grades =\n    let x =\n      let open Assume in\n      Context.run ~level ~names @@ fun () -> assume_self ~self ~body\n    in\n    (grades, x)\n\n  let assume_fix ~self ~body ~level ~names ~types:_ ~grades =\n    let x =\n      let open Assume in\n      Context.run ~level ~names @@ fun () -> assume_fix ~self ~body\n    in\n    (grades, x)\n\n  let subst_term ~to_ term ~level:_ ~names:_ ~types:_ ~grades =\n    (grades, open_term ~to_ term)\n\n  let open_term term ~level ~names:_ ~types:_ ~grades =\n    (grades, open_term ~to_:(ST_free_var { level }) term)\n\n  let close_term term ~level ~names:_ ~types:_ ~grades =\n    (grades, close_term ~from:level ~to_:Index.zero term)\nend\n\nopen Context\nopen Error\n\n(* TODO: think better about enter pat *)\nlet rec enter_pat ~erasable pat ~type_ k =\n  match pat with\n  (* TODO: weird *)\n  | SP_loc { pat; loc = _ } -> enter_pat pat ~erasable ~type_ k\n  | SP_var { var } -> enter ~erasable var ~type_ k\n  | SP_erasable { pat } -> enter_pat ~erasable:true pat ~type_ k\n  | SP_annot { pat; annot = _ } -> enter_pat pat ~erasable ~type_ k\n\nlet enter_ty_pat ~erasable pat k =\n  let (SP_typed { pat; type_ }) = pat in\n  enter_pat pat ~erasable ~type_ k\n\n(* TODO: this is clearly bad *)\nlet enter_erasable_zone_conditional ~erasable k =\n  match erasable with true -> enter_erasable_zone k | false -> k ()\n\nlet rec infer_term term =\n  let wrap ~type_ term = pure @@ ST_typed { term; type_ } in\n  match term with\n  | LT_loc { term; loc } ->\n      let* (ST_typed { term; type_ }) =\n        with_loc loc @@ fun () -> infer_term term\n      in\n      wrap ~type_ @@ ST_loc { term; loc }\n  | LT_var { var } ->\n      let* `Type type_, level = lookup var in\n      wrap ~type_ @@ ST_free_var { level }\n  | LT_extension _ -> error E_unsupported_extensions\n  | LT_forall { param; return } ->\n      let* param = infer_ty_pat param in\n      let* return =\n        enter_erasable_zone @@ fun () ->\n        check_term_with_ty_pat ~erasable:true param return ~expected:st_type\n      in\n      wrap ~type_:st_type @@ ST_forall { param; return }\n  | LT_lambda { param; return } ->\n      let* param = infer_ty_pat param in\n      let* (ST_typed { term = return; type_ = return_type }) =\n        infer_term_with_ty_pat ~erasable:false param return\n      in\n      let type_ = ST_forall { param; return = return_type } in\n      wrap ~type_ @@ ST_lambda { param; return }\n  | LT_apply { lambda; arg } -> (\n      let* (ST_typed { term = lambda; type_ = forall }) = infer_term lambda in\n      (* TODO: maybe machinery to eliminate forall *)\n      match expand_head_term forall with\n      | ST_forall { param; return } ->\n          let* arg =\n            let expected, `Erasability erasable = typeof_pat param in\n            enter_erasable_zone_conditional ~erasable @@ fun () ->\n            check_term arg ~expected\n          in\n          let* type_ = subst_term ~to_:arg return in\n          wrap ~type_ @@ ST_apply { lambda; arg }\n      (* TODO: expand cases *)\n      | _ -> error E_expected_forall)\n  | LT_self { self; body } ->\n      let* assumed_self = assume_self ~self ~body in\n      let* self = check_pat self ~expected:assumed_self in\n      let* body =\n        enter_erasable_zone @@ fun () ->\n        check_term_with_pat ~erasable:true self ~type_:assumed_self body\n          ~expected:st_type\n      in\n      let self = ST_self { self; body } in\n      (* this equality is about peace of mind *)\n      let () = equal_term ~received:self ~expected:assumed_self in\n      wrap ~type_:st_type @@ self\n  | LT_fix { self; body } ->\n      let* self = infer_ty_pat self in\n      let type_, `Erasability erasable = typeof_pat self in\n      let* body =\n        enter_erasable_zone_conditional ~erasable @@ fun () ->\n        check_term_with_ty_pat ~erasable:false self body ~expected:type_\n      in\n      wrap ~type_ @@ ST_fix { self; body }\n  | LT_unroll { term } -> (\n      (* TODO: rename to fix *)\n      let* (ST_typed { term; type_ = self }) = infer_term term in\n      (* TODO: maybe machinery to eliminate forall *)\n      match expand_head_term self with\n      | ST_self { self = _; body } ->\n          let* type_ = subst_term ~to_:term body in\n          wrap ~type_ @@ ST_unroll { term }\n      (* TODO: expand cases *)\n      | _ -> error E_expected_self)\n  | LT_let { bound; return } ->\n      (* TODO: check bind? *)\n      (* TODO: use this loc *)\n      let (LBind { loc = _; pat = bound; value }) = bound in\n      (* TODO: remove need for typing of let *)\n      let* bound = infer_ty_pat bound in\n      let* value =\n        let value_type, `Erasability erasable = typeof_pat bound in\n        enter_erasable_zone_conditional ~erasable @@ fun () ->\n        check_term value ~expected:value_type\n      in\n      let* (ST_typed { term = return; type_ = return_type }) =\n        infer_term_with_ty_pat ~erasable:false bound return\n      in\n      (* TODO: could use let at type level *)\n      let* type_ = subst_term ~to_:value return_type in\n      wrap ~type_ @@ ST_let { bound; value; return }\n  | LT_annot { term; annot } ->\n      let* annot =\n        enter_erasable_zone @@ fun () -> check_term annot ~expected:st_type\n      in\n      let* term = check_term term ~expected:annot in\n      wrap ~type_:annot @@ ST_annot { term; annot }\n  | LT_string _ -> error E_string_not_supported\n\nand check_term term ~expected =\n  (* TODO: check term equality for nested annot ((x : A) : B)? *)\n  (* TODO: propagate *)\n  match (term, expand_head_term expected) with\n  | LT_loc { term; loc }, expected ->\n      let* term = with_loc loc @@ fun () -> check_term term ~expected in\n      pure @@ ST_loc { term; loc }\n  | ( LT_lambda { param; return },\n      ST_forall { param = expected_param; return = expected_return } ) ->\n      let* param =\n        (* TODO: use this erasable? *)\n        let expected_param_type, `Erasability _erasable =\n          typeof_pat expected_param\n        in\n        check_ty_pat param ~expected:expected_param_type\n      in\n      let () =\n        (* TODO : loc for error message *)\n        equal_ty_pat ~received:param ~expected:expected_param\n      in\n      let* return =\n        check_term_with_ty_pat ~erasable:false param return\n          ~expected:expected_return\n      in\n      pure @@ ST_lambda { param; return }\n  | ( LT_fix { self; body },\n      (ST_self { self = expected_self; body = expected_body } as expected) ) ->\n      let* self = check_ty_pat self ~expected in\n      let () =\n        (* TODO : loc for error message *)\n        let (SP_typed { pat = self; type_ = _ }) = self in\n        equal_pat ~received:self ~expected:expected_self\n      in\n      let* body =\n        check_term_with_ty_pat ~erasable:false self body ~expected:expected_body\n      in\n      pure @@ ST_fix { self; body }\n  | term, expected ->\n      let* (ST_typed { term; type_ = received }) = infer_term term in\n      let () = equal_term ~received ~expected in\n      pure term\n\nand infer_ty_pat pat =\n  let wrap ~type_ pat = pure @@ SP_typed { pat; type_ } in\n  match pat with\n  | LP_loc { pat; loc } ->\n      with_loc loc @@ fun () ->\n      let* (SP_typed { pat; type_ }) = infer_ty_pat pat in\n      wrap ~type_ @@ SP_loc { pat; loc }\n  | LP_var _ -> error E_missing_annotations\n  | LP_erasable { pat } ->\n      let* (SP_typed { pat; type_ }) = infer_ty_pat pat in\n      wrap ~type_ @@ SP_erasable { pat }\n  | LP_unroll _ -> error E_unroll_pattern_not_supported\n  | LP_annot { pat; annot } ->\n      let* annot =\n        enter_erasable_zone @@ fun () -> check_term annot ~expected:st_type\n      in\n      check_ty_pat pat ~expected:annot\n\nand check_ty_pat pat ~expected =\n  let* pat = check_pat pat ~expected in\n  pure @@ SP_typed { pat; type_ = expected }\n\nand check_pat pat ~expected =\n  match pat with\n  | LP_loc { pat; loc } ->\n      with_loc loc @@ fun () ->\n      let* pat = check_pat pat ~expected in\n      pure @@ SP_loc { pat; loc }\n  | LP_var { var } -> pure @@ SP_var { var }\n  | LP_erasable { pat } ->\n      let* pat = check_pat pat ~expected in\n      pure @@ SP_erasable { pat }\n  | LP_unroll _ -> error E_unroll_pattern_not_supported\n  | LP_annot { pat; annot } ->\n      let* annot =\n        enter_erasable_zone @@ fun () -> check_term annot ~expected:st_type\n      in\n      let* pat = check_pat pat ~expected:annot in\n      let () = equal_term ~received:annot ~expected in\n      pure @@ SP_annot { pat; annot }\n\nand infer_term_with_ty_pat ~erasable pat term =\n  enter_ty_pat ~erasable pat @@ fun () ->\n  let* (ST_typed { term; type_ }) = infer_term term in\n  let* term = close_term term in\n  let* type_ = close_term type_ in\n  pure @@ ST_typed { term; type_ }\n\nand check_term_with_ty_pat ~erasable pat term ~expected =\n  enter_ty_pat ~erasable pat @@ fun () ->\n  (* TODO: open and close should probably not be here *)\n  let* expected = open_term expected in\n  let* term = check_term term ~expected in\n  close_term term\n\nand check_term_with_pat ~erasable pat ~type_ term ~expected =\n  enter_pat ~erasable pat ~type_ @@ fun () ->\n  let* expected = open_term expected in\n  let* term = check_term term ~expected in\n  close_term term\n"
  },
  {
    "path": "smol/test.ml",
    "content": "open Syntax\nopen Smol\n\ntype test = { name : string; term : string }\n\nlet type_term name term = { name; term }\n\n(* TODO: used variable still usable on type level *)\nlet id =\n  type_term \"id\"\n    {|((A : Type $ 0) => (x : A) => x\n      :(A : Type $ 0) -> (x : A) -> A)|}\n\nlet id_propagate =\n  type_term \"id_propagate\"\n    {|((A $ 0) => x => x : (A : Type $ 0) -> (x : A) -> A)|}\n\nlet apply_erasable =\n  type_term \"apply_erasable\"\n    {|\n      (A : Type $ 0) => ((A : Type $ 0) => (x : A) => x) A\n    |}\n\nlet sequence =\n  type_term \"sequence\"\n    {|((A : Type) => (x : A) => (B : Type) => (y : B) => y\n      :(A : Type) -> (x : A) -> (B : Type) -> (y : B) -> B)|}\n\nlet bool =\n  type_term \"bool\"\n    {|((A : Type) => (x : A) => (y : A) => x\n               :(A : Type) -> (x : A) -> (y : A) -> A)|}\n\nlet sequence_propagate =\n  type_term \"sequence_propagate\"\n    {|(A => x => B => y => y\n       :(A : Type) -> (x : A) -> (B : Type) -> (y : B) -> B)|}\n\nlet true_ =\n  type_term \"true\"\n    {|((A : Type) => (x : A) => (y : A) => x\n      :(A : Type) -> (x : A) -> (y : A) -> A)|}\n\nlet true_propagate =\n  type_term \"true_propagate\"\n    {|(A => x => y => x\n       :(A : Type) -> (x : A) -> (y : A) -> A)|}\n\nlet false_ =\n  type_term \"false\"\n    {|((A : Type) => (x : A) => (y : A) => y\n      :(A : Type) -> (x : A) -> (y : A) -> A)|}\n\nlet ind_False =\n  let b_false = {|f @-> (P : (f : @False I_False) -> Type) -> @I_False P f|} in\n  let i_false_t =\n    Format.sprintf\n      {|I_False @-> (P : (f : @False I_False) -> Type) -> (f : %s) -> Type|}\n      b_false\n  in\n  let code =\n    Format.sprintf\n      {|\n        (FalseT : Type) === False @-> (I_False : %s) -> Type;\n        (False : FalseT) @=> (I_False : %s) => %s\n      |}\n      i_false_t i_false_t b_false\n  in\n  type_term \"ind_False\" code\n\nlet ind_Unit =\n  let b_Unit =\n    {|\n      u @-> (P : (x : @Unit I_Unit unit I_unit) -> Type) ->\n        (x : @I_Unit unit I_unit P (@unit I_unit)) -> @I_Unit unit I_unit P u\n    |}\n  in\n  let b_unit =\n    {|\n      (u : u @->\n        (P : (x : @Unit I_Unit unit I_unit) -> Type) ->\n        (a : @I_Unit unit I_unit P (@unit I_unit)) -> @I_Unit unit I_unit P u\n      ) @=> (P : (x : @Unit I_Unit unit I_unit) -> Type) =>\n        (b : @I_Unit unit I_unit P (@unit I_unit)) => @I_unit P b\n    |}\n  in\n  let t_i_unit =\n    Format.sprintf\n      {|\n        I_unit @-> (P : (c : @Unit I_Unit unit I_unit) -> Type) ->\n          (d : @I_Unit unit I_unit P (@unit I_unit)) ->\n          @I_Unit unit I_unit P (%s)\n      |}\n      b_unit\n  in\n  let t_unit =\n    Format.sprintf {|\n      unit @-> (I_unit : %s) -> %s\n    |} t_i_unit b_Unit\n  in\n  let t_i_Unit =\n    Format.sprintf\n      {|\n        I_Unit @-> (unit : %s) -> (I_unit : %s) ->\n          (P : (e : @Unit I_Unit unit I_unit) -> Type) ->\n          (f : %s) -> Type\n      |}\n      t_unit t_i_unit b_Unit\n  in\n  (*\n  {|\n    (UnitT : Type) ===\n      Unit @-> (I_Unit : %s) -> (unit : %s) -> (I_unit : %s) -> Type;\n    (Unit : UnitT) === (Unit : UnitT) @=> \n      (I_Unit : %s) => (unit : %s) => (I_unit : %s) => %s;\n    (UnitR : Type) === (I_Unit : %s) -> (unit : %s) -> (I_unit : %s) -> Type;\n    (UnitEq : (P : (x : UnitR) -> Type) -> (x : P @Unit) ->\n      P ((I_Unit : %s) => (unit : %s) => (I_unit : %s) => %s)\n    ) === (P : (x : UnitR) -> Type) => (x : P @Unit) => %%expand x;\n    (I_Unit : %s) === (I_Unit : %s) @=> (unit : %s) => (I_unit : %s) =>\n      (P : (f : @Unit I_Unit unit I_unit) -> Type) =>\n        UnitEq ((Unit : UnitR) => (f : Unit I_Unit unit I_unit) -> Type) P;\n    (unit : %s) === (unit : %s) @=> (I_unit : %s) => %s;\n    (I_unitT : Type) === %s;\n    (unitR : Type) === (I_unit : I_unitT) -> %s;\n    (unitEq : (P : (x : unitR) -> Type) -> (x : P @unit) ->\n      P ((I_unit : I_unitT) => %s)) ===\n      (P : (x : unitR) -> Type) => (x : P @unit) => %%expand P;\n    unitEq\n  |}\n  *)\n  let _code =\n    Format.sprintf\n      {|\n        (UnitT : Type) ===\n          Unit @-> (I_Unit : %s) -> (unit : %s) -> (I_unit : %s) -> Type;\n        (I_UnitT : (Unit : UnitT) -> Type) ===\n          (Unit : UnitT) => %s;\n        (unitT : (Unit : UnitT) -> (I_Unit : I_UnitT Unit) -> Type) ===\n          (Unit : UnitT) => (I_Unit : I_UnitT Unit) => %s;\n        (I_unitT : (Unit : UnitT) -> (I_Unit : I_UnitT Unit) ->\n          (unit : unitT Unit I_Unit) -> Type) ===\n          (Unit : UnitT) => (I_Unit : I_UnitT Unit) =>\n          (unit : unitT Unit I_Unit) => %s;\n        (Unit : UnitT) === (Unit : UnitT) @=> \n          (I_Unit : I_UnitT Unit) => (unit : unitT Unit I_Unit) =>\n            (I_unit : I_unitT Unit I_Unit unit) => %s;\n        (I_UnitT : Type) === I_UnitT Unit;\n        (unitT : (I_Unit : I_UnitT) -> Type) === unitT Unit;\n        (I_unitT : (I_Unit : I_UnitT) ->\n          (unit : unitT I_Unit) -> Type) === I_unitT Unit;\n        (UnitR : Type) === (I_Unit : I_UnitT) -> (unit : unitT I_Unit) ->\n            (I_unit : I_unitT I_Unit unit) -> Type;\n        (UnitEq : (P : (x : UnitR) -> Type) -> (x : P @Unit) ->\n          P (\n          (I_Unit : I_UnitT) => (unit : unitT I_Unit) =>\n            (I_unit : I_unitT I_Unit unit) => %s)\n        ) === (P : (x : UnitR) -> Type) => (x : P @Unit) => %%expand x;\n        (I_Unit : I_UnitT) === (I_Unit : I_UnitT) @=>\n          (unit : unitT I_Unit) => (I_unit : I_unitT I_Unit unit) =>\n          (P : (f : @Unit I_Unit unit I_unit) -> Type) =>\n            UnitEq ((Unit : UnitR) => (f : Unit I_Unit unit I_unit) -> Type) P;\n        (unitT : Type) === unitT I_Unit;\n        (I_unitT : (unit : unitT) -> Type) === I_unitT I_Unit;\n        (unit : unitT) === (unit : unitT) @=>\n          (I_unit : I_unitT unit) => %s;\n        (I_unitT : Type) === I_unitT unit;\n        (unitR : Type) === (I_unit : I_unitT) -> %s;\n        (unitEq : (P : (x : unitR) -> Type) -> (x : P @unit) ->\n          P ((I_unit : I_unitT) => %s)) ===\n            (P : (x : unitR) -> Type) => (x : P @unit) => %%expand x;\n        (I_unit : I_unitT) === (I_unit : I_unitT) @=>\n          (P : (c : @Unit I_Unit unit I_unit) -> Type) =>\n          (d : @I_Unit unit I_unit P (@unit I_unit)) =>\n          unitEq ((at_unit : unitR) => @I_Unit unit I_unit P (at_unit I_unit))\n            d;\n        (UnitEq : (P : (_ : Type) -> Type) ->\n          (x : P (@Unit I_Unit unit I_unit)) -> P (%s)) ===\n          (P : (_ : Type) -> Type) =>\n          (x : P (@Unit I_Unit unit I_unit)) => %%expand x;\n        (RevUnitEq : (P : (_ : Type) -> Type) ->\n          (x : P (%s)) -> P (@Unit I_Unit unit I_unit)) ===\n          (P : (_ : Type) -> Type) =>\n          UnitEq ((T : Type) => (x : P T) -> P (@Unit I_Unit unit I_unit))\n            ((x : P (@Unit I_Unit unit I_unit)) => x);\n        \n        (unitK : @Unit I_Unit unit I_unit) === RevUnitEq ((X : Type) => X) (@unit I_unit);\n        @(%%expand unitK)\n      |}\n      t_i_Unit t_unit t_i_unit t_i_Unit t_unit t_i_unit b_Unit b_Unit b_unit\n      b_Unit b_unit b_Unit b_Unit\n  in\n  let code =\n    Format.sprintf\n      {|\n        (UnitT : Type) ===\n          Unit @-> (I_Unit : %s) -> (unit : %s) -> (I_unit : %s) -> Type;\n        (I_UnitT : (Unit : UnitT) -> Type) ===\n          (Unit : UnitT) => %s;\n        (unitT : (Unit : UnitT) -> (I_Unit : I_UnitT Unit) -> Type) ===\n          (Unit : UnitT) => (I_Unit : I_UnitT Unit) => %s;\n        (I_unitT : (Unit : UnitT) -> (I_Unit : I_UnitT Unit) ->\n          (unit : unitT Unit I_Unit) -> Type) ===\n          (Unit : UnitT) => (I_Unit : I_UnitT Unit) =>\n          (unit : unitT Unit I_Unit) => %s;\n        (Unit : UnitT) === (Unit : UnitT) @=> \n          (I_Unit : I_UnitT Unit) => (unit : unitT Unit I_Unit) =>\n            (I_unit : I_unitT Unit I_Unit unit) => %s;\n        (I_UnitT : Type) === I_UnitT Unit;\n        (unitT : (I_Unit : I_UnitT) -> Type) === unitT Unit;\n        (I_unitT : (I_Unit : I_UnitT) ->\n          (unit : unitT I_Unit) -> Type) === I_unitT Unit;\n        (UnitR : Type) === (I_Unit : I_UnitT) -> (unit : unitT I_Unit) ->\n            (I_unit : I_unitT I_Unit unit) -> Type;\n        (UnitEq : (P : (x : UnitR) -> Type) -> (x : P @Unit) ->\n          P (\n          (I_Unit : I_UnitT) => (unit : unitT I_Unit) =>\n            (I_unit : I_unitT I_Unit unit) => %s)\n        ) === (P : (x : UnitR) -> Type) => (x : P @Unit) => %%expand x;\n        (I_Unit : I_UnitT) === (I_Unit : I_UnitT) @=>\n          (unit : unitT I_Unit) => (I_unit : I_unitT I_Unit unit) =>\n          (P : (f : @Unit I_Unit unit I_unit) -> Type) =>\n            UnitEq ((Unit : UnitR) => (f : Unit I_Unit unit I_unit) -> Type) P;\n        I_Unit\n      |}\n      t_i_Unit t_unit t_i_unit t_i_Unit t_unit t_i_unit b_Unit b_Unit\n  in\n  type_term \"ind_Unit\" code\n\nlet _tests =\n  [\n    id;\n    id_propagate;\n    sequence;\n    sequence_propagate;\n    bool;\n    true_;\n    true_propagate;\n    false_;\n    ind_False;\n    ind_Unit;\n  ]\n\nlet tests = [ id; id_propagate; apply_erasable ]\n\nlet type_term term =\n  let term = Clexer.from_string Cparser.term_opt term in\n  let term = Option.get term in\n  let term =\n    let loc = Location.none in\n    Lparser.parse_term ~loc term\n  in\n  Styper.(Context.run @@ fun () -> infer_term term)\n\nlet test { name; term } =\n  let check () =\n    let _type_ = type_term term in\n    (* Format.eprintf \"type_ : %a\\n%!\" Tprinter.pp_term type_; *)\n    ()\n  in\n  Alcotest.test_case name `Quick check\n\nlet tests = (\"tests\", List.map test tests)\nlet () = Alcotest.run \"Typer\" [ tests ]\n"
  },
  {
    "path": "smol/test.mli",
    "content": ""
  },
  {
    "path": "syntax/clexer.ml",
    "content": "open Cparser\nopen Sedlexing.Utf8\n\nexception Lexer_error of { loc : Location.t }\nexception Parser_error of { loc : Location.t }\n\nlet () =\n  Printexc.register_printer @@ function\n  | Lexer_error { loc = _ } -> Some \"lexer: syntax error\"\n  | Parser_error { loc = _ } -> Some \"parser: syntax error\"\n  | _ -> None\n\nlet loc buf =\n  let loc_start, loc_end = Sedlexing.lexing_positions buf in\n  Location.{ loc_start; loc_end; loc_ghost = false }\n\nlet whitespace = [%sedlex.regexp? Plus (' ' | '\\t' | '\\n')]\nlet alphabet = [%sedlex.regexp? 'a' .. 'z' | 'A' .. 'Z']\nlet digit = [%sedlex.regexp? '0' .. '9']\nlet variable = [%sedlex.regexp? (alphabet | '_'), Star (alphabet | digit | '_')]\nlet extension = [%sedlex.regexp? '%', variable]\nlet string = [%sedlex.regexp? '\"', Star (Sub (any, '\"')), '\"']\nlet number = [%sedlex.regexp? Plus '0' .. '9']\n\nlet rec tokenizer buf =\n  match%sedlex buf with\n  | whitespace -> tokenizer buf\n  | variable -> VAR (lexeme buf)\n  | extension -> EXTENSION (lexeme buf)\n  | \":\" -> COLON\n  | \"->\" -> ARROW\n  | \"=>\" -> FAT_ARROW\n  | \"=\" -> EQUAL\n  | \",\" -> COMMA\n  | \"&\" -> AMPERSAND\n  | \";\" -> SEMICOLON\n  | string ->\n      (* TODO: this should probably be somewhere else *)\n      let literal = lexeme buf in\n      (* TODO: this is dangerous *)\n      let literal = String.sub literal 1 (String.length literal - 2) in\n      STRING literal\n  | number ->\n      (* TODO: this should probably be somewhere else *)\n      let literal = lexeme buf in\n      (* TODO: this is dangerous *)\n      let literal = Z.of_string literal in\n      NUMBER literal\n  | \"(\" -> LEFT_PARENS\n  | \")\" -> RIGHT_PARENS\n  | \"{\" -> LEFT_BRACE\n  | \"}\" -> RIGHT_BRACE\n  | eof -> EOF\n  | _ ->\n      let loc = loc buf in\n      raise @@ Lexer_error { loc }\n\nlet next buf =\n  let token = tokenizer buf in\n  let start, end_ = Sedlexing.lexing_positions buf in\n  (token, start, end_)\n\nopen Cparser.MenhirInterpreter\n\nlet rec loop buf state =\n  match state with\n  | InputNeeded _env ->\n      (* The parser needs a token. Request one from the lexer,\n         and offer it to the parser, which will produce a new\n         checkpoint. Then, repeat. *)\n      let token, start, end_ = next buf in\n      let state = offer state (token, start, end_) in\n      loop buf state\n  | Shifting _ | AboutToReduce _ ->\n      let state = resume state in\n      loop buf state\n  | HandlingError _env ->\n      let loc = loc buf in\n      raise (Parser_error { loc })\n  | Accepted value -> value\n  | Rejected -> failwith \"cdriver.loop: rejected reached\"\n\nlet buf_from_string string =\n  (* TODO: from string seems to not trigger new line, likely a bug in sedlex *)\n  let index = ref 0 in\n  let length = String.length string in\n  from_gen (fun () ->\n      match !index < length with\n      | true ->\n          let char = String.get string !index in\n          incr index;\n          Some char\n      | false -> None)\n\nlet term_opt_from_string string =\n  let buf = buf_from_string string in\n  (* TODO: allow to change this *)\n  let start, _end = Sedlexing.lexing_positions buf in\n  loop buf @@ Cparser.Incremental.term_opt start\n"
  },
  {
    "path": "syntax/clexer.mli",
    "content": "exception Lexer_error of { loc : Location.t }\nexception Parser_error of { loc : Location.t }\n\nval loc : Sedlexing.lexbuf -> Location.t\nval next : Sedlexing.lexbuf -> Cparser.token * Lexing.position * Lexing.position\nval term_opt_from_string : string -> Ctree.term option\n"
  },
  {
    "path": "syntax/cparser.mly",
    "content": "%{\nopen Utils\nopen Ctree\n\nlet mk (loc_start, loc_end) =\n  Location.{ loc_start; loc_end; loc_ghost = false }\n\n%}\n%token <string> VAR (* x *)\n%token COLON (* : *)\n%token ARROW (* -> *)\n%token FAT_ARROW (* => *)\n%token EQUAL (* = *)\n%token COMMA (* , *)\n%token AMPERSAND (* & *)\n%token SEMICOLON (* ; *)\n%token <string> STRING (* \"abc\" *)\n%token <Z.t> NUMBER (* 123 *)\n%token LEFT_PARENS (* ( *)\n%token RIGHT_PARENS (* ) *)\n%token LEFT_BRACE (* { *)\n%token RIGHT_BRACE (* } *)\n%token <string> EXTENSION (* %x *)\n\n%token EOF\n\n%start <Ctree.term option> term_opt\n%%\n\nlet term_opt :=\n  | EOF;\n    { None }\n  | term = term; EOF;\n    { Some term }\n\nlet term := term_rec_pair\n\nlet term_rec_pair :=\n  | term_semi_or_annot\n  | term_pair(term_rec_pair, term_semi_or_annot)\n\nlet term_semi_or_annot :=\n  | term_rec_annot\n  | term_semi(term_rec_semi, term_rec_semi_bind)\n\nlet term_rec_semi :=\n  | term_rec_funct\n  | term_semi(term_rec_semi, term_rec_semi_bind)\n\nlet term_rec_semi_bind :=\n  | term_rec_semi_annot\n  | term_bind(term_rec_semi, term_rec_semi_annot)\n\nlet term_rec_semi_annot :=\n  | term_rec_funct\n  | term_annot(term_rec_semi_annot, term_rec_funct)\n\nlet term_rec_annot :=\n  | term_rec_funct\n  | term_annot(term_rec_annot, term_rec_funct)\n\nlet term_rec_funct :=\n  | term_rec_apply\n  | term_forall(term_rec_funct, term_rec_apply)\n  | term_lambda(term_rec_funct, term_rec_apply)\n  | term_both(term_rec_funct, term_rec_apply)\n\nlet term_rec_apply :=\n  | term_atom\n  | term_apply(term_rec_apply, term_atom)\n\nlet term_atom :=\n  | term_var\n  | term_extension\n  | term_string\n  | term_number\n  | term_parens(term)\n  | term_braces(term)\n\nlet term_forall(self, lower) ==\n  | param = lower; ARROW; body = self;\n    { ct_forall (mk $loc) ~param ~body }\nlet term_lambda(self, lower) ==\n  | param = lower; FAT_ARROW; body = self;\n    { ct_lambda (mk $loc) ~param ~body }\nlet term_apply(self, lower) ==\n  | funct = self; arg = lower;\n    { ct_apply (mk $loc) ~funct ~arg }\nlet term_pair(self, lower) ==\n  | left = lower; COMMA; right = self;\n    { ct_pair (mk $loc) ~left ~right }\nlet term_both(self, lower) ==\n  | left = lower; AMPERSAND; right = self;\n    { ct_both (mk $loc) ~left ~right }\nlet term_bind(self, lower) ==\n  | bound = lower; EQUAL; value = lower;\n    { ct_bind (mk $loc) ~bound ~value }\nlet term_semi(self, lower) ==\n  | left = lower; SEMICOLON; right = self;\n    { ct_semi (mk $loc) ~left ~right }\nlet term_annot(self, lower) ==\n  | value = lower; COLON; annot = self;\n    { ct_annot (mk $loc) ~value ~annot }\nlet term_var ==\n  | var = VAR;\n    { ct_var (mk $loc) ~var:(Name.make var) }\nlet term_extension ==\n  | extension = EXTENSION;\n    { ct_extension (mk $loc) ~extension:(Name.make extension) }\nlet term_string ==\n  | literal = STRING;\n    { ct_string (mk $loc) ~literal }\nlet term_number ==\n  | literal = NUMBER;\n    { ct_number (mk $loc) ~literal }\nlet term_parens(content) ==\n  | LEFT_PARENS; content = content; RIGHT_PARENS;\n    { ct_parens (mk $loc) ~content }\nlet term_braces(content) ==\n  | LEFT_BRACE; content = content; RIGHT_BRACE;\n    { ct_braces (mk $loc) ~content }\n"
  },
  {
    "path": "syntax/ctree.ml",
    "content": "open Utils\n\ntype term =\n  (* TODO: printer location *)\n  | CTerm of { term : term_syntax; loc : Location.t [@opaque] }\n\nand term_syntax =\n  | CT_var of { var : Name.t }\n  | CT_extension of { extension : Name.t }\n  | CT_forall of { param : term; body : term }\n  | CT_lambda of { param : term; body : term }\n  | CT_apply of { funct : term; arg : term }\n  | CT_pair of { left : term; right : term }\n  | CT_both of { left : term; right : term }\n  | CT_bind of { bound : term; value : term }\n  | CT_semi of { left : term; right : term }\n  | CT_annot of { value : term; annot : term }\n  | CT_string of { literal : string }\n  | CT_number of { literal : Z.t [@printer Z.pp_print] }\n  | CT_parens of { content : term }\n  | CT_braces of { content : term }\n[@@deriving show { with_path = false }]\n\nlet cterm loc term = CTerm { loc; term }\nlet ct_var loc ~var = cterm loc (CT_var { var })\nlet ct_extension loc ~extension = cterm loc (CT_extension { extension })\nlet ct_forall loc ~param ~body = cterm loc (CT_forall { param; body })\nlet ct_lambda loc ~param ~body = cterm loc (CT_lambda { param; body })\nlet ct_apply loc ~funct ~arg = cterm loc (CT_apply { funct; arg })\nlet ct_pair loc ~left ~right = cterm loc (CT_pair { left; right })\nlet ct_both loc ~left ~right = cterm loc (CT_both { left; right })\nlet ct_bind loc ~bound ~value = cterm loc (CT_bind { bound; value })\nlet ct_semi loc ~left ~right = cterm loc (CT_semi { left; right })\nlet ct_annot loc ~value ~annot = cterm loc (CT_annot { value; annot })\nlet ct_string loc ~literal = cterm loc (CT_string { literal })\nlet ct_number loc ~literal = cterm loc (CT_number { literal })\nlet ct_parens loc ~content = cterm loc (CT_parens { content })\nlet ct_braces loc ~content = cterm loc (CT_braces { content })\n"
  },
  {
    "path": "syntax/ctree.mli",
    "content": "open Utils\n\ntype term = CTerm of { term : term_syntax; loc : Location.t }\n\nand term_syntax =\n  | CT_var of { var : Name.t }\n  | CT_extension of { extension : Name.t }\n  | CT_forall of { param : term; body : term }\n  | CT_lambda of { param : term; body : term }\n  | CT_apply of { funct : term; arg : term }\n  | CT_pair of { left : term; right : term }\n  | CT_both of { left : term; right : term }\n  | CT_bind of { bound : term; value : term }\n  | CT_semi of { left : term; right : term }\n  | CT_annot of { value : term; annot : term }\n  | CT_string of { literal : string }\n  | CT_number of { literal : Z.t }\n  | CT_parens of { content : term }\n  | CT_braces of { content : term }\n[@@deriving show]\n\nval ct_var : Location.t -> var:Name.t -> term\nval ct_extension : Location.t -> extension:Name.t -> term\nval ct_forall : Location.t -> param:term -> body:term -> term\nval ct_lambda : Location.t -> param:term -> body:term -> term\nval ct_apply : Location.t -> funct:term -> arg:term -> term\nval ct_pair : Location.t -> left:term -> right:term -> term\nval ct_both : Location.t -> left:term -> right:term -> term\nval ct_bind : Location.t -> bound:term -> value:term -> term\nval ct_semi : Location.t -> left:term -> right:term -> term\nval ct_annot : Location.t -> value:term -> annot:term -> term\nval ct_string : Location.t -> literal:string -> term\nval ct_number : Location.t -> literal:Z.t -> term\nval ct_parens : Location.t -> content:term -> term\nval ct_braces : Location.t -> content:term -> term\n"
  },
  {
    "path": "syntax/dune",
    "content": "(library\n (name syntax)\n (libraries menhirLib compiler-libs.common utils zarith)\n (modules\n  (:standard \\ Test))\n (preprocess\n  (pps ppx_deriving.eq ppx_deriving.ord ppx_deriving.show sedlex.ppx)))\n\n(menhir\n (modules cparser)\n (flags --dump --explain --table))\n\n(executable\n (name test)\n (modules Test)\n (libraries alcotest syntax)\n (preprocess\n  (pps ppx_deriving.show ppx_deriving.eq ppx_deriving.ord)))\n\n(rule\n (alias runtest)\n (deps\n  (:exe ./test.exe))\n (action\n  (run %{exe})))\n"
  },
  {
    "path": "syntax/test.ml",
    "content": ""
  },
  {
    "path": "teika/dune",
    "content": "(library\n (name teika)\n (libraries syntax compiler-libs.common)\n (modules\n  (:standard \\ Test))\n (preprocess\n  (pps\n   ppx_deriving.show\n   ppx_sexp_conv\n   ppx_deriving.eq\n   ppx_deriving.ord\n   sedlex.ppx)))\n\n(executable\n (name test)\n (modules Test)\n (libraries alcotest teika)\n (preprocess\n  (pps ppx_deriving.show ppx_deriving.eq ppx_deriving.ord)))\n\n(rule\n (alias runtest)\n (deps\n  (:exe ./test.exe))\n (action\n  (run %{exe})))\n"
  },
  {
    "path": "teika/solve.ml",
    "content": "open Utils\nopen Syntax\nopen Ctree\nopen Ttree\nopen Terror\n\nexception Solve_error of { loc : Location.t; exn : exn }\n\n(* TODO: context vs env *)\ntype context =\n  | Context of { names : (bool * Level.t) Name.Map.t; next : Level.t }\n\nlet with_loc ~loc f =\n  try f () with\n  | Solve_error { loc; exn } ->\n      (* TODO: reraise *)\n      raise @@ Solve_error { loc; exn }\n  | exn -> raise @@ Solve_error { loc; exn }\n\nlet () =\n  Printexc.register_printer @@ function\n  | Solve_error { loc = _; exn } -> Some (Printexc.to_string exn)\n  | _ -> None\n\nlet split_pat_annot pat =\n  let (Pat { struct_ = pat; loc = _ }) = pat in\n  match pat with\n  | P_annot { pat; annot } -> (pat, annot)\n  (* TODO: support tuple here *)\n  | P_var _ | P_tuple _ -> error_missing_annotation ()\n\nlet rec enter ctx pat =\n  let (Pat { struct_ = pat; loc = _ }) = pat in\n  (* TODO: use this location? *)\n  match pat with\n  | P_annot { pat; annot = _ } -> enter ctx pat\n  | P_var { var = name } ->\n      let (Context { names; next }) = ctx in\n      Format.eprintf \"hi\\n%!\";\n      let names = Name.Map.add name (false, next) names in\n      let next = Level.next next in\n      Context { names; next }\n  | P_tuple { elements } ->\n      List.fold_left (fun ctx el -> enter ctx el) ctx elements\n\nlet rec name_of_var_pat pat =\n  let (VPat { struct_ = pat; loc = _ }) = pat in\n  match pat with\n  | VP_annot { pat; annot = _ } -> name_of_var_pat pat\n  | VP_var { var } -> var\n\nlet open_hoist ctx pat =\n  (* TODO: ensure that somehow all the hoists are closed *)\n  let name = name_of_var_pat pat in\n  let (Context { names; next }) = ctx in\n  Format.eprintf \"hi\\n%!\";\n  let names = Name.Map.add name (true, next) names in\n  let next = Level.next next in\n  Context { names; next }\n\nlet close_hoist ctx pat =\n  (* TODO: this is a bad API *)\n  let name = name_of_var_pat pat in\n  let (Context { names; next }) = ctx in\n  let names =\n    match Name.Map.find_opt name names with\n    | Some (true, from) -> Name.Map.add name (false, from) names\n    | Some (false, _from) -> failwith \"compiler bug invalid name\"\n    | None -> failwith \"close_hoist: compiler bug invalid name\"\n  in\n  Context { names; next }\n\nlet lookup ctx name =\n  let (Context { names; next }) = ctx in\n  match Name.Map.find_opt name names with\n  | Some (is_open_hoist, from) -> (\n      match Level.offset ~from ~to_:next with\n      | Some var -> (`hoist is_open_hoist, var)\n      | None -> failwith \"compiler bug invalid var\")\n  | None -> error_unknown_var ~name\n\nlet is_hoist ctx name =\n  let (Context { names; next }) = ctx in\n  match Name.Map.find_opt name names with\n  | Some (is_open_hoist, from) -> (\n      match Level.offset ~from ~to_:next with\n      | Some var -> (\n          match is_open_hoist with true -> Some var | false -> None)\n      | None -> None)\n  | None -> None\n\ntype meta_pat =\n  | MP_simple of var_pat\n  | MP_fancy of pat\n  | MP_fix of Index.t * var_pat\n\nlet rec pat_of_var_pat var_pat =\n  let (VPat { struct_ = var_pat; loc }) = var_pat in\n  match var_pat with\n  | VP_annot { pat; annot } ->\n      let pat = pat_of_var_pat pat in\n      p_wrap ~loc @@ P_annot { pat; annot }\n  | VP_var { var } -> p_wrap ~loc @@ P_var { var }\n\nlet pat_not_fix meta_pat =\n  match meta_pat with\n  | MP_simple pat -> pat_of_var_pat pat\n  | MP_fancy pat -> pat\n  | MP_fix (_var, _pat) ->\n      (* TODO: proper error here *)\n      failwith \"a variable with the same name is open on a hoist\"\n\nlet self_pat_simple meta_pat =\n  match meta_pat with\n  | MP_simple pat -> pat\n  | MP_fancy _pat -> failwith \"fancy patterns are not supported on self\"\n  | MP_fix (_var, _pat) ->\n      failwith \"a variable with the same name is open on a hoist\"\n\nlet rec solve_term ctx term =\n  let (CTerm { term; loc }) = term in\n  with_loc ~loc @@ fun () ->\n  match term with\n  | CT_parens { content = term } -> solve_term ctx term\n  | CT_annot { value = term; annot } ->\n      let annot = solve_term ctx annot in\n      let term = solve_term ctx term in\n      t_wrap ~loc @@ T_annot { term; annot }\n  | CT_var { var = name } ->\n      (* TODO: this could be treated as forward *)\n      let `hoist _, var = lookup ctx name in\n      t_wrap ~loc @@ T_var { var }\n  | CT_semi { left; right } -> solve_semi ctx ~loc ~left ~right\n  | CT_extension _ -> error_extensions_not_implemented ()\n  | CT_apply { funct; arg } ->\n      let funct = solve_term ctx funct in\n      let arg = solve_term ctx arg in\n      t_wrap ~loc @@ T_apply { funct; arg }\n  | CT_lambda { param; body } ->\n      let bound = solve_pat ctx param in\n      let bound = pat_not_fix bound in\n      let body =\n        let ctx = enter ctx bound in\n        solve_term ctx body\n      in\n      t_wrap ~loc @@ T_lambda { bound; body }\n  | CT_forall { param; body } ->\n      let bound = solve_pat ctx param in\n      let bound = pat_not_fix bound in\n      let bound, param = split_pat_annot bound in\n      let body =\n        let ctx = enter ctx bound in\n        solve_term ctx body\n      in\n      t_wrap ~loc @@ T_forall { bound; param; body }\n  | CT_pair { left; right } ->\n      let left = solve_term ctx left in\n      let acc = [ left ] in\n      let elements = solve_term_tuple ctx ~acc ~right in\n      t_wrap ~loc @@ T_tuple { elements }\n  | CT_both { left; right } ->\n      let bound = solve_pat ctx left in\n      let bound = self_pat_simple bound in\n      let body =\n        (* TODO: this is hackish *)\n        let bound = pat_of_var_pat bound in\n        let ctx = enter ctx bound in\n        solve_term ctx right\n      in\n      t_wrap ~loc @@ T_self { bound; body }\n  | CT_bind _ | CT_number _ | CT_braces _ | CT_string _ ->\n      error_invalid_notation ()\n\nand solve_term_tuple ctx ~acc ~right =\n  match\n    let (CTerm { term = right; loc = _ }) = right in\n    right\n  with\n  | CT_pair { left; right } ->\n      let left = solve_term ctx left in\n      let acc = left :: acc in\n      solve_term_tuple ctx ~acc ~right\n  | CT_var _ | CT_extension _ | CT_forall _ | CT_lambda _ | CT_apply _\n  | CT_both _ | CT_bind _ | CT_semi _ | CT_annot _ | CT_string _ | CT_number _\n  | CT_parens _ | CT_braces _ ->\n      let right = solve_term ctx right in\n      List.rev (right :: acc)\n\nand solve_semi ctx ~loc ~left ~right =\n  let (CTerm { term = left_desc; loc = _ }) = left in\n  match left_desc with\n  | CT_bind { bound; value } -> (\n      let bound = solve_pat ctx bound in\n      (* TODO: just clean this *)\n      match bound with\n      | MP_simple bound ->\n          let bound = pat_of_var_pat bound in\n          let arg = solve_term ctx value in\n          let body =\n            let ctx = enter ctx bound in\n            solve_term ctx right\n          in\n          t_wrap ~loc @@ T_let { bound; arg; body }\n      | MP_fancy bound ->\n          let arg = solve_term ctx value in\n          let body =\n            let ctx = enter ctx bound in\n            solve_term ctx right\n          in\n          t_wrap ~loc @@ T_let { bound; arg; body }\n      | MP_fix (var, bound) ->\n          let arg = solve_term ctx value in\n          let body =\n            let ctx = close_hoist ctx bound in\n            solve_term ctx right\n          in\n          t_wrap ~loc @@ T_fix { bound; var; arg; body })\n  | CT_annot { value = _; annot = _ } ->\n      let bound =\n        match solve_pat ctx left with\n        | MP_simple pat -> pat\n        | MP_fancy _pat -> failwith \"fancy patterns are not supported on hoist\"\n        | MP_fix (_var, _pat) ->\n            failwith \"a variable with the same name is already open\"\n      in\n      let body =\n        let ctx = open_hoist ctx bound in\n        solve_term ctx right\n      in\n      t_wrap ~loc @@ T_hoist { bound; body }\n  | CT_parens _ | CT_var _ | CT_extension _ | CT_forall _ | CT_lambda _\n  | CT_apply _ | CT_pair _ | CT_both _ | CT_semi _ | CT_string _ | CT_number _\n  | CT_braces _ ->\n      error_invalid_notation ()\n\n(* TODO: this code is kind of ugly *)\nand solve_pat ctx pat = solve_pat_simple ctx pat\n\nand solve_pat_simple ctx pat =\n  let (CTerm { term = pat_desc; loc }) = pat in\n  (* TODO: a bit duplicated *)\n  match pat_desc with\n  | CT_parens { content = pat } -> solve_pat_simple ctx pat\n  | CT_var { var = name } -> (\n      match is_hoist ctx name with\n      | Some var -> MP_fix (var, vp_wrap ~loc @@ VP_var { var = name })\n      | None -> MP_simple (vp_wrap ~loc @@ VP_var { var = name }))\n  | CT_annot { value = pat; annot } -> (\n      let annot = solve_term ctx annot in\n      match solve_pat ctx pat with\n      | MP_simple pat -> MP_simple (vp_wrap ~loc @@ VP_annot { pat; annot })\n      | MP_fancy pat -> MP_fancy (p_wrap ~loc @@ P_annot { pat; annot })\n      | MP_fix (var, pat) ->\n          MP_fix (var, vp_wrap ~loc @@ VP_annot { pat; annot }))\n  | CT_pair { left = _; right = _ } -> MP_fancy (solve_pat_fancy ctx pat)\n  | CT_extension _ | CT_forall _ | CT_lambda _ | CT_apply _ | CT_both _\n  | CT_bind _ | CT_semi _ | CT_string _ | CT_number _ | CT_braces _ ->\n      error_invalid_notation ()\n\nand solve_pat_fancy ctx pat =\n  (* TODO: no duplicated name on pattern *)\n  (* TODO: to_ here *)\n  let (CTerm { term = pat; loc }) = pat in\n  match pat with\n  | CT_parens { content = pat } -> solve_pat_fancy ctx pat\n  | CT_var { var = name } -> (\n      match is_hoist ctx name with\n      | Some _var -> failwith \"hoist is not supported on fancy patterns\"\n      | None -> p_wrap ~loc @@ P_var { var = name })\n  | CT_annot { value = pat; annot } ->\n      let annot = solve_term ctx annot in\n      let pat = solve_pat_fancy ctx pat in\n      p_wrap ~loc @@ P_annot { pat; annot }\n  | CT_pair { left; right } ->\n      let left = solve_pat_fancy ctx left in\n      let acc = [ left ] in\n      let elements = solve_pat_tuple ctx ~acc ~right in\n      p_wrap ~loc @@ P_tuple { elements }\n  | CT_extension _ | CT_forall _ | CT_lambda _ | CT_apply _ | CT_both _\n  | CT_bind _ | CT_semi _ | CT_string _ | CT_number _ | CT_braces _ ->\n      error_invalid_notation ()\n\nand solve_pat_tuple ctx ~acc ~right =\n  match\n    let (CTerm { term = right; loc = _ }) = right in\n    right\n  with\n  | CT_pair { left; right } ->\n      let left = solve_pat_fancy ctx left in\n      let acc = left :: acc in\n      solve_pat_tuple ctx ~acc ~right\n  | CT_var _ | CT_extension _ | CT_forall _ | CT_lambda _ | CT_apply _\n  | CT_both _ | CT_bind _ | CT_semi _ | CT_annot _ | CT_string _ | CT_number _\n  | CT_parens _ | CT_braces _ ->\n      let right = solve_pat_fancy ctx right in\n      List.rev (right :: acc)\n\n(* external *)\nlet solve_term ctx term = try Ok (solve_term ctx term) with exn -> Error exn\n\nlet initial =\n  (* TODO: duplicated from Typer *)\n  let next = Level.(next zero) in\n  (* TODO: predef somewhere *)\n  (* TODO: rename Type to data *)\n  let type_ = Name.make \"Type\" in\n  let names = Name.Map.(empty |> add type_ (false, Level.zero)) in\n  Context { names; next }\n"
  },
  {
    "path": "teika/solve.mli",
    "content": "open Syntax\n\nexception Solve_error of { loc : Location.t; exn : exn }\n\ntype context\n\n(* TODO: couple all the initial contexts *)\nval initial : context\nval solve_term : context -> Ctree.term -> (Ttree.term, exn) result\n"
  },
  {
    "path": "teika/terror.ml",
    "content": "open Utils\nopen Ttree\n\n(* TODO: too much work to add errors,\n   adding here and context is bad*)\ntype error =\n  (* TODO: why track nested locations?\n         Probably because things like macros exists *)\n  | TError_loc of { error : error; loc : Location.t [@opaque] }\n  (* equal *)\n  | TError_type_clash\n  (* typer *)\n  | TError_unknown_var of { name : Name.t }\n  | TError_not_a_forall of { type_ : term }\n  | TError_hoist_not_implemented\n  | TError_extensions_not_implemented\n  | TError_pairs_not_implemented\n  | TError_unknown_native of { native : string }\n  | TError_missing_annotation\n  (* elaborate *)\n  | TError_invalid_notation\n\nand t = error [@@deriving show { with_path = false }]\n\nexception TError of { error : error }\n\nlet () =\n  Printexc.register_printer @@ function\n  | TError { error } -> Some (show_error error)\n  | _ -> None\n\nlet terror error = raise (TError { error })\nlet error_type_clash () = terror @@ TError_type_clash\nlet error_unknown_var ~name = terror @@ TError_unknown_var { name }\nlet error_not_a_forall ~type_ = terror @@ TError_not_a_forall { type_ }\nlet error_hoist_not_implemented () = terror @@ TError_hoist_not_implemented\n\nlet error_extensions_not_implemented () =\n  terror @@ TError_extensions_not_implemented\n\nlet error_pairs_not_implemented () = terror @@ TError_pairs_not_implemented\nlet error_unknown_native ~native = terror @@ TError_unknown_native { native }\nlet error_missing_annotation () = terror @@ TError_missing_annotation\nlet error_invalid_notation () = terror @@ TError_invalid_notation\n"
  },
  {
    "path": "teika/terror.mli",
    "content": "open Utils\nopen Ttree\n\ntype error =\n  (* metadata *)\n  | TError_loc of { error : error; loc : Location.t [@opaque] }\n  (* equal *)\n  | TError_type_clash\n  (* TODO: infer *)\n  (* typer *)\n  | TError_unknown_var of { name : Name.t }\n  | TError_not_a_forall of { type_ : term }\n  | TError_hoist_not_implemented\n  | TError_extensions_not_implemented\n  | TError_pairs_not_implemented\n  (* TODO: native should not be a string *)\n  | TError_unknown_native of { native : string }\n  | TError_missing_annotation\n  (* elaborate *)\n  | TError_invalid_notation\n\ntype t = error [@@deriving show]\n\nexception TError of { error : error }\n\n(* TODO: error_loc *)\nval error_type_clash : unit -> 'a\nval error_unknown_var : name:Name.t -> 'a\nval error_not_a_forall : type_:term -> 'a\nval error_hoist_not_implemented : unit -> 'a\nval error_extensions_not_implemented : unit -> 'a\nval error_pairs_not_implemented : unit -> 'a\nval error_unknown_native : native:string -> 'a\nval error_missing_annotation : unit -> 'a\nval error_invalid_notation : unit -> 'a\n"
  },
  {
    "path": "teika/test.ml",
    "content": "open Syntax\n\nmodule Typer = struct\n  open Teika\n\n  type test =\n    | Check of { name : string; annotated_term : string }\n    | Fail of { name : string; annotated_term : string }\n\n  let check name annotated_term = Check { name; annotated_term }\n  let fail name annotated_term = Fail { name; annotated_term }\n\n  (* TODO: write tests for locations and names / offset *)\n  (* TODO: write tests for escape check *)\n  let univ_type = check \"Type\" {|(Type : Type)|}\n  let string_type = check \"String\" {|(String : Type)|}\n  let false_type = check \"False\" {|(A : Type) -> A|}\n\n  let id =\n    check \"id\" {|((A : Type) => (x : A) => x : (A : Type) -> (x : A) -> A)|}\n\n  let id_propagate =\n    check \"id_propagate\" {|((A => x => x) : (A : Type) -> (x : A) -> A)|}\n\n  let id_unify =\n    check \"id_unify\" {|((A => (x : A) => x) : (A : Type) -> (x : A) -> A)|}\n\n  let let_id =\n    check \"let_id\"\n      {|((\n        id : (A : Type) -> (x : A) -> A = A => (x : A) => x;\n        id\n      ) : (A : Type) -> (x : A) -> A)|}\n\n  let id_type = check \"id_type\" {|(((A : Type) => (x : A) => x) Type)|}\n\n  let id_type_never =\n    check \"id_type_never\"\n      {|(((A : Type) => (x : A) => x) Type ((A : Type) -> A)\n        : Type)|}\n\n  let return_id_propagate =\n    check \"return_id_propagate\"\n      {|((((id : (A : Type) -> (x : A) -> A) => id) (A => x => x))\n        : (A : Type) -> (x : A) -> A)|}\n\n  let sequence =\n    check \"sequence\"\n      {|((A => (x : A) => B => (y : B) => y)\n        : (A : Type) -> (x : A) -> (B : Type) -> (y : B) -> B)|}\n\n  let bool =\n    check \"bool\" {|(((A : Type) -> (x : A) -> (y : A) -> A)\n        : Type)|}\n\n  let true_ =\n    check \"true\"\n      {|(((A : Type) => (x : A) => (y : A) => x)\n        : (A : Type) -> (x : A) -> (y : A) -> A)|}\n\n  let true_unify =\n    check \"true_unify\"\n      {|(((A : Type) => x => (y : A) => x)\n        : (A : Type) -> (x : A) -> (y : A) -> A)|}\n\n  let false_ =\n    check \"false\"\n      {|((A => (x : A) => (y : A) => y)\n        : (A : Type) -> (x : A) -> (y : A) -> A)|}\n\n  let ind_false_T =\n    check \"False_T\"\n      {|\n        (@self(False -> (f : @self(f -> @unroll False f)) -> Type) : Type)\n      |}\n\n  let ind_false =\n    check \"False\"\n      {|\n        (@fix(False => f =>\n          (P : (f : @self(f -> @unroll False f)) -> Type) -> P f\n        ) : @self(False -> (f : @self(f -> @unroll False f)) -> Type))\n      |}\n\n  let let_alias =\n    check \"let_alias\"\n      {|\n        Id : (A : Type) -> Type = (A : Type) => A;\n        ((A : Type) => (x : A) => (x : Id A))\n      |}\n\n  let simple_string = check \"simple_string\" {|(\"simple string\" : String)|}\n\n  let rank_2_propagate =\n    check \"rank_2_propagate\"\n      {|\n        Unit = (A : Type) -> (x : A) -> A;\n        (u => u Unit u : (u : Unit) -> Unit)\n      |}\n\n  let rank_2_propagate_let =\n    check \"rank_2_propagate\"\n      {|\n        Unit = (A : Type) -> (x : A) -> A;\n        noop : (u : Unit) -> Unit = u => u Unit u;\n        noop\n      |}\n\n  let invalid_annotation = fail \"invalid_annotation\" {|(String : \"A\")|}\n  let simplest_escape_check = fail \"simplest_escape_check\" \"x => A => (x : A)\"\n\n  let bound_var_escape_check =\n    fail \"bound_var_escape_check\"\n      {|\n        call = f => v => f v;\n        (never : (A : Type) -> A) => call never\n      |}\n\n  let hole_lowering_check =\n    fail \"hole_lowering_check\"\n      {|\n        x => (A : Type) => y => (id => (_ = (id x); _ = id y; (y : A))) (x => x)\n      |}\n\n  let trivial_equality =\n    check \"trivial_equality\"\n      {|\n        Eq = (A : Type) => (x : A) => (y : A) =>\n          (P : (z : A) -> Type) -> (l : P x) -> P y;\n        refl = (A : Type) => (x : A) =>\n          (P : (z : A) -> Type) => (l : P x) => l;\n        (refl Type Type : Eq Type Type Type)\n      |}\n\n  let split_at_a_distance =\n    check \"split_at_a_distance\"\n      {|\n        (l : Type) =>\n          (f : (X = Type; (A : X) => (x : A) -> A) Type) => f l\n      |}\n\n  let nat_256_equality =\n    check \"nat_256_equality\"\n      {|\n        Eq = (A : Type) => (x : A) => (y : A) =>\n          (P : (z : A) -> Type) -> (l : P x) -> P y;\n        refl : (A : Type) -> (x : A) -> Eq A x x\n          = (A : Type) => (x : A) =>\n            (P : (z : A) -> Type) => (l : P x) => l;\n\n        Nat = (A : Type) -> (z : A) -> (s : (acc : A) -> A) -> A;\n        zero : Nat = (A : Type) => (z : A) => (s : (acc : A) -> A) => z;\n        succ : (pred : Nat) -> Nat = (pred : Nat) => \n          (A : Type) => (z : A) => (s : (acc : A) -> A) => s (pred A z s);\n        one = succ zero;\n\n        add = (a : Nat) => (b : Nat) => a Nat b succ;\n        mul = (a : Nat) => (b : Nat) => a Nat zero ((n : Nat) => add n b);\n        \n        two = succ one;\n        three = succ two;\n        four = add two two;\n        eight = add four four;\n        sixteen = add eight eight;\n        n256 = mul sixteen sixteen;\n        sixteen_is_eight_times_two : Eq Nat sixteen (mul eight two)\n          = refl Nat sixteen;\n        (refl Nat n256 : Eq Nat (mul (mul eight eight) four) n256)\n      |}\n\n  let simple_alpha_rename =\n    check \"simple_alpha_rename\"\n      {|( (f : (B : Type) -> B) => (f ((C : Type) -> C) : (D : Type) -> D)\n        : (f : (E : Type) -> E) -> (F : Type) -> F)|}\n\n  let _tests =\n    [\n      id_propagate;\n      id_unify;\n      let_id;\n      return_id_propagate;\n      sequence;\n      true_unify;\n      false_;\n      ind_false_T;\n      ind_false;\n      rank_2_propagate;\n      rank_2_propagate_let;\n      simplest_escape_check;\n    ]\n\n  let _tests =\n    [\n      univ_type;\n      string_type;\n      false_type;\n      id;\n      (*\n      id_propagate;\n      id_unify;\n      let_id;\n      *)\n      id_type;\n      id_type_never;\n      (* return_id_propagate; *)\n      (* sequence; *)\n      bool;\n      true_;\n      (* true_unify; *)\n      (* false_; *)\n      let_alias;\n      simple_string;\n      (*\n      ind_false_T;\n      ind_false;\n      rank_2_propagate;\n      rank_2_propagate_let;\n      *)\n      invalid_annotation;\n      (* simplest_escape_check; *)\n      bound_var_escape_check;\n      hole_lowering_check;\n      trivial_equality;\n      split_at_a_distance;\n      nat_256_equality;\n      simple_alpha_rename;\n    ]\n\n  let _tests =\n    [\n      check \"nat_256_equality\"\n        {|\n          Eq : (A : Type) -> (x : A) -> (y : A) -> Type\n            = A => x => y => (P : (z : A) -> Type) -> (l : P x) -> P y;\n          refl : (A : Type) -> (x : A) -> Eq A x x\n            = A => x => P => l => l;\n\n          Nat = (A : Type) -> (z : A) -> (s : (acc : A) -> A) -> A;\n          zero : Nat = A => z => s => z;\n          succ : (pred : Nat) -> Nat = pred => A => z => s => s (pred A z s);\n          one = succ zero;\n\n          add : (a : Nat) -> (b : Nat) -> Nat\n            = a => b => a Nat b succ;\n          mul : (a : Nat) -> (b : Nat) -> Nat\n            = a => b => a Nat zero (n => add n b);\n\n          two = succ one;\n          three = succ two;\n          four = add two two;\n          eight = add four four;\n          sixteen = add eight eight;\n          n256 = mul sixteen sixteen;\n          n512 = mul n256 two;\n          (refl Nat n512 : Eq Nat (mul (mul eight eight) eight) n512)\n      |};\n    ]\n\n  let tests =\n    [\n      check \"fix\"\n        {|\n          Never : Type;\n          Never = (A : Type) -> A;\n\n          Unit : Type;\n          unit : Unit;\n\n          Unit = (u : Unit) & (P : (u : Unit) -> Type) -> (x : P(unit)) -> P(u);\n          unit = P => x => x;\n          ind_unit : (u : Unit) -> (P : (u : Unit) -> Type) ->\n            (x : P(unit)) -> P(u) = u => u;\n\n          Bool : Type;\n          true : Bool;\n          false : Bool;\n\n          Bool = (b : Bool) & (P : (b : Bool) -> Type) ->\n            (x : P(true)) -> (y : P(false)) -> P(b);\n          true = P => x => y => x;\n          false = P => x => y => y;\n          ind_bool : (b : Bool) -> (P : (b : Bool) -> Type) ->\n            (x : P(true)) -> (y : P(false)) -> P(b) = b => b;\n\n          Equal : (A : Type) -> (x : A) -> (y : A) -> Type;\n          refl : (A : Type) -> (x : A) -> Equal A x x;\n\n          Equal = A => x => y => (eq : Equal A x y) &\n            (P : (z : A) -> Type) -> (v : P(x)) -> P(y);\n          refl = A => x => P => v => v;\n\n          transport : (A : Type) -> (x : A) -> (y : A) -> \n            (H : Equal A x y) -> (P : (z : A) -> Type) -> (v : P(x)) -> P(y);\n          transport = A => x => y => H => H;\n\n          true_not_false : (H : Equal(Bool)(true)(false)) -> Never;\n          true_not_false = H => (\n            P : (b : Bool) -> Type = b => ind_bool(b)(_ => Type)(Unit)(Never);\n            transport(Bool)(true)(false)(H)(P)(unit)\n          );\n\n          id : (A : Type) -> (x : A) -> A = (\n            (A : Type) => (x : A) => x\n          );\n\n          true\n        |};\n    ]\n\n  (* alcotest *)\n  let test test =\n    let check ~name ~annotated_term =\n      Alcotest.test_case name `Quick @@ fun () ->\n      let ctree =\n        match Clexer.from_string Cparser.term_opt annotated_term with\n        | Some ctree -> ctree\n        | None -> failwith \"failed to parse\"\n      in\n      let ttree =\n        match Solve.(solve_term initial ctree) with\n        | Ok ttree -> ttree\n        | Error exn ->\n            failwith\n            @@ Format.asprintf \"failed to infer types: %s\"\n                 (Printexc.to_string exn)\n      in\n      Format.eprintf \"ttree : %a\\n%!\" Ttree.pp_term ttree;\n      match Typer.infer_term ttree with\n      | Ok _type_ -> Format.eprintf \"typed\\n%!\"\n      | Error exn ->\n          failwith\n          @@ Format.asprintf \"failed to infer types: %s\"\n               (Printexc.to_string exn)\n    in\n    let fail ~name ~annotated_term =\n      Alcotest.test_case name `Quick @@ fun () ->\n      let ctree =\n        match Clexer.from_string Cparser.term_opt annotated_term with\n        | Some ctree -> ctree\n        | None -> failwith \"failed to parse\"\n      in\n      let ttree =\n        match Solve.(solve_term initial ctree) with\n        | Ok ttree -> ttree\n        | Error exn ->\n            failwith\n            @@ Format.asprintf \"failed to infer types: %s\"\n                 (Printexc.to_string exn)\n      in\n      match Typer.infer_term ttree with\n      | Ok _type_ -> failwith \"worked but should had failed\"\n      | Error _exn -> ()\n    in\n    match test with\n    | Check { name; annotated_term } -> check ~name ~annotated_term\n    | Fail { name; annotated_term } -> fail ~name ~annotated_term\n\n  let tests = (\"typer\", List.map test tests)\nend\n\nlet () = Alcotest.run \"Teika\" [ Typer.tests ]\n\n(* TODO: (n : Nat & n >= 1, x : Nat) should be valid\n   *)\n"
  },
  {
    "path": "teika/test.mli",
    "content": ""
  },
  {
    "path": "teika/tprinter.ml",
    "content": "[@@@ocaml.warning \"-unused-constructor\"]\n\nmodule Ptree = struct\n  open Format\n  open Utils\n\n  type term =\n    (* TODO: use PT_meta for level, subst and shift *)\n    | PT_meta of { term : term }\n    | PT_annot of { term : term; annot : term }\n    | PT_var of { var : Name.t }\n    | PT_free_var of { var : Level.t }\n    | PT_bound_var of { var : Index.t }\n    | PT_hoist of { bound : term; body : term }\n    | PT_let of { bound : term; arg : term; body : term }\n    | PT_apply of { funct : term; arg : term }\n    | PT_lambda of { param : term; body : term }\n    | PT_forall of { param : term; body : term }\n    | PT_inter of { left : term; right : term }\n    | PT_string of { literal : string }\n\n  type term_prec = T_wrapped | T_let | T_funct | T_apply | T_atom\n\n  let pp_term_syntax ~pp_wrapped ~pp_let ~pp_funct ~pp_apply ~pp_atom fmt term =\n    match term with\n    | PT_meta { term } -> fprintf fmt \"#%a\" pp_atom term\n    | PT_annot { term; annot } ->\n        fprintf fmt \"%a : %a\" pp_funct term pp_wrapped annot\n    | PT_var { var } -> fprintf fmt \"%s\" (Name.repr var)\n    | PT_free_var { var } -> fprintf fmt \"\\\\+%a\" Level.pp var\n    | PT_bound_var { var } -> fprintf fmt \"\\\\-%a\" Index.pp var\n    | PT_hoist { bound; body } ->\n        (* TODO: is pp_wrapped correct here? *)\n        fprintf fmt \"%a; %a\" pp_wrapped bound pp_let body\n    | PT_let { bound; arg; body } ->\n        fprintf fmt \"%a = %a; %a\" pp_atom bound pp_funct arg pp_let body\n    | PT_lambda { param; body } ->\n        fprintf fmt \"%a => %a\" pp_atom param pp_funct body\n    | PT_apply { funct; arg } -> fprintf fmt \"%a %a\" pp_apply funct pp_atom arg\n    | PT_string { literal } ->\n        (* TODO: proper escaping *)\n        fprintf fmt \"%S\" literal\n    | PT_forall { param; body } ->\n        fprintf fmt \"(%a) -> %a\" pp_wrapped param pp_funct body\n    | PT_inter { left; right } ->\n        fprintf fmt \"(%a) & %a\" pp_wrapped left pp_funct right\n\n  let rec pp_term prec fmt term =\n    let pp_wrapped fmt term = pp_term T_wrapped fmt term in\n    let pp_let fmt term = pp_term T_let fmt term in\n    let pp_funct fmt term = pp_term T_funct fmt term in\n    let pp_apply fmt term = pp_term T_apply fmt term in\n    let pp_atom fmt term = pp_term T_atom fmt term in\n    match (term, prec) with\n    | ( (PT_meta _ | PT_var _ | PT_free_var _ | PT_bound_var _ | PT_string _),\n        (T_wrapped | T_let | T_funct | T_apply | T_atom) )\n    | PT_apply _, (T_wrapped | T_let | T_funct | T_apply)\n    | (PT_lambda _ | PT_forall _ | PT_inter _), (T_wrapped | T_let | T_funct)\n    | (PT_hoist _ | PT_let _), (T_wrapped | T_let)\n    | PT_annot _, T_wrapped ->\n        pp_term_syntax ~pp_wrapped ~pp_let ~pp_funct ~pp_apply ~pp_atom fmt term\n    | PT_apply _, T_atom\n    | (PT_lambda _ | PT_forall _ | PT_inter _), (T_apply | T_atom)\n    | (PT_hoist _ | PT_let _), (T_funct | T_apply | T_atom)\n    | PT_annot _, (T_let | T_funct | T_apply | T_atom) ->\n        fprintf fmt \"(%a)\" pp_wrapped term\n\n  let pp_term fmt term = pp_term T_wrapped fmt term\nend\n\nopen Ttree\nopen Ptree\n\nlet _pt_with_type ~type_ term =\n  PT_meta { term = PT_annot { term; annot = type_ } }\n\n(* TODO: extract substitutions *)\n(* TODO: rename all tt_ to term_ *)\nlet rec tt_print term =\n  let (Term { struct_ = term; loc = _ }) = term in\n  match term with\n  | T_annot { term; annot } ->\n      let term = tt_print term in\n      let annot = tt_print annot in\n      PT_annot { term; annot }\n  | T_var { var } -> PT_bound_var { var }\n  | T_let { bound; arg; body } ->\n      let bound = tp_print bound in\n      let arg = tt_print arg in\n      let body = tt_print body in\n      PT_let { bound; arg; body }\n  | T_hoist { bound; body } ->\n      let bound = vp_print bound in\n      let body = tt_print body in\n      PT_hoist { bound; body }\n  | T_fix { bound; var = _; arg; body } ->\n      (* TODO: proper var renaming *)\n      let bound = vp_print bound in\n      let arg = tt_print arg in\n      let body = tt_print body in\n      PT_let { bound; arg; body }\n  | T_lambda { bound; body } ->\n      let param = tp_print bound in\n      let body = tt_print body in\n      PT_lambda { param; body }\n  | T_apply { funct; arg } ->\n      let funct = tt_print funct in\n      let arg = tt_print arg in\n      PT_apply { funct; arg }\n  | T_forall { bound; param; body } ->\n      let param =\n        let pat = tp_print bound in\n        let annot = tt_print param in\n        PT_annot { term = pat; annot }\n      in\n      let body = tt_print body in\n      PT_forall { param; body }\n  | T_self { bound; body } ->\n      let left = vp_print bound in\n      let right = tt_print body in\n      (* TODO: self *)\n      PT_inter { left; right }\n  | T_tuple _ | T_exists _ -> failwith \"not implemented\"\n\nand vp_print pat =\n  let (VPat { struct_ = pat; loc = _ }) = pat in\n  match pat with\n  | VP_annot { pat; annot } ->\n      let pat = vp_print pat in\n      let annot = tt_print annot in\n      PT_annot { term = pat; annot }\n  | VP_var { var } -> PT_var { var }\n\nand tp_print pat =\n  let (Pat { struct_ = pat; loc = _ }) = pat in\n  match pat with\n  | P_annot { pat; annot } ->\n      let pat = tp_print pat in\n      let annot = tt_print annot in\n      PT_annot { term = pat; annot }\n  | P_var { var } -> PT_var { var }\n  | P_tuple _ -> failwith \"not implemented\"\n\nlet pp_term fmt term =\n  let term = tt_print term in\n  Ptree.pp_term fmt term\n\nlet pp_pat fmt pat =\n  let pat = tp_print pat in\n  Ptree.pp_term fmt pat\n\nmodule Perror = struct\n  open Format\n  open Utils\n  open Ptree\n\n  type error =\n    | PE_loc of { loc : Location.t; error : error }\n    | PE_type_clash\n    | PE_unknown_var of { name : Name.t }\n    | PE_not_a_forall of { type_ : term }\n    | PE_hoist_not_implemented\n    | PE_extensions_not_implemented\n    | PE_pairs_not_implemented\n    | PE_unknown_native of { native : string }\n    | PE_missing_annotation\n    | PE_invalid_notation\n\n  let pp_pos fmt pos =\n    let Lexing.{ pos_fname; pos_lnum; pos_bol; pos_cnum = _ } = pos in\n    (* TODO: print only file by default? *)\n    fprintf fmt \"%s:%d:%d\" pos_fname pos_lnum pos_bol\n\n  let pp_loc fmt loc =\n    let Location.{ loc_start; loc_end; loc_ghost = _ } = loc in\n    match Location.is_none loc with\n    | true -> fprintf fmt \"[__NONE__]\"\n    | false -> fprintf fmt \"[%a .. %a]\" pp_pos loc_start pp_pos loc_end\n\n  let rec pp_error fmt error =\n    match error with\n    | PE_loc { loc; error } -> fprintf fmt \"%a\\n%a\" pp_loc loc pp_error error\n    | PE_type_clash -> fprintf fmt \"type clash\"\n    | PE_unknown_var { name } -> fprintf fmt \"unknown variable %a\" Name.pp name\n    | PE_not_a_forall { type_ } ->\n        fprintf fmt \"expected forall\\nreceived : %a\" pp_term type_\n    | PE_hoist_not_implemented -> fprintf fmt \"hoist not implemented\"\n    | PE_extensions_not_implemented -> fprintf fmt \"extensions not implemented\"\n    | PE_pairs_not_implemented -> fprintf fmt \"pairs not implemented\"\n    | PE_unknown_native { native } -> fprintf fmt \"unknown native : %S\" native\n    (* TODO: rename missing annotation *)\n    | PE_missing_annotation -> fprintf fmt \"not enough annotations\"\n    | PE_invalid_notation -> fprintf fmt \"invalid notation\"\nend\n\nlet rec te_print error =\n  let open Terror in\n  let open Perror in\n  match error with\n  | TError_loc { error; loc } ->\n      let rec loop loc error =\n        match error with\n        | TError_loc { error; loc = loc' } ->\n            let loc =\n              (* ignore none locations *)\n              match Location.is_none loc' with\n              | true -> loc\n              | false -> loc'\n            in\n            loop loc error\n        | error ->\n            let error = te_print error in\n            PE_loc { loc; error }\n      in\n      loop loc error\n  | TError_type_clash -> PE_type_clash\n  | TError_unknown_var { name } -> PE_unknown_var { name }\n  | TError_not_a_forall { type_ } ->\n      let type_ = tt_print type_ in\n      PE_not_a_forall { type_ }\n  | TError_extensions_not_implemented -> PE_extensions_not_implemented\n  | TError_hoist_not_implemented -> PE_hoist_not_implemented\n  | TError_pairs_not_implemented ->\n      PE_pairs_not_implemented (* TODO: print payload *)\n  | TError_unknown_native { native } -> PE_unknown_native { native }\n  | TError_missing_annotation -> PE_missing_annotation\n  | TError_invalid_notation -> PE_invalid_notation\n\nlet pp_error fmt error =\n  let error = te_print error in\n  Perror.pp_error fmt error\n"
  },
  {
    "path": "teika/tprinter.mli",
    "content": "open Format\nopen Ttree\nopen Terror\n\nval pp_term : formatter -> term -> unit\nval pp_pat : formatter -> pat -> unit\nval pp_error : Format.formatter -> error -> unit\n"
  },
  {
    "path": "teika/ttree.ml",
    "content": "open Utils\n\n(* TODO: explicit unfold for loops on terms *)\ntype term = Term of { struct_ : term_struct; loc : Location.t [@opaque] }\n\nand term_struct =\n  (* (M : A) *)\n  | T_annot of { term : term; annot : term }\n  (* \\n *)\n  | T_var of { var : Index.t }\n  (* P = N; M *)\n  | T_let of { bound : pat; arg : term; body : term }\n  (* x : A; M *)\n  | T_hoist of { bound : var_pat; body : term }\n  (* x : A; ...; x = N; M *)\n  | T_fix of { bound : var_pat; var : Index.t; arg : term; body : term }\n  (* P => M *)\n  | T_lambda of { bound : pat; body : term }\n  (* M N *)\n  | T_apply of { funct : term; arg : term }\n  (* (P : A) -> B *)\n  | T_forall of { bound : pat; param : term; body : term }\n  (* (P : A) & B *)\n  | T_self of { bound : var_pat; body : term }\n  (* (M, ...) *)\n  | T_tuple of { elements : term list }\n  (* (x : A, ...) *)\n  | T_exists of { elements : pat list }\n\nand var_pat = VPat of { struct_ : var_pat_struct; loc : Location.t [@opaque] }\n\nand var_pat_struct =\n  (* (P : A) *)\n  | VP_annot of { pat : var_pat; annot : term }\n  (* x *)\n  | VP_var of { var : Name.t }\n\nand pat = Pat of { struct_ : pat_struct; loc : Location.t [@opaque] }\n\nand pat_struct =\n  (* (P : A) *)\n  | P_annot of { pat : pat; annot : term }\n  (* x *)\n  (* TODO: drop names and uses receipts *)\n  | P_var of { var : Name.t }\n  (* (x, ...) *)\n  | P_tuple of { elements : pat list }\n[@@deriving show { with_path = false }]\n\nlet t_wrap ~loc struct_ = Term { struct_; loc }\nlet vp_wrap ~loc struct_ = VPat { struct_; loc }\nlet p_wrap ~loc struct_ = Pat { struct_; loc }\n"
  },
  {
    "path": "teika/ttree.mli",
    "content": "open Utils\n\n(* TODO: explicit unfold for loops on terms *)\ntype term = Term of { struct_ : term_struct; loc : Location.t [@opaque] }\n\nand term_struct =\n  (* (M : A) *)\n  | T_annot of { term : term; annot : term }\n  (* \\n *)\n  | T_var of { var : Index.t }\n  (* P = N; M *)\n  | T_let of { bound : pat; arg : term; body : term }\n  (* x : A; M *)\n  | T_hoist of { bound : var_pat; body : term }\n  (* x : A; ...; x = N; M *)\n  | T_fix of { bound : var_pat; var : Index.t; arg : term; body : term }\n  (* P => M *)\n  | T_lambda of { bound : pat; body : term }\n  (* M N *)\n  | T_apply of { funct : term; arg : term }\n  (* (P : A) -> B *)\n  | T_forall of { bound : pat; param : term; body : term }\n  (* (P : A) & B *)\n  | T_self of { bound : var_pat; body : term }\n  (* (M, ...) *)\n  | T_tuple of { elements : term list }\n  (* (x : A, ...) *)\n  | T_exists of { elements : pat list }\n\nand var_pat = VPat of { struct_ : var_pat_struct; loc : Location.t }\n\nand var_pat_struct =\n  (* (P : A) *)\n  | VP_annot of { pat : var_pat; annot : term }\n  (* x *)\n  | VP_var of { var : Name.t }\n\nand pat = Pat of { struct_ : pat_struct; loc : Location.t }\n\nand pat_struct =\n  (* (P : A) *)\n  | P_annot of { pat : pat; annot : term }\n  (* x *)\n  (* TODO: drop names and uses receipts *)\n  | P_var of { var : Name.t }\n  (* (x, ...) *)\n  | P_tuple of { elements : pat list }\n[@@deriving show]\n\nval t_wrap : loc:Location.t -> term_struct -> term\nval vp_wrap : loc:Location.t -> var_pat_struct -> var_pat\nval p_wrap : loc:Location.t -> pat_struct -> pat\n"
  },
  {
    "path": "teika/typer.ml",
    "content": "open Utils\nopen Ttree\nopen Terror\n\nmodule Value : sig\n  type value\n\n  and value_struct =\n    | V_hole\n    (* TODO: name on var? *)\n    | V_var of { name : Name.t }\n    | V_forward of { name : Name.t; mutable inner : value }\n    | V_apply of { funct : value; arg : value }\n    | V_lambda of { env : env; bound : pat; body : term }\n    | V_univ\n    | V_forall of { param : value; env : env; bound : pat; body : term }\n    | V_self of { env : env; bound : var_pat; body : term }\n    | V_thunk of { env : env; term : term }\n    | V_link of { mutable value : value }\n\n  and env [@@deriving show]\n\n  (* environment *)\n  val empty : env\n  val access : env -> Index.t -> value\n  val append : env -> value -> env\n\n  (* constructors *)\n  val v_null : value\n  val v_var : at:Level.t -> name:Name.t -> value\n  val fresh_v_hole : at:Level.t -> value\n  val fresh_v_forward : name:Name.t -> value\n  val v_apply : funct:value -> arg:value -> value\n  val v_lambda : env:env -> bound:pat -> body:term -> value\n  val v_univ : value\n  val v_forall : param:value -> env:env -> bound:pat -> body:term -> value\n  val v_self : env:env -> bound:var_pat -> body:term -> value\n  val v_thunk : env:env -> term:term -> value\n\n  (* utilities *)\n  val repr : value -> value\n  val struct_ : value -> value_struct\n  val level : value -> Level.t\n  val same : value -> value -> bool\n  val assert_forward : value -> unit\n  val init_forward : value -> to_:value -> unit\n  val lock_forward : value -> (unit -> 'a) -> 'a\n  val hole_lower : value -> to_:Level.t -> unit\n  val hole_link : value -> to_:value -> unit\n  val thunk_link : value -> to_:value -> unit\nend = struct\n  type value = { mutable struct_ : value_struct; mutable at : Level.t }\n\n  and value_struct =\n    | V_hole\n    | V_var of { name : Name.t }\n    | V_forward of { name : Name.t; mutable inner : value [@opaque] }\n    | V_apply of { funct : value; arg : value }\n    | V_lambda of { env : env; [@opaque] bound : pat; body : term }\n      (* TODO: is univ actually needed or useful here? *)\n    | V_univ\n    (* TODO: non dependent version of types and function *)\n    | V_forall of {\n        param : value;\n        env : env; [@opaque]\n        bound : pat;\n        body : term;\n      }\n    | V_self of { env : env; [@opaque] bound : var_pat; body : term }\n    | V_thunk of { env : env; [@opaque] term : term }\n    | V_link of { mutable value : value }\n\n  and env = value list [@@deriving show { with_path = false }]\n\n  let v_new ~at struct_ = { struct_; at }\n\n  let v_null =\n    let name = Name.make \"**null**\" in\n    v_new ~at:Level.zero @@ V_var { name }\n\n  let v_var ~at ~name = v_new ~at @@ V_var { name }\n  let fresh_v_hole ~at = v_new ~at @@ V_hole\n\n  let fresh_v_forward ~name =\n    (* TODO: proper level here *)\n    let at = Level.zero in\n    v_new ~at @@ V_forward { name; inner = v_null }\n\n  let v_apply ~funct ~arg =\n    let at = Level.max funct.at arg.at in\n    v_new ~at @@ V_apply { funct; arg }\n\n  let v_lambda ~env ~bound ~body =\n    (* TODO: proper level for lambdas *)\n    let at = Level.zero in\n    v_new ~at @@ V_lambda { env; bound; body }\n\n  let v_univ = v_new ~at:Level.zero @@ V_univ\n\n  let v_forall ~param ~env ~bound ~body =\n    (* TODO: proper level for forall *)\n    let at = Level.zero in\n    v_new ~at @@ V_forall { param; env; bound; body }\n\n  let v_self ~env ~bound ~body =\n    (* TODO: proper level for self *)\n    let at = Level.zero in\n    v_new ~at @@ V_self { env; bound; body }\n\n  let v_thunk ~env ~term =\n    (* TODO: proper level here *)\n    let at = Level.zero in\n    v_new ~at @@ V_thunk { env; term }\n\n  let rec repr value =\n    match value.struct_ with\n    | V_link { value } -> repr value\n    | V_hole | V_var _ | V_forward _ | V_apply _ | V_lambda _ | V_univ\n    | V_forall _ | V_self _ | V_thunk _ ->\n        value\n\n  (* TODO: inline repr? *)\n  let repr value =\n    match value.struct_ with\n    | V_link ({ value } as link) ->\n        (* path compression *)\n        let value = repr value in\n        link.value <- value;\n        value\n    | V_hole | V_var _ | V_forward _ | V_apply _ | V_lambda _ | V_univ\n    | V_forall _ | V_self _ | V_thunk _ ->\n        value\n\n  let struct_ value = (repr value).struct_\n\n  (* TODO: level vs at *)\n  let level value = (repr value).at\n  let same (left : value) (right : value) = left == right\n\n  let assert_forward value =\n    match value.struct_ with\n    | V_forward { name = _; inner = _ } -> ()\n    | V_hole | V_var _ | V_apply _ | V_lambda _ | V_univ | V_forall _ | V_self _\n    | V_thunk _ | V_link _ ->\n        failwith \"assert_forward: not a forward\"\n\n  let init_forward value ~to_ =\n    let value = repr value in\n    match value.struct_ with\n    | V_forward ({ name = _; inner } as forward) -> (\n        match same inner v_null with\n        | true -> forward.inner <- to_\n        | false -> failwith \"init_forward: already initialized\")\n    | V_hole | V_var _ | V_apply _ | V_lambda _ | V_univ | V_forall _ | V_self _\n    | V_thunk _ | V_link _ ->\n        failwith \"init_forward: not a forward\"\n\n  let lock_forward value f =\n    match struct_ value with\n    | V_forward ({ name = _; inner } as forward) ->\n        forward.inner <- v_null;\n        let finally () = forward.inner <- inner in\n        Fun.protect ~finally f\n    | V_hole | V_var _ | V_apply _ | V_lambda _ | V_univ | V_forall _ | V_self _\n    | V_thunk _ | V_link _ ->\n        failwith \"lock_forward: not a forward\"\n\n  let hole_lower hole ~to_ =\n    let hole = repr hole in\n    (match hole.struct_ with\n    | V_hole -> ()\n    | _ -> failwith \"hole_lower: not a hole\");\n    hole.at <- Level.min hole.at to_\n\n  let hole_link hole ~to_ =\n    let hole = repr hole in\n    (match hole.struct_ with\n    | V_hole -> ()\n    | _ -> failwith \"link_hole: not a hole\");\n    hole.struct_ <- V_link { value = to_ }\n\n  let thunk_link thunk ~to_ =\n    let thunk = repr thunk in\n    (match thunk.struct_ with\n    | V_thunk _ -> ()\n    | _ -> failwith \"link_thunk: not a thunk\");\n    thunk.struct_ <- V_link { value = to_ }\n\n  let empty = []\n\n  let access env var =\n    let var = (var : Index.t :> int) in\n    match List.nth_opt env var with\n    | Some value -> value\n    | None -> failwith \"lookup: unknown variable\"\n\n  let append env value = value :: env\nend\n\nmodule Eval = struct\n  open Value\n\n  let rec with_var_pat env bound ~arg =\n    let (VPat { struct_ = bound; loc = _ }) = bound in\n    match bound with\n    | VP_annot { pat; annot = _ } -> with_var_pat env pat ~arg\n    | VP_var { var = _ } ->\n        (* TODO: name and maybe type here? *)\n        append env arg\n\n  let rec with_pat env bound ~arg =\n    let (Pat { struct_ = bound; loc = _ }) = bound in\n    match bound with\n    | P_annot { pat; annot = _ } -> with_pat env pat ~arg\n    | P_var { var = _ } ->\n        (* TODO: name and maybe type here? *)\n        append env arg\n    | P_tuple { elements = _ } -> failwith \"not implemented\"\n\n  let rec fresh_v_forward_of_var_pat pat =\n    let (VPat { struct_ = pat; loc = _ }) = pat in\n    match pat with\n    | VP_annot { pat; annot = _ } -> fresh_v_forward_of_var_pat pat\n    | VP_var { var = name } -> fresh_v_forward ~name\n\n  let rec eval env term =\n    let (Term { struct_ = term; loc = _ }) = term in\n    match term with\n    | T_annot { term; annot = _ } -> eval env term\n    | T_var { var } -> weak_force @@ access env var\n    | T_hoist { bound; body } ->\n        let env =\n          let arg = fresh_v_forward_of_var_pat bound in\n          with_var_pat env bound ~arg\n        in\n        eval env body\n    | T_fix { bound = _; var; arg; body } ->\n        let forward = access env var in\n        let () = assert_forward forward in\n        let () =\n          let arg = eval env arg in\n          init_forward forward ~to_:arg\n        in\n        eval env body\n    | T_let { bound; arg; body } ->\n        let env =\n          let arg = eval env arg in\n          with_pat env bound ~arg\n        in\n        eval env body\n    | T_apply { funct; arg } ->\n        let funct = eval env funct in\n        let arg = eval env arg in\n        eval_apply ~funct ~arg\n    | T_lambda { bound; body } -> v_lambda ~env ~bound ~body\n    | T_forall { bound; param; body } ->\n        let param = eval env param in\n        v_forall ~param ~env ~bound ~body\n    | T_self { bound; body } -> v_self ~env ~bound ~body\n    | T_tuple _ | T_exists _ -> failwith \"not implemented\"\n\n  and eval_apply ~funct ~arg =\n    let funct = weak_force funct in\n    match struct_ funct with\n    | V_lambda { env; bound; body } ->\n        let env = with_pat env bound ~arg in\n        eval env body\n    | V_var _ | V_forward _ | V_apply _ -> v_apply ~funct ~arg\n    | V_hole | V_univ | V_forall _ | V_self _ ->\n        failwith \"eval_apply: type clash\"\n    | V_link _ | V_thunk _ -> failwith \"eval_apply: unreacheable\"\n\n  and weak_force value =\n    (* TODO: forcing every time removes some short circuits *)\n    let value = repr value in\n    match struct_ value with\n    | V_thunk { env; term } ->\n        (* TODO: detect recursive force? *)\n        let final = eval env term in\n        thunk_link value ~to_:final;\n        final\n    | V_hole | V_var _ | V_forward _ | V_apply _ | V_lambda _ | V_univ\n    | V_forall _ | V_self _ | V_link _ ->\n        value\n\n  let rec strong_force value =\n    (* TODO: forcing every time removes some short circuits *)\n    (* TODO: path compression is bad for reasons *)\n    let value = weak_force value in\n    match struct_ value with\n    | V_forward { name = _; inner } -> (\n        match same inner v_null with\n        | true -> value\n        | false -> strong_force inner)\n    | V_apply { funct; arg } ->\n        let funct = strong_force funct in\n        strong_eval_apply ~funct ~arg\n    | V_hole | V_var _ | V_lambda _ | V_univ | V_forall _ | V_self _ | V_link _\n    | V_thunk _ ->\n        value\n\n  and strong_eval_apply ~funct ~arg =\n    match struct_ funct with\n    | V_lambda { env; bound; body } ->\n        let env = with_pat env bound ~arg in\n        strong_force @@ eval env body\n    | V_var _ | V_forward _ | V_apply _ -> v_apply ~funct ~arg\n    | V_hole | V_univ | V_forall _ | V_self _ ->\n        failwith \"strong_eval_apply: type clash\"\n    | V_link _ | V_thunk _ -> failwith \"strong_eval_apply: unreacheable\"\nend\n\nmodule Unify = struct\n  open Value\n  open Eval\n\n  let rec unify_check ~at ~hole in_ =\n    (* TODO: short circuit on level *)\n    (* TODO: color to avoid size explosion *)\n    let in_ = weak_force in_ in\n    match struct_ in_ with\n    | V_hole -> (\n        match same hole in_ with\n        | true -> failwith \"occurs check\"\n        | false -> hole_lower in_ ~to_:at)\n    | V_var { name = _ } -> (\n        (* TODO: poly comparison *)\n        match level in_ >= at with\n        | true -> failwith \"escape check\"\n        | false -> ())\n    | V_forward { name = _; inner } ->\n        lock_forward in_ @@ fun () -> unify_check ~at ~hole inner\n    | V_apply { funct; arg } ->\n        unify_check ~at ~hole funct;\n        unify_check ~at ~hole arg\n    | V_univ -> ()\n    | V_lambda { env; bound = _; body } -> unify_check_under ~at ~hole env body\n    | V_forall { param; env; bound = _; body } ->\n        unify_check ~at ~hole param;\n        unify_check_under ~at ~hole env body\n    | V_self { env; bound = _; body } -> unify_check_under ~at ~hole env body\n    | V_thunk _ | V_link _ -> failwith \"unify_check: unreacheable\"\n\n  and unify_check_under ~at ~hole env body =\n    (* TODO: fill this *)\n    let name = Name.make \"**unify_check_under**\" in\n    let skolem = v_var ~at ~name in\n    let at = Level.next at in\n    let body =\n      let env = append env skolem in\n      eval env body\n    in\n    unify_check ~at ~hole body\n\n  let unify_hole ~at ~hole ~to_ =\n    match same hole to_ with\n    | true -> ()\n    | false ->\n        unify_check ~at ~hole to_;\n        hole_link hole ~to_\n\n  let rec unify ~at lhs rhs =\n    (* TODO: do repr shortcircuit first *)\n    let lhs = weak_force lhs in\n    let rhs = weak_force rhs in\n    match same lhs rhs with true -> () | false -> unify_struct ~at lhs rhs\n\n  and unify_struct ~at lhs rhs =\n    match (struct_ lhs, struct_ rhs) with\n    | V_hole, _ -> unify_hole ~at ~hole:lhs ~to_:rhs\n    | _, V_hole -> unify_hole ~at ~hole:rhs ~to_:lhs\n    | V_var { name = _ }, V_var { name = _ } -> failwith \"var clash\"\n    | ( V_forward { name = lhs_name; inner = lhs_inner },\n        V_forward { name = rhs_name; inner = rhs_inner } ) -> (\n        match same lhs_inner v_null || same rhs_inner v_null with\n        | true ->\n            failwith\n            @@ Format.asprintf \"forward clash: %s == %s\" (Name.repr lhs_name)\n                 (Name.repr rhs_name)\n        | false ->\n            (* TODO: is this a good idea? *)\n            lock_forward lhs @@ fun () ->\n            lock_forward rhs @@ fun () -> unify ~at lhs_inner rhs_inner)\n    | ( V_apply { funct = lhs_funct; arg = lhs_arg },\n        V_apply { funct = rhs_funct; arg = rhs_arg } ) ->\n        unify ~at lhs_funct rhs_funct;\n        unify ~at lhs_arg rhs_arg\n    | ( V_lambda { env = lhs_env; bound = _; body = lhs_body },\n        V_lambda { env = rhs_env; bound = _; body = rhs_body } ) ->\n        unify_under ~at lhs_env lhs_body rhs_env rhs_body\n    | V_univ, V_univ -> ()\n    | ( V_forall { param = lhs_param; env = lhs_env; bound = _; body = lhs_body },\n        V_forall\n          { param = rhs_param; env = rhs_env; bound = _; body = rhs_body } ) ->\n        unify ~at lhs_param rhs_param;\n        unify_under ~at lhs_env lhs_body rhs_env rhs_body\n    | ( V_self { env = lhs_env; bound = _; body = lhs_body },\n        V_self { env = rhs_env; bound = _; body = rhs_body } ) ->\n        (* TODO: check only bound? *)\n        unify_under ~at lhs_env lhs_body rhs_env rhs_body\n    | ( ( V_var _ | V_forward _ | V_apply _ | V_lambda _ | V_univ | V_forall _\n        | V_self _ | V_thunk _ | V_link _ ),\n        ( V_var _ | V_forward _ | V_apply _ | V_lambda _ | V_univ | V_forall _\n        | V_self _ | V_thunk _ | V_link _ ) ) ->\n        error_type_clash ()\n\n  and unify_under ~at lhs_env lhs rhs_env rhs =\n    (* TODO: should use pattern *)\n    (* TODO: fill this *)\n    let name = Name.make \"**unify_check_under**\" in\n    let skolem = v_var ~at ~name in\n    let at = Level.next at in\n    let lhs =\n      let lhs_env = append lhs_env skolem in\n      eval lhs_env lhs\n    in\n    let rhs =\n      let rhs_env = append rhs_env skolem in\n      eval rhs_env rhs\n    in\n    unify ~at lhs rhs\nend\n\nmodule Machinery = struct\n  open Value\n  open Eval\n\n  let rec inst_self ~self type_ =\n    let type_ = strong_force type_ in\n    match struct_ @@ type_ with\n    | V_self { env; bound; body } ->\n        let type_ =\n          let env = with_var_pat env bound ~arg:self in\n          eval env body\n        in\n        inst_self ~self type_\n    | V_hole | V_var _ | V_forward _ | V_apply _ | V_lambda _ | V_univ\n    | V_forall _ | V_thunk _ | V_link _ ->\n        type_\n\n  let split_forall value =\n    let value = strong_force value in\n    match struct_ value with\n    | V_forall { param; env; bound; body } -> (param, env, bound, body)\n    | V_hole -> failwith \"hole is not a forall\"\n    | V_var _ | V_forward _ | V_apply _ | V_lambda _ | V_univ | V_self _\n    | V_thunk _ | V_link _ ->\n        failwith \"not a forall\"\n\n  let coerce ~at ~self lhs rhs =\n    let lhs = inst_self ~self lhs in\n    (* TODO: this is really bad *)\n    let rhs = inst_self ~self rhs in\n    Format.eprintf \"%a == %a\\n%!\" pp_value lhs pp_value rhs;\n    Unify.unify ~at lhs rhs\nend\n\nopen Value\nopen Eval\nopen Unify\nopen Machinery\n\ntype value = Value.value\n\n(* infer *)\ntype vars = Vars of { types : (Name.t * value) list }\n[@@ocaml.unboxed] [@@deriving show { with_path = false }]\n\nlet rec v_skolem ~at pat =\n  let (Pat { struct_ = pat; loc = _ }) = pat in\n  match pat with\n  | P_annot { pat; annot = _ } -> v_skolem ~at pat\n  | P_var { var = name } -> v_var ~at ~name\n  | P_tuple _ -> failwith \"not implemented\"\n\nlet rec v_skolem_of_var_pat ~at pat =\n  let (VPat { struct_ = pat; loc = _ }) = pat in\n  match pat with\n  | VP_annot { pat; annot = _ } -> v_skolem_of_var_pat ~at pat\n  | VP_var { var = name } -> v_var ~at ~name\n\nlet rec enter vars pat ~type_ =\n  let (Pat { struct_ = pat; loc = _ }) = pat in\n  match pat with\n  | P_annot { pat; annot = _ } -> enter vars pat ~type_\n  | P_var { var = name } ->\n      let (Vars { types }) = vars in\n      (* TODO: why this *)\n      let type_ =\n        (* TODO: thunk strong force *)\n        strong_force type_\n      in\n      let types = (name, type_) :: types in\n      Vars { types }\n  | P_tuple _ -> failwith \"not implemented\"\n\nlet rec enter_var_pat vars pat ~type_ =\n  let (VPat { struct_ = pat; loc = _ }) = pat in\n  match pat with\n  | VP_annot { pat; annot = _ } -> enter_var_pat vars pat ~type_\n  | VP_var { var = name } ->\n      let (Vars { types }) = vars in\n      (* TODO: why this *)\n      let type_ =\n        (* TODO: thunk strong force *)\n        strong_force type_\n      in\n      let types = (name, type_) :: types in\n      Vars { types }\n\nlet solve vars env var =\n  let rec solve types var =\n    match (types, var) with\n    | (_name, type_) :: _types, 0 -> type_\n    | (_name, _type) :: types, var -> solve types (var - 1)\n    | [], _var ->\n        (* TODO: this is a problem *)\n        failwith \"unexpected unbound variable\"\n  in\n  let (Vars { types }) = vars in\n  let self = access env var in\n  let var = ((var : Index.t) :> int) in\n  let type_ = solve types var in\n  inst_self ~self type_\n\nlet rec type_of_pat env pat ~type_ =\n  let (Pat { struct_ = pat; loc = _ }) = pat in\n  match pat with\n  | P_annot { pat; annot } ->\n      let type_ = v_thunk ~env ~term:annot in\n      type_of_pat env pat ~type_\n  | P_var { var = _ } -> type_\n  | P_tuple _ -> failwith \"not implemented\"\n\nlet rec type_of_var_pat env pat ~type_ =\n  let (VPat { struct_ = pat; loc = _ }) = pat in\n  match pat with\n  | VP_annot { pat; annot } ->\n      let type_ = v_thunk ~env ~term:annot in\n      type_of_var_pat env pat ~type_\n  | VP_var { var = _ } -> type_\n\n(* TODO: ideally ensure that infer_term returns head normalized type *)\nlet rec infer_term vars env ~at term =\n  let expected_self = None in\n  let expected = fresh_v_hole ~at in\n  check_term vars env ~at term ~expected_self ~expected;\n  (* TODO: is this correct or a good idea? *)\n  let self = v_thunk ~env ~term in\n  inst_self ~self expected\n\nand check_term vars env ~at term ~expected_self ~expected =\n  (* TODO: not principled, let and annot will break this *)\n  let (Term { struct_ = term; loc = _ }) = term in\n  match term with\n  | T_annot { term; annot } ->\n      let annot = check_annot vars env ~at annot ~expected_self ~expected in\n      check_term vars env ~at term ~expected_self ~expected:annot\n  | T_var { var } ->\n      (* TODO: use expected_self? *)\n      let received = solve vars env var in\n      let self = access env var in\n      coerce ~at ~self received expected\n  | T_hoist { bound; body } ->\n      (* TODO: ensure it's eventually bound *)\n      let type_ = infer_var_pat vars env ~at bound in\n      let vars = enter_var_pat vars bound ~type_ in\n      let arg = fresh_v_forward_of_var_pat bound in\n      let env = with_var_pat env bound ~arg in\n      check_term vars env ~at body ~expected_self ~expected\n  | T_fix { bound; var; arg; body } ->\n      (* TODO: ensure it's not trivially recursive? A = M(A) *)\n      let forward = access env var in\n      let () = assert_forward forward in\n      let () =\n        let expected = solve vars env var in\n        check_var_pat vars env ~at bound ~expected;\n        let self = forward in\n        let expected_self = Some self in\n        let expected = type_of_var_pat env bound ~type_:expected in\n        check_term vars env ~at arg ~expected_self ~expected\n      in\n      let () =\n        let arg = v_thunk ~env ~term:arg in\n        init_forward forward ~to_:arg\n      in\n      let expected =\n        (* TODO: this could unlock some reductions *)\n        match expected_self with\n        | Some self -> inst_self ~self expected\n        | None -> expected\n      in\n      check_term vars env ~at body ~expected_self ~expected\n  | T_let { bound; arg; body } ->\n      let type_ = infer_pat vars env ~at bound in\n      let () =\n        check_term vars env ~at arg ~expected_self:None ~expected:type_\n      in\n      let arg = v_thunk ~env ~term:arg in\n      let vars = enter vars bound ~type_ in\n      let env = with_pat env bound ~arg in\n      check_term vars env ~at body ~expected_self ~expected\n  | T_lambda { bound; body } ->\n      let expected_param, expected_env, expected_bound, expected_body =\n        split_forall expected\n      in\n      let () = check_pat vars env ~at bound ~expected:expected_param in\n      let param = type_of_pat env bound ~type_:expected_param in\n      let skolem = v_skolem ~at bound in\n      let vars = enter vars bound ~type_:param in\n      let env = with_pat env bound ~arg:skolem in\n      let at = Level.next at in\n      let expected =\n        let env = with_pat expected_env expected_bound ~arg:skolem in\n        eval env expected_body\n      in\n      let expected = strong_force expected in\n      let expected, expected_self =\n        match expected_self with\n        | Some self ->\n            let self = strong_force self in\n            let self = eval_apply ~funct:self ~arg:skolem in\n            let expected = inst_self ~self expected in\n            (expected, Some self)\n        | None -> (expected, None)\n      in\n      check_term vars env ~at body ~expected_self ~expected\n  | T_apply { funct; arg } ->\n      let funct_type = infer_term vars env ~at funct in\n      let param, body_env, bound, body_type = split_forall funct_type in\n      let () =\n        check_term vars env ~at arg ~expected_self:None ~expected:param\n      in\n      let received =\n        let arg = v_thunk ~env ~term:arg in\n        let body_env = with_pat body_env bound ~arg in\n        eval body_env body_type\n      in\n      (* TODO: coerce? *)\n      unify ~at received expected\n  | T_forall { bound; param; body } ->\n      unify ~at v_univ expected;\n      let () =\n        check_term vars env ~at param ~expected_self:None ~expected:v_univ\n      in\n      let param = eval env param in\n      check_pat vars env ~at bound ~expected:param;\n      let skolem = v_skolem ~at bound in\n      let at = Level.next at in\n      let vars = enter vars bound ~type_:param in\n      let env = append env skolem in\n      check_term vars env ~at body ~expected_self:None ~expected:v_univ\n  | T_self { bound; body } ->\n      (* TODO: this is really ugly *)\n      unify ~at v_univ expected;\n      let expected_self =\n        match expected_self with\n        | Some expected_self -> expected_self\n        | None -> failwith \"self is only supported in a fixpoint\"\n      in\n      check_var_pat vars env ~at bound ~expected:expected_self;\n      let type_ = type_of_var_pat env bound ~type_:expected_self in\n      let skolem = v_skolem_of_var_pat ~at bound in\n      let at = Level.next at in\n      let vars = enter_var_pat vars bound ~type_ in\n      let env = with_var_pat env bound ~arg:skolem in\n      check_term vars env ~at body ~expected_self:None ~expected:v_univ\n  | T_tuple _ | T_exists _ -> failwith \"not implemented\"\n\nand check_annot vars env ~at annot ~expected_self ~expected =\n  check_term vars env ~at annot ~expected_self:None ~expected:v_univ;\n  let received = eval env annot in\n  match expected_self with\n  | Some self ->\n      let received = inst_self ~self received in\n      coerce ~at ~self received expected;\n      received\n  | None ->\n      unify ~at received expected;\n      received\n\nand infer_var_pat vars env ~at pat =\n  let expected = fresh_v_hole ~at in\n  check_var_pat vars env ~at pat ~expected;\n  expected\n\nand check_var_pat vars env ~at pat ~expected =\n  let (VPat { struct_ = pat_struct; loc = _ }) = pat in\n  match pat_struct with\n  | VP_annot { pat; annot } ->\n      let annot =\n        check_annot vars env ~at annot ~expected_self:None ~expected\n      in\n      check_var_pat vars env ~at pat ~expected:annot\n  | VP_var { var = _ } -> ()\n\nand infer_pat vars env ~at pat =\n  let expected = fresh_v_hole ~at in\n  check_pat vars env ~at pat ~expected;\n  expected\n\nand check_pat vars env ~at pat ~expected =\n  let (Pat { struct_ = pat_struct; loc = _ }) = pat in\n  match pat_struct with\n  | P_annot { pat; annot } ->\n      let annot =\n        check_annot vars env ~at annot ~expected_self:None ~expected\n      in\n      check_pat vars env ~at pat ~expected:annot\n  | P_var { var = _ } -> ()\n  | P_tuple _ -> failwith \"not implemented\"\n\n(* external *)\nlet infer_term term =\n  let at = Level.(next zero) in\n  let vars =\n    let types = [ (Name.make \"Type\", v_univ) ] in\n    Vars { types }\n  in\n  let env = append empty v_univ in\n  try Ok (infer_term vars env ~at term) with exn -> Error exn\n"
  },
  {
    "path": "teika/typer.mli",
    "content": "open Ttree\n\ntype value\n\nval infer_term : term -> (value, exn) result\n"
  },
  {
    "path": "teikalsp/dune",
    "content": "(executable\n (name teikalsp)\n (libraries lsp eio eio_main)\n (preprocess\n  (pps ppx_deriving.show ppx_deriving.eq ppx_deriving.ord sedlex.ppx)))\n"
  },
  {
    "path": "teikalsp/lsp_channel.ml",
    "content": "module Io : sig\n  type 'a t\n\n  val return : 'a -> 'a t\n  val raise : exn -> 'a t\n  val await : 'a t -> 'a\n  val async : (sw:Eio.Switch.t -> ('a, exn) result) -> 'a t\n\n  module O : sig\n    val ( let+ ) : 'a t -> ('a -> 'b) -> 'b t\n    val ( let* ) : 'a t -> ('a -> 'b t) -> 'b t\n  end\nend = struct\n  type 'a t = sw:Eio.Switch.t -> ('a, exn) result Eio.Promise.t\n\n  let await t = Eio.Switch.run @@ fun sw -> Eio.Promise.await_exn (t ~sw)\n  let return value ~sw:_ = Eio.Promise.create_resolved (Ok value)\n  let error desc ~sw:_ = Eio.Promise.create_resolved (Error desc)\n\n  let async f ~sw =\n    let promise, resolver = Eio.Promise.create () in\n    ( Eio.Fiber.fork ~sw @@ fun () ->\n      try\n        let result = f ~sw in\n        Eio.Promise.resolve resolver result\n      with exn -> Eio.Promise.resolve resolver @@ Error exn );\n    promise\n\n  let bind t f =\n    async @@ fun ~sw ->\n    match Eio.Promise.await (t ~sw) with\n    | Ok value -> Eio.Promise.await @@ f value ~sw\n    | Error desc -> Error desc\n\n  let raise = error\n\n  module O = struct\n    let ( let+ ) x f = bind x @@ fun value -> return @@ f value\n    let ( let* ) = bind\n  end\nend\n\nmodule Chan : sig\n  type input\n  type output\n\n  (* eio *)\n  val of_source : #Eio.Flow.source -> input\n  val with_sink : #Eio.Flow.sink -> (output -> 'a) -> 'a\n\n  (* lsp *)\n  val read_line : input -> string option Io.t\n  val read_exactly : input -> int -> string option Io.t\n  val write : output -> string -> unit Io.t\nend = struct\n  type input = Input of { mutex : Eio.Mutex.t; buf : Eio.Buf_read.t }\n  type output = Output of { mutex : Eio.Mutex.t; buf : Eio.Buf_write.t }\n\n  (* TODO: magic numbers *)\n  let initial_size = 1024\n  let max_size = 1024 * 1024\n\n  let of_source source =\n    let mutex = Eio.Mutex.create () in\n    let buf = Eio.Buf_read.of_flow ~initial_size ~max_size source in\n    Input { mutex; buf }\n\n  let with_sink sink f =\n    let mutex = Eio.Mutex.create () in\n    Eio.Buf_write.with_flow ~initial_size sink @@ fun buf ->\n    f @@ Output { mutex; buf }\n\n  let read_line input =\n    let (Input { mutex; buf }) = input in\n    Io.async @@ fun ~sw:_ ->\n    (* TODO: what this protect does? *)\n    Eio.Mutex.use_rw ~protect:true mutex @@ fun () ->\n    match Eio.Buf_read.eof_seen buf with\n    | true -> Ok None\n    | false -> Ok (Some (Eio.Buf_read.line buf))\n\n  let read_exactly input size =\n    let (Input { mutex; buf }) = input in\n    Io.async @@ fun ~sw:_ ->\n    Eio.Mutex.use_rw ~protect:true mutex @@ fun () ->\n    match Eio.Buf_read.eof_seen buf with\n    | true -> Ok None\n    | false -> Ok (Some (Eio.Buf_read.take size buf))\n\n  let write output str =\n    let (Output { mutex; buf }) = output in\n    Io.async @@ fun ~sw:_ ->\n    Eio.Mutex.use_rw ~protect:true mutex @@ fun () ->\n    Ok (Eio.Buf_write.string buf str)\nend\n\nmodule Lsp_io = Lsp.Io.Make (Io) (Chan)\nopen Jsonrpc\nopen Lsp_error\n\n(* TODO: is a mutex needed for write? *)\ntype channel = Chan.output\ntype t = channel\n\nlet notify channel notification =\n  (* TODO: fork here *)\n  (* TODO: buffering and async? *)\n  let notification = Lsp.Server_notification.to_jsonrpc notification in\n  Io.await @@ Lsp_io.write channel @@ Notification notification\n\nlet respond channel response =\n  Io.await @@ Lsp_io.write channel @@ Response response\n\nlet rec input_loop ~input ~output with_ =\n  (* TODO: buffering and async handling *)\n  match Io.await @@ Lsp_io.read input with\n  | Some packet ->\n      let () = with_ packet in\n      input_loop ~input ~output with_\n  | exception exn -> (* TODO: handle this exception *) raise exn\n  | None ->\n      (* TODO: this means EOF right? *)\n      ()\n\nlet request_of_jsonrpc request =\n  match Lsp.Client_request.of_jsonrpc request with\n  | Ok request -> request\n  | Error error -> fail (Error_invalid_notification { error })\n\nlet notification_of_jsonrpc notification =\n  match Lsp.Client_notification.of_jsonrpc notification with\n  | Ok notification -> notification\n  | Error error -> fail (Error_invalid_notification { error })\n\ntype on_request = {\n  f :\n    'response.\n    channel ->\n    'response Lsp.Client_request.t ->\n    ('response, Response.Error.t) result;\n}\n\nlet listen ~input ~output ~on_request ~on_notification =\n  let on_request channel request =\n    (* TODO: error handling *)\n    let result =\n      let (E request) = request_of_jsonrpc request in\n      match on_request.f channel request with\n      | Ok result -> Ok (Lsp.Client_request.yojson_of_result request result)\n      | Error _error as error -> error\n    in\n    let response = Jsonrpc.Response.{ id = request.id; result } in\n    respond channel response\n  in\n  let on_notification channel notification =\n    let notification = notification_of_jsonrpc notification in\n    on_notification channel notification\n  in\n\n  let input = Chan.of_source input in\n  Chan.with_sink output @@ fun channel ->\n  input_loop ~input ~output @@ fun packet ->\n  (* TODO: make this async? *)\n  match packet with\n  | Notification notification -> on_notification channel notification\n  | Request request -> on_request channel request\n  | Batch_call calls ->\n      (* TODO: what if one fails? It should not prevents the others *)\n      List.iter\n        (fun call ->\n          match call with\n          | `Request request -> on_request channel request\n          | `Notification notification -> on_notification channel notification)\n        calls\n  (* TODO: can the server receive a response?\n      Yes but right now it will not be supported *)\n  | Response _ -> fail Error_response_unsupported\n  | Batch_response _ -> fail Error_response_unsupported\n"
  },
  {
    "path": "teikalsp/lsp_channel.mli",
    "content": "open Jsonrpc\n\ntype channel\ntype t = channel\n\nval notify : channel -> Lsp.Server_notification.t -> unit\n\ntype on_request = {\n  f :\n    'response.\n    channel ->\n    'response Lsp.Client_request.t ->\n    ('response, Response.Error.t) result;\n}\n\n(* TODO: request*)\nval listen :\n  input:#Eio.Flow.source ->\n  output:#Eio.Flow.sink ->\n  on_request:on_request ->\n  on_notification:(channel -> Lsp.Client_notification.t -> unit) ->\n  unit\n\n(* val input_loop : input:Chan.input ->\n   output:Chan.output -> (Jsonrpc.Packet.t -> Jsonrpc.Packet.t list) -> unit) *)\n"
  },
  {
    "path": "teikalsp/lsp_context.ml",
    "content": "open Lsp.Types\nopen Lsp_error\nmodule Document_uri_map = Map.Make (DocumentUri)\n\n(* TODO: capabilities *)\n(* TODO: initialized *)\ntype status = Handshake | Running\n\ntype context = {\n  mutable status : status;\n  mutable text_documents : Lsp_text_document.t Document_uri_map.t;\n}\n\ntype t = context\n\nlet create () = { status = Handshake; text_documents = Document_uri_map.empty }\nlet status context = context.status\n\nlet initialize context =\n  match context.status with\n  | Handshake -> context.status <- Running\n  | Running -> fail Error_invalid_status_during_initialize\n\nlet update_text_documents context f =\n  let text_documents = context.text_documents in\n  let text_documents = f text_documents in\n  context.text_documents <- text_documents\n\nlet open_text_document context uri text_document =\n  update_text_documents context @@ fun text_documents ->\n  (match Document_uri_map.mem uri text_documents with\n  | true -> fail Error_text_document_already_in_context\n  | false -> ());\n  Document_uri_map.add uri text_document text_documents\n\nlet change_text_document context uri cb =\n  update_text_documents context @@ fun text_documents ->\n  let text_document =\n    match Document_uri_map.find_opt uri text_documents with\n    | Some text_document -> text_document\n    | None -> fail Error_text_document_not_in_context\n  in\n  let text_document = cb text_document in\n  (* TODO: only accept if version is newer or equal *)\n  Document_uri_map.add uri text_document text_documents\n\nlet close_text_document context uri =\n  update_text_documents context @@ fun text_documents ->\n  (match Document_uri_map.mem uri text_documents with\n  | true -> ()\n  | false -> fail Error_text_document_not_in_context);\n  Document_uri_map.remove uri text_documents\n"
  },
  {
    "path": "teikalsp/lsp_context.mli",
    "content": "open Lsp.Types\n\ntype status = private Handshake | Running\ntype context\ntype t = context\n\n(* TODO: rollback? Requests and notifications should probably be atomic *)\nval create : unit -> context\nval status : context -> status\nval initialize : context -> unit\n\n(* documents *)\nval open_text_document : context -> DocumentUri.t -> Lsp_text_document.t -> unit\n\nval change_text_document :\n  context ->\n  DocumentUri.t ->\n  (Lsp_text_document.t -> Lsp_text_document.t) ->\n  unit\n\nval close_text_document : context -> DocumentUri.t -> unit\n"
  },
  {
    "path": "teikalsp/lsp_error.ml",
    "content": "open Lsp.Types\n\ntype error =\n  (* channel *)\n  | Error_request_unsupported\n  | Error_response_unsupported\n  | Error_invalid_request of { error : string }\n  | Error_invalid_notification of { error : string }\n  (* server *)\n  | Error_unsupported_request\n  | Error_unsupported_notification\n  (* context *)\n  | Error_notification_before_initialize\n  | Error_invalid_status_during_initialize\n  | Error_text_document_already_in_context\n  | Error_text_document_not_in_context\n  (* notification *)\n  | Error_multiple_content_changes of {\n      content_changes : TextDocumentContentChangeEvent.t list; [@opaque]\n    }\n  | Error_partial_content_change of {\n      content_change : TextDocumentContentChangeEvent.t; [@opaque]\n    }\n  | Error_invalid_content_change of {\n      content_change : TextDocumentContentChangeEvent.t; [@opaque]\n    }\n  | Error_unknown_language_id of { language_id : string }\n\nand t = error [@@deriving show]\n\n(* TODO: what happen with errors? *)\nexception Lsp_error of { error : error }\n\nlet fail error = raise (Lsp_error { error })\n"
  },
  {
    "path": "teikalsp/lsp_error.mli",
    "content": "open Lsp.Types\n\ntype error =\n  (* channel *)\n  | Error_request_unsupported\n  | Error_response_unsupported\n  | Error_invalid_request of { error : string }\n  | Error_invalid_notification of { error : string }\n  (* server *)\n  | Error_unsupported_request\n  | Error_unsupported_notification\n  (* context *)\n  | Error_notification_before_initialize\n  | Error_invalid_status_during_initialize\n  | Error_text_document_already_in_context\n  | Error_text_document_not_in_context\n  (* notification *)\n  | Error_multiple_content_changes of {\n      content_changes : TextDocumentContentChangeEvent.t list;\n    }\n  | Error_partial_content_change of {\n      content_change : TextDocumentContentChangeEvent.t;\n    }\n  | Error_invalid_content_change of {\n      content_change : TextDocumentContentChangeEvent.t;\n    }\n  | Error_unknown_language_id of { language_id : string }\n\ntype t = error [@@deriving show]\n\nexception Lsp_error of { error : error }\n\nval fail : error -> 'a\n"
  },
  {
    "path": "teikalsp/lsp_notification.ml",
    "content": "open Lsp.Types\nopen Lsp_context\nopen Lsp_error\n\nmodule Server_life_cycle = struct\n  (* TODO: do something here?*)\n  let initialized _context = ()\nend\n\nmodule Text_document_sync = struct\n  let did_open context ~params =\n    let DidOpenTextDocumentParams.{ textDocument = text_document } = params in\n    let TextDocumentItem.{ uri; languageId = language_id; version; text } =\n      text_document\n    in\n    let document =\n      match language_id with\n      | \"teika\" -> Lsp_text_document.teika ~version ~text\n      | language_id -> fail (Error_unknown_language_id { language_id })\n    in\n    (* TODO: async typing here *)\n    open_text_document context uri document\n\n  let did_change context ~params =\n    (* TODO: currently only full content changes are supported\n        partial content changes could be supported *)\n    let DidChangeTextDocumentParams.\n          { textDocument = { uri; version }; contentChanges = content_changes }\n        =\n      params\n    in\n    let content_change =\n      match content_changes with\n      | [ content_change ] -> content_change\n      | content_changes ->\n          fail (Error_multiple_content_changes { content_changes })\n    in\n    let TextDocumentContentChangeEvent.{ range; rangeLength; text } =\n      content_change\n    in\n    (match (range, rangeLength) with\n    | None, None -> ()\n    | Some _, Some _ -> fail (Error_partial_content_change { content_change })\n    | Some _, None | None, Some _ ->\n        fail (Error_invalid_content_change { content_change }));\n    change_text_document context uri @@ fun document ->\n    (* TODO: async typing here *)\n    Lsp_text_document.with_change ~version ~text document\n\n  let did_close context ~params =\n    let DidCloseTextDocumentParams.{ textDocument = { uri } } = params in\n    close_text_document context uri\n\n  (* TODO: save and rename *)\nend\n"
  },
  {
    "path": "teikalsp/lsp_notification.mli",
    "content": "open Lsp.Types\n\nmodule Server_life_cycle : sig\n  val initialized : Lsp_context.t -> unit\nend\n\nmodule Text_document_sync : sig\n  val did_open : Lsp_context.t -> params:DidOpenTextDocumentParams.t -> unit\n  val did_change : Lsp_context.t -> params:DidChangeTextDocumentParams.t -> unit\n  val did_close : Lsp_context.t -> params:DidCloseTextDocumentParams.t -> unit\nend\n"
  },
  {
    "path": "teikalsp/lsp_request.ml",
    "content": "open Lsp.Types\nopen Lsp_context\nopen Lsp_error\n\nmodule Server_life_cycle = struct\n  let initialize context ~params =\n    let () =\n      (* TODO: this is duplicated *)\n      match status context with\n      | Handshake -> ()\n      | Running -> fail Error_invalid_status_during_initialize\n    in\n    (* TODO: use additional data *)\n    let InitializeParams.\n          {\n            workDoneToken = _;\n            processId = _;\n            clientInfo = _;\n            locale = _;\n            rootPath = _;\n            rootUri = _;\n            initializationOptions = _;\n            capabilities = _;\n            (* TODO: definitely ignore capabilities *)\n            trace = _;\n            (* TODO: enable logging using tgrace*)\n            workspaceFolders = _;\n          } =\n      params\n    in\n    let () = Lsp_context.initialize context in\n    (* TODO: better capabilities *)\n    let capabilities =\n      ServerCapabilities.create ~textDocumentSync:(`TextDocumentSyncKind Full)\n        ~hoverProvider:(`Bool true) ()\n    in\n    (* TODO: server_info *)\n    InitializeResult.create ~capabilities ()\nend\n"
  },
  {
    "path": "teikalsp/lsp_request.mli",
    "content": "open Lsp.Types\n\nmodule Server_life_cycle : sig\n  val initialize :\n    Lsp_context.t -> params:InitializeParams.t -> InitializeResult.t\nend\n"
  },
  {
    "path": "teikalsp/lsp_text_document.ml",
    "content": "(* TODO: proper types for text and version *)\ntype document = Smol of { version : int; text : string }\ntype t = document\n\nlet teika ~version ~text = Smol { version; text }\n\nlet with_change ~version ~text document =\n  (* TODO: use the version for something? *)\n  let (Smol { version = _; text = _ }) = document in\n  Smol { version; text }\n"
  },
  {
    "path": "teikalsp/lsp_text_document.mli",
    "content": "type document\ntype t = document\n\nval teika : version:int -> text:string -> document\nval with_change : version:int -> text:string -> document -> document\n"
  },
  {
    "path": "teikalsp/teikalsp.ml",
    "content": "open Lsp_error\n\nlet on_request (type response) context _channel\n    (request : response Lsp.Client_request.t) : response =\n  let open Lsp_request in\n  let open Lsp.Client_request in\n  (* TODO: use channel? *)\n  match request with\n  | Initialize params -> Server_life_cycle.initialize context ~params\n  | _request ->\n      (* TODO: print which requests are not supported *)\n      fail Error_unsupported_request\n\nlet on_request_error _context _channel error =\n  let open Jsonrpc.Response.Error in\n  (* TODO: maybe error should show to user? *)\n  (* TODO: better errors *)\n  let message =\n    match error with\n    | Lsp_error { error } -> Lsp_error.show error\n    | error -> Printexc.to_string error\n  in\n  Jsonrpc.Response.Error.make ~code:Code.InternalError ~message ()\n\nlet on_notification context _channel notification =\n  let open Lsp_notification in\n  let open Lsp.Client_notification in\n  (* TODO: use channel? *)\n  match notification with\n  | Initialized -> Server_life_cycle.initialized context\n  | TextDocumentDidOpen params -> Text_document_sync.did_open context ~params\n  | TextDocumentDidChange params ->\n      Text_document_sync.did_change context ~params\n  | TextDocumentDidClose params -> Text_document_sync.did_close context ~params\n  | _notification ->\n      (* TODO: print which notifications are not supported *)\n      fail Error_unsupported_notification\n\nlet on_notification context channel notification =\n  (* TODO: notification error handling *)\n  match Lsp_context.status context with\n  | Handshake ->\n      (* TODO: log *)\n      (* TODO: server can send some notifications during handshake *)\n      fail Error_notification_before_initialize\n  | Running -> on_notification context channel notification\n\nlet on_notification_error _context channel error =\n  let open Lsp.Types in\n  let open Lsp.Server_notification in\n  let message =\n    match error with\n    | Lsp_error { error } -> Lsp_error.show error\n    | error -> Printexc.to_string error\n  in\n  (* TODO: maybe error should show to user? *)\n  let message = LogMessageParams.create ~type_:Error ~message in\n  Lsp_channel.notify channel @@ LogMessage message\n\nlet main () =\n  Eio_main.run @@ fun env ->\n  let context = Lsp_context.create () in\n  let on_request channel request =\n    try Ok (on_request context channel request)\n    with error -> Error (on_request_error context channel error)\n  in\n  let on_notification channel notification =\n    try on_notification context channel notification\n    with error -> on_notification_error context channel error\n  in\n  Lsp_channel.listen ~input:env#stdin ~output:env#stdout\n    ~on_request:{ f = on_request } ~on_notification\n\nlet () = main ()\n"
  },
  {
    "path": "teikalsp/teikalsp.mli",
    "content": ""
  },
  {
    "path": "teikavsc/main.ts",
    "content": "import { workspace, ExtensionContext, commands } from \"vscode\";\n\nimport {\n  LanguageClient,\n  LanguageClientOptions,\n  ServerOptions,\n  TransportKind,\n} from \"vscode-languageclient/node\";\n\nlet client: LanguageClient;\n\nconst restartClient = async () => {\n  const workspaceConfiguration = workspace.getConfiguration();\n  const teikaServerPath = workspaceConfiguration.get<string>(\n    \"teika.server.path\",\n    \"\"\n  );\n\n  // If the extension is launched in debug mode then the debug server options are used\n  // Otherwise the run options are used\n  const serverOptions: ServerOptions = {\n    run: {\n      command: teikaServerPath,\n      args: [],\n      transport: TransportKind.stdio,\n    },\n    debug: {\n      command: teikaServerPath,\n      args: [],\n      transport: TransportKind.stdio,\n    },\n  };\n\n  // Options to control the language client\n  const clientOptions: LanguageClientOptions = {\n    // Register the server for plain text documents\n    documentSelector: [{ scheme: \"file\", language: \"teika\" }],\n    synchronize: {},\n  };\n\n  if (!client) {\n    // Create the language client and start the client.\n    client = new LanguageClient(\n      \"teika\",\n      \"Teika Server\",\n      serverOptions,\n      clientOptions\n    );\n  }\n\n  if (client && client.isRunning()) {\n    await client.restart();\n  } else {\n    // Start the client. This will also launch the server\n    await client.start();\n  }\n};\n\nexport function activate(context: ExtensionContext) {\n  context.subscriptions.push(\n    commands.registerCommand(\"teika.server.restart\", () => {\n      // TODO: handle promise below\n      restartClient();\n    })\n  );\n\n  restartClient();\n}\n\nexport function deactivate(): Thenable<void> | undefined {\n  if (!client) {\n    return undefined;\n  }\n  return client.stop();\n}\n"
  },
  {
    "path": "teikavsc/package.json",
    "content": "{\n  \"name\": \"teikavsc\",\n  \"displayName\": \"Teika\",\n  \"description\": \"Teika language extension for VSCode\",\n  \"license\": \"MIT\",\n  \"version\": \"0.0.1\",\n  \"publisher\": \"teikalang\",\n  \"repository\": {\n    \"type\": \"git\",\n    \"url\": \"https://github.com/teikalang/teika\"\n  },\n  \"bugs\": {\n    \"url\": \"https://github.com/teikalang/teika/issues\"\n  },\n  \"homepage\": \"my home page\",\n  \"main\": \"./dist/main.js\",\n  \"engines\": {\n    \"vscode\": \"^1.64.0\"\n  },\n  \"categories\": [\n    \"Programming Languages\"\n  ],\n  \"activationEvents\": [\n    \"workspaceContains:**/*.te\",\n    \"workspaceContains:**/*.tei\"\n  ],\n  \"_icon\": \"assets/logo.png\",\n  \"contributes\": {\n    \"viewsWelcome\": [],\n    \"viewsContainers\": {},\n    \"views\": {},\n    \"commands\": [\n      {\n        \"command\": \"teika.server.restart\",\n        \"category\": \"Teika\",\n        \"title\": \"Restart Language Server\"\n      }\n    ],\n    \"keybindings\": [],\n    \"menus\": {\n      \"editor/context\": [],\n      \"commandPalette\": [],\n      \"editor/title\": [],\n      \"view/title\": [],\n      \"view/item/context\": []\n    },\n    \"configuration\": {\n      \"title\": \"Teika\",\n      \"properties\": {\n        \"teika.server.path\": {\n          \"type\": \"string\",\n          \"default\": null,\n          \"description\": \"teikalsp path\"\n        },\n        \"teika.trace.server\": {\n          \"scope\": \"window\",\n          \"type\": \"string\",\n          \"enum\": [\n            \"off\",\n            \"messages\",\n            \"verbose\"\n          ],\n          \"default\": \"off\",\n          \"description\": \"Traces the communication between VS Code and the language server.\"\n        }\n      }\n    },\n    \"configurationDefaults\": {\n      \"[teika]\": {\n        \"editor.tabSize\": 2\n      }\n    },\n    \"problemMatchers\": [],\n    \"taskDefinitions\": [],\n    \"languages\": [\n      {\n        \"id\": \"teika\",\n        \"aliases\": [\n          \"Teika\",\n          \"teika\"\n        ],\n        \"extensions\": [\n          \".te\",\n          \".tei\"\n        ],\n        \"configuration\": \"./teika.language.json\"\n      }\n    ],\n    \"grammars\": [\n      {\n        \"language\": \"teika\",\n        \"scopeName\": \"source.teika\",\n        \"path\": \"./teika.syntax.json\"\n      },\n      {\n        \"scopeName\": \"markdown.teika.codeblock\",\n        \"path\": \"./teika.markdown.codeblock.json\",\n        \"injectTo\": [\n          \"text.html.markdown\"\n        ],\n        \"embeddedLanguages\": {\n          \"meta.embedded.block.teika\": \"teika\"\n        }\n      }\n    ],\n    \"snippets\": [\n      {\n        \"language\": \"teika\",\n        \"path\": \"./teika.snippets.json\"\n      }\n    ],\n    \"jsonValidation\": [],\n    \"customEditors\": []\n  },\n  \"scripts\": {\n    \"package\": \"vsce package --out vscode-teika.vsix --yarn\",\n    \"deploy:vsce\": \"vsce publish --packagePath vscode-teika.vsix --yarn\",\n    \"fmt:check\": \"prettier . --check\",\n    \"fmt\": \"prettier . --write\"\n  },\n  \"dependencies\": {\n    \"vscode-languageclient\": \"*\"\n  },\n  \"devDependencies\": {\n    \"@types/vscode\": \"*\",\n    \"@types/node\": \"*\",\n    \"prettier\": \"*\",\n    \"vsce\": \"*\",\n    \"typescript\": \"*\"\n  },\n  \"prettier\": {\n    \"proseWrap\": \"always\",\n    \"overrides\": []\n  }\n}\n"
  },
  {
    "path": "teikavsc/tsconfig.json",
    "content": "{\n  \"compilerOptions\": {\n    /* Visit https://aka.ms/tsconfig to read more about this file */\n    /* Projects */\n    // \"incremental\": true,                              /* Save .tsbuildinfo files to allow for incremental compilation of projects. */\n    // \"composite\": true,                                /* Enable constraints that allow a TypeScript project to be used with project references. */\n    // \"tsBuildInfoFile\": \"./.tsbuildinfo\",              /* Specify the path to .tsbuildinfo incremental compilation file. */\n    // \"disableSourceOfProjectReferenceRedirect\": true,  /* Disable preferring source files instead of declaration files when referencing composite projects. */\n    // \"disableSolutionSearching\": true,                 /* Opt a project out of multi-project reference checking when editing. */\n    // \"disableReferencedProjectLoad\": true,             /* Reduce the number of projects loaded automatically by TypeScript. */\n    /* Language and Environment */\n    \"target\": \"es2020\", /* Set the JavaScript language version for emitted JavaScript and include compatible library declarations. */\n    \"lib\": [\n      \"es2020\"\n    ], /* Specify a set of bundled library declaration files that describe the target runtime environment. */\n    // \"jsx\": \"preserve\",                                /* Specify what JSX code is generated. */\n    // \"experimentalDecorators\": true,                   /* Enable experimental support for TC39 stage 2 draft decorators. */\n    // \"emitDecoratorMetadata\": true,                    /* Emit design-type metadata for decorated declarations in source files. */\n    // \"jsxFactory\": \"\",                                 /* Specify the JSX factory function used when targeting React JSX emit, e.g. 'React.createElement' or 'h'. */\n    // \"jsxFragmentFactory\": \"\",                         /* Specify the JSX Fragment reference used for fragments when targeting React JSX emit e.g. 'React.Fragment' or 'Fragment'. */\n    // \"jsxImportSource\": \"\",                            /* Specify module specifier used to import the JSX factory functions when using 'jsx: react-jsx*'. */\n    // \"reactNamespace\": \"\",                             /* Specify the object invoked for 'createElement'. This only applies when targeting 'react' JSX emit. */\n    // \"noLib\": true,                                    /* Disable including any library files, including the default lib.d.ts. */\n    // \"useDefineForClassFields\": true,                  /* Emit ECMAScript-standard-compliant class fields. */\n    // \"moduleDetection\": \"auto\",                        /* Control what method is used to detect module-format JS files. */\n    /* Modules */\n    \"module\": \"commonjs\", /* Specify what module code is generated. */\n    // \"rootDir\": \"./\",                                  /* Specify the root folder within your source files. */\n    \"moduleResolution\": \"node\", /* Specify how TypeScript looks up a file from a given module specifier. */\n    // \"baseUrl\": \"./\",                                  /* Specify the base directory to resolve non-relative module names. */\n    // \"paths\": {},                                      /* Specify a set of entries that re-map imports to additional lookup locations. */\n    // \"rootDirs\": [],                                   /* Allow multiple folders to be treated as one when resolving modules. */\n    // \"typeRoots\": [],                                  /* Specify multiple folders that act like './node_modules/@types'. */\n    // \"types\": [],                                      /* Specify type package names to be included without being referenced in a source file. */\n    // \"allowUmdGlobalAccess\": true,                     /* Allow accessing UMD globals from modules. */\n    // \"moduleSuffixes\": [],                             /* List of file name suffixes to search when resolving a module. */\n    // \"resolveJsonModule\": true,                        /* Enable importing .json files. */\n    // \"noResolve\": true,                                /* Disallow 'import's, 'require's or '<reference>'s from expanding the number of files TypeScript should add to a project. */\n    /* JavaScript Support */\n    // \"allowJs\": true,                                  /* Allow JavaScript files to be a part of your program. Use the 'checkJS' option to get errors from these files. */\n    // \"checkJs\": true,                                  /* Enable error reporting in type-checked JavaScript files. */\n    // \"maxNodeModuleJsDepth\": 1,                        /* Specify the maximum folder depth used for checking JavaScript files from 'node_modules'. Only applicable with 'allowJs'. */\n    /* Emit */\n    // \"declaration\": true,                              /* Generate .d.ts files from TypeScript and JavaScript files in your project. */\n    // \"declarationMap\": true,                           /* Create sourcemaps for d.ts files. */\n    // \"emitDeclarationOnly\": true,                      /* Only output d.ts files and not JavaScript files. */\n    \"sourceMap\": false, /* Create source map files for emitted JavaScript files. */\n    // \"outFile\": \"./\",                                  /* Specify a file that bundles all outputs into one JavaScript file. If 'declaration' is true, also designates a file that bundles all .d.ts output. */\n    \"outDir\": \"./dist\", /* Specify an output folder for all emitted files. */\n    // \"removeComments\": true,                           /* Disable emitting comments. */\n    // \"noEmit\": true,                                   /* Disable emitting files from a compilation. */\n    // \"importHelpers\": true,                            /* Allow importing helper functions from tslib once per project, instead of including them per-file. */\n    // \"importsNotUsedAsValues\": \"remove\",               /* Specify emit/checking behavior for imports that are only used for types. */\n    // \"downlevelIteration\": true,                       /* Emit more compliant, but verbose and less performant JavaScript for iteration. */\n    // \"sourceRoot\": \"\",                                 /* Specify the root path for debuggers to find the reference source code. */\n    // \"mapRoot\": \"\",                                    /* Specify the location where debugger should locate map files instead of generated locations. */\n    // \"inlineSourceMap\": true,                          /* Include sourcemap files inside the emitted JavaScript. */\n    // \"inlineSources\": true,                            /* Include source code in the sourcemaps inside the emitted JavaScript. */\n    // \"emitBOM\": true,                                  /* Emit a UTF-8 Byte Order Mark (BOM) in the beginning of output files. */\n    // \"newLine\": \"crlf\",                                /* Set the newline character for emitting files. */\n    // \"stripInternal\": true,                            /* Disable emitting declarations that have '@internal' in their JSDoc comments. */\n    // \"noEmitHelpers\": true,                            /* Disable generating custom helper functions like '__extends' in compiled output. */\n    // \"noEmitOnError\": true,                            /* Disable emitting files if any type checking errors are reported. */\n    // \"preserveConstEnums\": true,                       /* Disable erasing 'const enum' declarations in generated code. */\n    // \"declarationDir\": \"./\",                           /* Specify the output directory for generated declaration files. */\n    // \"preserveValueImports\": true,                     /* Preserve unused imported values in the JavaScript output that would otherwise be removed. */\n    /* Interop Constraints */\n    // \"isolatedModules\": true,                          /* Ensure that each file can be safely transpiled without relying on other imports. */\n    // \"allowSyntheticDefaultImports\": true,             /* Allow 'import x from y' when a module doesn't have a default export. */\n    \"esModuleInterop\": true, /* Emit additional JavaScript to ease support for importing CommonJS modules. This enables 'allowSyntheticDefaultImports' for type compatibility. */\n    // \"preserveSymlinks\": true,                         /* Disable resolving symlinks to their realpath. This correlates to the same flag in node. */\n    \"forceConsistentCasingInFileNames\": true, /* Ensure that casing is correct in imports. */\n    /* Type Checking */\n    \"strict\": true, /* Enable all strict type-checking options. */\n    // \"noImplicitAny\": true,                            /* Enable error reporting for expressions and declarations with an implied 'any' type. */\n    // \"strictNullChecks\": true,                         /* When type checking, take into account 'null' and 'undefined'. */\n    // \"strictFunctionTypes\": true,                      /* When assigning functions, check to ensure parameters and the return values are subtype-compatible. */\n    // \"strictBindCallApply\": true,                      /* Check that the arguments for 'bind', 'call', and 'apply' methods match the original function. */\n    // \"strictPropertyInitialization\": true,             /* Check for class properties that are declared but not set in the constructor. */\n    // \"noImplicitThis\": true,                           /* Enable error reporting when 'this' is given the type 'any'. */\n    // \"useUnknownInCatchVariables\": true,               /* Default catch clause variables as 'unknown' instead of 'any'. */\n    // \"alwaysStrict\": true,                             /* Ensure 'use strict' is always emitted. */\n    // \"noUnusedLocals\": true,                           /* Enable error reporting when local variables aren't read. */\n    // \"noUnusedParameters\": true,                       /* Raise an error when a function parameter isn't read. */\n    // \"exactOptionalPropertyTypes\": true,               /* Interpret optional property types as written, rather than adding 'undefined'. */\n    // \"noImplicitReturns\": true,                        /* Enable error reporting for codepaths that do not explicitly return in a function. */\n    // \"noFallthroughCasesInSwitch\": true,               /* Enable error reporting for fallthrough cases in switch statements. */\n    // \"noUncheckedIndexedAccess\": true,                 /* Add 'undefined' to a type when accessed using an index. */\n    // \"noImplicitOverride\": true,                       /* Ensure overriding members in derived classes are marked with an override modifier. */\n    // \"noPropertyAccessFromIndexSignature\": true,       /* Enforces using indexed accessors for keys declared using an indexed type. */\n    // \"allowUnusedLabels\": true,                        /* Disable error reporting for unused labels. */\n    // \"allowUnreachableCode\": true,                     /* Disable error reporting for unreachable code. */\n    /* Completeness */\n    // \"skipDefaultLibCheck\": true,                      /* Skip type checking .d.ts files that are included with TypeScript. */\n    \"skipLibCheck\": true /* Skip type checking all .d.ts files. */\n  }\n}\n"
  },
  {
    "path": "utils/dune",
    "content": "(library\n (name utils)\n (preprocess\n  (pps ppx_deriving.eq ppx_deriving.ord ppx_deriving.eq ppx_deriving.show)))\n"
  },
  {
    "path": "utils/utils.ml",
    "content": "module Index = struct\n  type index = int\n  and t = index [@@deriving show, eq]\n\n  let zero = 0\n\n  let next n =\n    let n = n + 1 in\n    assert (n + 1 >= zero);\n    n\nend\n\nmodule Level = struct\n  type level = int\n  and t = level [@@deriving show, eq]\n\n  let zero = 0\n\n  let next n =\n    let n = n + 1 in\n    assert (n + 1 >= zero);\n    n\n\n  let offset ~from ~to_ =\n    match to_ > from with\n    | true ->\n        (* TODO: explain this -1 *)\n        Some (to_ - from - 1)\n    | false -> None\nend\n\nmodule Name = struct\n  type name = string\n  and t = name [@@deriving show, eq, ord]\n\n  let make t = t\n  let repr t = t\n\n  module Map = Map.Make (String)\nend\n"
  },
  {
    "path": "utils/utils.mli",
    "content": "module Index : sig\n  type index = private int\n  type t = index [@@deriving show, eq]\n\n  val zero : index\n  val next : index -> index\nend\n\nmodule Level : sig\n  (* TODO: this private int is not ideal *)\n  type level = private int\n  type t = level [@@deriving show, eq]\n\n  val zero : level\n  val next : level -> level\n  val offset : from:level -> to_:level -> Index.t option\nend\n\nmodule Name : sig\n  type name\n  type t = name [@@deriving show, eq, ord]\n\n  val make : string -> name\n  val repr : name -> string\n\n  (* TODO: stop exposing this? *)\n  module Map : Map.S with type key = name\nend\n"
  }
]