Repository: 0xPARC/plonkathon
Branch: main
Commit: 954c48f913fe
Files: 25
Total size: 104.6 KB
Directory structure:
gitextract_u8n2xw5v/
├── .gitignore
├── README.md
├── TESTING_verifier_DO_NOT_OPEN.py
├── __init__.py
├── compiler/
│ ├── __init__.py
│ ├── assembly.py
│ ├── program.py
│ └── utils.py
├── curve.py
├── poly.py
├── prover.py
├── pyproject.toml
├── setup.py
├── test/
│ ├── __init__.py
│ ├── main.plonk.vkey-58.json
│ ├── main.plonk.vkey-59.json
│ ├── main.plonk.vkey.json
│ ├── mini_poseidon.py
│ ├── poseidon_rc.json
│ ├── powersOfTau28_hez_final_11.ptau
│ └── proof.pickle
├── test.py
├── transcript.py
├── utils.py
└── verifier.py
================================================
FILE CONTENTS
================================================
================================================
FILE: .gitignore
================================================
venv
*.pyc
# Remove jupyter notebook stuff
*.ipynb
settings.json
================================================
FILE: README.md
================================================
# PlonKathon
**PlonKathon** is part of the program for [MIT IAP 2023] [Modern Zero Knowledge Cryptography](https://zkiap.com/). Over the course of this weekend, we will get into the weeds of the PlonK protocol through a series of exercises and extensions. This repository contains a simple python implementation of PlonK adapted from [py_plonk](https://github.com/ethereum/research/tree/master/py_plonk), and targeted to be close to compatible with the implementation at https://zkrepl.dev.
### Exercises
Each step of the exercise is accompanied by tests in `test.py` to check your progress.
#### Step 1: Implement setup.py
Implement `Setup.commit` and `Setup.verification_key`.
#### Step 2: Implement prover.py
1. Implement Round 1 of the PlonK prover
2. Implement Round 2 of the PlonK prover
3. Implement Round 3 of the PlonK prover
4. Implement Round 4 of the PlonK prover
5. Implement Round 5 of the PlonK prover
#### Step 3: Implement verifier.py
Implement `VerificationKey.verify_proof_unoptimized` and `VerificationKey.verify_proof`. See the comments for the differences.
#### Step 4: Pass all the tests!
Pass a number of miscellaneous tests that test your implementation end-to-end.
### Extensions
1. Add support for custom gates.
[TurboPlonK](https://docs.zkproof.org/pages/standards/accepted-workshop3/proposal-turbo_plonk.pdf) introduced support for custom constraints, beyond the addition and multiplication gates supported here. Try to generalise this implementation to allow circuit writers to define custom constraints.
2. Add zero-knowledge.
The parts of PlonK that are responsible for ensuring strong privacy are left out of this implementation. See if you can identify them in the [original paper](https://eprint.iacr.org/2019/953.pdf) and add them here.
3. Add support for lookups.
A lookup argument allows us to prove that a certain element can be found in a public lookup table. [PlonKup](https://eprint.iacr.org/2022/086.pdf) introduces lookup arguments to PlonK. Try to understand the construction in the paper and implement it here.
4. Implement Merlin transcript.
Currently, this implementation uses the [merlin transcript package](https://github.com/nalinbhardwaj/curdleproofs.pie/tree/master/merlin). Learn about the [Merlin transcript construction](https://merlin.cool) and the [STROBE framework](https://www.cryptologie.net/article/416/the-strobe-protocol-framework/) which Merlin is based upon, and then implement the transcript class `MerlinTranscript` yourself!
## Getting started
To get started, you'll need to have a Python version >= 3.8 and [`poetry`](https://python-poetry.org) installed: `curl -sSL https://install.python-poetry.org | python3 -`.
Then, run `poetry install` in the root of the repository. This will install all the dependencies in a virtualenv.
Then, to see the proof system in action, run `poetry run python test.py` from the root of the repository. This will take you through the workflow of setup, proof generation, and verification for several example programs.
The `main` branch contains code stubbed out with comments to guide you through the tests. The `hardcore` branch removes the comments for the more adventurous amongst you. The `reference` branch contains a completed implementation.
For linting and types, the repo also provides `poetry run black .` and `poetry run mypy .`
### Compiler
#### Program
We specify our program logic in a high-level language involving constraints and variable assignments. Here is a program that lets you prove that you know two small numbers that multiply to a given number (in our example we'll use 91) without revealing what those numbers are:
```
n public
pb0 === pb0 * pb0
pb1 === pb1 * pb1
pb2 === pb2 * pb2
pb3 === pb3 * pb3
qb0 === qb0 * qb0
qb1 === qb1 * qb1
qb2 === qb2 * qb2
qb3 === qb3 * qb3
pb01 <== pb0 + 2 * pb1
pb012 <== pb01 + 4 * pb2
p <== pb012 + 8 * pb3
qb01 <== qb0 + 2 * qb1
qb012 <== qb01 + 4 * qb2
q <== qb012 + 8 * qb3
n <== p * q
```
Examples of valid program constraints:
- `a === 9`
- `b <== a * c`
- `d <== a * c - 45 * a + 987`
Examples of invalid program constraints:
- `7 === 7` (can't assign to non-variable)
- `a <== b * * c` (two multiplications in a row)
- `e <== a + b * c * d` (multiplicative degree > 2)
Given a `Program`, we can derive the `CommonPreprocessedInput`, which are the polynomials representing the fixed constraints of the program. The prover later uses these polynomials to construct the quotient polynomial, and to compute their evaluations at a given challenge point.
```python
@dataclass
class CommonPreprocessedInput:
"""Common preprocessed input"""
group_order: int
# q_M(X) multiplication selector polynomial
QM: list[Scalar]
# q_L(X) left selector polynomial
QL: list[Scalar]
# q_R(X) right selector polynomial
QR: list[Scalar]
# q_O(X) output selector polynomial
QO: list[Scalar]
# q_C(X) constants selector polynomial
QC: list[Scalar]
# S_σ1(X) first permutation polynomial S_σ1(X)
S1: list[Scalar]
# S_σ2(X) second permutation polynomial S_σ2(X)
S2: list[Scalar]
# S_σ3(X) third permutation polynomial S_σ3(X)
S3: list[Scalar]
```
#### Assembly
Our "assembly" language consists of `AssemblyEqn`s:
```python
class AssemblyEqn:
"""Assembly equation mapping wires to coefficients."""
wires: GateWires
coeffs: dict[Optional[str], int]
```
where:
```python
@dataclass
class GateWires:
"""Variable names for Left, Right, and Output wires."""
L: Optional[str]
R: Optional[str]
O: Optional[str]
```
Examples of valid program constraints, and corresponding assembly:
| program constraint | assembly |
| -------------------------- | ------------------------------------------------ |
| a === 9 | ([None, None, 'a'], {'': 9}) |
| b <== a * c | (['a', 'c', 'b'], {'a*c': 1}) |
| d <== a * c - 45 * a + 987 | (['a', 'c', 'd'], {'a*c': 1, 'a': -45, '': 987}) |
### Setup
Let $\mathbb{G}_1$ and $\mathbb{G}_2$ be two elliptic curves with a pairing $e : \mathbb{G}_1 \times \mathbb{G}_2 \rightarrow \mathbb{G}_T$. Let $p$ be the order of $\mathbb{G}_1$ and $\mathbb{G}_2$, and $G$ and $H$ be generators of $\mathbb{G}_1$ and $\mathbb{G}_2$. We will use the shorthand notation
$$[x]_1 = xG \in \mathbb{G}_1 \text{ and } [x]_2 = xH \in \mathbb{G}_2$$
for any $x \in \mathbb{F}_p$.
The trusted setup is a preprocessing step that produces a structured reference string:
$$\mathsf{srs} = ([1]_1, [x]_1, \cdots, [x^{d-1}]_1, [x]_2),$$
where:
- $x \in \mathbb{F}$ is a randomly chosen, **secret** evaluation point; and
- $d$ is the size of the trusted setup, corresponding to the maximum degree polynomial that it can support.
```python
@dataclass
class Setup(object):
# ([1]₁, [x]₁, ..., [x^{d-1}]₁)
# = ( G, xG, ..., x^{d-1}G ), where G is a generator of G_2
powers_of_x: list[G1Point]
# [x]₂ = xH, where H is a generator of G_2
X2: G2Point
```
In this repository, we are using the pairing-friendly [BN254 curve](https://hackmd.io/@jpw/bn254), where:
- `p = 21888242871839275222246405745257275088696311157297823662689037894645226208583`
- $\mathbb{G}_1$ is the curve $y^2 = x^3 + 3$ over $\mathbb{F}_p$;
- $\mathbb{G}_2$ is the twisted curve $y^2 = x^3 + 3/(9+u)$ over $\mathbb{F}_{p^2}$; and
- $\mathbb{G}_T = {\mu}_r \subset \mathbb{F}_{p^{12}}^{\times}$.
We are using an existing setup for $d = 2^{11}$, from this [ceremony](https://github.com/iden3/snarkjs/blob/master/README.md). You can find out more about trusted setup ceremonies [here](https://github.com/weijiekoh/perpetualpowersoftau).
### Prover
The prover creates a proof of knowledge of some satisfying witness to a program.
```python
@dataclass
class Prover:
group_order: int
setup: Setup
program: Program
pk: CommonPreprocessedInput
```
The prover progresses in five rounds, and produces a message at the end of each. After each round, the message is hashed into the `Transcript`.
The `Proof` consists of all the round messages (`Message1`, `Message2`, `Message3`, `Message4`, `Message5`).
#### Round 1
```python
def round_1(
self,
witness: dict[Optional[str], int],
) -> Message1
@dataclass
class Message1:
# - [a(x)]₁ (commitment to left wire polynomial)
a_1: G1Point
# - [b(x)]₁ (commitment to right wire polynomial)
b_1: G1Point
# - [c(x)]₁ (commitment to output wire polynomial)
c_1: G1Point
```
#### Round 2
```python
def round_2(self) -> Message2
@dataclass
class Message2:
# [z(x)]₁ (commitment to permutation polynomial)
z_1: G1Point
```
#### Round 3
```python
def round_3(self) -> Message3
@dataclass
class Message3:
# [t_lo(x)]₁ (commitment to t_lo(X), the low chunk of the quotient polynomial t(X))
t_lo_1: G1Point
# [t_mid(x)]₁ (commitment to t_mid(X), the middle chunk of the quotient polynomial t(X))
t_mid_1: G1Point
# [t_hi(x)]₁ (commitment to t_hi(X), the high chunk of the quotient polynomial t(X))
t_hi_1: G1Point
```
#### Round 4
```python
def round_4(self) -> Message4
@dataclass
class Message4:
# Evaluation of a(X) at evaluation challenge ζ
a_eval: Scalar
# Evaluation of b(X) at evaluation challenge ζ
b_eval: Scalar
# Evaluation of c(X) at evaluation challenge ζ
c_eval: Scalar
# Evaluation of the first permutation polynomial S_σ1(X) at evaluation challenge ζ
s1_eval: Scalar
# Evaluation of the second permutation polynomial S_σ2(X) at evaluation challenge ζ
s2_eval: Scalar
# Evaluation of the shifted permutation polynomial z(X) at the shifted evaluation challenge ζω
z_shifted_eval: Scalar
```
#### Round 5
```python
def round_5(self) -> Message5
@dataclass
class Message5:
# [W_ζ(X)]₁ (commitment to the opening proof polynomial)
W_z_1: G1Point
# [W_ζω(X)]₁ (commitment to the opening proof polynomial)
W_zw_1: G1Point
```
### Verifier
Given a `Setup` and a `Program`, we can generate a verification key for the program:
```python
def verification_key(self, pk: CommonPreprocessedInput) -> VerificationKey
```
The `VerificationKey` contains:
| verification key element | remark |
| ------------------------ | ---------------------------------------------------------------- |
| $[q_M(x)]_1$ | commitment to multiplication selector polynomial |
| $[q_L(x)]_1$ | commitment to left selector polynomial |
| $[q_R(x)]_1$ | commitment to right selector polynomial |
| $[q_O(x)]_1$ | commitment to output selector polynomial |
| $[q_C(x)]_1$ | commitment to constants selector polynomial |
| $[S_{\sigma1}(x)]_1$ | commitment to the first permutation polynomial $S_{\sigma1}(X)$ |
| $[S_{\sigma2}(x)]_1$ | commitment to the second permutation polynomial $S_{\sigma2}(X)$ |
| $[S_{\sigma3}(x)]_1$ | commitment to the third permutation polynomial $S_{\sigma3}(X)$ |
| $[x]_2 = xH$ | (from the $\mathsf{srs}$) |
| $\omega$ | an $n$-th root of unity, where $n$ is the program's group order. |
================================================
FILE: TESTING_verifier_DO_NOT_OPEN.py
================================================
import py_ecc.bn128 as b
from utils import *
from dataclasses import dataclass
from curve import *
from transcript import Transcript
from poly import Polynomial, Basis
@dataclass
class TestingVerificationKey:
"""Testing Verification key: DO NOT READ THIS CODE, only for testing prover implementations"""
group_order: int
# [q_M(x)]₁ (commitment to multiplication selector polynomial)
Qm: G1Point
# [q_L(x)]₁ (commitment to left selector polynomial)
Ql: G1Point
# [q_R(x)]₁ (commitment to right selector polynomial)
Qr: G1Point
# [q_O(x)]₁ (commitment to output selector polynomial)
Qo: G1Point
# [q_C(x)]₁ (commitment to constants selector polynomial)
Qc: G1Point
# [S_σ1(x)]₁ (commitment to the first permutation polynomial S_σ1(X))
S1: G1Point
# [S_σ2(x)]₁ (commitment to the second permutation polynomial S_σ2(X))
S2: G1Point
# [S_σ3(x)]₁ (commitment to the third permutation polynomial S_σ3(X))
S3: G1Point
# [x]₂ = xH, where H is a generator of G_2
X_2: G2Point
# nth root of unity, where n is the program's group order.
w: Scalar
# More optimized version that tries hard to minimize pairings and
# elliptic curve multiplications, but at the cost of being harder
# to understand and mixing together a lot of the computations to
# efficiently batch them
def verify_proof(self, group_order: int, pf, public=[]) -> bool:
# 4. Compute challenges
beta, gamma, alpha, zeta, v, u = self.compute_challenges(pf)
proof = pf.flatten()
# 5. Compute zero polynomial evaluation Z_H(ζ) = ζ^n - 1
root_of_unity = Scalar.root_of_unity(group_order)
ZH_ev = zeta**group_order - 1
# 6. Compute Lagrange polynomial evaluation L_0(ζ)
L0_ev = ZH_ev / (group_order * (zeta - 1))
# 7. Compute public input polynomial evaluation PI(ζ).
PI = Polynomial(
[Scalar(-x) for x in public]
+ [Scalar(0) for _ in range(group_order - len(public))],
Basis.LAGRANGE,
)
PI_ev = PI.barycentric_eval(zeta)
# Compute the constant term of R. This is not literally the degree-0
# term of the R polynomial; rather, it's the portion of R that can
# be computed directly, without resorting to elliptic cutve commitments
r0 = (
PI_ev
- L0_ev * alpha**2
- (
alpha
* (proof["a_eval"] + beta * proof["s1_eval"] + gamma)
* (proof["b_eval"] + beta * proof["s2_eval"] + gamma)
* (proof["c_eval"] + gamma)
* proof["z_shifted_eval"]
)
)
# D = (R - r0) + u * Z
D_pt = ec_lincomb(
[
(self.Qm, proof["a_eval"] * proof["b_eval"]),
(self.Ql, proof["a_eval"]),
(self.Qr, proof["b_eval"]),
(self.Qo, proof["c_eval"]),
(self.Qc, 1),
(
proof["z_1"],
(
(proof["a_eval"] + beta * zeta + gamma)
* (proof["b_eval"] + beta * 2 * zeta + gamma)
* (proof["c_eval"] + beta * 3 * zeta + gamma)
* alpha
+ L0_ev * alpha**2
+ u
),
),
(
self.S3,
(
-(proof["a_eval"] + beta * proof["s1_eval"] + gamma)
* (proof["b_eval"] + beta * proof["s2_eval"] + gamma)
* alpha
* beta
* proof["z_shifted_eval"]
),
),
(proof["t_lo_1"], -ZH_ev),
(proof["t_mid_1"], -ZH_ev * zeta**group_order),
(proof["t_hi_1"], -ZH_ev * zeta ** (group_order * 2)),
]
)
F_pt = ec_lincomb(
[
(D_pt, 1),
(proof["a_1"], v),
(proof["b_1"], v**2),
(proof["c_1"], v**3),
(self.S1, v**4),
(self.S2, v**5),
]
)
E_pt = ec_mul(
b.G1,
(
-r0
+ v * proof["a_eval"]
+ v**2 * proof["b_eval"]
+ v**3 * proof["c_eval"]
+ v**4 * proof["s1_eval"]
+ v**5 * proof["s2_eval"]
+ u * proof["z_shifted_eval"]
),
)
# What's going on here is a clever re-arrangement of terms to check
# the same equations that are being checked in the basic version,
# but in a way that minimizes the number of EC muls and even
# compressed the two pairings into one. The 2 pairings -> 1 pairing
# trick is basically to replace checking
#
# Y1 = A * (X - a) and Y2 = B * (X - b)
#
# with
#
# Y1 + A * a = A * X
# Y2 + B * b = B * X
#
# so at this point we can take a random linear combination of the two
# checks, and verify it with only one pairing.
assert b.pairing(
self.X_2, ec_lincomb([(proof["W_z_1"], 1), (proof["W_zw_1"], u)])
) == b.pairing(
b.G2,
ec_lincomb(
[
(proof["W_z_1"], zeta),
(proof["W_zw_1"], u * zeta * root_of_unity),
(F_pt, 1),
(E_pt, -1),
]
),
)
print("done combined check")
return True
# Basic, easier-to-understand version of what's going on
def verify_proof_unoptimized(self, group_order: int, pf, public=[]) -> bool:
# 4. Compute challenges
beta, gamma, alpha, zeta, v, _ = self.compute_challenges(pf)
proof = pf.flatten()
# 5. Compute zero polynomial evaluation Z_H(ζ) = ζ^n - 1
root_of_unity = Scalar.root_of_unity(group_order)
ZH_ev = zeta**group_order - 1
# 6. Compute Lagrange polynomial evaluation L_0(ζ)
L0_ev = ZH_ev / (group_order * (zeta - 1))
# 7. Compute public input polynomial evaluation PI(ζ).
PI = Polynomial(
[Scalar(-x) for x in public]
+ [Scalar(0) for _ in range(group_order - len(public))],
Basis.LAGRANGE,
)
PI_ev = PI.barycentric_eval(zeta)
# Recover the commitment to the linearization polynomial R,
# exactly the same as what was created by the prover
R_pt = ec_lincomb(
[
(self.Qm, proof["a_eval"] * proof["b_eval"]),
(self.Ql, proof["a_eval"]),
(self.Qr, proof["b_eval"]),
(self.Qo, proof["c_eval"]),
(b.G1, PI_ev),
(self.Qc, 1),
(
proof["z_1"],
(
(proof["a_eval"] + beta * zeta + gamma)
* (proof["b_eval"] + beta * 2 * zeta + gamma)
* (proof["c_eval"] + beta * 3 * zeta + gamma)
* alpha
),
),
(
self.S3,
(
-(proof["a_eval"] + beta * proof["s1_eval"] + gamma)
* (proof["b_eval"] + beta * proof["s2_eval"] + gamma)
* beta
* alpha
* proof["z_shifted_eval"]
),
),
(
b.G1,
(
-(proof["a_eval"] + beta * proof["s1_eval"] + gamma)
* (proof["b_eval"] + beta * proof["s2_eval"] + gamma)
* (proof["c_eval"] + gamma)
* alpha
* proof["z_shifted_eval"]
),
),
(proof["z_1"], L0_ev * alpha**2),
(b.G1, -L0_ev * alpha**2),
(proof["t_lo_1"], -ZH_ev),
(proof["t_mid_1"], -ZH_ev * zeta**group_order),
(proof["t_hi_1"], -ZH_ev * zeta ** (group_order * 2)),
]
)
print("verifier R_pt", R_pt)
# Verify that R(z) = 0 and the prover-provided evaluations
# A(z), B(z), C(z), S1(z), S2(z) are all correct
assert b.pairing(
b.G2,
ec_lincomb(
[
(R_pt, 1),
(proof["a_1"], v),
(b.G1, -v * proof["a_eval"]),
(proof["b_1"], v**2),
(b.G1, -(v**2) * proof["b_eval"]),
(proof["c_1"], v**3),
(b.G1, -(v**3) * proof["c_eval"]),
(self.S1, v**4),
(b.G1, -(v**4) * proof["s1_eval"]),
(self.S2, v**5),
(b.G1, -(v**5) * proof["s2_eval"]),
]
),
) == b.pairing(b.add(self.X_2, ec_mul(b.G2, -zeta)), proof["W_z_1"])
print("done check 1")
# Verify that the provided value of Z(zeta*w) is correct
assert b.pairing(
b.G2, ec_lincomb([(proof["z_1"], 1), (b.G1, -proof["z_shifted_eval"])])
) == b.pairing(
b.add(self.X_2, ec_mul(b.G2, -zeta * root_of_unity)), proof["W_zw_1"]
)
print("done check 2")
return True
# Compute challenges (should be same as those computed by prover)
def compute_challenges(
self, proof
) -> tuple[Scalar, Scalar, Scalar, Scalar, Scalar, Scalar]:
transcript = Transcript(b"plonk")
beta, gamma = transcript.round_1(proof.msg_1)
alpha, _fft_cofactor = transcript.round_2(proof.msg_2)
zeta = transcript.round_3(proof.msg_3)
v = transcript.round_4(proof.msg_4)
u = transcript.round_5(proof.msg_5)
return beta, gamma, alpha, zeta, v, u
================================================
FILE: __init__.py
================================================
================================================
FILE: compiler/__init__.py
================================================
================================================
FILE: compiler/assembly.py
================================================
from utils import *
from .utils import *
from typing import Optional
from dataclasses import dataclass
@dataclass
class GateWires:
"""Variable names for Left, Right, and Output wires."""
L: Optional[str]
R: Optional[str]
O: Optional[str]
def as_list(self) -> list[Optional[str]]:
return [self.L, self.R, self.O]
@dataclass
class Gate:
"""Gate polynomial"""
L: Scalar
R: Scalar
M: Scalar
O: Scalar
C: Scalar
@dataclass
class AssemblyEqn:
"""Assembly equation mapping wires to coefficients."""
wires: GateWires
coeffs: dict[Optional[str], int]
def L(self) -> Scalar:
return Scalar(-self.coeffs.get(self.wires.L, 0))
def R(self) -> Scalar:
if self.wires.R != self.wires.L:
return Scalar(-self.coeffs.get(self.wires.R, 0))
return Scalar(0)
def C(self) -> Scalar:
return Scalar(-self.coeffs.get("", 0))
def O(self) -> Scalar:
return Scalar(self.coeffs.get("$output_coeff", 1))
def M(self) -> Scalar:
if None not in self.wires.as_list():
return Scalar(
-self.coeffs.get(get_product_key(self.wires.L, self.wires.R), 0)
)
return Scalar(0)
def gate(self) -> Gate:
return Gate(self.L(), self.R(), self.M(), self.O(), self.C())
# Converts a arithmetic expression containing numbers, variables and {+, -, *}
# into a mapping of term to coefficient
#
# For example:
# ['a', '+', 'b', '*', 'c', '*', '5'] becomes {'a': 1, 'b*c': 5}
#
# Note that this is a recursive algo, so the input can be a mix of tokens and
# mapping expressions
#
def evaluate(exprs: list[str], first_is_negative=False) -> dict[Optional[str], int]:
# Splits by + and - first, then *, to follow order of operations
# The first_is_negative flag helps us correctly interpret expressions
# like 6000 - 700 - 80 + 9 (that's 5229)
if "+" in exprs:
L = evaluate(exprs[: exprs.index("+")], first_is_negative)
R = evaluate(exprs[exprs.index("+") + 1 :], False)
return {x: L.get(x, 0) + R.get(x, 0) for x in set(L.keys()).union(R.keys())}
elif "-" in exprs:
L = evaluate(exprs[: exprs.index("-")], first_is_negative)
R = evaluate(exprs[exprs.index("-") + 1 :], True)
return {x: L.get(x, 0) + R.get(x, 0) for x in set(L.keys()).union(R.keys())}
elif "*" in exprs:
L = evaluate(exprs[: exprs.index("*")], first_is_negative)
R = evaluate(exprs[exprs.index("*") + 1 :], first_is_negative)
o = {}
for k1 in L.keys():
for k2 in R.keys():
o[get_product_key(k1, k2)] = L[k1] * R[k2]
return o
elif len(exprs) > 1:
raise Exception("No ops, expected sub-expr to be a unit: {}".format(exprs[1]))
elif exprs[0][0] == "-":
return evaluate([exprs[0][1:]], not first_is_negative)
elif exprs[0].isnumeric():
return {"": int(exprs[0]) * (-1 if first_is_negative else 1)}
elif is_valid_variable_name(exprs[0]):
return {exprs[0]: -1 if first_is_negative else 1}
else:
raise Exception("ok wtf is {}".format(exprs[0]))
# Converts an equation to a mapping of term to coefficient, and verifies that
# the operations in the equation are valid.
#
# Also outputs a triple containing the L and R input variables and the output
# variable
#
# Think of the list of (variable triples, coeffs) pairs as this language's
# version of "assembly"
#
# Example valid equations, and output:
# a === 9 ([None, None, 'a'], {'': 9})
# b <== a * c (['a', 'c', 'b'], {'a*c': 1})
# d <== a * c - 45 * a + 987 (['a', 'c', 'd'], {'a*c': 1, 'a': -45, '': 987})
#
# Example invalid equations:
# 7 === 7 # Can't assign to non-variable
# a <== b * * c # Two times signs in a row
# e <== a + b * c * d # Multiplicative degree > 2
#
def eq_to_assembly(eq: str) -> AssemblyEqn:
tokens = eq.rstrip("\n").split(" ")
if tokens[1] in ("<==", "==="):
# First token is the output variable
out = tokens[0]
# Convert the expression to coefficient map form
coeffs = evaluate(tokens[2:])
# Handle the "-x === a * b" case
if out[0] == "-":
out = out[1:]
coeffs["$output_coeff"] = -1
# Check out variable name validity
if not is_valid_variable_name(out):
raise Exception("Invalid out variable name: {}".format(out))
# Gather list of variables used in the expression
variables = []
for t in tokens[2:]:
var = t.lstrip("-")
if is_valid_variable_name(var) and var not in variables:
variables.append(var)
# Construct the list of allowed coefficients
allowed_coeffs = variables + ["", "$output_coeff"]
if len(variables) == 0:
pass
elif len(variables) == 1:
variables.append(variables[0])
allowed_coeffs.append(get_product_key(*variables))
elif len(variables) == 2:
allowed_coeffs.append(get_product_key(*variables))
else:
raise Exception("Max 2 variables, found {}".format(variables))
# Check that only allowed coefficients are in the coefficient map
for key in coeffs.keys():
if key not in allowed_coeffs:
raise Exception("Disallowed multiplication: {}".format(key))
# Return output
wires = variables + [None] * (2 - len(variables)) + [out]
return AssemblyEqn(GateWires(wires[0], wires[1], wires[2]), coeffs)
elif tokens[1] == "public":
return AssemblyEqn(
GateWires(tokens[0], None, None),
{tokens[0]: -1, "$output_coeff": 0, "$public": True},
)
else:
raise Exception("Unsupported op: {}".format(tokens[1]))
================================================
FILE: compiler/program.py
================================================
# A simple zk language, reverse-engineered to match https://zkrepl.dev/ output
from utils import *
from .assembly import *
from .utils import *
from typing import Optional, Set
from poly import Polynomial, Basis
@dataclass
class CommonPreprocessedInput:
"""Common preprocessed input"""
group_order: int
# q_M(X) multiplication selector polynomial
QM: Polynomial
# q_L(X) left selector polynomial
QL: Polynomial
# q_R(X) right selector polynomial
QR: Polynomial
# q_O(X) output selector polynomial
QO: Polynomial
# q_C(X) constants selector polynomial
QC: Polynomial
# S_σ1(X) first permutation polynomial S_σ1(X)
S1: Polynomial
# S_σ2(X) second permutation polynomial S_σ2(X)
S2: Polynomial
# S_σ3(X) third permutation polynomial S_σ3(X)
S3: Polynomial
class Program:
constraints: list[AssemblyEqn]
group_order: int
def __init__(self, constraints: list[str], group_order: int):
if len(constraints) > group_order:
raise Exception("Group order too small")
assembly = [eq_to_assembly(constraint) for constraint in constraints]
self.constraints = assembly
self.group_order = group_order
def common_preprocessed_input(self) -> CommonPreprocessedInput:
L, R, M, O, C = self.make_gate_polynomials()
S = self.make_s_polynomials()
return CommonPreprocessedInput(
self.group_order,
M,
L,
R,
O,
C,
S[Column.LEFT],
S[Column.RIGHT],
S[Column.OUTPUT],
)
@classmethod
def from_str(cls, constraints: str, group_order: int):
lines = [line.strip() for line in constraints.split("\n")]
return cls(lines, group_order)
def coeffs(self) -> list[dict[Optional[str], int]]:
return [constraint.coeffs for constraint in self.constraints]
def wires(self) -> list[GateWires]:
return [constraint.wires for constraint in self.constraints]
def make_s_polynomials(self) -> dict[Column, Polynomial]:
# For each variable, extract the list of (column, row) positions
# where that variable is used
variable_uses: dict[Optional[str], Set[Cell]] = {None: set()}
for row, constraint in enumerate(self.constraints):
for column, value in zip(Column.variants(), constraint.wires.as_list()):
if value not in variable_uses:
variable_uses[value] = set()
variable_uses[value].add(Cell(column, row))
# Mark unused cells
for row in range(len(self.constraints), self.group_order):
for column in Column.variants():
variable_uses[None].add(Cell(column, row))
# For each list of positions, rotate by one.
#
# For example, if some variable is used in positions
# (LEFT, 4), (LEFT, 7) and (OUTPUT, 2), then we store:
#
# at S[LEFT][7] the field element representing (LEFT, 4)
# at S[OUTPUT][2] the field element representing (LEFT, 7)
# at S[LEFT][4] the field element representing (OUTPUT, 2)
S_values = {
Column.LEFT: [Scalar(0)] * self.group_order,
Column.RIGHT: [Scalar(0)] * self.group_order,
Column.OUTPUT: [Scalar(0)] * self.group_order,
}
for _, uses in variable_uses.items():
sorted_uses = sorted(uses)
for i, cell in enumerate(sorted_uses):
next_i = (i + 1) % len(sorted_uses)
next_column = sorted_uses[next_i].column
next_row = sorted_uses[next_i].row
S_values[next_column][next_row] = cell.label(self.group_order)
S = {}
S[Column.LEFT] = Polynomial(S_values[Column.LEFT], Basis.LAGRANGE)
S[Column.RIGHT] = Polynomial(S_values[Column.RIGHT], Basis.LAGRANGE)
S[Column.OUTPUT] = Polynomial(S_values[Column.OUTPUT], Basis.LAGRANGE)
return S
# Get the list of public variable assignments, in order
def get_public_assignments(self) -> list[Optional[str]]:
coeffs = self.coeffs()
o = []
no_more_allowed = False
for coeff in coeffs:
if coeff.get("$public", False) is True:
if no_more_allowed:
raise Exception("Public var declarations must be at the top")
var_name = [x for x in list(coeff.keys()) if "$" not in str(x)][0]
if coeff != {"$public": True, "$output_coeff": 0, var_name: -1}:
raise Exception("Malformatted coeffs: {}", format(coeffs))
o.append(var_name)
else:
no_more_allowed = True
return o
# Generate the gate polynomials: L, R, M, O, C,
# each a list of length `group_order`
def make_gate_polynomials(
self,
) -> tuple[Polynomial, Polynomial, Polynomial, Polynomial, Polynomial]:
L = [Scalar(0) for _ in range(self.group_order)]
R = [Scalar(0) for _ in range(self.group_order)]
M = [Scalar(0) for _ in range(self.group_order)]
O = [Scalar(0) for _ in range(self.group_order)]
C = [Scalar(0) for _ in range(self.group_order)]
for i, constraint in enumerate(self.constraints):
gate = constraint.gate()
L[i] = gate.L
R[i] = gate.R
M[i] = gate.M
O[i] = gate.O
C[i] = gate.C
return (
Polynomial(L, Basis.LAGRANGE),
Polynomial(R, Basis.LAGRANGE),
Polynomial(M, Basis.LAGRANGE),
Polynomial(O, Basis.LAGRANGE),
Polynomial(C, Basis.LAGRANGE),
)
# Attempts to "run" the program to fill in any intermediate variable
# assignments, starting from the given assignments. Eg. if
# `starting_assignments` contains {'a': 3, 'b': 5}, and the first line
# says `c <== a * b`, then it fills in `c: 15`.
def fill_variable_assignments(
self, starting_assignments: dict[Optional[str], int]
) -> dict[Optional[str], int]:
out = {k: Scalar(v) for k, v in starting_assignments.items()}
out[None] = Scalar(0)
for constraint in self.constraints:
wires = constraint.wires
coeffs = constraint.coeffs
in_L = wires.L
in_R = wires.R
output = wires.O
out_coeff = coeffs.get("$output_coeff", 1)
product_key = get_product_key(in_L, in_R)
if output is not None and out_coeff in (-1, 1):
new_value = (
Scalar(
coeffs.get("", 0)
+ out[in_L] * coeffs.get(in_L, 0)
+ out[in_R] * coeffs.get(in_R, 0) * (1 if in_R != in_L else 0)
+ out[in_L] * out[in_R] * coeffs.get(product_key, 0)
)
* out_coeff
) # should be / but equivalent for (1, -1)
if output in out:
if out[output] != new_value:
raise Exception(
"Failed assertion: {} = {}".format(out[output], new_value)
)
else:
out[output] = new_value
# print('filled in:', output, out[output])
return {k: v.n for k, v in out.items()}
================================================
FILE: compiler/utils.py
================================================
from utils import *
from enum import Enum
from dataclasses import dataclass
class Column(Enum):
LEFT = 1
RIGHT = 2
OUTPUT = 3
def __lt__(self, other):
if self.__class__ is other.__class__:
return self.value < other.value
return NotImplemented
@staticmethod
def variants():
return [Column.LEFT, Column.RIGHT, Column.OUTPUT]
@dataclass
class Cell:
column: Column
row: int
def __key(self):
return (self.row, self.column.value)
def __hash__(self):
return hash(self.__key())
def __lt__(self, other):
if self.__class__ is other.__class__:
return self.__key() < other.__key()
return NotImplemented
def __repr__(self) -> str:
return "(" + str(self.row) + ", " + str(self.column.value) + ")"
def __str__(self) -> str:
return "(" + str(self.row) + ", " + str(self.column.value) + ")"
# Outputs the label (an inner-field element) representing a given
# (column, row) pair. Expects section = 1 for left, 2 right, 3 output
def label(self, group_order: int) -> Scalar:
assert self.row < group_order
return Scalar.roots_of_unity(group_order)[self.row] * self.column.value
# Gets the key to use in the coeffs dictionary for the term for key1*key2,
# where key1 and key2 can be constant(''), a variable, or product keys
# Note that degrees higher than 2 are disallowed in the compiler, but we
# still allow them in the parser in case we find a way to compile them later
def get_product_key(key1, key2):
members = sorted((key1 or "").split("*") + (key2 or "").split("*"))
return "*".join([x for x in members if x])
def is_valid_variable_name(name: str) -> bool:
return len(name) > 0 and name.isalnum() and name[0] not in "0123456789"
================================================
FILE: curve.py
================================================
from py_ecc.fields.field_elements import FQ as Field
import py_ecc.bn128 as b
from typing import NewType
primitive_root = 5
G1Point = NewType("G1Point", tuple[b.FQ, b.FQ])
G2Point = NewType("G2Point", tuple[b.FQ2, b.FQ2])
class Scalar(Field):
field_modulus = b.curve_order
# Gets the first root of unity of a given group order
@classmethod
def root_of_unity(cls, group_order: int):
return Scalar(5) ** ((cls.field_modulus - 1) // group_order)
# Gets the full list of roots of unity of a given group order
@classmethod
def roots_of_unity(cls, group_order: int):
o = [Scalar(1), cls.root_of_unity(group_order)]
while len(o) < group_order:
o.append(o[-1] * o[1])
return o
Base = NewType("Base", b.FQ)
def ec_mul(pt, coeff):
if hasattr(coeff, "n"):
coeff = coeff.n
return b.multiply(pt, coeff % b.curve_order)
# Elliptic curve linear combination. A truly optimized implementation
# would replace this with a fast lin-comb algo, see https://ethresear.ch/t/7238
def ec_lincomb(pairs):
return lincomb(
[pt for (pt, _) in pairs],
[int(n) % b.curve_order for (_, n) in pairs],
b.add,
b.Z1,
)
# Equivalent to:
# o = b.Z1
# for pt, coeff in pairs:
# o = b.add(o, ec_mul(pt, coeff))
# return o
################################################################
# multicombs
################################################################
import random, sys, math
def multisubset(numbers, subsets, adder=lambda x, y: x + y, zero=0):
# Split up the numbers into partitions
partition_size = 1 + int(math.log(len(subsets) + 1))
# Align number count to partition size (for simplicity)
numbers = numbers[::]
while len(numbers) % partition_size != 0:
numbers.append(zero)
# Compute power set for each partition (eg. a, b, c -> {0, a, b, a+b, c, a+c, b+c, a+b+c})
power_sets = []
for i in range(0, len(numbers), partition_size):
new_power_set = [zero]
for dimension, value in enumerate(numbers[i : i + partition_size]):
new_power_set += [adder(n, value) for n in new_power_set]
power_sets.append(new_power_set)
# Compute subset sums, using elements from power set for each range of values
# ie. with a single power set lookup you can get the sum of _all_ elements in
# the range partition_size*k...partition_size*(k+1) that are in that subset
subset_sums = []
for subset in subsets:
o = zero
for i in range(len(power_sets)):
index_in_power_set = 0
for j in range(partition_size):
if i * partition_size + j in subset:
index_in_power_set += 2**j
o = adder(o, power_sets[i][index_in_power_set])
subset_sums.append(o)
return subset_sums
# Reduces a linear combination `numbers[0] * factors[0] + numbers[1] * factors[1] + ...`
# into a multi-subset problem, and computes the result efficiently
def lincomb(numbers, factors, adder=lambda x, y: x + y, zero=0):
# Maximum bit length of a number; how many subsets we need to make
maxbitlen = max(len(bin(f)) - 2 for f in factors)
# Compute the subsets: the ith subset contains the numbers whose corresponding factor
# has a 1 at the ith bit
subsets = [
{i for i in range(len(numbers)) if factors[i] & (1 << j)}
for j in range(maxbitlen + 1)
]
subset_sums = multisubset(numbers, subsets, adder=adder, zero=zero)
# For example, suppose a value V has factor 6 (011 in increasing-order binary). Subset 0
# will not have V, subset 1 will, and subset 2 will. So if we multiply the output of adding
# subset 0 with twice the output of adding subset 1, with four times the output of adding
# subset 2, then V will be represented 0 + 2 + 4 = 6 times. This reasoning applies for every
# value. So `subset_0_sum + 2 * subset_1_sum + 4 * subset_2_sum` gives us the result we want.
# Here, we compute this as `((subset_2_sum * 2) + subset_1_sum) * 2 + subset_0_sum` for
# efficiency: an extra `maxbitlen * 2` group operations.
o = zero
for i in range(len(subsets) - 1, -1, -1):
o = adder(adder(o, o), subset_sums[i])
return o
# Tests go here
def make_mock_adder():
counter = [0]
def adder(x, y):
if x and y:
counter[0] += 1
return x + y
return adder, counter
def test_multisubset(numcount, setcount):
numbers = [random.randrange(10**20) for _ in range(numcount)]
subsets = [
{i for i in range(numcount) if random.randrange(2)} for i in range(setcount)
]
adder, counter = make_mock_adder()
o = multisubset(numbers, subsets, adder=adder)
for output, subset in zip(o, subsets):
assert output == sum([numbers[x] for x in subset])
def test_lincomb(numcount, bitlength=256):
numbers = [random.randrange(10**20) for _ in range(numcount)]
factors = [random.randrange(2**bitlength) for _ in range(numcount)]
adder, counter = make_mock_adder()
o = lincomb(numbers, factors, adder=adder)
assert o == sum([n * f for n, f in zip(numbers, factors)])
total_ones = sum(bin(f).count("1") for f in factors)
print("Naive operation count: %d" % (bitlength * numcount + total_ones))
print("Optimized operation count: %d" % (bitlength * 2 + counter[0]))
print(
"Optimization factor: %.2f"
% ((bitlength * numcount + total_ones) / (bitlength * 2 + counter[0]))
)
if __name__ == "__main__":
test_lincomb(int(sys.argv[1]) if len(sys.argv) >= 2 else 80)
================================================
FILE: poly.py
================================================
from curve import Scalar
from enum import Enum
class Basis(Enum):
LAGRANGE = 1
MONOMIAL = 2
class Polynomial:
values: list[Scalar]
basis: Basis
def __init__(self, values: list[Scalar], basis: Basis):
assert all(isinstance(x, Scalar) for x in values)
assert isinstance(basis, Basis)
self.values = values
self.basis = basis
def __eq__(self, other):
return (self.basis == other.basis) and (self.values == other.values)
def __add__(self, other):
if isinstance(other, Polynomial):
assert len(self.values) == len(other.values)
assert self.basis == other.basis
return Polynomial(
[x + y for x, y in zip(self.values, other.values)],
self.basis,
)
else:
assert isinstance(other, Scalar)
if self.basis == Basis.LAGRANGE:
return Polynomial(
[x + other for x in self.values],
self.basis,
)
else:
return Polynomial(
[self.values[0] + other] + self.values[1:],
self.basis
)
def __sub__(self, other):
if isinstance(other, Polynomial):
assert len(self.values) == len(other.values)
assert self.basis == other.basis
return Polynomial(
[x - y for x, y in zip(self.values, other.values)],
self.basis,
)
else:
assert isinstance(other, Scalar)
if self.basis == Basis.LAGRANGE:
return Polynomial(
[x - other for x in self.values],
self.basis,
)
else:
return Polynomial(
[self.values[0] - other] + self.values[1:],
self.basis
)
def __mul__(self, other):
if isinstance(other, Polynomial):
assert self.basis == Basis.LAGRANGE
assert self.basis == other.basis
assert len(self.values) == len(other.values)
return Polynomial(
[x * y for x, y in zip(self.values, other.values)],
self.basis,
)
else:
assert isinstance(other, Scalar)
return Polynomial(
[x * other for x in self.values],
self.basis,
)
def __truediv__(self, other):
if isinstance(other, Polynomial):
assert self.basis == Basis.LAGRANGE
assert self.basis == other.basis
assert len(self.values) == len(other.values)
return Polynomial(
[x / y for x, y in zip(self.values, other.values)],
self.basis,
)
else:
assert isinstance(other, Scalar)
return Polynomial(
[x / other for x in self.values],
self.basis,
)
def shift(self, shift: int):
assert self.basis == Basis.LAGRANGE
assert shift < len(self.values)
return Polynomial(
self.values[shift:] + self.values[:shift],
self.basis,
)
# Convenience method to do FFTs specifically over the subgroup over which
# all of the proofs are operating
def fft(self, inv=False):
# Fast Fourier transform, used to convert between polynomial coefficients
# and a list of evaluations at the roots of unity
# See https://vitalik.ca/general/2019/05/12/fft.html
def _fft(vals, modulus, roots_of_unity):
if len(vals) == 1:
return vals
L = _fft(vals[::2], modulus, roots_of_unity[::2])
R = _fft(vals[1::2], modulus, roots_of_unity[::2])
o = [0] * len(vals)
for i, (x, y) in enumerate(zip(L, R)):
y_times_root = y * roots_of_unity[i]
o[i] = (x + y_times_root) % modulus
o[i + len(L)] = (x - y_times_root) % modulus
return o
roots = [x.n for x in Scalar.roots_of_unity(len(self.values))]
o, nvals = Scalar.field_modulus, [x.n for x in self.values]
if inv:
assert self.basis == Basis.LAGRANGE
# Inverse FFT
invlen = Scalar(1) / len(self.values)
reversed_roots = [roots[0]] + roots[1:][::-1]
return Polynomial(
[Scalar(x) * invlen for x in _fft(nvals, o, reversed_roots)],
Basis.MONOMIAL,
)
else:
assert self.basis == Basis.MONOMIAL
# Regular FFT
return Polynomial(
[Scalar(x) for x in _fft(nvals, o, roots)], Basis.LAGRANGE
)
def ifft(self):
return self.fft(True)
# Converts a list of evaluations at [1, w, w**2... w**(n-1)] to
# a list of evaluations at
# [offset, offset * q, offset * q**2 ... offset * q**(4n-1)] where q = w**(1/4)
# This lets us work with higher-degree polynomials, and the offset lets us
# avoid the 0/0 problem when computing a division (as long as the offset is
# chosen randomly)
def to_coset_extended_lagrange(self, offset):
assert self.basis == Basis.LAGRANGE
group_order = len(self.values)
x_powers = self.ifft().values
x_powers = [(offset**i * x) for i, x in enumerate(x_powers)] + [Scalar(0)] * (
group_order * 3
)
return Polynomial(x_powers, Basis.MONOMIAL).fft()
# Convert from offset form into coefficients
# Note that we can't make a full inverse function of to_coset_extended_lagrange
# because the output of this might be a deg >= n polynomial, which cannot
# be expressed via evaluations at n roots of unity
def coset_extended_lagrange_to_coeffs(self, offset):
assert self.basis == Basis.LAGRANGE
shifted_coeffs = self.ifft().values
inv_offset = 1 / offset
return Polynomial(
[v * inv_offset**i for (i, v) in enumerate(shifted_coeffs)],
Basis.MONOMIAL,
)
# Given a polynomial expressed as a list of evaluations at roots of unity,
# evaluate it at x directly, without using an FFT to covert to coeffs first
def barycentric_eval(self, x: Scalar):
assert self.basis == Basis.LAGRANGE
order = len(self.values)
roots_of_unity = Scalar.roots_of_unity(order)
return (
(Scalar(x) ** order - 1)
/ order
* sum(
[
value * root / (x - root)
for value, root in zip(self.values, roots_of_unity)
]
)
)
================================================
FILE: prover.py
================================================
from compiler.program import Program, CommonPreprocessedInput
from utils import *
from setup import *
from typing import Optional
from dataclasses import dataclass
from transcript import Transcript, Message1, Message2, Message3, Message4, Message5
from poly import Polynomial, Basis
@dataclass
class Proof:
msg_1: Message1
msg_2: Message2
msg_3: Message3
msg_4: Message4
msg_5: Message5
def flatten(self):
proof = {}
proof["a_1"] = self.msg_1.a_1
proof["b_1"] = self.msg_1.b_1
proof["c_1"] = self.msg_1.c_1
proof["z_1"] = self.msg_2.z_1
proof["t_lo_1"] = self.msg_3.t_lo_1
proof["t_mid_1"] = self.msg_3.t_mid_1
proof["t_hi_1"] = self.msg_3.t_hi_1
proof["a_eval"] = self.msg_4.a_eval
proof["b_eval"] = self.msg_4.b_eval
proof["c_eval"] = self.msg_4.c_eval
proof["s1_eval"] = self.msg_4.s1_eval
proof["s2_eval"] = self.msg_4.s2_eval
proof["z_shifted_eval"] = self.msg_4.z_shifted_eval
proof["W_z_1"] = self.msg_5.W_z_1
proof["W_zw_1"] = self.msg_5.W_zw_1
return proof
@dataclass
class Prover:
group_order: int
setup: Setup
program: Program
pk: CommonPreprocessedInput
def __init__(self, setup: Setup, program: Program):
self.group_order = program.group_order
self.setup = setup
self.program = program
self.pk = program.common_preprocessed_input()
def prove(self, witness: dict[Optional[str], int]) -> Proof:
# Initialise Fiat-Shamir transcript
transcript = Transcript(b"plonk")
# Collect fixed and public information
# FIXME: Hash pk and PI into transcript
public_vars = self.program.get_public_assignments()
PI = Polynomial(
[Scalar(-witness[v]) for v in public_vars]
+ [Scalar(0) for _ in range(self.group_order - len(public_vars))],
Basis.LAGRANGE,
)
self.PI = PI
# Round 1
msg_1 = self.round_1(witness)
self.beta, self.gamma = transcript.round_1(msg_1)
# Round 2
msg_2 = self.round_2()
self.alpha, self.fft_cofactor = transcript.round_2(msg_2)
# Round 3
msg_3 = self.round_3()
self.zeta = transcript.round_3(msg_3)
# Round 4
msg_4 = self.round_4()
self.v = transcript.round_4(msg_4)
# Round 5
msg_5 = self.round_5()
return Proof(msg_1, msg_2, msg_3, msg_4, msg_5)
def round_1(
self,
witness: dict[Optional[str], int],
) -> Message1:
program = self.program
setup = self.setup
group_order = self.group_order
if None not in witness:
witness[None] = 0
# Compute wire assignments for A, B, C, corresponding:
# - A_values: witness[program.wires()[i].L]
# - B_values: witness[program.wires()[i].R]
# - C_values: witness[program.wires()[i].O]
# Construct A, B, C Lagrange interpolation polynomials for
# A_values, B_values, C_values
# Compute a_1, b_1, c_1 commitments to A, B, C polynomials
# Sanity check that witness fulfils gate constraints
assert (
self.A * self.pk.QL
+ self.B * self.pk.QR
+ self.A * self.B * self.pk.QM
+ self.C * self.pk.QO
+ self.PI
+ self.pk.QC
== Polynomial([Scalar(0)] * group_order, Basis.LAGRANGE)
)
# Return a_1, b_1, c_1
return Message1(a_1, b_1, c_1)
def round_2(self) -> Message2:
group_order = self.group_order
setup = self.setup
# Using A, B, C, values, and pk.S1, pk.S2, pk.S3, compute
# Z_values for permutation grand product polynomial Z
#
# Note the convenience function:
# self.rlc(val1, val2) = val_1 + self.beta * val_2 + gamma
# Check that the last term Z_n = 1
assert Z_values.pop() == 1
# Sanity-check that Z was computed correctly
for i in range(group_order):
assert (
self.rlc(self.A.values[i], roots_of_unity[i])
* self.rlc(self.B.values[i], 2 * roots_of_unity[i])
* self.rlc(self.C.values[i], 3 * roots_of_unity[i])
) * Z_values[i] - (
self.rlc(self.A.values[i], self.pk.S1.values[i])
* self.rlc(self.B.values[i], self.pk.S2.values[i])
* self.rlc(self.C.values[i], self.pk.S3.values[i])
) * Z_values[
(i + 1) % group_order
] == 0
# Construct Z, Lagrange interpolation polynomial for Z_values
# Cpmpute z_1 commitment to Z polynomial
# Return z_1
return Message2(z_1)
def round_3(self) -> Message3:
group_order = self.group_order
setup = self.setup
# Compute the quotient polynomial
# List of roots of unity at 4x fineness, i.e. the powers of µ
# where µ^(4n) = 1
# Using self.fft_expand, move A, B, C into coset extended Lagrange basis
# Expand public inputs polynomial PI into coset extended Lagrange
# Expand selector polynomials pk.QL, pk.QR, pk.QM, pk.QO, pk.QC
# into the coset extended Lagrange basis
# Expand permutation grand product polynomial Z into coset extended
# Lagrange basis
# Expand shifted Z(ω) into coset extended Lagrange basis
# Expand permutation polynomials pk.S1, pk.S2, pk.S3 into coset
# extended Lagrange basis
# Compute Z_H = X^N - 1, also in evaluation form in the coset
# Compute L0, the Lagrange basis polynomial that evaluates to 1 at x = 1 = ω^0
# and 0 at other roots of unity
# Expand L0 into the coset extended Lagrange basis
L0_big = self.fft_expand(
Polynomial([Scalar(1)] + [Scalar(0)] * (group_order - 1), Basis.LAGRANGE)
)
# Compute the quotient polynomial (called T(x) in the paper)
# It is only possible to construct this polynomial if the following
# equations are true at all roots of unity {1, w ... w^(n-1)}:
# 1. All gates are correct:
# A * QL + B * QR + A * B * QM + C * QO + PI + QC = 0
#
# 2. The permutation accumulator is valid:
# Z(wx) = Z(x) * (rlc of A, X, 1) * (rlc of B, 2X, 1) *
# (rlc of C, 3X, 1) / (rlc of A, S1, 1) /
# (rlc of B, S2, 1) / (rlc of C, S3, 1)
# rlc = random linear combination: term_1 + beta * term2 + gamma * term3
#
# 3. The permutation accumulator equals 1 at the start point
# (Z - 1) * L0 = 0
# L0 = Lagrange polynomial, equal at all roots of unity except 1
# Sanity check: QUOT has degree < 3n
assert (
self.expanded_evals_to_coeffs(QUOT_big).values[-group_order:]
== [0] * group_order
)
print("Generated the quotient polynomial")
# Split up T into T1, T2 and T3 (needed because T has degree 3n - 4, so is
# too big for the trusted setup)
# Sanity check that we've computed T1, T2, T3 correctly
assert (
T1.barycentric_eval(fft_cofactor)
+ T2.barycentric_eval(fft_cofactor) * fft_cofactor**group_order
+ T3.barycentric_eval(fft_cofactor) * fft_cofactor ** (group_order * 2)
) == QUOT_big.values[0]
print("Generated T1, T2, T3 polynomials")
# Compute commitments t_lo_1, t_mid_1, t_hi_1 to T1, T2, T3 polynomials
# Return t_lo_1, t_mid_1, t_hi_1
return Message3(t_lo_1, t_mid_1, t_hi_1)
def round_4(self) -> Message4:
# Compute evaluations to be used in constructing the linearization polynomial.
# Compute a_eval = A(zeta)
# Compute b_eval = B(zeta)
# Compute c_eval = C(zeta)
# Compute s1_eval = pk.S1(zeta)
# Compute s2_eval = pk.S2(zeta)
# Compute z_shifted_eval = Z(zeta * ω)
# Return a_eval, b_eval, c_eval, s1_eval, s2_eval, z_shifted_eval
return Message4(a_eval, b_eval, c_eval, s1_eval, s2_eval, z_shifted_eval)
def round_5(self) -> Message5:
# Evaluate the Lagrange basis polynomial L0 at zeta
# Evaluate the vanishing polynomial Z_H(X) = X^n - 1 at zeta
# Move T1, T2, T3 into the coset extended Lagrange basis
# Move pk.QL, pk.QR, pk.QM, pk.QO, pk.QC into the coset extended Lagrange basis
# Move Z into the coset extended Lagrange basis
# Move pk.S3 into the coset extended Lagrange basis
# Compute the "linearization polynomial" R. This is a clever way to avoid
# needing to provide evaluations of _all_ the polynomials that we are
# checking an equation betweeen: instead, we can "skip" the first
# multiplicand in each term. The idea is that we construct a
# polynomial which is constructed to equal 0 at Z only if the equations
# that we are checking are correct, and which the verifier can reconstruct
# the KZG commitment to, and we provide proofs to verify that it actually
# equals 0 at Z
#
# In order for the verifier to be able to reconstruct the commitment to R,
# it has to be "linear" in the proof items, hence why we can only use each
# proof item once; any further multiplicands in each term need to be
# replaced with their evaluations at Z, which do still need to be provided
# Commit to R
# Sanity-check R
assert R.barycentric_eval(zeta) == 0
print("Generated linearization polynomial R")
# Generate proof that W(z) = 0 and that the provided evaluations of
# A, B, C, S1, S2 are correct
# Move A, B, C into the coset extended Lagrange basis
# Move pk.S1, pk.S2 into the coset extended Lagrange basis
# In the COSET EXTENDED LAGRANGE BASIS,
# Construct W_Z = (
# R
# + v * (A - a_eval)
# + v**2 * (B - b_eval)
# + v**3 * (C - c_eval)
# + v**4 * (S1 - s1_eval)
# + v**5 * (S2 - s2_eval)
# ) / (X - zeta)
# Check that degree of W_z is not greater than n
assert W_z_coeffs[group_order:] == [0] * (group_order * 3)
# Compute W_z_1 commitment to W_z
# Generate proof that the provided evaluation of Z(z*w) is correct. This
# awkwardly different term is needed because the permutation accumulator
# polynomial Z is the one place where we have to check between adjacent
# coordinates, and not just within one coordinate.
# In other words: Compute W_zw = (Z - z_shifted_eval) / (X - zeta * ω)
# Check that degree of W_z is not greater than n
assert W_zw_coeffs[group_order:] == [0] * (group_order * 3)
# Compute W_z_1 commitment to W_z
print("Generated final quotient witness polynomials")
# Return W_z_1, W_zw_1
return Message5(W_z_1, W_zw_1)
def fft_expand(self, x: Polynomial):
return x.to_coset_extended_lagrange(self.fft_cofactor)
def expanded_evals_to_coeffs(self, x: Polynomial):
return x.coset_extended_lagrange_to_coeffs(self.fft_cofactor)
def rlc(self, term_1, term_2):
return term_1 + term_2 * self.beta + self.gamma
================================================
FILE: pyproject.toml
================================================
[tool.poetry]
name = "plonkathon"
version = "0.1.0"
description = "A simple Python implementation of PLONK adapted from py_plonk"
authors = ["0xPARC / Vitalik Buterin"]
license = "MIT"
readme = "README.md"
[tool.poetry.dependencies]
python = "^3.9"
py-ecc = "^6.0.0"
merlin = {git = "https://github.com/nalinbhardwaj/curdleproofs.pie", rev = "master", subdirectory = "merlin"}
[tool.poetry.group.dev.dependencies]
mypy = "^0.991"
black = "^22.12.0"
[build-system]
requires = ["poetry-core"]
build-backend = "poetry.core.masonry.api"
[tool.mypy]
explicit_package_bases = true
================================================
FILE: setup.py
================================================
from utils import *
import py_ecc.bn128 as b
from curve import ec_lincomb, G1Point, G2Point
from compiler.program import CommonPreprocessedInput
from verifier import VerificationKey
from dataclasses import dataclass
from poly import Polynomial, Basis
# Recover the trusted setup from a file in the format used in
# https://github.com/iden3/snarkjs#7-prepare-phase-2
SETUP_FILE_G1_STARTPOS = 80
SETUP_FILE_POWERS_POS = 60
@dataclass
class Setup(object):
# ([1]₁, [x]₁, ..., [x^{d-1}]₁)
# = ( G, xG, ..., x^{d-1}G ), where G is a generator of G_2
powers_of_x: list[G1Point]
# [x]₂ = xH, where H is a generator of G_2
X2: G2Point
@classmethod
def from_file(cls, filename):
contents = open(filename, "rb").read()
# Byte 60 gives you the base-2 log of how many powers there are
powers = 2 ** contents[SETUP_FILE_POWERS_POS]
# Extract G1 points, which start at byte 80
values = [
int.from_bytes(contents[i : i + 32], "little")
for i in range(
SETUP_FILE_G1_STARTPOS, SETUP_FILE_G1_STARTPOS + 32 * powers * 2, 32
)
]
assert max(values) < b.field_modulus
# The points are encoded in a weird encoding, where all x and y points
# are multiplied by a factor (for montgomery optimization?). We can
# extract the factor because we know the first point is the generator.
factor = b.FQ(values[0]) / b.G1[0]
values = [b.FQ(x) / factor for x in values]
powers_of_x = [(values[i * 2], values[i * 2 + 1]) for i in range(powers)]
print("Extracted G1 side, X^1 point: {}".format(powers_of_x[1]))
# Search for start of G2 points. We again know that the first point is
# the generator.
pos = SETUP_FILE_G1_STARTPOS + 32 * powers * 2
target = (factor * b.G2[0].coeffs[0]).n
while pos < len(contents):
v = int.from_bytes(contents[pos : pos + 32], "little")
if v == target:
break
pos += 1
print("Detected start of G2 side at byte {}".format(pos))
X2_encoding = contents[pos + 32 * 4 : pos + 32 * 8]
X2_values = [
b.FQ(int.from_bytes(X2_encoding[i : i + 32], "little")) / factor
for i in range(0, 128, 32)
]
X2 = (b.FQ2(X2_values[:2]), b.FQ2(X2_values[2:]))
assert b.is_on_curve(X2, b.b2)
print("Extracted G2 side, X^1 point: {}".format(X2))
# assert b.pairing(b.G2, powers_of_x[1]) == b.pairing(X2, b.G1)
# print("X^1 points checked consistent")
return cls(powers_of_x, X2)
# Encodes the KZG commitment that evaluates to the given values in the group
def commit(self, values: Polynomial) -> G1Point:
assert values.basis == Basis.LAGRANGE
# Run inverse FFT to convert values from Lagrange basis to monomial basis
# Optional: Check values size does not exceed maximum power setup can handle
# Compute linear combination of setup with values
return NotImplemented
# Generate the verification key for this program with the given setup
def verification_key(self, pk: CommonPreprocessedInput) -> VerificationKey:
# Create the appropriate VerificationKey object
return NotImplemented
================================================
FILE: test/__init__.py
================================================
================================================
FILE: test/main.plonk.vkey-58.json
================================================
{
"protocol": "plonk",
"curve": "bn128",
"nPublic": 0,
"power": 3,
"k1": "2",
"k2": "3",
"Qm": [
"10294367845524522889674980414658158979115219665406612861401259333422895729896",
"17339696279167455564514853058684962930296864414660175742312401951183098671156",
"1"
],
"Ql": [
"14297155691368363150439281660551929853142513799648244067851273621337387750022",
"12012534117624137096359211205114297110997558611632571627258110765766724342420",
"1"
],
"Qr": [
"14297155691368363150439281660551929853142513799648244067851273621337387750022",
"9875708754215138125887194540142977977698752545665252035430927128878501866163",
"1"
],
"Qo": [
"9979011916860674833876589507703522468002507052909770370053733666594800845094",
"14729630273693641999471892596240127436204275281179182844090045631345819602206",
"1"
],
"Qc": [
"0",
"1",
"0"
],
"S1": [
"12298616283483778451735558489316417273025867766441336478133033639422100496973",
"15676047185797786755474337421099990612245072514136746262179062664360148735456",
"1"
],
"S2": [
"3285281999993628756611785365316971613389533139400689700975461452204136786218",
"1617843731485072605069101897885581356842856822348044818321641451407219765267",
"1"
],
"S3": [
"15275313728188330177132414910553475149781975424851628063518044954012175053863",
"6189857575419431040150205313691546343485409281989411607909724166107587561616",
"1"
],
"X_2": [
[
"21831381940315734285607113342023901060522397560371972897001948545212302161822",
"17231025384763736816414546592865244497437017442647097510447326538965263639101"
],
[
"2388026358213174446665280700919698872609886601280537296205114254867301080648",
"11507326595632554467052522095592665270651932854513688777769618397986436103170"
],
[
"1",
"0"
]
],
"w": "19540430494807482326159819597004422086093766032135589407132600596362845576832"
}
================================================
FILE: test/main.plonk.vkey-59.json
================================================
{
"protocol": "plonk",
"curve": "bn128",
"nPublic": 1,
"power": 3,
"k1": "2",
"k2": "3",
"Qm": [
"10294367845524522889674980414658158979115219665406612861401259333422895729896",
"17339696279167455564514853058684962930296864414660175742312401951183098671156",
"1"
],
"Ql": [
"14297155691368363150439281660551929853142513799648244067851273621337387750022",
"9875708754215138125887194540142977977698752545665252035430927128878501866163",
"1"
],
"Qr": [
"0",
"1",
"0"
],
"Qo": [
"10294367845524522889674980414658158979115219665406612861401259333422895729896",
"4548546592671819657731552686572312158399446742637647920376635943462127537427",
"1"
],
"Qc": [
"0",
"1",
"0"
],
"S1": [
"2694761611667402433549058650401049833973608710551146850129171008254242491412",
"4407815622841625592989621790140274705225113068749062773093818734897989348824",
"1"
],
"S2": [
"174950894878901504258554221888959060942005622520060188523927601854050691737",
"4218570225917094256073281485929194442981718242484973947497628954679969816940",
"1"
],
"S3": [
"5287191920074181963852791792640835974097875195213946104826736553520071559657",
"20309358409622118500558363825899879958276827259299017541094321370900713472551",
"1"
],
"X_2": [
[
"21831381940315734285607113342023901060522397560371972897001948545212302161822",
"17231025384763736816414546592865244497437017442647097510447326538965263639101"
],
[
"2388026358213174446665280700919698872609886601280537296205114254867301080648",
"11507326595632554467052522095592665270651932854513688777769618397986436103170"
],
[
"1",
"0"
]
],
"w": "19540430494807482326159819597004422086093766032135589407132600596362845576832"
}
================================================
FILE: test/main.plonk.vkey.json
================================================
{
"protocol": "plonk",
"curve": "bn128",
"nPublic": 0,
"power": 3,
"k1": "2",
"k2": "3",
"Qm": [
"14297155691368363150439281660551929853142513799648244067851273621337387750022",
"12012534117624137096359211205114297110997558611632571627258110765766724342420",
"1"
],
"Ql": [
"0",
"1",
"0"
],
"Qr": [
"0",
"1",
"0"
],
"Qo": [
"14297155691368363150439281660551929853142513799648244067851273621337387750022",
"9875708754215138125887194540142977977698752545665252035430927128878501866163",
"1"
],
"Qc": [
"0",
"1",
"0"
],
"S1": [
"3514537020795837778176804346752472700561147058607627192700022438897457917853",
"8628914924381244881833041972511347057770541232212012588202494401021191294886",
"1"
],
"S2": [
"10270239105545965856893095866462682446854660969221881327142885898762754802290",
"18254124398103062012489482857980956345119701560599595388652594443232359794243",
"1"
],
"S3": [
"7581161054812340292915025932276548889828390062297736519460377063075571422119",
"11644391033980170207157104232723275903489480225585820213777359398785819753031",
"1"
],
"X_2": [
[
"21831381940315734285607113342023901060522397560371972897001948545212302161822",
"17231025384763736816414546592865244497437017442647097510447326538965263639101"
],
[
"2388026358213174446665280700919698872609886601280537296205114254867301080648",
"11507326595632554467052522095592665270651932854513688777769618397986436103170"
],
[
"1",
"0"
]
],
"w": "19540430494807482326159819597004422086093766032135589407132600596362845576832"
}
================================================
FILE: test/mini_poseidon.py
================================================
from py_ecc.fields.field_elements import FQ as Field
from py_ecc import bn128 as b
import json
from curve import Scalar
# Mimics the Poseidon hash for params:
#
# p = b.curve_order
# security level = 128
# alpha = 5
# input size = 2
# t (inner state size) = 3
# full round count = 8 (4 on each side)
# partial round count = 56
#
# Tested compatible with the implementation at
# https://github.com/ingonyama-zk/poseidon-hash
rc = [
[Scalar(a), Scalar(b), Scalar(c)]
for (a, b, c) in json.load(open("test/poseidon_rc.json"))
]
mds = [Scalar(1) / i for i in range(3, 8)]
def poseidon_hash(in1, in2):
L, M, R = Scalar(in1), Scalar(in2), Scalar(0)
for i in range(64):
L = (L + rc[i][0]) ** 5
M += rc[i][1]
R += rc[i][2]
if i < 4 or i >= 60:
M = M**5
R = R**5
(L, M, R) = (
(L * mds[0] + M * mds[1] + R * mds[2]),
(L * mds[1] + M * mds[2] + R * mds[3]),
(L * mds[2] + M * mds[3] + R * mds[4]),
)
return M
================================================
FILE: test/poseidon_rc.json
================================================
[
[
20036611579827150559091469005844175073625940102952070649817884191797764107075,
12833584042949159986565784794014151972247796182705941809242049488642050965764,
20460265239335923814507658708649753625122635556480788121847315929099732630160
],
[
2327433556176440050072122789158937095349015830966296688052004527413893088211,
18928950063089170035907725405866336756839578021518946851617532132424258251714,
1498343560697108822498273562774300261830404487630763796790963549875888044170
],
[
10894060797649430325830102265601652830696874259935838091685396334824506870974,
10724282231429846920698162462239380161919181405938370346044877808218935812722,
12900822782410361357346122719301635014682733276151991957222022994303272655598
],
[
12281187892168564254318627327513968787860396596558174241050192282039059721015,
20650821582781615100585145227212437582142977940055542470636589623435638972974,
10629768372315475914205126397268924392851627409938344639024631693382468091238
],
[
9863412385841952362091849999302344670551536862597895561278739824943955436428,
12647165719356049717276082841324782064533712829755618494899697790036659206896,
7086910375195160419007414095698529055230556352178368086440281946317648386880
],
[
3534616497502400285225524656623573157846352519939301302938694874082052072531,
7614193770971378304733014379518957809467088529876055633743098033063991937988,
5524517084220026554140171636731805466222414790303448093874175843707842149296
],
[
6465594825680448812389988135904471523354179427789234651961801931723687100590,
13069961729933398081974430219710353490873542026760226429212509555519019152029,
20442858075434477469830277030056901344749316967720547608602986649544991223254
],
[
2545061208850080799716748664877827143438576541109233212439269028123126995283,
10590804083025333327978466704873260869373451484886744716135293454855421776182,
2971801848580094041349173303826341900320100639936132696391170461147651791599
],
[
6457488363353205666653234338157121013068090150457850441006724736538329117150,
6828448758803946230526683011338306528814963654399318058641145666219257935090,
21220571338383411884578252416062534137998565615530587243864120751033430171639
],
[
3026330605417380631686907051427895875297995068925155493166857059368062462182,
731369258216230615295136666016875570919026708063642185179381726677850241678,
3976056293038938116541801753309319302894026070844887224129384450572018141696
],
[
18191771098193274629372635247562817907522214770330118615218435763543549998849,
5725653743840821273222706240430501486301231493172435748281584336969930288424,
11315632284569557454743403424858271195359234409995993988812038704370836780935
],
[
1554548474507260015710470936845569353450016357484523407182389210520378329681,
4043703307392904605083268263455220115794158610223459176828550627728253474746,
5147341731800747507000152527876279909425699775431448328748957228886538246347
],
[
316000240737142429159443531715814797123772606639483557601339985787766020188,
9201639581870600108429525788569689133841787929111628476773761218482387877013,
76452313668236664875560655448227033496493983051536430171258861062052191226
],
[
5503629712269636063700501314848921574392241749467380315893984871270736181455,
6403315002483104841708481320831358853734637389275574672824410469941470947501,
18767770632814906108167838116228345063195454403262264488888241387554264238660
],
[
373847975877840160326133445669241099117560443926620032996032499044947786206,
21583421457099893419202355352294471270616995660291443431182584120442735921995,
9983253793391637270425117523681631923584433357117997691348261314215409417977
],
[
15636580057941931488663342701921482616732902735747145545560675417041959528168,
13987037342707768451704041319080865947736730484739494822133564732867708182129,
17012807810939334924306150791435050869173026502739324471672493058506335677602
],
[
20445006141370558272839096434547759512337597485395700957462476332341263470795,
8919512276260973838722894975323643726180891487427800112399328013257265098884,
5121935724947519798577565968119435009768355186616305612599005246528173860072
],
[
14390107556389236497954661819339690487046314759239155128415464040636056995180,
15584006578697113728990467783042934569634784959449117771168091586065677667289,
1894113083403807703900670244801865115743187370284525340414788037116278755190
],
[
10705487296676443016268478693800719585977460625398657826409495078078625785779,
19873128801331021326057263165428495080386730321288221112754217581989746046187,
9535868612309525085410638759100096355215966759273780340615254181304597180876
],
[
19343524237095964073059512688602276354221693507832098135091590955813103860375,
6822002759027581661397720296774923367218318024525870595858692787302610345838,
20926315677022871764300366782923548421637739455364254388830334528197327379743
],
[
21136323309320513006223521228388165385265667945268848018308200096388016784087,
6225919113652258029432904641421409709120407088870082365870799895213524311811,
21311480157558702908741067111471547471593313246226635294078364811203226453373
],
[
14198190262525448445981718442399104668575545443532459171696901751739593839857,
12459931960399741184291910088669153631848894479486118281597822681015713751918,
6907332911214986094624912982222847460679912779076715919016253686198271600395
],
[
3393759605179403680839415959192212038674751416897927769901727596993250674985,
16080342864802322672358399934807873723002376453489101187414456500323890285012,
16917851076945834679745654051495359311013500015444379399867812928847812812510
],
[
5864181025763406655115777223889072827293022750325008731169039770623910915277,
10954281474369541712624748628862576225398375445664590376470032986861704444851,
13047511983320021446527738191176357616721874803157974120155396052149484216634
],
[
17384764166393886603954067639141617082485023441458666471633998399316404519396,
2362259915869885612166233838782699563177506073802792748904427749547307239091,
3449611877995076637446149781868130604254776727586032907573864421212958534148
],
[
8105036071757751706553034047983212897131570345551297243248892142702129409678,
4057622885794382454162574102227879765926272192471614700281108386186208378108,
19253655284986776930766796883935217167812408888646382124396693895764793065618
],
[
14562077372515720927783340475811850803822078325552042696568002577429594161808,
7014243471382661219949882659461009076113783570411660059539677464905457050066,
11119991552991434794635736236084481059792337090668881972430146540524522150444
],
[
6920339933326447431851243390609339708453004519540083864540711586753457519075,
11162222425158351635121226742909449004096853940732577587985131267177476511460,
21584563699769356971941778663228493736429564166612353926541774478985941223655
],
[
15214147914792054366849870322421302165632582490745675437262348323314349777499,
13722597825431950116506669418346236306300603980563381228999662274560534928630,
5918152979416198971639649518325331620625448907020294484718605076371986899062
],
[
15678541263797653248465557518970463974123241861658113965486586969940493538159,
8175777680824278600606955282259166614282815213949021367965714288908365703981,
8574569171387093031118013554489261429423593158447011529314100526972055673141
],
[
10918716126595714712314376320709488869257965519405486565051522412268603317681,
2532325191174382117711281885521238232961564999737136355944431967345212530064,
9114912432384157184380243669694781216466912274227843462720070724726171771343
],
[
9691400567437687859118635218970787863193933880668112713672540475341700900162,
10028377139165856509931030519434327238309496767889858927872034975860560686709,
2849683057915704668087183484859329979263625107757384166719222732911939917981
],
[
20815778942199761651592702319981868700183832939349741057726702071066236466370,
3621409685183637105257042956474539264816126311779453543298291481462284916375,
10642444461722023103186165843304806163340395145625014310520389928965799104093
],
[
17907899976867921198783978602917240207228046151371192195898939004844562290779,
9146061947045243980414113080125198805700065329387764273046439006263867106802,
13405550010907066703864845579707432131752948528064342334643991332494785370038
],
[
5273152122127100210394298964572549609798968014900036928547501207482676600182,
20346921636268799423535709508901024522528095837141171053270676908508776437195,
12308379321943972271977727620557057340570329211508147240067063174565210570182
],
[
10770632218611704372085484530618691838222212586567712779168513852598498268856,
3259010024842168233969061619991654422834134029900348332440765888676313413487,
13595627212987049470638603008526754259254399837052142232458256128156882979850
],
[
13686702530817919707738525718876497735372528847997348246778693823968904671413,
4460395761756237799418597895559539723093635900761406744282005033814770269920,
919904521071258469189758023145643742474862093406711345641316491878850805232
],
[
19552067628327891935919890073139134992135919431541039306364452733276397042650,
10232151001993153761133312595886476849690259266379133316759758019413141271347,
19100291637068471015012417085176588278037788688620609514545142049341033683807
],
[
17297878401483595514567069610271673264004048391410881314538210656783184883353,
9840592683860937355653821679751062912936830160133502129601080738765222581867,
12297898491098218555414910863121396110527474713633732916696718763044349837748
],
[
15412894648987998885409062475029640813182264551444190226337524042414883759931,
4319236626081961726054666064727097684638760777375371898340440545864486507582,
18191973605703047070391590784739729000864209799853952244185039146413344974340
],
[
5752404706735255136529409306453204906911390186311487545512488044087647004363,
19230596513152256569445337244613095254390639822535622716862845651176210035567,
2252265175955857562316757498871300534719440896551852902250776607815209442724
],
[
9073931270846747895810603986335991449117293042147443815945435448584060282081,
6483856960596610771808987619300220938594487623302140190538043615094727502464,
637083280333510402345876123020430035110637297697948006826849236896141939084
],
[
7961522592388438366485301812068782850165520204310798157690585659140643523427,
17678862179510708909425310131356019028135505241560682522031533174595818508320,
574878632188556442314330639459430853202335502659322827394752921941768574011
],
[
3568116434701175901536016957970225427791238673348886522778060096654834078838,
209057191470494933977718281886471474348594564432534881082944258569955784687,
3800165250578186427973769620301873054994730397398528230413324355039263408752
],
[
3639100050381197573204744243844108375909703224935458547540586483692218704877,
9218466811348574054855689012570491528574600673776655812307423675082651391951,
8596059170152859617707385585788415986680436507191992176622088596772810472999
],
[
322732566805048175662260816011517006326081754353072089063398671083697385738,
7049453550072299437444402478358913479930492212486482712275051349319498748499,
20246786653614752389192328734689452300173121506317343804072039327223658303388
],
[
20278117822137273667620939538959902468837011105813146111671754380981924989927,
1654160784815709253179744420396162626998410370662765393562093472726921196108,
5926459500982186461721807733417927873652323980102172461303859048787673285969
],
[
9814654683585465702825961842499223555384560422004063024601314839443288937571,
20667359178767651545823868376438771969042007215942165947678357086227810746146,
15377207745218347185681329705726178336802704421362747654022727216790137478882
],
[
19356050391225890781843193058227673417005845306283197872873417841669492437242,
12166372468984248061354090302429129252134172582728018687944601026858163494693,
18717100384016177736416179070153621959142213446659284814208081064231797419015
],
[
5204341573325740834696285877748784802324386535160173798458534920759432537787,
19782883278004953315498469484333242486546087363994023561941134846660210215049,
16034955990523880906964021599001738382034941228857279292438387818659914390371
],
[
14818160808196029852784679856539666583166658183438355275636086056715415083081,
17716280573405858003273309028354201201039005385690741490326620374206055544700,
20325174322201219249837009573437620189627639570996454573810011699858417613068
],
[
3392148968210856354607867739690107473804056082222277341573277534219216150766,
19428278646034050357542301711807143225530095360718614464231862005201274256649,
12276431613202649950362446507866899415646220636973437014689940717149756340651
],
[
2587427326658891351835083368358925497178797694851506580681724992222840444228,
15153709835422806446107529654085221072935438340920609707727892556855417866058,
15451366934210083469013986585953772077331429640485131755078569432020850609582
],
[
7529206459496272002258451885407469611325302405318004333109879967079729143354,
11074797962501795518507846987686168533587240327062952171521584310243255490527,
12764552282499399390083087144902783986037307873849260653087645304074666033424
],
[
13042650929397207114336562675355065717548627481583423774346222039669532727946,
16043108230643325415237962657577223570300108689567622235410238284480628436709,
9512396575137793662413202544406785640755021654701449580404337449461432257619
],
[
1647285959695493539499324566741747088943357828231148957465068152734531543503,
21709648504024848772276178029483179772486495316295743876700927299213559938506,
21724258692495205310896265709064144808671841367275300438694185214351785954314
],
[
1867255786401891769870675572668606218710278873748335003682799683714442945229,
7987721022707500619696671044839066336452195788164827456425525148091870118251,
6539200631055259414716243770171674046221430025597614000179562787351853958064
],
[
8813689471242022713194444818607001333164230629904238460550879892354711745295,
20714180249817511375952366875224782387429058136430649457777093547955191502176,
18539711572616584819185239447633561813340291234100075574593487873259164787756
],
[
17667981635370223776813093657005199235669153259421463967392158997697754245871,
5376160060481176473583403267888691728170726259835912852329136073184696997147,
4827362235909576508007307863435628112155590770759625133213539477210553500723
],
[
13370319446619231361317124489467185868983690187607611069576690369092652259991,
7028188533891957864971355746639290885773084850501201875941442998993682972992,
13540296832979359410552476169105468590730420434093705252588595073215852455795
],
[
12990891770605679281297891471045699934999991085025534577757867831342762325899,
12462076986357124836994040219766123609086672012765820033289064632297189251090,
1828489075147172049835809292018292715706474000231898985692281252669405838335
],
[
3765746554097871584971351502123536140065794926761110131907071999370632640132,
3439916369559340392520868084385525387338766489009161266575789323989654664195,
20809812814789097985689246615784687916156945595313708545672843850284131223766
],
[
13027707712686586554032640557539411722798759615913193118482673776883949392685,
6182352544550527301330938836100871624958555541496136265556139550324548491339,
12013693095010211530598130838725639174031637655633969485927984978914903872850
],
[
16544062834867868837759941955456105371508093239200809853834322062372359189123,
14286779287597934647998556545702143097785912165468635922756805434029327391090,
3870485400879314278951355285041632089577731427251451910320055748195753746665
]
]
================================================
FILE: test.py
================================================
import pickle
from TESTING_verifier_DO_NOT_OPEN import TestingVerificationKey
from compiler.program import Program
from curve import G1Point
from poly import Basis, Polynomial
from setup import Setup
from prover import Prover
from verifier import VerificationKey
import json
from test.mini_poseidon import rc, mds, poseidon_hash
from utils import *
def setup_test():
print("===setup_test===")
setup = Setup.from_file("test/powersOfTau28_hez_final_11.ptau")
dummy_values = Polynomial(
list(map(Scalar, [1, 2, 3, 4, 5, 6, 7, 8])), Basis.LAGRANGE
)
program = Program(["c <== a * b"], 8)
commitment = setup.commit(dummy_values)
assert commitment == G1Point(
(
16120260411117808045030798560855586501988622612038310041007562782458075125622,
3125847109934958347271782137825877642397632921923926105820408033549219695465,
)
)
vk = setup.verification_key(program.common_preprocessed_input())
assert (
vk.w
== 19540430494807482326159819597004422086093766032135589407132600596362845576832
)
print("Successfully created dummy commitment and verification key")
def basic_test():
print("===basic_test===")
# Extract 2^28 powers of tau
setup = Setup.from_file("test/powersOfTau28_hez_final_11.ptau")
print("Extracted setup")
program = Program(["c <== a * b"], 8)
vk = setup.verification_key(program.common_preprocessed_input())
print("Generated verification key")
their_output = json.load(open("test/main.plonk.vkey.json"))
for key in ("Qm", "Ql", "Qr", "Qo", "Qc", "S1", "S2", "S3", "X_2"):
if interpret_json_point(their_output[key]) != getattr(vk, key):
raise Exception(
"Mismatch {}: ours {} theirs {}".format(
key, getattr(vk, key), their_output[key]
)
)
assert getattr(vk, "w") == int(their_output["w"])
print("Basic test success")
return setup
# Equivalent to this zkrepl code:
#
# template Example () {
# signal input a;
# signal input b;
# signal c;
# c <== a * b + a;
# }
def ab_plus_a_test(setup):
print("===ab_plus_a_test===")
program = Program(["ab === a - c", "-ab === a * b"], 8)
vk = setup.verification_key(program.common_preprocessed_input())
print("Generated verification key")
their_output = json.load(open("test/main.plonk.vkey-58.json"))
for key in ("Qm", "Ql", "Qr", "Qo", "Qc", "S1", "S2", "S3", "X_2"):
if interpret_json_point(their_output[key]) != getattr(vk, key):
raise Exception(
"Mismatch {}: ours {} theirs {}".format(
key, getattr(vk, key), their_output[key]
)
)
assert getattr(vk, "w") == int(their_output["w"])
print("ab+a test success")
def one_public_input_test(setup):
print("===one_public_input_test===")
program = Program(["c public", "c === a * b"], 8)
vk = setup.verification_key(program.common_preprocessed_input())
print("Generated verification key")
their_output = json.load(open("test/main.plonk.vkey-59.json"))
for key in ("Qm", "Ql", "Qr", "Qo", "Qc", "S1", "S2", "S3", "X_2"):
if interpret_json_point(their_output[key]) != getattr(vk, key):
raise Exception(
"Mismatch {}: ours {} theirs {}".format(
key, getattr(vk, key), their_output[key]
)
)
assert getattr(vk, "w") == int(their_output["w"])
print("One public input test success")
def prover_test_dummy_verifier(setup):
print("===prover_test_dummy_verifier===")
print("Beginning prover test with test verifier")
program = Program(["e public", "c <== a * b", "e <== c * d"], 8)
assignments = {"a": 3, "b": 4, "c": 12, "d": 5, "e": 60}
prover = Prover(setup, program)
proof = prover.prove(assignments)
print("Beginning test verification")
program = Program(["e public", "c <== a * b", "e <== c * d"], 8)
public = [60]
vk = setup.verification_key(program.common_preprocessed_input())
vk_test = TestingVerificationKey(
group_order=vk.group_order,
Qm=vk.Qm,
Ql=vk.Ql,
Qr=vk.Qr,
Qo=vk.Qo,
Qc=vk.Qc,
S1=vk.S1,
S2=vk.S2,
S3=vk.S3,
X_2=vk.X_2,
w=vk.w,
)
assert vk_test.verify_proof_unoptimized(8, proof, public)
assert vk_test.verify_proof(8, proof, public)
print("Prover test with dummy verifier success")
def prover_test(setup):
print("===prover_test===")
print("Beginning prover test")
program = Program(["e public", "c <== a * b", "e <== c * d"], 8)
assignments = {"a": 3, "b": 4, "c": 12, "d": 5, "e": 60}
prover = Prover(setup, program)
proof = prover.prove(assignments)
print("Prover test success")
return proof
def verifier_test_unoptimized(setup, proof):
print("===verifier_test_unoptimized===")
print("Beginning verifier test")
program = Program(["e public", "c <== a * b", "e <== c * d"], 8)
public = [60]
vk = setup.verification_key(program.common_preprocessed_input())
assert vk.verify_proof_unoptimized(8, proof, public)
print("Verifier test success")
def verifier_test_full(setup, proof):
print("===verifier_test_full===")
print("Beginning verifier test")
program = Program(["e public", "c <== a * b", "e <== c * d"], 8)
public = [60]
vk = setup.verification_key(program.common_preprocessed_input())
assert vk.verify_proof_unoptimized(8, proof, public)
assert vk.verify_proof(8, proof, public)
print("Verifier test success")
def factorization_test(setup):
print("===factorization_test===")
print("Beginning test: prove you know small integers that multiply to 91")
program = Program.from_str(
"""n public
pb0 === pb0 * pb0
pb1 === pb1 * pb1
pb2 === pb2 * pb2
pb3 === pb3 * pb3
qb0 === qb0 * qb0
qb1 === qb1 * qb1
qb2 === qb2 * qb2
qb3 === qb3 * qb3
pb01 <== pb0 + 2 * pb1
pb012 <== pb01 + 4 * pb2
p <== pb012 + 8 * pb3
qb01 <== qb0 + 2 * qb1
qb012 <== qb01 + 4 * qb2
q <== qb012 + 8 * qb3
n <== p * q""",
16,
)
public = [91]
vk = setup.verification_key(program.common_preprocessed_input())
print("Generated verification key")
assignments = program.fill_variable_assignments(
{
"pb3": 1,
"pb2": 1,
"pb1": 0,
"pb0": 1,
"qb3": 0,
"qb2": 1,
"qb1": 1,
"qb0": 1,
}
)
prover = Prover(setup, program)
proof = prover.prove(assignments)
print("Generated proof")
assert vk.verify_proof(16, proof, public)
print("Factorization test success!")
def output_proof_lang() -> str:
o = []
o.append("L0 public")
o.append("M0 public")
o.append("M64 public")
o.append("R0 <== 0")
for i in range(64):
for j, pos in enumerate(("L", "M", "R")):
f = {"x": i, "r": rc[i][j], "p": pos}
if i < 4 or i >= 60 or pos == "L":
o.append("{p}adj{x} <== {p}{x} + {r}".format(**f))
o.append("{p}sq{x} <== {p}adj{x} * {p}adj{x}".format(**f))
o.append("{p}qd{x} <== {p}sq{x} * {p}sq{x}".format(**f))
o.append("{p}qn{x} <== {p}qd{x} * {p}adj{x}".format(**f))
else:
o.append("{p}qn{x} <== {p}{x} + {r}".format(**f))
for j, pos in enumerate(("L", "M", "R")):
f = {"x": i, "p": pos, "m": mds[j]}
o.append("{p}suma{x} <== Lqn{x} * {m}".format(**f))
f = {"x": i, "p": pos, "m": mds[j + 1]}
o.append("{p}sumb{x} <== {p}suma{x} + Mqn{x} * {m}".format(**f))
f = {"x": i, "xp1": i + 1, "p": pos, "m": mds[j + 2]}
o.append("{p}{xp1} <== {p}sumb{x} + Rqn{x} * {m}".format(**f))
return "\n".join(o)
def poseidon_test(setup):
print("===poseidon_test===")
# PLONK-prove the correctness of a Poseidon execution. Note that this is
# a very suboptimal way to do it: an optimized implementation would use
# a custom PLONK gate to do a round in a single gate
expected_value = poseidon_hash(1, 2)
# Generate code for proof
program = Program.from_str(output_proof_lang(), 1024)
print("Generated code for Poseidon test")
assignments = program.fill_variable_assignments({"L0": 1, "M0": 2})
vk = setup.verification_key(program.common_preprocessed_input())
print("Generated verification key")
prover = Prover(setup, program)
proof = prover.prove(assignments)
print("Generated proof")
assert vk.verify_proof(1024, proof, [1, 2, expected_value])
print("Verified proof!")
if __name__ == "__main__":
# Step 1: Pass setup test
setup_test()
setup = basic_test()
# Step 2: Pass prover test using verifier we provide (DO NOT READ TEST VERIFIER CODE)
prover_test_dummy_verifier(setup)
# Step 3: Pass verifier test using your own verifier
with open("test/proof.pickle", "rb") as f:
proof = pickle.load(f)
verifier_test_unoptimized(setup, proof)
verifier_test_full(setup, proof)
# Step 4: Pass end-to-end tests for prover and verifier
ab_plus_a_test(setup)
one_public_input_test(setup)
proof = prover_test(setup)
verifier_test_full(setup, proof)
factorization_test(setup)
poseidon_test(setup)
================================================
FILE: transcript.py
================================================
from utils import Scalar
from curve import G1Point
from merlin import MerlinTranscript
from py_ecc.secp256k1.secp256k1 import bytes_to_int
from dataclasses import dataclass
@dataclass
class Message1:
# [a(x)]₁ (commitment to left wire polynomial)
a_1: G1Point
# [b(x)]₁ (commitment to right wire polynomial)
b_1: G1Point
# [c(x)]₁ (commitment to output wire polynomial)
c_1: G1Point
@dataclass
class Message2:
# [z(x)]₁ (commitment to permutation polynomial)
z_1: G1Point
@dataclass
class Message3:
# [t_lo(x)]₁ (commitment to t_lo(X), the low chunk of the quotient polynomial t(X))
t_lo_1: G1Point
# [t_mid(x)]₁ (commitment to t_mid(X), the middle chunk of the quotient polynomial t(X))
t_mid_1: G1Point
# [t_hi(x)]₁ (commitment to t_hi(X), the high chunk of the quotient polynomial t(X))
t_hi_1: G1Point
@dataclass
class Message4:
# Evaluation of a(X) at evaluation challenge ζ
a_eval: Scalar
# Evaluation of b(X) at evaluation challenge ζ
b_eval: Scalar
# Evaluation of c(X) at evaluation challenge ζ
c_eval: Scalar
# Evaluation of the first permutation polynomial S_σ1(X) at evaluation challenge ζ
s1_eval: Scalar
# Evaluation of the second permutation polynomial S_σ2(X) at evaluation challenge ζ
s2_eval: Scalar
# Evaluation of the shifted permutation polynomial z(X) at the shifted evaluation challenge ζω
z_shifted_eval: Scalar
@dataclass
class Message5:
# [W_ζ(X)]₁ (commitment to the opening proof polynomial)
W_z_1: G1Point
# [W_ζω(X)]₁ (commitment to the opening proof polynomial)
W_zw_1: G1Point
class Transcript(MerlinTranscript):
def append(self, label: bytes, item: bytes) -> None:
self.append_message(label, item)
def append_scalar(self, label: bytes, item: Scalar):
self.append_message(label, item.n.to_bytes(32, "big"))
def append_point(self, label: bytes, item: G1Point):
self.append_message(label, item[0].n.to_bytes(32, "big"))
self.append_message(label, item[1].n.to_bytes(32, "big"))
def get_and_append_challenge(self, label: bytes) -> Scalar:
while True:
challenge_bytes = self.challenge_bytes(label, 255)
f = Scalar(bytes_to_int(challenge_bytes))
if f != Scalar.zero(): # Enforce challenge != 0
self.append(label, challenge_bytes)
return f
def round_1(self, message: Message1) -> tuple[Scalar, Scalar]:
self.append_point(b"a_1", message.a_1)
self.append_point(b"b_1", message.b_1)
self.append_point(b"c_1", message.c_1)
# The first two Fiat-Shamir challenges
beta = self.get_and_append_challenge(b"beta")
gamma = self.get_and_append_challenge(b"gamma")
return beta, gamma
def round_2(self, message: Message2) -> tuple[Scalar, Scalar]:
self.append_point(b"z_1", message.z_1)
alpha = self.get_and_append_challenge(b"alpha")
# This value could be anything, it just needs to be unpredictable. Lets us
# have evaluation forms at cosets to avoid zero evaluations, so we can
# divide polys without the 0/0 issue
fft_cofactor = self.get_and_append_challenge(b"fft_cofactor")
return alpha, fft_cofactor
def round_3(self, message: Message3) -> Scalar:
self.append_point(b"t_lo_1", message.t_lo_1)
self.append_point(b"t_mid_1", message.t_mid_1)
self.append_point(b"t_hi_1", message.t_hi_1)
zeta = self.get_and_append_challenge(b"zeta")
return zeta
def round_4(self, message: Message4) -> Scalar:
self.append_scalar(b"a_eval", message.a_eval)
self.append_scalar(b"b_eval", message.b_eval)
self.append_scalar(b"c_eval", message.c_eval)
self.append_scalar(b"s1_eval", message.s1_eval)
self.append_scalar(b"s2_eval", message.s2_eval)
self.append_scalar(b"z_shifted_eval", message.z_shifted_eval)
v = self.get_and_append_challenge(b"v")
return v
def round_5(self, message: Message5) -> Scalar:
self.append_point(b"W_z_1", message.W_z_1)
self.append_point(b"W_zw_1", message.W_zw_1)
u = self.get_and_append_challenge(b"u")
return u
================================================
FILE: utils.py
================================================
import py_ecc.bn128 as b
from curve import Scalar
f = b.FQ
f2 = b.FQ2
primitive_root = 5
# Extracts a point from JSON in zkrepl's format
def interpret_json_point(p):
if len(p) == 3 and isinstance(p[0], str) and p[2] == "1":
return (f(int(p[0])), f(int(p[1])))
elif len(p) == 3 and p == ["0", "1", "0"]:
return b.Z1
elif len(p) == 3 and isinstance(p[0], list) and p[2] == ["1", "0"]:
return (
f2([int(p[0][0]), int(p[0][1])]),
f2([int(p[1][0]), int(p[1][1])]),
)
elif len(p) == 3 and p == [["0", "0"], ["1", "0"], ["0", "0"]]:
return b.Z2
raise Exception("cannot interpret that point: {}".format(p))
================================================
FILE: verifier.py
================================================
import py_ecc.bn128 as b
from utils import *
from dataclasses import dataclass
from curve import *
from transcript import Transcript
from poly import Polynomial, Basis
@dataclass
class VerificationKey:
"""Verification key"""
# we set this to some power of 2 (so that we can FFT over it), that is at least the number of constraints we have (so we can Lagrange interpolate them)
group_order: int
# [q_M(x)]₁ (commitment to multiplication selector polynomial)
Qm: G1Point
# [q_L(x)]₁ (commitment to left selector polynomial)
Ql: G1Point
# [q_R(x)]₁ (commitment to right selector polynomial)
Qr: G1Point
# [q_O(x)]₁ (commitment to output selector polynomial)
Qo: G1Point
# [q_C(x)]₁ (commitment to constants selector polynomial)
Qc: G1Point
# [S_σ1(x)]₁ (commitment to the first permutation polynomial S_σ1(X))
S1: G1Point
# [S_σ2(x)]₁ (commitment to the second permutation polynomial S_σ2(X))
S2: G1Point
# [S_σ3(x)]₁ (commitment to the third permutation polynomial S_σ3(X))
S3: G1Point
# [x]₂ = xH, where H is a generator of G_2
X_2: G2Point
# nth root of unity (i.e. ω^1), where n is the program's group order.
w: Scalar
# More optimized version that tries hard to minimize pairings and
# elliptic curve multiplications, but at the cost of being harder
# to understand and mixing together a lot of the computations to
# efficiently batch them
def verify_proof(self, group_order: int, pf, public=[]) -> bool:
# 4. Compute challenges
# 5. Compute zero polynomial evaluation Z_H(ζ) = ζ^n - 1
# 6. Compute Lagrange polynomial evaluation L_0(ζ)
# 7. Compute public input polynomial evaluation PI(ζ).
# Compute the constant term of R. This is not literally the degree-0
# term of the R polynomial; rather, it's the portion of R that can
# be computed directly, without resorting to elliptic cutve commitments
# Compute D = (R - r0) + u * Z, and E and F
# Run one pairing check to verify the last two checks.
# What's going on here is a clever re-arrangement of terms to check
# the same equations that are being checked in the basic version,
# but in a way that minimizes the number of EC muls and even
# compressed the two pairings into one. The 2 pairings -> 1 pairing
# trick is basically to replace checking
#
# Y1 = A * (X - a) and Y2 = B * (X - b)
#
# with
#
# Y1 + A * a = A * X
# Y2 + B * b = B * X
#
# so at this point we can take a random linear combination of the two
# checks, and verify it with only one pairing.
return False
# Basic, easier-to-understand version of what's going on
def verify_proof_unoptimized(self, group_order: int, pf, public=[]) -> bool:
# 4. Compute challenges
# 5. Compute zero polynomial evaluation Z_H(ζ) = ζ^n - 1
# 6. Compute Lagrange polynomial evaluation L_0(ζ)
# 7. Compute public input polynomial evaluation PI(ζ).
# Recover the commitment to the linearization polynomial R,
# exactly the same as what was created by the prover
# Verify that R(z) = 0 and the prover-provided evaluations
# A(z), B(z), C(z), S1(z), S2(z) are all correct
# Verify that the provided value of Z(zeta*w) is correct
return False
# Compute challenges (should be same as those computed by prover)
def compute_challenges(
self, proof
) -> tuple[Scalar, Scalar, Scalar, Scalar, Scalar, Scalar]:
transcript = Transcript(b"plonk")
beta, gamma = transcript.round_1(proof.msg_1)
alpha, _fft_cofactor = transcript.round_2(proof.msg_2)
zeta = transcript.round_3(proof.msg_3)
v = transcript.round_4(proof.msg_4)
u = transcript.round_5(proof.msg_5)
return beta, gamma, alpha, zeta, v, u
gitextract_u8n2xw5v/ ├── .gitignore ├── README.md ├── TESTING_verifier_DO_NOT_OPEN.py ├── __init__.py ├── compiler/ │ ├── __init__.py │ ├── assembly.py │ ├── program.py │ └── utils.py ├── curve.py ├── poly.py ├── prover.py ├── pyproject.toml ├── setup.py ├── test/ │ ├── __init__.py │ ├── main.plonk.vkey-58.json │ ├── main.plonk.vkey-59.json │ ├── main.plonk.vkey.json │ ├── mini_poseidon.py │ ├── poseidon_rc.json │ ├── powersOfTau28_hez_final_11.ptau │ └── proof.pickle ├── test.py ├── transcript.py ├── utils.py └── verifier.py
SYMBOL INDEX (112 symbols across 13 files)
FILE: TESTING_verifier_DO_NOT_OPEN.py
class TestingVerificationKey (line 10) | class TestingVerificationKey:
method verify_proof (line 39) | def verify_proof(self, group_order: int, pf, public=[]) -> bool:
method verify_proof_unoptimized (line 166) | def verify_proof_unoptimized(self, group_order: int, pf, public=[]) ->...
method compute_challenges (line 267) | def compute_challenges(
FILE: compiler/assembly.py
class GateWires (line 8) | class GateWires:
method as_list (line 15) | def as_list(self) -> list[Optional[str]]:
class Gate (line 20) | class Gate:
class AssemblyEqn (line 31) | class AssemblyEqn:
method L (line 37) | def L(self) -> Scalar:
method R (line 40) | def R(self) -> Scalar:
method C (line 45) | def C(self) -> Scalar:
method O (line 48) | def O(self) -> Scalar:
method M (line 51) | def M(self) -> Scalar:
method gate (line 58) | def gate(self) -> Gate:
function evaluate (line 71) | def evaluate(exprs: list[str], first_is_negative=False) -> dict[Optional...
function eq_to_assembly (line 122) | def eq_to_assembly(eq: str) -> AssemblyEqn:
FILE: compiler/program.py
class CommonPreprocessedInput (line 11) | class CommonPreprocessedInput:
class Program (line 33) | class Program:
method __init__ (line 37) | def __init__(self, constraints: list[str], group_order: int):
method common_preprocessed_input (line 44) | def common_preprocessed_input(self) -> CommonPreprocessedInput:
method from_str (line 60) | def from_str(cls, constraints: str, group_order: int):
method coeffs (line 64) | def coeffs(self) -> list[dict[Optional[str], int]]:
method wires (line 67) | def wires(self) -> list[GateWires]:
method make_s_polynomials (line 70) | def make_s_polynomials(self) -> dict[Column, Polynomial]:
method get_public_assignments (line 116) | def get_public_assignments(self) -> list[Optional[str]]:
method make_gate_polynomials (line 134) | def make_gate_polynomials(
method fill_variable_assignments (line 161) | def fill_variable_assignments(
FILE: compiler/utils.py
class Column (line 6) | class Column(Enum):
method __lt__ (line 11) | def __lt__(self, other):
method variants (line 17) | def variants():
class Cell (line 22) | class Cell:
method __key (line 26) | def __key(self):
method __hash__ (line 29) | def __hash__(self):
method __lt__ (line 32) | def __lt__(self, other):
method __repr__ (line 37) | def __repr__(self) -> str:
method __str__ (line 40) | def __str__(self) -> str:
method label (line 45) | def label(self, group_order: int) -> Scalar:
function get_product_key (line 54) | def get_product_key(key1, key2):
function is_valid_variable_name (line 59) | def is_valid_variable_name(name: str) -> bool:
FILE: curve.py
class Scalar (line 10) | class Scalar(Field):
method root_of_unity (line 15) | def root_of_unity(cls, group_order: int):
method roots_of_unity (line 20) | def roots_of_unity(cls, group_order: int):
function ec_mul (line 30) | def ec_mul(pt, coeff):
function ec_lincomb (line 38) | def ec_lincomb(pairs):
function multisubset (line 59) | def multisubset(numbers, subsets, adder=lambda x, y: x + y, zero=0):
function lincomb (line 91) | def lincomb(numbers, factors, adder=lambda x, y: x + y, zero=0):
function make_mock_adder (line 115) | def make_mock_adder():
function test_multisubset (line 126) | def test_multisubset(numcount, setcount):
function test_lincomb (line 137) | def test_lincomb(numcount, bitlength=256):
FILE: poly.py
class Basis (line 5) | class Basis(Enum):
class Polynomial (line 10) | class Polynomial:
method __init__ (line 14) | def __init__(self, values: list[Scalar], basis: Basis):
method __eq__ (line 20) | def __eq__(self, other):
method __add__ (line 23) | def __add__(self, other):
method __sub__ (line 45) | def __sub__(self, other):
method __mul__ (line 68) | def __mul__(self, other):
method __truediv__ (line 85) | def __truediv__(self, other):
method shift (line 102) | def shift(self, shift: int):
method fft (line 113) | def fft(self, inv=False):
method ifft (line 147) | def ifft(self):
method to_coset_extended_lagrange (line 156) | def to_coset_extended_lagrange(self, offset):
method coset_extended_lagrange_to_coeffs (line 169) | def coset_extended_lagrange_to_coeffs(self, offset):
method barycentric_eval (line 181) | def barycentric_eval(self, x: Scalar):
FILE: prover.py
class Proof (line 11) | class Proof:
method flatten (line 18) | def flatten(self):
class Prover (line 39) | class Prover:
method __init__ (line 45) | def __init__(self, setup: Setup, program: Program):
method prove (line 51) | def prove(self, witness: dict[Optional[str], int]) -> Proof:
method round_1 (line 86) | def round_1(
method round_2 (line 121) | def round_2(self) -> Message2:
method round_3 (line 154) | def round_3(self) -> Message3:
method round_4 (line 228) | def round_4(self) -> Message4:
method round_5 (line 241) | def round_5(self) -> Message5:
method fft_expand (line 308) | def fft_expand(self, x: Polynomial):
method expanded_evals_to_coeffs (line 311) | def expanded_evals_to_coeffs(self, x: Polynomial):
method rlc (line 314) | def rlc(self, term_1, term_2):
FILE: setup.py
class Setup (line 16) | class Setup(object):
method from_file (line 24) | def from_file(cls, filename):
method commit (line 66) | def commit(self, values: Polynomial) -> G1Point:
method verification_key (line 75) | def verification_key(self, pk: CommonPreprocessedInput) -> Verificatio...
FILE: test.py
function setup_test (line 14) | def setup_test():
function basic_test (line 37) | def basic_test():
function ab_plus_a_test (line 67) | def ab_plus_a_test(setup):
function one_public_input_test (line 85) | def one_public_input_test(setup):
function prover_test_dummy_verifier (line 103) | def prover_test_dummy_verifier(setup):
function prover_test (line 136) | def prover_test(setup):
function verifier_test_unoptimized (line 148) | def verifier_test_unoptimized(setup, proof):
function verifier_test_full (line 159) | def verifier_test_full(setup, proof):
function factorization_test (line 171) | def factorization_test(setup):
function output_proof_lang (line 216) | def output_proof_lang() -> str:
function poseidon_test (line 242) | def poseidon_test(setup):
FILE: test/mini_poseidon.py
function poseidon_hash (line 27) | def poseidon_hash(in1, in2):
FILE: transcript.py
class Message1 (line 9) | class Message1:
class Message2 (line 19) | class Message2:
class Message3 (line 25) | class Message3:
class Message4 (line 35) | class Message4:
class Message5 (line 51) | class Message5:
class Transcript (line 58) | class Transcript(MerlinTranscript):
method append (line 59) | def append(self, label: bytes, item: bytes) -> None:
method append_scalar (line 62) | def append_scalar(self, label: bytes, item: Scalar):
method append_point (line 65) | def append_point(self, label: bytes, item: G1Point):
method get_and_append_challenge (line 69) | def get_and_append_challenge(self, label: bytes) -> Scalar:
method round_1 (line 77) | def round_1(self, message: Message1) -> tuple[Scalar, Scalar]:
method round_2 (line 88) | def round_2(self, message: Message2) -> tuple[Scalar, Scalar]:
method round_3 (line 99) | def round_3(self, message: Message3) -> Scalar:
method round_4 (line 107) | def round_4(self, message: Message4) -> Scalar:
method round_5 (line 118) | def round_5(self, message: Message5) -> Scalar:
FILE: utils.py
function interpret_json_point (line 10) | def interpret_json_point(p):
FILE: verifier.py
class VerificationKey (line 10) | class VerificationKey:
method verify_proof (line 40) | def verify_proof(self, group_order: int, pf, public=[]) -> bool:
method verify_proof_unoptimized (line 75) | def verify_proof_unoptimized(self, group_order: int, pf, public=[]) ->...
method compute_challenges (line 95) | def compute_challenges(
Condensed preview — 25 files, each showing path, character count, and a content snippet. Download the .json file or copy for the full structured content (112K chars).
[
{
"path": ".gitignore",
"chars": 64,
"preview": "venv\n*.pyc\n# Remove jupyter notebook stuff\n*.ipynb\nsettings.json"
},
{
"path": "README.md",
"chars": 11378,
"preview": "# PlonKathon\n**PlonKathon** is part of the program for [MIT IAP 2023] [Modern Zero Knowledge Cryptography](https://zkiap"
},
{
"path": "TESTING_verifier_DO_NOT_OPEN.py",
"chars": 10153,
"preview": "import py_ecc.bn128 as b\nfrom utils import *\nfrom dataclasses import dataclass\nfrom curve import *\nfrom transcript impor"
},
{
"path": "__init__.py",
"chars": 0,
"preview": ""
},
{
"path": "compiler/__init__.py",
"chars": 0,
"preview": ""
},
{
"path": "compiler/assembly.py",
"chars": 5900,
"preview": "from utils import *\nfrom .utils import *\nfrom typing import Optional\nfrom dataclasses import dataclass\n\n\n@dataclass\nclas"
},
{
"path": "compiler/program.py",
"chars": 7465,
"preview": "# A simple zk language, reverse-engineered to match https://zkrepl.dev/ output\n\nfrom utils import *\nfrom .assembly impor"
},
{
"path": "compiler/utils.py",
"chars": 1820,
"preview": "from utils import *\nfrom enum import Enum\nfrom dataclasses import dataclass\n\n\nclass Column(Enum):\n LEFT = 1\n RIGHT"
},
{
"path": "curve.py",
"chars": 5643,
"preview": "from py_ecc.fields.field_elements import FQ as Field\nimport py_ecc.bn128 as b\nfrom typing import NewType\n\nprimitive_root"
},
{
"path": "poly.py",
"chars": 6784,
"preview": "from curve import Scalar\nfrom enum import Enum\n\n\nclass Basis(Enum):\n LAGRANGE = 1\n MONOMIAL = 2\n\n\nclass Polynomial"
},
{
"path": "prover.py",
"chars": 11462,
"preview": "from compiler.program import Program, CommonPreprocessedInput\nfrom utils import *\nfrom setup import *\nfrom typing import"
},
{
"path": "pyproject.toml",
"chars": 578,
"preview": "[tool.poetry]\nname = \"plonkathon\"\nversion = \"0.1.0\"\ndescription = \"A simple Python implementation of PLONK adapted from "
},
{
"path": "setup.py",
"chars": 3325,
"preview": "from utils import *\nimport py_ecc.bn128 as b\nfrom curve import ec_lincomb, G1Point, G2Point\nfrom compiler.program import"
},
{
"path": "test/__init__.py",
"chars": 0,
"preview": ""
},
{
"path": "test/main.plonk.vkey-58.json",
"chars": 1997,
"preview": "{\n \"protocol\": \"plonk\",\n \"curve\": \"bn128\",\n \"nPublic\": 0,\n \"power\": 3,\n \"k1\": \"2\",\n \"k2\": \"3\",\n \"Qm\": [\n \"1029"
},
{
"path": "test/main.plonk.vkey-59.json",
"chars": 1842,
"preview": "{\n \"protocol\": \"plonk\",\n \"curve\": \"bn128\",\n \"nPublic\": 1,\n \"power\": 3,\n \"k1\": \"2\",\n \"k2\": \"3\",\n \"Qm\": [\n \"1029"
},
{
"path": "test/main.plonk.vkey.json",
"chars": 1693,
"preview": "{\n \"protocol\": \"plonk\",\n \"curve\": \"bn128\",\n \"nPublic\": 0,\n \"power\": 3,\n \"k1\": \"2\",\n \"k2\": \"3\",\n \"Qm\": [\n \"1429"
},
{
"path": "test/mini_poseidon.py",
"chars": 1088,
"preview": "from py_ecc.fields.field_elements import FQ as Field\nfrom py_ecc import bn128 as b\nimport json\nfrom curve import Scalar\n"
},
{
"path": "test/poseidon_rc.json",
"chars": 17372,
"preview": "[\n [\n 20036611579827150559091469005844175073625940102952070649817884191797764107075,\n 12833584042949159"
},
{
"path": "test.py",
"chars": 9545,
"preview": "import pickle\nfrom TESTING_verifier_DO_NOT_OPEN import TestingVerificationKey\nfrom compiler.program import Program\nfrom "
},
{
"path": "transcript.py",
"chars": 4281,
"preview": "from utils import Scalar\nfrom curve import G1Point\nfrom merlin import MerlinTranscript\nfrom py_ecc.secp256k1.secp256k1 i"
},
{
"path": "utils.py",
"chars": 686,
"preview": "import py_ecc.bn128 as b\nfrom curve import Scalar\n\nf = b.FQ\nf2 = b.FQ2\n\nprimitive_root = 5\n\n# Extracts a point from JSON"
},
{
"path": "verifier.py",
"chars": 3987,
"preview": "import py_ecc.bn128 as b\nfrom utils import *\nfrom dataclasses import dataclass\nfrom curve import *\nfrom transcript impor"
}
]
// ... and 2 more files (download for full content)
About this extraction
This page contains the full source code of the 0xPARC/plonkathon GitHub repository, extracted and formatted as plain text for AI agents and large language models (LLMs). The extraction includes 25 files (104.6 KB), approximately 31.0k tokens, and a symbol index with 112 extracted functions, classes, methods, constants, and types. Use this with OpenClaw, Claude, ChatGPT, Cursor, Windsurf, or any other AI tool that accepts text input. You can copy the full output to your clipboard or download it as a .txt file.
Extracted by GitExtract — free GitHub repo to text converter for AI. Built by Nikandr Surkov.