Verifiable Delay Functions (VDF)
Leading Problems
- randomness is hard to generate on chain due to non-determinism, but we still want to be able to verify the randomness
- fairness of leader election is hard to ensure as the attacker may manipulate the randomness in election
VDF Requirements
- Anyone has to compute sequential steps to distinguish the output. No one can have a speed advantage.
- The result has to be efficiently verifiable in a short time (typically in log(t))
Techniques
- injective rational maps (First attempt in original VDF paper): “weak VDF” requires large parallel processing
- Finite group of unknown order (Pietrazak and Wesolowski): use a trapdoor or Rivest-Shamir-Wagner time-lock puzzle
Applications
- Chia blockchain: use VDF for consensus algorithms
- Protocol Labs + Ethereum Foundation: co-funding grants for research of viability of building optimized ASICs for running a VDF
Great Resources
Formal Verification
Leading Problem
- How do we formally verify that a set of constraints used by a zero knowledge proof system has the desired characteristics, such as soundness, completeness, functional correctness, and zero knowledge?
- Zero knowledge proof systems often use a mathematical constraint system such as R1CS or AIR to encode a computation. The zero knowledge proof is a probabilistic cryptographic proof that the computation was done correctly
- Formal verification of a constraint system used in a zero knowledge proof requires
- (1) a formal specification of the computation that the constraint system is intended to encode
- (2) a formal model of the semantics of the constraint system
- (3) the specific set of constraints representing the computation
- (4) the theorem prover, and
- (5) the mechanized proof of a theorem relating (1) and (3)
Techniques
- To prove a general statement with certainty, it must be a purely mathematical statement which can be known to be true by reasoning alone, without needing to perform any tests. The reasoning must start from consistent assumptions (axioms) and proceed by truth-preserving inferences
- Example: given that a bachelor is defined as an unmarried man, and a married man by definition has a spouse, it can be proven with certainty that no bachelor has a spouse.
- A proof theory codifies a consistent set of rules for performing truth-preserving inference. Proof theories can be used to construct proofs manually, with pencil and paper
- Computer programs called proof assistants reduce labor and mitigate the risk of human error by helping the user to construct machine-checkable proofs
Using a proof assistant
- To use a proof assistant to prove a statement about a program, there are two main approaches:
- (1) Write the program in the proof assistant language and apply the proving facilities to the program directly
- (2) Use a transpiler to turn a program written in another language into an object which the proof assistant can reason about
- Of these approaches, (1) seems preferable for the greater confidence provided by the proof being about exactly the (source) program being executed, as opposed to output of a transpiler which is assumed to have the same meaning as the source program
- What motivates approach (2) is when (for whatever reason) the proof assistant language is not suitable as a language for developing the application in
Without using a proof assistant
- There are also ways to prove a statement about a program without (directly) using a proof assistant:
- Use a verified compiler (e.g. CompCert), which turns a source program into an object program which provably has certain properties by virtue of (proven) facts about the verified compiler
- Note that there is a distinction between a verified compiler and a verifying compiler. CompCert itself is proved to generate binary that will always be semantically equivalent to the C input. A verifying compiler generates a binary along with a proof of correctness that the binary is semantically equivalent to the source.
- When a program is compiled by a verifying or verified compiler, we say that the compiler output is correct by construction (provided that the input is correct).
- Use an automatic proof search algorithm, which takes as input statements to be proven and outputs proofs of those statements if those statements are true and the proof search algorithm finds proofs
- Use a static analyzer, which takes as input a program and automatically checks for various kinds of issues using predetermined algorithms
- Both of these approaches have limitations:
- A verified compiler is limited in what statements it can prove about the resulting program: typically, just that the resulting program has the same meaning or behavior as the source program
- An automatic proof search algorithm is limited in what statements it can prove by the sophistication of the algorithm and the computational power applied to it. Also, due to Gödel’s incompleteness theorem, there cannot exist a proof search algorithm which would find a proof of any given true statement
- A static analyzer is generally not capable of reasoning about the meaning of a program to see if it’s correct; it is only able to recognize patterns which always or often indicate some kind of issue
Formal Verification for ZK Circuits
- Formal verification for probabilistic proof systems, inclusive of ZK proof systems, encompasses two main problem spaces:
- Proving the intended properties of a general-purpose proving system, such as soundness, completeness, and zero knowledge (E.g., Bulletproofs, Halo 2, etc.)
- Proving the intended properties of a circuit, namely that it correctly encodes the intended relation
- Problem (1) is generally considered a very difficult problem and has not been done for any significant proving system
- Problem (2) can be done with a formal specification of the relation and a model of thcircuit semantics. Usually it requires proving functional correctness of functions defined within the relation as well as the existence of witness variables for every argument to the function
Denotational design
- Denotational design provides a helpful way of thinking about both problem spaces (general and application-specific). The circuit denotes a set: namely, the set of public inputs for which the circuit is satisfiable
- The goal of application specific circuit verification is to prove that the circuit denotes the intended relation
- The goal of general purpose proving system verification is to prove that it has the intended properties with respect to the denotational semantics of circuits:
- Soundness means that if the verifier accepts a proof, then with high probability, the public input used to generate the proof (i.e., the statement being proven) is in the set denoted by the circuit (i.e., the statement is true)
- Completeness means that if a public input (i.e., a statement) is in the set denoted by the circuit (i.e., the statement is true), then the proving algorithm successfully outputs a proof which the verifier accepts
- Example: consider a ZK proving system which one would use to show that one has a solution to a Sudoku puzzle, without revealing the solution
- The statements to be proven are of the form “X (a Sudoku puzzle) is solvable”
- The relation we intend is the set of solvable Sudoku puzzles
- Soundness means that if the verifier accepts a proof that X is solvable, then with high probability, X is solvable
- Completeness means that if X is solvable, then the prover creates a proof that X is solvable which the verifier accepts
- Zero knowledge means that having a proof that X is solvable does not reduce the computational difficulty of finding a solution to X
- To see this example worked out more formally, see the OSL whitepaper.
- If you know that your circuit denotes the relation you intend, and you know that your general purpose proof system is sound and complete in the above senses, then you know that your application-specific proving system (i.e., the circuit plus the general proof system) has the intended soundness and completeness properties for that application
- This suggests that, given a formally verified general-purpose proving system, and a verifying compiler from statements to circuits, one can solve the problem of proving correctness of application-specific proving systems without application-specific correctness proofs
- Suppose that
- one can write the statement to be expressed by a circuit in a machine-readable, human-readable notation, where it is self-evident that the statement being written has the intended meaning or denotation
- one has a verifying compiler which turns that statement into a circuit which provably has the same denotation as the source statement
- circuit can be executed on a formally verified general-purpose probabilistic proving system
- Then one can generate formally verified application-specific probabilistic proving systems without any additional proof writing for an additional application. This seems like a promising way forward towards a sustainable and cost effective approach to formal verification for ZK circuits
Synthesizing formally verified programs
- Here is a summary of some of the ways in which the ecosystem supports efficient execution of verified code:
- The proof assistants Coq and Agda do not provide for compilation of programs written in those languages to an efficiently executable form
- The language ATS provides proof facilities and purports to allow for programming with the efficiency of C and C++
- There are various means for transpiling code written in a mainstream language such as C or Haskell into a proof assistant, which allows for theorems to be proven about the extracted model of the source program
- You can synthesize an efficient binary program using Coq (e.g., using Fiat)
- The proof assistant ACL2 defines a subset of Common Lisp with a full formal logic. When a definition is executable, it can be compiled into efficient code, and because the language is a formal logic, you can define and prove theorems about the code
- There is a verifying compiler project, ATC, from ACL2 to C
- Imandra defines a subset of OCaml with a full formal logic and a theorem prover
Current limitations of formal methods on modern hardware
- In the context of modern computing, most computationally intensive tasks deal with vector math and other embarassingly parallel problems which are done most efficiently on specialized hardware such as GPUs, FPGAs, and ASICs
- This is generally true of the problem of constructing proofs in probabilistic proof systems. Provers for these proof systems would be most efficient if implemented on specialized hardware, but in practice, they are usually implemented on CPUs, due to the greater ease of programming on CPUs and the greater availability of those skill sets in the labor market
- For creating a formally verified implementation of a probabilistic proof system which executes efficiently, it seems that the right goal is not to optimize for speed of execution on a CPU, but to target specialized hardware such as FPGAs, GPUs, or ASICs
- Unfortunately, tools for compiling formally verified programs to run on FPGAs, GPUs, or ASICs are more or less nonexistent as far as we know
The State of Current Progress
- Decades of research exists on formal verification strategies for arithmetic circuits in the context of hardware verification
- See, e.g., Drechsler et al (2022) and Yu et al (2016)
- This work has limited industral applications, e.g., the AAMP5 (see Kern and Greenstreet (1997), page 43)
- This line of research is not directly applicable to formal verification of arithmetic circuits for zk-SNARKs, because arithmetic circuits in hardware and arithmetic circuits in zk-SNARKs are not quite the same things
- Orbis Labs is working on:
- A verifying Halo 2 circuit compiler for Σ¹₁ formulas
- Expected to be working in Q4 2022 or Q1 2023
- Orbis Specification Language (OSL), which provides a high level spec language which we can compile to Σ¹₁ formulas
- A toolchain (Miya) for developing formally verified, hardware accelerated probabilistic proof systems
- A theory of interaction combinator arithmetization, towards compiling formally verified code into circuits
- No timeline on this; still in basic research
- Kestrel Institute is a research lab that has proved functional correctness of a number of R1CS gadgets using the ACL2 theorem prover. (Kestrel also does a lot of other FV work, including on Yul, Solidity, Ethereum, and Bitcoin)
- They formalized and proved the functional correctness of parts of the Ethereum Semaphore R1CS
- They formalized and proved the functional correctness of parts of the Zcash Sapling protocol. An overview of the process:
- Used the ACL2 proof assistant to formalize specs of parts of the Zcash Sapling protocol
- Formalized rank 1 constraint systems (R1CS) in ACL2
- Used an extraction tool to fetch the R1CS gadgets from the Bellman implementation, and then used the Axe toolkit to lift the R1CS into logic
- Proved in ACL2 that those R1CS gadgets are semantically equivalent to their specs, implying soundness and completeness
- Aleo is developing programming languages such as Leo that compile to constraint systems such as R1CS
- Aleo aims to create a verifying compiler for Leo, with theorems of correct compilation generated and checked using the ACL2 theorem prover
- Aleo has also done post-hoc verification of R1CS gadgets using Kestrel Institute’s Axe toolkit
- Nomadic Labs is a consulting firm that does a lot of work on Tezos and they built the Zcash Sapling protocol for shielded transactions into the Tezos blockchain as of the Edo upgrade. Kestrel Institute formally verified some of the R1CSes used in that protocol. (Nomadic Labs also does a lot of other FV work)
- Anoma team is working on the Juvix language as a first step toward creating more robust and reliable alternatives for formally verified smart contracts than existing languages
- Andrew Miller and Bolton Bailey are working on a formal verification of a variety of SNARK proof systems, using the Lean Theorem Prover, in the Algebraic Group Model
- Alex Ozdemir from Stanford is working on adding a finite field solver in cvc5 SMT Solver
- Lucas Clemente Vella and Leonardo Alt are working on SMT solver of polynomial equations over finite fields
- Veridise is working on:
- Ecne is a special-purpose automatic proof search tool which can prove that an R1CS constraint system defines a function (total or partial)
- Starkware is writing Lean proofs to check that circuits expressed as Cairo programs conform to their specs
How to analyze each research effort / development project
- You could analyze each research effort / development project in terms of
- Other interesting attributes of project concerning FV for constraint systems are:
Future Research Directions
- A lot of work needs to be done. There is not enough emphasis placed on formal verification in the security industry
- Based on the observations and arguments presented in this blog post, we think the following will be some interesting directions for future research and development:
- Can we formally verify systems that are designed to automatically make ZK circuits more efficient?
- Improved specification languages and verified translators between specification languages
- Understand how to create formally verified programs to run on vectorized hardware, e.g., FPGAs, GPUs, and/or ASICs
Terminology