# Integral evaluation in shielded pools

In the Multi-Asset Shielded Pool, the limited additional fungibility provided by the Convert circuit allows for fungibility among tokens with distinct metadata, such as time value. For example, different “vintages” of the same token can be distinguished inside the pool by epoch shielded. By re-introducing fungibility to these now distinct vintages, a (public) conversion rate can apply to tokens based on this metadata and therefore a different rate or even a different primitive token type applies to the conversion as a function of the metadata. The most direct application of this is “integral evaluation”: applying some function based on the quantity of the token multiplied by the amount of time a token has existed in its present state, despite the fact that both the quantity and time are private values.

The Convert circuit can work well when this function changes frequently; for example, conversion rates can be computed outside of the circuit (although it can still be computationally expensive to do so) and when only the latest conversion rate applies. For example, if tokens can only be Converted to the latest vintage, this is better both for privacy and efficiency.

A more substantial challenge occurs when attempting to do this in a shielded pool that doesn’t apply such metadata to tokens, or that applied some other metadata. For example, an existing shielded pool probably wasn’t designed to allow this integral evaluation. Alternatively, it might be desirable to do this integral evaluation over an arbitrary range of time and not only ending at the present time. Unfortunately, over n epochs there would be O(n^2) possible integration domains to compute conversion rates, which is far too inefficient.

So let’s assume a slightly different context than the Convert circuit does:

1. Tokens don’t have metadata attached (of any kind) - only quantity and type
2. The function to compute is a simple function of the quantity * time, for example, a constant * quantity * time
3. For each note commitment (or nullifier) it is publicly known which epoch that note commitment (or nullifier) was revealed

Then we can define an integration circuit which takes as public input:

• A merkle root rt_cm of a tree with leaves (note commitment, i) where i is the epoch that note commitment was revealed
• A merkle root rt_nf of a tree with leaves (nullifier, j) where i is the epoch that nullifier was revealed
• A value commitment to the output of the function
• An authorization signature
• An integration nullifier nf_I

and checks the following:

• there exists a merkle path in rt_cm to a note commitment and epoch (cm, i)
• there exists a merkle path in rt_nf to a nullifier and epoch (nf, j)
• nf is the correct nullifier of cm
• the authorization signature was created using the spending key in cm
• the value commitment opens to c * v *(j - i) where c is a constant and v is the value (quantity) in cm
• nf_I is correctly derived from cm and nf in a proper way

Then the authorization signature and nf_I are checked outside of the circuit for validation and uniqueness, respectively.

The value commitment can be used either as public input to another circuit, or possibly used to balance a shielded pool transaction, if the value represents a claim of some tokens.

A note on proof systems and curves: in addition to the problems described above, the proof systems and curves necessary for this present new problems, especially when combining different pools. Considering several concrete examples:

1. BLS12-381/JubJub based pools (Sapling, MASP). Since the note commitments are over the JubJub curve, an integration circuit must be implemented in the JubJub base field. If the proof system used is groth16 over BLS12-381, then the integration circuit requires another trusted setup.
2. Pallas/Vesta based pools (Orchard, Taiga). An integration circuit wouldn’t need a new trusted setup, but also would not be immediately compatible with the value commitments output by a JubJub integration circuit (or vice versa)

We can try to mitigate this a little bit. The value commitment could be based on blake2s (or other hash) and an additional circuit on a different curve can convert the value commitment to that curve.

Another mitigation would be to use Halo2 as the proof system for a JubJub based integration circuit. BLS12-381 could be used with Halo2 at the expense of being inefficient. A new elliptic curve could be found with prime order equal to the order of JubJub’s base field. This should be possible in principle but so far no such curve is known.

Do we have any idea of what “inefficient” means in practice here? Generally, for integral evaluations, proofs only need to be made once (per user).

Just some rough guesses, but compared to using a curve over a 256 bit field, the proofs would be approximately 1.5x (~381/256) larger and probably computations would be ~2x slower.

And yeah, possibly it’s not worth optimizing this too much if users only need to compute proofs once per integral evaluation (they may need to compute 1 proof per note in the integration domain, which could be over many notes) and similarly validators/full nodes only need to verify some bounded number of proofs in total (the number of notes that ever existed during the domain) which is known in advance.

I’ve got a couple of questions:

a (public) conversion rate can apply to tokens based on this metadata and therefore a different rate or even a different primitive token type applies to the conversion as a function of the metadata.

1. Do I understand correctly that converting an old token to a new token might happen with not a 1:1 rate? Is it because users receive rewards for holding tokens?

2. So as I understand, integration circuit allows to prove that the committed value corresponds to an integral evaluation of a token that existed for `j - i` epochs. What can this information be used for and what is the relationship between convert circuit and integral evaluation?

Do I understand correctly that converting an old token to a new token might happen with not a 1:1 rate? Is it because users receive rewards for holding tokens?

Possibly, or it can be more general than that. It’s true that it doesn’t have to be at a 1:1 rate, in fact the only true constraint on this rate is that it’s publicly known.

So as I understand, integration circuit allows to prove that the committed value corresponds to an integral evaluation of a token that existed for `j - i` epochs.

Correct, the idea is that instead of doing some evaluation at precise moments in time, the evaluation happens over the integration domain, which might have (for example) better game-theoretic properties.

What can this information be used for and what is the relationship between convert circuit and integral evaluation?

The convert circuit can be used to perform some integral evaluations, for example, by tagging tokens with their vintage, but this requires support to be added ahead of time. The convert circuit also computes the integral over an implicit domain (I might even say an implicit measure - mathematically speaking), where by implicit I mean the convert circuit is not explicitly aware of the values `i`, `j`, or `j-i` but only uses these values through precomputed AllowedConversion rates. This both simplifies the convert circuit (the logic is simpler) and since this measure is computed out of the circuit, can integrally evaluate arbitrary functions, since both the domain and the function are just included in the convert ratio.

The proposal above is (in some ways) complementary. Tokens don’t need to be vintage tagged, since the circuit computes `j-i` directly. However, the function is not implicit (so the function to be integrally evaluated now needs to be computed in the circuit) For a simple linear function this is probably not a huge problem - e.g. if the function is the same across the whole domain, or at least has a compact circuit representation of some kind. But if the integrated function is highly time dependent - e.g. `f(t, amount(t))` - then it may be more practical to compute out-of-circuit and use conversion ratios.

1 Like

for example, by tagging tokens with their vintage

What does it mean to tag a token with their vintage?

I mean that the asset type of some note also includes a specific epoch (e.g. when the note was created), as a means of evaluating the timespan since that event.

1 Like

After some discussions, it seems that the decision is that:

1. integral evaluation is interesting but presently not worth the effort to write the new circuits and infrastructure needed
2. It should be possible, theoretically, to find a non-pairing curve whose order is the same as the size of the base field of the curve used for value commitments and merkle tree in the relevant evaluation domain; however, a long (~1 month) search using existing software (which says it implements a polynomial time algorithm for such search) did not find such curves of interest. Since the constraints required for this sort of curve are not extensive, it should not require brute force search to find (e.g. can use complex multiplication instead). But since this curve was not readily generated, it might require more nontrivial analysis and/or use more expensive pairing based curves instead. The total cost of using a pairing based curve is at least bounded (if the integral domain is fixed) so perhaps it’s feasible, but it still would be better to do it the right way.
3. An “instantaneous” evaluation is much easier since it can use existing circuits. The method is roughly described below. There is a question of whether an instantaneous evaluation is representative of the state over time. While an instantaneous evaluation of the past might be less subject to distortion/game theoretic effects, it still could be quite distorted by random chance. Taking several instantaneous evaluations might mitigate this.

Here’s the idea for how to use existing circuits to do an instantaneous evaluation. The “Convert” circuit is quite handy for this. Existing circuits can create or consume notes of their specific types. If the merkle root to be instantaneously evaluated is frozen and set as “consume only”, then the consumed notes’ value commitment an be combined with new notes of a different type with the Convert circuit as a bridge between the two otherwise incompatible circuit domains. This presents some problems however, for example nullifier reveal during evaluation, and also a question of how many notes from the evaluated merkle tree should be consumed at a time. However, since this approach uses only existing circuits, it is simple to implement.