ART Report: Resource Machine

The report contains the specification of the Anoma resource machine abstraction. It should contain all the necessary information needed to create a resource machine implementation (e.g., taiga) with the desired properties (which should also be clearly defined), but not to be unnecessarily restrictive.

Looking forward to your feedback :slightly_smiling_face:

1 Like

I would prefer to limit Galois fields to prime order (which is trivial) or 2^n, unless we have some reason for other prime powers? Arbitrary Galois field multiplication is not actually easy; even for 2^n we need to parametrize on a particular polynomial

1 Like

Here are my preliminary notes. Note that I’m a non-expert when it comes to crypto stuff, so things that someone familiar with the field might take for granted, I cannot. I’m mostly coming at this from a mathematical perspective. If I wanted to formalize a resource machine as a concrete mathematical structure, what would that look like? To that end, I’m fixating mostly on ambiguities and things that I can’t make sense of, and giving some recommendations for clearer exposition along the way. I also just finished reading it, so my understanding of what a resource machine is supposed to be is undeveloped. Some of these might be obvious or pointless to others, but I hope this is helpful anyway.

Page 1:

We say a field to refer to a component of some composite structure

You never define what a composite structure is. It seems to be a record with named fields.

We say a field to refer to a component of some composite structure 𝑆

You’re not consistent about using the word β€œfield” instead of β€œcomponent”, and I don’t see a good reason to use this terminology. Why not just always say β€œcomponent”? You could then say β€œcomputable component” instead of β€œcomputed field”, etc.

A computed field of 𝑆 is a value that is not a field of 𝑆 but that can be derived
solely from the composite structure 𝑆 fields.

I think what this is trying to say is that the computed fields are trivially derivable from other fields. This sentence isn’t correct as later computed fields are computed using hash functions being applied to other fields, so, not β€œsolely”; you also need the hash function. If a hash is always involved, maybe you should move the mention of computed fields until after hashes are mentioned. You can then define computed fields as hashes of other fields. Also ensure to mention that computed fields can be created from other computed fields (e.g. 𝑅.Ξ”), as the default interpretation of just β€œfield” on its own would be the non-computed fields.

The grammar of this sentence suggests that β€œfields” is a verb, an action that S is performing, which is obviously wrong. So this sentence should be rephrased even if you don’t take my previous advice.

Let 𝐻 be a class of hash function

Where does H live? Is it part of the data within the machine itself, or is this just defining a type of hash function?

such that the output of any function in 𝐻 forms a finite field.

Is the range a field, or is the image a field? For example, can I have the outputs be, say, graphs, and then define a field over the finite set of graphs defined by the image of H? Are there restrictions on the input?

I’m also not sure about having 𝐻 at all. It’s never really used. It’s mentioned as containing the hash functions in the β€œResource Computable Fields” section, and that’s it. I don’t see why we need a symbol when all it refers to is the hash functions listed in that section, and we don’t ever need to talk about the set of those functions.

and the corresponding to β„Ž finite field is denoted as πΉβ„Žπ‘₯

Rephrase to β€œand the finite field corresponding to β„Ž is denoted πΉβ„Žπ‘₯”.

We use π‘₯𝑠 (β€œπ‘₯-set”) to refer to a set of objects of the same type as π‘₯.

Is x a type? Or is x a component, like in the previous paragraph? Taken literally, this says that x should be an element of some type, and β€œxs” is referring to that type itself, but that’s awful. It would imply that, for example, β€œ25s” means the natural numbers since 25 is a natural number.

I also don’t think this line is that helpful, as this is just a naming convention, not a notation that needs to be explained. Where it appears, β€œxs” is just a name, it’s declared to be a set, and it doesn’t need further explanation, IMO. I spent too long trying to figure out what this line was saying when what it’s saying is already obvious from the context where it’s used.

Page 2:

π‘™π‘Žπ‘π‘’π‘™ : 𝐹_π‘™π‘Žπ‘π‘’π‘™

Many of the "F"s in the resource definition should be in blackboard bold (i.e. using \mathbb) but aren’t.

Resource Computable Fields

Are these hash functions the same across different resources? Or are they global; part of the definition of the resource machine? Or are they universal; the same across different resource machines? This is essentially the same question as before, where does β€œH” live?

Address π‘Ÿ.π‘Žπ‘‘π‘‘π‘Ÿ = π‘π‘š = β„Ž_π‘π‘š(π‘Ÿ)

Shouldn’t it be β€œAddress π‘Ÿ.π‘Žπ‘‘π‘‘π‘Ÿ = r.π‘π‘š = β„Ž_π‘π‘š(π‘Ÿ)”? Also, why have separate bullets? Can’t one just say cm is also called addr since it’s used as an address? Also, the β€œr.π‘Žπ‘‘π‘‘π‘Ÿβ€ notation is never used, so thus bullet can be dropped entirely.

When a resource is created, its commitment is added to an append-only global
commitment tree.

What is an β€œappend-only global commitment tree”? Is it a Merkle tree? Where does it live? Is it part of the resource machine, or is it an external thing the machine has access to?

Delta 𝑅.Ξ” = β„ŽΞ”(𝑅.π‘˜π‘–π‘›π‘‘, 𝑅.π‘ž)

Why is β€œR” capitalized here when it’s lowercase on the other bullets?

Proving system

Where do proving systems live? Are they part of the data within the resource machine, so every resource logic uses the same proving system? Or does every resource logic have its own proving system?

Remark 1. The hash function used to derive Ξ” must have additive homomorphic properties […]

The algebra should be spelled out explicitly through equations. Something like

  • β„ŽΞ”(K, q1) + β„ŽΞ”(K, q2) = β„ŽΞ”(K, q1 + q2)

It should also be made clear what it’s a homomorphism of. If β€œβ„ŽΞ”(K, 0) = 0” doesn’t hold then it’s not a monoid homomorphism.

when deltas of two resources of different kinds are added, their quantities stay independent of each other

I don’t know how to interpret β€œtheir quantities stay independent of each other”, it’s not clear what that could mean.

π‘‰π‘’π‘Ÿπ‘–π‘“π‘¦(π‘£π‘˜, π‘₯, πœ‹) : 𝑃𝑆.π‘‰π‘’π‘Ÿπ‘–π‘“π‘¦π‘–π‘›π‘”πΎπ‘’π‘¦ Γ— 𝑃𝑆.πΌπ‘›π‘ π‘‘π‘Žπ‘›π‘π‘’ β†’ F_𝐡"

Should this have β€œπ‘ƒπ‘†.π‘ƒπ‘Ÿπ‘œπ‘œπ‘“β€ as one of its input types? Also, what is 𝐡? That’s not mentioned before this. Did you mean F_𝑏, for the boolean field?

A proof record carries the components required to verify a proof.

This is the same data that’s taken as input for Verify, but in a different order. That’s quite messy. Why not make Verify : π‘ƒπ‘Ÿπ‘œπ‘œπ‘“π‘…π‘’π‘π‘œπ‘Ÿπ‘‘ β†’ F_b? That way, you don’t need to permute a proof record to verify it.

Page 3:

A partial transaction is a mutable composite structure

What does β€œmutable” mean? I know what it means in the context of programming, but I think this is trying to handwave something that really needs to be described explicitly. I’ll write about this more at the end of my note, but what actually is a resource machine, and how does its data evolve during execution? The answer to this question is given in disparate pieces implicitly throughout the paper but is never described explicitly. β€œmutability” can’t be made sense of without a more explicit description of the architecture. Where are the partial transactions stored so that they can be mutated, and what update function is being applied to perform the mutation? What does the global state transition look like; specifically where/when does it call the mutation update function? I suppose all that doesn’t need to be explained at this part of the paper, but it should be explained somewhere.

A partial transaction is a composite structure 𝑃 𝑇 𝑋 = […]

Resources are defined in terms of a large product but partial transactions are not and a name for the type of partial transactions is never given. This is an inconsistency that should be easy to fix.

π‘Žπ‘›π‘  : {π‘Žπ‘› : Fπ‘Žπ‘›}

This notation is very confusing and suggests that β€œan” is an element of the set β€œFan”, so every β€œan” is part of a different set indexed by itself. Very strange. I think this is trying to say that β€œans” is a set of "Fan"s, and that any particular element of this set is called β€œan”. Same thing with β€œπ‘π‘šπ‘  : {π‘π‘š : Fπ‘π‘š}” and β€œπ‘›π‘“ 𝑠 : {𝑛𝑓 : F𝑛𝑓 }”. In these lines, β€œan”, β€œπ‘π‘šβ€, and β€œnf” don’t seem to actually refer to a specific element, but are just part of the naming schemes for example elements, so I think their mention can be dropped entirely without causing confusion. It may be helpful to explain where the names come from. β€œcm” standing for β€œcommitment” is obvious, but where does β€œan” come from?

π‘Žπ‘›π‘  : {π‘Žπ‘› : Fπ‘Žπ‘›} is a set of roots (from different epochs) of the global resource commitment tree.

Is the β€œglobal resource commitment tree” the same as the β€œappend-only global
commitment tree” mentioned earlier? It would probably be a good idea to make symbols for all the global data structures and use those symbols when referencing them.

Δ𝑝𝑑π‘₯ : 𝐹Δ is computed from Ξ” of input and output resources (signed addition).

Isn’t the Ξ” computed using hΞ”? Where does signed addition come in? And when and where are these requirements enforced? This is another place where the architecture is hinted at without being described explicitly. If Δ𝑝𝑑π‘₯ is a computed Ξ”, how is that? If 𝐹Δ is just a field, then the partial transaction can’t seemingly know where it came from. Δ𝑝𝑑π‘₯ is just a field element, and it being some kind of difference must be enforced by something outside the partial transaction, and it’s unclear what’s doing that and how.

Alternatively, is this actually a computed field? If so, then it should be separated from the other fields, as it looks like data of the ptx that has to be checked for correctness rather than something computed from the ptx.

is a post-ordering execution function

What does β€œpost-ordering” mean?

q = 5 […] π‘ž = π‘ž_π‘šπ‘Žπ‘₯ = |𝐹_π‘ž| βˆ’ 1

F is not in blackboard bold. Also, where does this field come from? It doesn’t look like it’s used by any hash function. And why does 𝐹_5 have 6 elements, according to this?

Each partial transaction contains a set of resources to be consumed and a set of resources to be created.

This is not true according to the definition just given. Is this referring to the cms and nfs? Those are field elements, not resources. If I understand this properly, they are essentially names of resources that can be looked up to get the actual resources. However the resources themselves are not part of the partial transaction.

For each resource consumed or created in a partial transaction, it is required to provide a proof that the logic of the resource evaluates to 1 given the partial transaction parameters

The β€œresource logic” terminology is really confusing. It should probably be renamed to β€œresource constraint”, or something like that. It doesn’t make much sense for a logic to evaluate to 1.

Ξ” proof (balance proof) […]

It would be good to write this out explicitly in more mathematical terms. It’s not clear what Ξ” composition is, how they are extracted from resources, or how the check actually occurs. Is composition just a linear combination? The way it’s written, it sounds like they are separate things (one is the value, while composition is something else that’s not specified).

RM compliance proof […]

Being explicit in more mathematical terms about what this looks like would be good. There’s a lot of moving parts here, and a detailed list of descriptions is necessary, I think. Also, what is β€œRM”? Is this supposed to be β€œARM”, or whatever other acronym we settle on?

It also requires explicit check of the presence of all other required proofs.

Where do the other required proofs come from? Are they part of the data of the resource machine? Are they the same thing as the β€œmandatory inputs and checks” mentioned later in the Resource Logic section?

Page 4:

𝑒π‘₯𝑒𝑐𝑝𝑑π‘₯ = 𝐹 (𝑒π‘₯𝑒𝑐1, 𝑒π‘₯𝑒𝑐2) […] Φ𝑝𝑑π‘₯ = 𝐺(Ξ¦1, Ξ¦2)

It’s fine to have F and G as parameters for composition, but it would be good to have some guidance for what sort of things they would be in practice.

F is used elsewhere to represent fields. While I don’t think this will cause confusion, it may be wise to change these variable names so there’s no clash.

Remark 4. For parameters composed with the disjoint union operator, it has to be checked that those sets didn’t have any elements in common, otherwise the partial transactions cannot be composed.

Why? Disjoint unions generally label the elements of the two sets so there isn’t an overlap; why can’t that be done here, and why does that not apply to π‘Žπ‘›π‘ ? Why do the disjoint unions need to be disjoint to begin with? There’s a lot of unexplained intuition packed into this remark.

Executable. The system used to represent and interpret executables

Are these related to earlier EXECs in the 𝑃 𝑇 𝑋s? If so, the earlier declaration that 𝐸𝑋𝐸𝐢 = 𝑃 𝑇 𝑋 β†’ 𝑃 𝑇 𝑋 is wrong, as what’s being described in this new section is a first-order representation that can be interpreted as a function. Instead, there should be something like a function of type β€œeval : EXEC Γ— PTX β†’ PTX”, and EXEC should actually be defined to be a kind of Nockma program. If I’m wrong, then a clarification that we’re talking about different executables should be made.

𝑅𝐸𝐴𝐷_𝑆𝑇 𝑂𝑅𝐴𝐺𝐸(π‘Žπ‘‘π‘‘π‘Ÿπ‘’π‘ π‘  : Fπ‘π‘š)

What does this actually return? An integer? An element of some other field? An actual resource? If that last one, is there a read function for nfs? If not, how does one get the resource associated with the hash generated by h_nf? That’s needed for the private input resources of a Resource Logic.

If the value is not found, the operation should return an error.

What does this mean? Again, the resource machine is being described implicitly, in parts, but the whole is not clear.

π‘…πΈπ‘†π‘‚π‘ˆ 𝑅𝐢𝐸𝑆_π΅π‘Œ _𝐼𝑁 𝐷𝐸𝑋(𝑖𝑛𝑑𝑒π‘₯_𝑓 π‘’π‘›π‘π‘‘π‘–π‘œπ‘›)

What are index functions and where do they come from? What’s an example of an index function? The only example of an index function mentioned in the paper appears on page 5, where it’s stated that β€œan index lookup uses the value stored at the address as an index function”. So, is an index function a function at all, or are you using a lookup table that contains addresses and calling that a function?

The executable might update partial transaction data fields,

What is an update? Again, this is implicitly describing the machine architecture without describing it explicitly. How are updates done? Presumably, this relates to the earlier comment on mutability. This is really a global architecture question. This section describes an update function, but where it lives (it’s part of the machine, right?) and how/when it’s used is less clear.

π‘”π‘π‘œπ‘’π‘›π‘‘ […] π‘”π‘™π‘–π‘šπ‘–π‘‘

Where are these variables stored? Are they part of the machine?

If π‘”π‘π‘œπ‘’π‘›π‘‘ > π‘”π‘™π‘–π‘šπ‘–π‘‘, the execution halts with an error indicating the gas limit was exceeded.

What are errors? What does it mean to halt? Error handling would be part of the machine definition, but, again, this is mentioned offhand without ever being described explicitly.

Recursive calls first increment the counter π‘”π‘π‘œπ‘’π‘›π‘‘, then recurse

Are executables even capable of recursion? It doesn’t seem like it, based on the Nockma description later on. If a pattern reduction step is executed, but it’s not recursion, just a step of evaluation, does that still increase gcount?

Page 5:

6.2. Resource Logic. A resource logic is a predicate […]

A Resource Logic represents a predicate, but it’s not literally a predicate (that is, a function from a set onto the booleans). We can extract a predicate from a resource logic, but it clearly has more structure.

The evaluation context of a Resource logic is a partial transaction.

This is contradicted by the later

  • input resources corresponding to the elements of 𝑛𝑓 𝑠
  • output resource corresponding to the elements of π‘π‘šπ‘ 

The partial transactions do not contain the resources themselves, only their names, so clearly the context must be more than just the partial transactions. Presumably, the context also includes the β€œglobal content-addressed storage” and β€œglobal resource nullifier set”. These might be made accessible by functions in the program language itself, but in section 6.1, β€œπ‘…πΈπ΄π·_𝑆𝑇𝑂𝑅𝐴𝐺𝐸” is described as an IO function, and the beginning of 6.2 states that resource logics do not require IO, so I don’t know.

Resource logics are customizable on both implementation of a resource machine and resource logic creation level. A concrete implementation of a resource machine can add more mandatory inputs and checks

This should really be spelled out explicitly by describing how information flows between/through the resource machine and the logic.

Here’s my best guess for how this should work: I think that a resource logic is a Nockma program with access to variables representing the cms, nfs, and actual resource data associated with them from the relevant partial transaction. A PS.Proof is a ZKP of the trace of the program attesting to it outputting β€œtrue”. And the resource machine has, as part of its data, a Nockma program that’s run prior to the resource logic program, and the ZKP should also verify its trace ends in β€œtrue” as well.

That’s my best guess for how this is supposed to work, but I’m not confident at all. Whatever the case is, it should be stated explicitly.

Verifiability. It must be possible to produce and verify a proof of type 𝑃 𝑆.𝑃 π‘Ÿπ‘œπ‘œπ‘“ that given a certain set of inputs the resource logic output is π‘‡π‘Ÿπ‘’π‘’

β€œTrue” is not a possible output according to the Nockma specification given later on. Should this be β€œ1”, representing true?

𝑛𝑓 𝑠 βŠ‚ 𝑛𝑓 𝑠𝑝𝑑π‘₯

Why can’t the resource logic mention all the nullifiers or commitments? That is to say, why are these proper subsets instead of improper subsets?

custom inputs

How are custom inputs stored and read?

for each output resource, check that the corresponding π‘π‘š value is derived according to the rules specified by the resource machine from its fields

What are the fields of a resource machine? That’s never described. Or is this the cm resource fields? If the latter, do fields need to be mentioned here?

Page 6:

Pattern

An explicit syntax for patterns would be helpful. It’s clear that the syntax for patterns is different for the syntax of nouns, and some patterns don’t have evaluation rules (for example, β€œ?[a b c]”) and it never says what to do in such a case.

finite field 𝐹𝑛 of order 𝑛

This contradicts the previous notation where non-blackboard bold F represents fields that are outputs of hash functions. I would recommend making this F, and all subsequent Fs in this section, blackboard bold.

division operation (floor division)

What happens when dividing by 0?

addition operation 𝑍𝑛 Γ— 𝑍𝑛 β†’ 𝑍𝑛 Γ— 𝐹2

F should be blackboard bold. Same for remaining operations mentioning F2.

the conversion function would return a flag (of type 𝐹2)

F should be blackboard bold.

Examples section

The examples provided are generally understandable, particularly in their depiction of transactions. However, they fall short in offering deeper insights beyond this basic case. It mentions green arrows indicating a state change. What’s state is being changed? The resource machine’s? Where is this state being stored so that it can be changed? Presumably, there should be global something-or-others involved, but they go unmentioned. The composition is also unclear. They are just handwaved away with unspecified Fs and Gs, leading to no additional insight. It’s still unclear what the execs are, for instance, since none are described at all. None of the examples describe specific resource logics, meaning they fail to give examples of things like custom inputs which are still unclear.

Discussion: What even is a resource machine?

Nowhere in the paper is a resource machine actually defined. Frequently, β€œglobal” data structures are mentioned, but it’s never made clear if they are part of the state of the machine or are some external data that the machine can read from. Because of this, what’s in it is ambiguous. Here’s a reasonable interpretation:

A resource machine contains the following data;

  • A finite set of hash functions, with at least the specific β„Žπ‘π‘š, β„Žπ‘›π‘“ , β„Žπ‘˜π‘–π‘›π‘‘, and β„ŽΞ”.
  • A Merkle tree for commitments
  • A table with Fcm entries containing resources
  • A table with Fnf entries containing resources
  • An integer gas counter
  • A set of partial transactions
  • A nockma program representing mandatory constraints

such that, at each step, it takes an input from an external clock, and then executes each of its partial transactions. This step is the least clear. There should be some state transition function of some concrete form that does this. It seems like most of the paper is trying to describe the details of this, but the connective tissue that ties this all together isn’t really there. What I’d like to know is, given a collection of resources and a machine state, how do we get to the next machine state? A timeline describing the order in which every transformation to the state is done would be helpful.

I suspect that some of the things in the previous bullet list aren’t part of the machine, but external things the machine can read from (and I may have described some of these incorrectly), but this hopefully demonstrates ambiguity in the current paper. A more clearly articulated architecture described in terms of an explicit data structure would be extremely helpful.

2 Likes

The connective tissues are sort of out of scope of the paper. I’d go as far to say it prescribes too much flow which isn’t necessarily there. For example I believe the EXEC part of the partial transaction is likely to be removed as it’s subsumed by transaction candidates. This is because the document is describing an abstract Taiga (The resource machine). The more connective flow of the system can be seen in

https://research.anoma.net/t/graphing-anoma-agents-v3/341/2

This is a rough graph of how the system flows as a diagram. Transaction candidates are created either by the user or by solvers and submitted to the mempool, which ends up spawning processes that starts up the execution engine. From here the transaction candidates (nockma code) evaluate and can read data stored in shards. At the end of execution it will have new state to be written to the shards. Ordering is decided by consensus, which can be read about here:

https://specs.anoma.net/alpha/main/operational-architecture/typhon.html

Thanks for the reviews so far (particularly @AHart - this is fantastic feedback). A few comments from my end (cc @vveiln):

Execution flow

As @mariari mentions, I think we can remove EXEC and stipulate that the Nockma code must simply return a full partial transaction structure - and all we need to define is the verification function (no need to define mutation functions) over PTX.

In that case, we might also want to clarify execution flow. Perhaps we can have a parameter which is passed to the Nockma code - something like:

data ExecTime = PreOrdering | PostOrdering

data ExecInfo = ExecInfo {
  execTime :: ExecTime,
  execWho :: ExternalIdentity
}

In both pre-ordering and post-ordering execution, the Nockma code must return a partial transaction, but the partial transaction returned in pre-ordering execution need not be balanced - this is what the solver would then examine in an attempt to solve (and compose the involved partial transactions, then forward their composition).

The question naturally emerges of whether and how to compose the Nockma code segments of two different partial transactions (or whatever we call the message that contains the Nockma code which returns a partial transaction). For now, let’s not worry about this, and say that the naive solver simply takes partial transaction codes A and B, calculates A, calculates B, and if they can be matched (and balanced) as C returns new Nockma code that simply returns C. In other words, we can worry about hybrid post-ordering matching later.

One problem with passing parameters to the Nockma code like this is that the parameter values aren’t verifiable ex post, so they cannot be checked by resource logics. We might also want to have β€œsentinel resources” which contain these values if we want resource logics to be able to check them.

Distributed partial evaluation

Relatedly, I think it may be helpful to visualize and articulate how this resource machine, when run in a distributed network, implements what I think can be understood as a form of distributed partial evaluation.

  • resources <> memory states
  • resource kinds <> memory cells
  • partial transactions <> evaluation states

For example (diagram could be greatly expanded), understanding #A, #B, #D, and #E as kinds of resources previously created, #C and #F as kinds of ephemeral resources, and res as some result, the following three partial transactions perform a distributed reduce operation:

Further input here from the PLT folks (cc @jonathan) would probably be helpful, I’m sure I’m butchering the language, but I think the analogy holds, and it’s worth making explicit.

Resource machine

I agree with @AHart that it would be helpful to more explicitly define a resource machine. In this case, the machine is in fact deterministic and need only process one transaction at a time, executing the Nockma code, validating the returned partial transaction, and either applying the updates (should the partial transaction be valid and balanced) or terminating without any changes (should the Nockma code run out of gas or the partial transaction be either invalid or unbalanced). Anything about sharding should be out of scope, though - that is a detail (feature) of Typhon’s execution engine, which preserves serializability - meaning that we need not care here.

Might have more thoughts tomorrow.

1 Like

@AHart seems to have covered almost all my comments (and much more!), so I have just three:

Ptx Arity
Does the ARM spec put any restrictions on the numbers of input and output resources? Is it theoretically allowed for ARM to have different number of inputs and outputs? E.g, 2 input and 3 output? Would this break anything?

Page 3
Section 5.1:

A RM compliance proof is required to…

This is the first time the abbreviation β€œRM” occurs in the document. I think it stands for Resource Machine, but since this abbreviation is not encountered anywhere except here, it’s probably better to just write that explicitly.

Page 5
Section 6.2:

Resource logics take as input a subset of resources created and consumed in the partial transaction

A subset of what? Resource logic takes all resources comprising a partial transaction, I think?

Resource Logic Public Inputs:

  • 𝑛𝑓𝑠 βŠ‚ 𝑛𝑓𝑠_𝑝𝑑π‘₯
  • π‘π‘šπ‘  βŠ‚ π‘π‘šπ‘ _𝑝𝑑π‘₯
  • Custom inputs

Resource logic also needs an indication of which resource it is the logic of. This can be assumed to live under custom inputs, but we may want to mention it explicitly (we don’t have to define the exact mechanism of such indication in the spec).

1 Like

Generally ptxs don’t have size bounds anymore, and right now I can’t see how having different number of input and output resources would break anything, but it is definitely a good idea to think it through

Not anymore

Why do you think it would be helpful? What is the context around it?

Not anymore

How does it work then? This is not immediately clear from the spec, as far as I can tell, but maybe I missed something. What determines which resources are passed as inputs?

What determines which resources are passed as inputs?

I have the same confusion. What’s the context that resource logic can access in a ptx?
I tried to let the resource logic access only the owned and concerned resources but failed. Maybe you have a good solution now?

The one we had in mind is to use a similar mechanism to how the β€œowner” resource is currently determined in Taiga. Would there be a problem with that (forgetting about efficiency)?

The efficiency is not a problem.
If the resource logic wants to access a potential target resource in circuits, the circuit must load all the resources in the ptx and search the target.
I didn’t figure out a good solution if the circuit only loads a subset of the resources.

Ah, true. So, what if we just pass only some resources from the ptx in and apply the checks to them only? It won’t support all possible checks (namely, we can’t check that the partial transaction doesn’t include something or some statement holds for all resources in a ptx), but it seems that it is a good enough subset, at least for now (discussed in more details here: Forall / negative checks in resource logics). If logics contain statements β€œthere must be a resource in the ptx that satisfies x”, whoever creates the proof will have to provide all the sufficient resources from the ptx to have the logic return true

If the resource logic doesn’t treat the resources equally, I’m unsure if it would leak some relation information among resources.
Besides, for now, function privacy is under the assumption that all the resource logics/circuits have the same layout.

For example, we could pass a hash of all of the included resource commitments (in a mini Merkle tree), and then the resource logic could check inclusion of (and properties of) specific ones, without leaking any info, right?

The point is the resource logic should also publicize the nullifiers or note commitments in the subset(specific ones).

Not sure that I follow. We do want function privacy to hide which other resource commitments were checked by a specific resource logic, but the public inputs should give the resource logic the required information to check included commitments (e.g. a Merkle tree of the commitments in the transaction).

namely, we can’t check that the partial transaction doesn’t include something or some statement holds for all resources in a ptx

That makes sense if we are under this condition.
So we won’t support the whitelist/blacklist examples, in which we need to check for all resources? I remember It was a classic example we wanted to achieve. And it might be reasonable for a user to refuse to receive a specified Token or from some senders.

The resource logic needs to publicize some information to help check the resources passed to logic are actually from the ptx.

I agree we can use hash/merkle tree to encode all the resources in the ptx, and the resource logic can decode the hash/merkle tree and use the specific ones. In this way, only the hash/merkle_root goes to the public inputs.

However, circuits can not decode/accept variable resources. If the number of resources passed to logic changes, the circuit and resource kind change. In general, we have the resource and its logic before the number of resources in the ptx is determined.

We won’t support this for now. In order to support this and preserve privacy, we need a slightly different implementation (see the discussions here) - curious how complicated you think this would be. If it’s easy we can scope it in to the next version of the resource machine report.