Closed ghost closed 4 years ago
In the LF<->JSON representation used in JSON API, we should keep it as-is for fidelity with LF values, LF types, Java, and Scala codegen representation, as well as simplicity of mapping.
I don't agree. I believe the burden is on us to present interfaces that are idiomatic for the client's host environment. Maintaining simplicity of mapping benefits us as the developers of DAML at the expense of our clients. I don't see how it's reasonable to argue to a typescript developer that the DAML template with a tuple parameter needs to be fed a record with fields _1
and so on because of "fidelity" with our internal representation of values.
Similarly to the code sample in description, you might
function retuple2<A, B>(ab: { _1: A, _2: B }): [A, B] {
return [ab._1, ab._2];
}
const json = await this.submit('command/exercise', payload);
// Decode the server response into a tuple.
const responseDecoder: jtv.Decoder<[R, Event<unknown>[]]> =
jtv.object ({_1:choice.resultDecoder(), _2:jtv.array(decodeEventUnknown())})
.map(retuple2);
const result: [R, Event<unknown>[]] = jtv.Result.withException(responseDecoder.run(json));
return result;
I don't see how it's reasonable to argue to a typescript developer that the DAML template with a tuple parameter needs to be fed a record with fields
_1
and so on because of "fidelity" with our internal representation of values.
Because it's not strictly an "internal" representation of values. The LF-ness of the ledger is a deliberately public feature of it, and the foundation of its future-proofness. The surface language is free to change incompatibly because the permanent stability of verifiable transactions is based upon LF-defined packages and transactions, not the surface language, which is thus free to evolve for new applications.
The LF-ness of the ledger is a deliberately public feature of it, and the foundation of its future-proofness.
The point of DAML is to abstract over the ledger. As a DAML full-stack developer I'm surprised to even have to know about the existence of DAML-LF let alone have to understand this third language in that I might design my client applications to account for it!
- Docs for translation from DAML to JSON https://docs.daml.com/app-dev/daml-lf-translation.html and https://docs.daml.com/json-api/lf-value-specification.html
- To understand data transfer from/to DAML to json you need to understand two things:
- How DAML values are translated to/from DAML-LF;
- How DAML-LF values are translated to/from JSON.
Going a bit beyond the topic at hand but I think the above is a mistake. I think it unfortunate that to implement interoperability with DAML you need to reference DAML-LF documentation. I believe this is a barrier to entry and at odds with the simplicity we are trying to sell.
The point of DAML is to abstract over the ledger. As a DAML full-stack developer I'm surprised to even have to know about the existence of DAML-LF let alone have to understand this third language in that I might design my client applications to account for it!
It may be surprising, but the impermanence of DAML means that it cannot do the job of abstracting over the ledger by itself. It relies upon LF to meet the requirements it cannot itself satisfy.
I think it unfortunate that to implement interoperability with DAML you need to reference DAML-LF documentation. I believe this is a barrier to entry and at odds with the simplicity we are trying to sell.
I think it's worth thinking about in this context, rather than necessarily being off-topic. LF imposes requirements like "only nominal record types are allowed in contracts" so the mapping to Java and other languages can be obvious. Insofar as DAML wishes to exceed those constraints, those features must be mapped down to fit the LF constraints. Thus you see records with _1, _2, in Java codegen, Scala codegen, gRPC value representation, and pretty much anything else on the non-DAML side interacting with the ledger.
So, knowing the (pretty simple I would say, though perhaps you disagree, or it is a matter of my personal involvement) DAML<->LF mapping, a developer already knows how data is represented in all these contexts, because the restricted rules of LF, which may be successfully represented in all of them, hold for all of them.
Similarly to the code sample in description, you might
Noted. That indeed does smooth things over a bit if we can't do better.
We should keep the current behavior. It maybe doesn't produce the most idiomatic JS/TS but using TS tuples will be even more confusing. Right now, you access the first element of a tuple x
using x._1
in both DAML and JS/TS. If we made the change @shayne-fletcher-da suggested, we'd have to use x[0]
in JS/TS instead. Being one-indexed in DAML and zero-indexed in JS/TS could cause a lot of confusion for almost no benefit.
I'm aware that the construction of tuples on the JS/TS is a bit clumsy but I don't think that the issue is big enough to warrant special-casing the JSON API for tuples. IMO, we should teach people to not use tuples in template types and choice argument types since it's bad practice anyway. When you use tuples in choice return types, you don't need to create them on the JS/TS side anyway.
@shayne-fletcher-da If you don't feel strongly about this, could you please close this ticket?
DAML has tuples as does Typescript. DAML-LF does not. Normally typescript values convert seamlessly with DAML values (ints, strings, lists, etc.). As a typescript/DAML developer I'm surprised that my DAML tuple interface must be encoded/decoded as a record - it's unnatural.
Example of the sort of code you end up having to write to work-around:
What should we do? (Some notes from an earlier discussion in this ticket ).