Open a-sully opened 2 weeks ago
@a-sully
gru.weight
Not usedgru.recurrentWeight
Not usedgru.bias
Not usedgru.recurrentBias
Not used
FYI, these operands (constants only) are used by NSNet2 (noise supression) example: https://github.com/webmachinelearning/webnn-samples/blob/master/nsnet2/nsnet2.js#L51
My apologies for the error. I've updated the respective rows (they're also all "Constant only")
Turns out XNNPACK also requires some tensors to be constants - for example prelu.slope
. The Chromium implementation currently fails if the slope
operand was not created as a constant()
The
constant()
operator is special. Constants are effectively inputs that are known at the time of graph compilation (i.e.build()
).You may ask, why do we need a separate method when we could just pass this constant data as an
input()
?Most discussions I've seen so far about
constant()
(https://github.com/webmachinelearning/webnn/issues/614#issuecomment-2019116877, this thread about the since-abandoned "fill sequence" overload, etc) have been concerned with optimizing performance. There are a number of compile-time optimizations a backend may perform if it knows that some data is constant.Our experience with CoreML has given us another reason:
The backend requires that a parameter to an operator must be constant
Take
conv2d()
, for example. It's defined for CoreML here. Thebias
parameter is defined as:bias: const tensor<[C_out],T>
Meanwhile, WebNN allows this to be any
MLOperand
, as long as it's of the appropriate shape and dtype:This appears to be a straightforward plumbing through of DML's interface, which does not require the
BiasTensor
to be a constant. Neither does the corresponding operator for TFLite. From what I can tell, this seems to be because these frameworks don't have a way to express that some input tensor must be const. The options are either to pass the parameter as the framework's generic representation of a Tensor - which would in practice always(?) be created from aconstant()
- or to pass the parameter as a 1D array directly. If the parameters may be large (and perhaps unbounded), the former is the more appropriate choice.To get a sense for whether this is a reasonable hypothesis, I've inspected of all† uses of the affected operators in the WebNN Samples repo repo:
operator.param
batchNormalization.mean
batchNormalization.variance
batchNormalization.scale
batchNormalization.bias
conv2d.bias
convTranspose2d.bias
gru.weight
gru.recurrentWeight
gru.bias
gru.recurrentBias
instanceNormalization.scale
instanceNormalization.bias
layerNormalization.scale
layerNormalization.bias
lstm.weight
lstm.recurrentWeight
lstm.bias
lstm.recurrentBias
lstm.peepholeWeight
prelu.slope
†This list only includes WebNN operators which trivially map to CoreML operators. WebNN operators which need to be in terms of other CoreML operators will be subject to the restrictions of those respective CoreML operators. For example, CoreML doesn't have operators for
gruCell
orlstmCell
, so these operators will need to be implemented in terms ofgru
andlstm
, respectively. These operators will in turn need many of their parameters to be const, as well††One caller of passes the result of a
reshape
... but that's only because the sample was written beforeconstant()
took anMLOperandDescriptor
. Thereshape
is just assigning dimensions to aconstant()
. Nowadays we'd just pass theconstant()
directlyRemarkably, every single instance where one of these params is used in the WebNN Samples, it was created from a
constant()
. Cool!Of course, this is not close to a comprehensive list of all models hope to run with WebNN. That being said, if there are no significant known use cases for passing any of these parameters as non-constant tensors - if their non-constness is simply a limitation in the framework and there are no useful reasons to pass non-const tensors - I think there's a reasonable argument that WebNN should require these parameters to be constants. @fwdr could you perhaps provide some more color here? :)
It seems that we have the following options to support each of these operators on CoreML:
operator.param
must be a constantMLOperand
(my tentative preference)MLConstantOperand
interface which extendsMLOperand
, and specify thatparam
takes anMLConstantOperand
MLOperand
has a "kind", as the Chromium implementation already does, and throw aTypeError
if not a "constant" kind. This may be confusing to developersoperator.param
asequence<MLNumber>
operator.param
is not a constant only on CoreMLbuild()
on Chromium, though we could conceivably make this a synchronous check on the respective builder method, especially if we have defined procedures for querying for backend-specific support (see #463)Thoughts?