webmachinelearning / webnn

🧠 Web Neural Network API
https://www.w3.org/TR/webnn/
Other
317 stars 40 forks source link

Do we need an `MLConstantOperand`? #668

Open a-sully opened 2 weeks ago

a-sully commented 2 weeks ago

The constant() operator is special. Constants are effectively inputs that are known at the time of graph compilation (i.e. build()).

You may ask, why do we need a separate method when we could just pass this constant data as an input()?

Most discussions I've seen so far about constant() (https://github.com/webmachinelearning/webnn/issues/614#issuecomment-2019116877, this thread about the since-abandoned "fill sequence" overload, etc) have been concerned with optimizing performance. There are a number of compile-time optimizations a backend may perform if it knows that some data is constant.

Our experience with CoreML has given us another reason:

The backend requires that a parameter to an operator must be constant

Take conv2d(), for example. It's defined for CoreML here. The bias parameter is defined as:

Meanwhile, WebNN allows this to be any MLOperand, as long as it's of the appropriate shape and dtype:

dictionary MLConv2dOptions {
  // ...
  MLOperand bias;
  // ...
};

This appears to be a straightforward plumbing through of DML's interface, which does not require the BiasTensor to be a constant. Neither does the corresponding operator for TFLite. From what I can tell, this seems to be because these frameworks don't have a way to express that some input tensor must be const. The options are either to pass the parameter as the framework's generic representation of a Tensor - which would in practice always(?) be created from a constant() - or to pass the parameter as a 1D array directly. If the parameters may be large (and perhaps unbounded), the former is the more appropriate choice.

To get a sense for whether this is a reasonable hypothesis, I've inspected of all† uses of the affected operators in the WebNN Samples repo repo:

operator.param Usage in WebNN Samples
batchNormalization.mean Constant only
batchNormalization.variance Constant only
batchNormalization.scale Constant only
batchNormalization.bias Constant only
conv2d.bias Constant only
convTranspose2d.bias Not used
gru.weight Constant only
gru.recurrentWeight Constant only
gru.bias Constant only
gru.recurrentBias Constant only
instanceNormalization.scale Constant only††
instanceNormalization.bias Constant only††
layerNormalization.scale Not used
layerNormalization.bias Not used
lstm.weight Not used
lstm.recurrentWeight Not used
lstm.bias Not used
lstm.recurrentBias Not used
lstm.peepholeWeight Not used
prelu.slope Not used

†This list only includes WebNN operators which trivially map to CoreML operators. WebNN operators which need to be in terms of other CoreML operators will be subject to the restrictions of those respective CoreML operators. For example, CoreML doesn't have operators for gruCell or lstmCell, so these operators will need to be implemented in terms of gru and lstm, respectively. These operators will in turn need many of their parameters to be const, as well

††One caller of passes the result of a reshape... but that's only because the sample was written before constant() took an MLOperandDescriptor. The reshape is just assigning dimensions to a constant(). Nowadays we'd just pass the constant() directly

Remarkably, every single instance where one of these params is used in the WebNN Samples, it was created from a constant(). Cool!

Of course, this is not close to a comprehensive list of all models hope to run with WebNN. That being said, if there are no significant known use cases for passing any of these parameters as non-constant tensors - if their non-constness is simply a limitation in the framework and there are no useful reasons to pass non-const tensors - I think there's a reasonable argument that WebNN should require these parameters to be constants. @fwdr could you perhaps provide some more color here? :)

It seems that we have the following options to support each of these operators on CoreML:

  1. Require that operator.param must be a constant MLOperand (my tentative preference)
    • Note that it's much easier to relax these restrictions than to try to impose them later
    • This can be done two ways:
      1. Create an MLConstantOperand interface which extends MLOperand, and specify that param takes an MLConstantOperand
      2. Specify that MLOperand has a "kind", as the Chromium implementation already does, and throw a TypeError if not a "constant" kind. This may be confusing to developers
  2. Make operator.param a sequence<MLNumber>
    • This doesn't make much sense for anything other than 1-D tensors
  3. Decide that this is a quirk of CoreML and fail if operator.param is not a constant only on CoreML
    • Currently this failure happens during build() on Chromium, though we could conceivably make this a synchronous check on the respective builder method, especially if we have defined procedures for querying for backend-specific support (see #463)

Thoughts?

huningxin commented 2 weeks ago

@a-sully

gru.weight Not used gru.recurrentWeight Not used gru.bias Not used gru.recurrentBias Not used

FYI, these operands (constants only) are used by NSNet2 (noise supression) example: https://github.com/webmachinelearning/webnn-samples/blob/master/nsnet2/nsnet2.js#L51

a-sully commented 2 weeks ago

My apologies for the error. I've updated the respective rows (they're also all "Constant only")

a-sully commented 2 weeks ago

Turns out XNNPACK also requires some tensors to be constants - for example prelu.slope. The Chromium implementation currently fails if the slope operand was not created as a constant()