intel / ai-reference-models

Intel® AI Reference Models: contains Intel optimizations for running deep learning workloads on Intel® Xeon® Scalable processors and Intel® Data Center GPUs
Apache License 2.0
683 stars 220 forks source link

Bump onnx from 1.15.0 to 1.16.0 in /quickstart/recommendation/pytorch/memrec_dlrm #179

Closed dependabot[bot] closed 4 months ago

dependabot[bot] commented 5 months ago

Bumps onnx from 1.15.0 to 1.16.0.

Release notes

Sourced from onnx's releases.

v1.16.0

ONNX v1.16.0 is now available with exciting new features! We would like to thank everyone who contributed to this release! Please visit onnx.ai to learn more about ONNX and associated projects.

Key Updates

ai.onnx Opset 21

ai.onnx.ml Opset 4

IR Version 10

  • Added support for UINT4, INT4 types
  • GraphProto, FunctionProto, NodeProto, TensorProto added metadata_props field
  • FunctionProto added value_info field
  • FunctionProto and NodeProto added overload field to support overloaded functions.

Python Changes

  • Support registering custom OpSchemas via Python interface
  • Support Python3.12

Security Updates

  • Fix path sanitization bypass leading to arbitrary read (CVE-2024-27318)
  • Fix Out of bounds read due to lack of string termination in assert (CVE-2024-27319)

Deprecation notice

Bug fixes and infrastructure improvements

  • Enable empty list of values as attribute (#5559)
  • Add backward conversions from 18->17 for reduce ops (#5606)
  • DFT-20 version converter (#5613)
  • Fix version-converter to generate valid identifiers (#5628)
  • Reserve removed proto fields (#5643)
  • Cleanup shape inference implementation (#5596)
  • Do not use LFS64 on non-glibc linux (#5669)
  • Drop "one of" default attribute check in LabelEncoder (#5673)
  • TreeEnsemble base values for the reference implementation (#5665)
  • Parser/printer support external data format (#5688)
  • [cmake] Place export target file in the correct directory (#5677)

... (truncated)

Commits


Dependabot compatibility score

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot show ignore conditions` will show all of the ignore conditions of the specified dependency - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself) You can disable automated security fix PRs for this repo from the [Security Alerts page](https://github.com/intel/models/network/alerts).
dependabot[bot] commented 4 months ago

Looks like onnx is up-to-date now, so this is no longer needed.