JuliaSymbolics / Metatheory.jl

Makes Julia reason with equations. General purpose metaprogramming, symbolic computation and algebraic equational reasoning library for the Julia programming language: E-Graphs & equality saturation, term rewriting and more.
https://juliasymbolics.github.io/Metatheory.jl/dev/
MIT License
355 stars 46 forks source link

Bump peter-evans/create-or-update-comment from 3 to 4 #187

Closed dependabot[bot] closed 7 months ago

dependabot[bot] commented 7 months ago

Bumps peter-evans/create-or-update-comment from 3 to 4.

Release notes

Sourced from peter-evans/create-or-update-comment's releases.

Create or Update Comment v4.0.0

⚙️ Updated runtime to Node.js 20

  • The action now requires a minimum version of v2.308.0 for the Actions runner. Update self-hosted runners to v2.308.0 or later to ensure compatibility.

What's Changed

Full Changelog: https://github.com/peter-evans/create-or-update-comment/compare/v3.1.0...v4.0.0

Create or Update Comment v3.1.0

What's Changed

Full Changelog: https://github.com/peter-evans/create-or-update-comment/compare/v3.0.2...v3.1.0

Create or Update Comment v3.0.2

What's Changed

... (truncated)

Commits
  • 71345be feat: update runtime to node 20 (#306)
  • d41bfe3 build(deps-dev): bump prettier from 3.2.3 to 3.2.4 (#305)
  • 73b4b9e build(deps-dev): bump @​types/node from 18.19.7 to 18.19.8 (#304)
  • b865fac build(deps-dev): bump @​types/node from 18.19.6 to 18.19.7 (#303)
  • 52b668a build(deps-dev): bump eslint-plugin-jest from 27.6.1 to 27.6.3 (#302)
  • 974f56a build(deps-dev): bump prettier from 3.1.1 to 3.2.3 (#301)
  • 2cbfe8b build(deps-dev): bump @​types/node from 18.19.4 to 18.19.6 (#300)
  • 761872a build(deps-dev): bump eslint-plugin-prettier from 5.1.2 to 5.1.3 (#299)
  • 72c3238 build(deps-dev): bump @​types/node from 18.19.3 to 18.19.4 (#298)
  • 07daf7b build(deps-dev): bump eslint-plugin-jest from 27.6.0 to 27.6.1 (#297)
  • Additional commits viewable in compare view


Dependabot compatibility score

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot show ignore conditions` will show all of the ignore conditions of the specified dependency - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
github-actions[bot] commented 7 months ago

Benchmark Results

master 19e9ebdd5050e0... t[master]/t[19e9ebdd5050e0...]
basic_maths/simpl1 0.0581 ± 0.0042 s 0.0587 ± 0.0047 s 0.991
calc_logic/demorgan 0.602 ± 0.016 ms 0.605 ± 0.014 ms 0.995
egraph/addexpr 20.2 ± 3.1 ms 20.1 ± 1.1 ms 1.01
egraph/constructor 0.558 ± 0.02 μs 0.577 ± 0.022 μs 0.967
prop_logic/demorgan 0.961 ± 0.019 ms 0.983 ± 0.022 ms 0.978
prop_logic/freges_theorem 0.0394 ± 0.0032 s 0.0398 ± 0.0032 s 0.988
prop_logic/prove1 9.39 s 8.9 s 1.06
prop_logic/rewrite 0.0766 ± 0.0011 ms 0.0761 ± 0.0014 ms 1.01
while_superinterpreter/while_10 0.12 ± 0.002 s 0.121 ± 0.0012 s 0.991
time_to_load 0.26 ± 0.0019 s 0.258 ± 0.0018 s 1

Benchmark Plots

A plot of the benchmark results have been uploaded as an artifact to the workflow run for this PR. Go to "Actions"->"Benchmark a pull request"->[the most recent run]->"Artifacts" (at the bottom).