AI-Hypercomputer / maxtext

A simple, performant and scalable Jax LLM!
Apache License 2.0
1.47k stars 275 forks source link

Maxtext Offline serverless inference code #897

Closed vipannalla closed 14 hours ago

vipannalla commented 6 days ago

We coded a offline inference solution for mlperf round to make submissions. The README.md included has clear steps on how to run the offline inference code.

vipannalla commented 4 days ago

For some reason the linter complains about "R0917: Too many positional arguments" in files/modules which were not modified by the PR. Was this a new rule recently enabled?

gobbleturk commented 1 day ago

For some reason the linter complains about "R0917: Too many positional arguments" in files/modules which were not modified by the PR. Was this a new rule recently enabled?

This should be getting fixed soon, you may need to sync to head once its fixed