simonw / llm-replicate

LLM plugin for models hosted on Replicate
Apache License 2.0
58 stars 6 forks source link

`llm replicate fetch-predictions` command #12

Closed simonw closed 1 year ago

simonw commented 1 year ago

Refs:

TODO:

simonw commented 1 year ago

Notes on ability to run twice without fetching everything here:

For the first shipped version I'm going to always fetch every predictions paginated page, but then individual predictions will only be fetched if they're not in the DB yet or are incomplete.

Incomplete means completed_at is null and the status is something other than failed.

simonw commented 1 year ago

progress

simonw commented 1 year ago

Documentation: https://github.com/simonw/llm-replicate/blob/840ea3a0b6182bf9b4ca24a20d0cc0aed072e07a/README.md#fetching-all-replicate-predictions