64bit / async-openai

Rust library for OpenAI
https://docs.rs/async-openai
MIT License
1.17k stars 178 forks source link

Add support for organization API #265

Open sharifhsn opened 2 months ago

sharifhsn commented 2 months ago

Fixes #260

This PR currently only implements the invites module of the organization API. I'd like to get clarification on some details before I continue.

64bit commented 2 months ago

Fixes #260

This PR currently only implements the invites module of the organization API. I'd like to get clarification on some details before I continue.

Awesome! thank you for the PR.

* The organization API requires a different API key, denoted as `OPENAI_ADMIN_KEY` in documentation. I'm thinking that I should write an additional method on `OpenAIConfig` to use this environment variable, perhaps `with_admin_key`. What do you think?

Looks like admin API key is also a bearer token, then existing with_api_key is sufficient because user can decide to provide the right key.

Current behavior to read OPENAI_API_KEY env var on Client::new() can be extended to read OPENAI_ADMIN_KEY when non-admin key is not present.

* All of the `organization` APIs are organized under that endpoint. As with the `assistants` module, I'm choosing to keep them in the flat namespace.

Sounds good. I see the same top level grouping here https://platform.openai.com/docs/api-reference/administration

* I'm choosing to name the types based on their names in the OpenAPI schema. Is this correct, or should I follow a different schema?

Yes having 1-on-1 naming from spec to Rust types helps with single source of truth and on going maintenance. There are nested objects in spec which don't have name i usually go with <parent-schema-name><field-name> pattern for naming.

* What is the minimum supported Rust version of this project? After checking with `cargo-msrv`, the current MSRV is 1.70.0. If this is intended, it should be indicated somewhere in the README.

Isn't rust-version in Cargo.toml doing that? https://github.com/64bit/async-openai/blob/main/async-openai/Cargo.toml#L9 Is cargo-msrv a different thing?


Please consider adding a self contained example for this - it helps me test (& maintain) that serialization and de serialization of new types are working well.