openai / openai-node

The official Node.js / Typescript library for the OpenAI API
https://www.npmjs.com/package/openai
Apache License 2.0
7.9k stars 855 forks source link

[Feature Request] Assistants API integration/example with Azure OpenAI #701

Open danny-avila opened 8 months ago

danny-avila commented 8 months ago

Confirm this is a feature request for the Node library and not the underlying OpenAI API.

Describe the feature or improvement you're requesting

I understand there is an Azure specific SDK, but I've built a lot of my app using this library and believe this shouldn't be too hard to integrate and maintains parity with the openai python SDK.

From this example, it looks like the python SDK can handle this pretty easily:

from openai import AzureOpenAI

# Load the environment variables - These are secrets.
load_dotenv()

api_URI = os.getenv("OPENAI_URI")
api_KEY = os.getenv("OPENAI_KEY")
api_version = os.getenv("OPENAI_VERSION")
deployment_name = os.getenv("OPENAI_DEPLOYMENT_NAME")

# Create an OpenAI Azure client
client = AzureOpenAI(api_key=api_key,
        api_version=api_version,
        azure_endpoint=api_endpoint)

That's all it takes and then all the "beta" assistant methods are available:

assistant = client.beta.assistants.create()
thread = client.beta.threads.create()
message = client.beta.threads.messages.create()
# etc.

Source: https://github.com/Azure-Samples/azureai-samples/tree/main/scenarios/Assistants/assistants-api-in-a-box

If this can't be achieved or isn't in the roadmap through this node library, I'd rather make my own REST methods rather than try to use 2 different libraries

Additional context

No response

rattrayalex commented 8 months ago

This is the example for how to use Azure with this library: https://github.com/openai/openai-node/blob/master/examples/azure.ts

You can instantiate a client that way and then do client.beta.assistants.create() exactly as you would in Python.

cc @kristapratico

danny-avila commented 8 months ago

@rattrayalex Thanks for your reply.

Unfortunately I already tried to initialize a client with azure on v4.28.4.

I regularly initialize Azure clients, and the same client will be able to use regular chat completions as expected, but It's not apparent to me how the API call gets constructed when using the Azure options.

rattrayalex commented 8 months ago

What code did you use and what problems did you run into?

danny-avila commented 8 months ago

What code did you use and what problems did you run into?

Thanks for prompting me to code as I tried to reproduce with a simpler approach.

I figured out that the baseURL needs to be modified to be compatible with Assistants API.

from:

https://github.com/openai/openai-node/blob/6175eca426b15990be5e5cdb0e8497e547f87d8a/examples/azure.ts#L24

to

baseURL = `https://${resource}.openai.azure.com/openai`;

Can I make a PR to update the Azure example?

import OpenAI from 'openai';

(async () => {
  const model = process.env.DEPLOYMENT_NAME;
  const resource = process.env.API_RESOURCE;
  const apiVersion = process.env.API_VERSION;
  const apiKey = process.env.ASSISTANTS_API_KEY;

  const options = {
    defaultQuery: {
      'api-version': apiVersion,
    },
    defaultHeaders: {
      'api-key': apiKey,
    },
    apiKey: apiKey, // error without this
    baseURL: `https://${resource}.openai.azure.com/openai/deployments/${model}`,
  };
  const openai = new OpenAI(options);

  console.log('Streaming:');
  const stream = await openai.chat.completions.create({
    model,
    messages: [{ role: 'user', content: 'Say hello!' }],
    stream: true,
  });

  for await (const part of stream) {
    process.stdout.write(part.choices[0]?.delta?.content ?? '');
  }
  process.stdout.write('\n');

  try {
    openai.baseURL = `https://${resource}.openai.azure.com/openai`;
    const response = await openai.beta.assistants.list({
      order: 'desc',
      limit: 20,
    });
    console.log('List:', response);
  } catch (error) {
    console.error('Error listing Assistants:', error);
  }

   const url = `https://${resource}.openai.azure.com/openai/assistants?order=desc&limit=20&api-version=${apiVersion}`;
  const fetchOptions = {
    method: 'GET',
    headers: {
      'Content-Type': 'application/json',
      'OpenAI-Beta': 'assistants=v1',
      'api-key': apiKey,
    },
  };

  fetch(url, fetchOptions)
    .then(response => response.json()) // Convert the response to JSON
    .then(data => console.log(data)) // Log the data
    .catch(error => console.error('Error:', error)); // Log any errors
})();
rattrayalex commented 8 months ago

Ah, I see. A PR adding an azure-assistants.ts file would be welcome.

naichalun commented 8 months ago

Hello, friends, can we add proxy interfaces to the code? Similar to using python /os.environ["http_proxy"] = "http://xxx.xxxxxx:port"

rattrayalex commented 8 months ago

@naichalun that is not on-topic for this thread. I'd ask you to open another, but the answer is here: https://github.com/openai/openai-node?tab=readme-ov-file#configuring-an-https-agent-eg-for-proxies