haibbo / cf-openai-azure-proxy

A Cloudflare worker script to proxy OpenAI‘s request to Azure OpenAI Service
MIT License
1.67k stars 207 forks source link

[New feat request] Add PaLM API #27

Open invisprints opened 1 year ago

invisprints commented 1 year ago

I recently obtained access to the PaLM API for testing, but I found that there are hardly any client applications or products available for PaLM in the market. Therefore, I would like to know if it's possible to convert the PaLM API into OpenAI's format to integrate it with various third-party clients. I have already implemented the corresponding functionality for non-printer mode, but I have been struggling with printer mode. Can you help me resolve this issue? If you need, I can provide the API for you to debug. Here is my code, mostly generated by GPT4 as I'm not familiar with JavaScript. Currently, it works fine in non-printer mode.

// The deployment name you chose when you deployed the model.
const deployName = 'chat-bison-001';

addEventListener("fetch", (event) => {
  event.respondWith(handleRequest(event.request));
});

async function handleRequest(request) {
  if (request.method === 'OPTIONS') {
    return handleOPTIONS(request)
  }

  const url = new URL(request.url);
  if (url.pathname === '/v1/chat/completions') {
    var path = "generateMessage"
  } else if (url.pathname === '/v1/completions') {
    var path = "generateText"
  } else {
    return new Response('404 Not Found', { status: 404 })
  }

  let body;
  if (request.method === 'POST') {
    body = await request.json();
  }

  const authKey = request.headers.get('Authorization');
  if (!authKey) {
    return new Response("Not allowed", {
      status: 403
    });
  }

  // Remove 'Bearer ' from the start of authKey
  const apiKey = authKey.replace('Bearer ', '');

  const fetchAPI = `https://generativelanguage.googleapis.com/v1beta2/models/${deployName}:${path}?key=${apiKey}`

  // Transform request body from OpenAI to PaLM format
  const transformedBody = {
    prompt: {
      messages: body?.messages?.map(msg => ({
        author: msg.role === 'user' ? '0' : '1',
        content: msg.content,
      })),
    },
  };

  const payload = {
    method: request.method,
    headers: {
      "Content-Type": "application/json",
    },
    body: JSON.stringify(transformedBody),
  };

  const response = await fetch(fetchAPI, payload);
  const palmData = await response.json();

  // Transform response from PaLM to OpenAI format
  const transformedResponse = transformResponse(palmData);

  if (body?.stream != true){
      return new Response(JSON.stringify(transformedResponse), {
        headers: { 'Content-Type': 'application/json' },
      });
    } else {
      // add stream output
    }
}

// Function to transform the response
function transformResponse(palmData) {
  return {
    id: 'chatcmpl-' + Math.random().toString(36).substring(2), // Generate a random id
    object: 'chat.completion',
    created: Math.floor(Date.now() / 1000), // Current Unix timestamp
    model: 'gpt-3.5-turbo', // Static model name
    usage: {
      prompt_tokens: palmData.messages.length, // This is a placeholder. Replace with actual token count if available
      completion_tokens: palmData.candidates.length, // This is a placeholder. Replace with actual token count if available
      total_tokens: palmData.messages.length + palmData.candidates.length, // This is a placeholder. Replace with actual token count if available
    },
    choices: palmData.candidates.map((candidate, index) => ({
      message: {
        role: 'assistant',
        content: candidate.content,
      },
      finish_reason: 'stop', // Static finish reason
      index: index,
    })),
  };
}

async function handleOPTIONS(request) {
    return new Response(null, {
      headers: {
        'Access-Control-Allow-Origin': '*',
        'Access-Control-Allow-Methods': '*',
        'Access-Control-Allow-Headers': '*'
      }
    })
}
haibbo commented 1 year ago

Great Job!

The PaLM API probably doesn't support stream mode, right? I didn't see it in the document, and when I asked questions on Brad, I didn't see any printer effects. So, I wouldn't recommend you to implement the printer mode in this case since you have got all response.

I don't have the API permissions and not familiar with it, so I might be wrong.

haibbo commented 1 year ago

If you have any other questions or need further assistance, could you possibly lend me your key temporarily? I would only use it for proxy development.

My Email: haibbo@gmail.com.

Caixiaopig commented 1 year ago

我部署了当前版本的 cf-openai-palm-proxy.js,一直提示 500 Error。

invisprints commented 1 year ago

英文 + 美国 IP 能解决大部分问题

invisprints commented 1 year ago

@Caixiaopig 你可以尝试 https://github.com/invisprints/cf-openai-azure-proxy/blob/main/cf-openai-palm-proxy.js ,这个应该能返回错误信息,但我没有充分测试过

Caixiaopig commented 1 year ago

@invisprints 你的脚本能提示错误信息就方便很多,cf workers的归属IP地址会根据请求IP来分流,确实要换到美国的线路去访问cf workers的域名。