BennyKok / novel-vscode

A quick integration of Novel.sh editor to VSCode
https://marketplace.visualstudio.com/items?itemName=bennykok.novel-vscode
MIT License
61 stars 7 forks source link

Handle AI streaming #2

Open steven-tey opened 11 months ago

steven-tey commented 11 months ago

Would it be possible to hook this up to Novel's streaming endpoint (https://novel.sh/api/generate)? Would it handle streaming inside the Editor?

Alternatively, we could also enable Copilot autocompletes with this instead?

BennyKok commented 11 months ago

I bump into some werid issues when trying to directly change the completionApi then i end up trying changing that directly in novel, seems like its still hitting the default endpoint, I'm going take a few more looks into that, will report back if its an issues elsewhere!

CleanShot 2023-09-04 at 00 09 10

CleanShot 2023-09-04 at 00 08 51@2x
BennyKok commented 11 months ago

Now i see!

The "Continue writing" is using hardcoded /api/generate While the ++ auto-completion is using completionApi props in NovelEditor

Might have to fix it at novel's end.

However, by directly calling to https://novel.sh/api/generate there will be cors issue as well.

I will take a look at if there's possible to setup <> copilot.

Another option will be setting up a local vercel ai client in vscode extension, that allow users to set their own openai API key.

jt6677 commented 11 months ago

It should be two calls; first option then post. Using my own api, the default useCompletion does not seem to handle the options correctly in vscode extention mode. It would only send option but not the post call. I had to build a hook with axios.

jt6677 commented 11 months ago

https://github.com/BennyKok/novel-vscode/assets/62420169/396660d3-cb79-406f-b41e-aadf1d079166

It took me a couple of hours to get this. basically, I used Novel as a component instead of as a package, used a axios hook instead of `useCompletion

jt6677 commented 11 months ago
export const useAxiosCompletion = () => {
  const [isLoading, setIsLoading] = useState(false);
  const [completion, setCompletion] = useState<string>("");
  const headers = {
    "Access-Control-Allow-Origin": "*",
    "Access-Control-Allow-Credentials": "true",
    "Authorization": `Bearer ${authorization}`,
  };

  const complete = async () => {
    setIsLoading(true);
    try {
      const response = await axios.post(api, { headers });
      if (response.status === 429) {
        setIsLoading(false);
        return;
      }
      const respText = response.data.choices[0].message.content;
      setCompletion(respText);
    } catch (error) {
      console.log("error", error);
    }
    setIsLoading(false);
  };

  const stop = () => {
    // Implement any stop logic if necessary
  };

  return { complete, completion, isLoading, stop };
};
BennyKok commented 11 months ago

@jt6677 so you modified the code inside novel-vscode or in a separate project for testing? is the API directly calling https://novel.sh/api/generate?

jt6677 commented 11 months ago

I basically copied novel editor from node_moudles to src directory so I can modify how it handle api calls. I use my own end point which is basically the same as https://novel.sh/api/generate. It is an edge function just like https://novel.sh/api/generate.

jt6677 commented 11 months ago

A lot of work still need to be done to parse the markdown correctly. Basically, cannot take openai response directly. We need to regex to determine what format it is (text, bulletpoint, checkbox or headline), sanitize the markdown format and parse them as respective tiptap node types.

https://github.com/BennyKok/novel-vscode/assets/62420169/39ed31f0-f7d7-4148-b3f8-ae55fe061ecf

BennyKok commented 11 months ago

Yes, and this also came across if we have a table or other md features supported, it has to take care of a lot more edge cases. We might have to carefully prompt with the current context of where the user's cursor is at.

etc

andredezzy commented 9 months ago

Completion is not working yet, any solution for this?

slavakurilyak commented 3 months ago

+1 for LLM streaming support as this would unlock the full potential of Novel

Screen Shot 2024-05-25 at 11 48 43 AM