yihong0618 / bilingual_book_maker

Make bilingual epub books Using AI translate
MIT License
7.68k stars 1.08k forks source link

能支持gpt4吗? #253

Open JulianZhang opened 1 year ago

JulianZhang commented 1 year ago

现在官方已经有GPT4的模型,但是工具的模型指定里面还不指定gpt4,有支持的计划吗?

yihong0618 commented 1 year ago

可以,我目前没有 gpt4 api, 等我有了会研究加上

astromaddie commented 1 year ago

@yihong0618 I have gpt4 api, and can help you integrate it into the code.

yihong0618 commented 1 year ago

@yihong0618 I have gpt4 api, and can help you integrate it into the code.

of course

astromaddie commented 1 year ago

@yihong0618 I added gpt4 support to a local clone of the repo under a feature branch, but I don't seem to have permission to push it to master.

yihong0618 commented 1 year ago

@yihong0618 I added gpt4 support to a local clone of the repo under a feature branch, but I don't seem to have permission to push it to master.

you can ask gpt “how to pull request” on GitHub(^-^)

astromaddie commented 1 year ago

@yihong0618 I added gpt4 support to a local clone of the repo under a feature branch, but I don't seem to have permission to push it to master.

you can ask gpt “how to pull request” on GitHub(^-^)

Thanks, forgot it could be that easy ^^; I've never contributed to a git repo before.

One new feature I'm testing locally is creating a running "context" paragraph in the prompt that incorporates important historical information with new information, to improve contextual translation quality of novels and possibly SRT. I may add it to the GPT4 file only, because it will add about ~200 tokens to each API request.

gumeng76 commented 1 year ago

Can support gpt4 now?

astromaddie commented 1 year ago

Can support gpt4 now?

yes, this works with gpt4! I’ve translated a few books with it already.

gumeng76 commented 1 year ago

Genuis! Do I only need pull new master brunch and run with new option? --model gpt4?

astromaddie commented 1 year ago

Ah sorry, I’ve made a pull request but @yihong0618 hasn't approved it, so it’s not in master. Maybe wait a bit longer!

yihong0618 commented 1 year ago

sorry for that I am on vocation these days

gumeng76 commented 1 year ago

Hope we can merge it into Master as soon as possible ...

astromaddie commented 1 year ago

sorry for that I am on vocation these days

no prob! just let me know if anything needs to be changed in the code, when you’re back

astromaddie commented 1 year ago

@gumeng76 I’ve added --mode gpt4 to my branch and a subfeature for it, --use_context, which creates a self-referential and updating paragraph to add historical context to the passage of text being translated (e.g. if you translate 5 paragraphs at a time with gpt4, paragraphs 6-10 will come with a 6th paragraph payload summarising paragraphs 1-5, and then paragraphs 11-15 will come with an additional paragraph summarising paragraphs 1-10… hope that makes sense.)

I find it improves tone and consistency in translations.

gumeng76 commented 1 year ago

@gumeng76 I’ve added --mode gpt4 to my branch and a subfeature for it, --use_context, which creates a self-referential and updating paragraph to add historical context to the passage of text being translated (e.g. if you translate 5 paragraphs at a time with gpt4, paragraphs 6-10 will come with a 6th paragraph payload summarising paragraphs 1-5, and then paragraphs 11-15 will come with an additional paragraph summarising paragraphs 1-10… hope that makes sense.)

I find it improves tone and consistency in translations.

Very good idea. It help GPT to remember the history :-)

ScofieldYeh commented 1 year ago

@astromaddie Does your branch here https://github.com/astromaddie/bilingual_book_maker support --model GPT4 and --use_context?

astromaddie commented 1 year ago

@ScofieldYeh the gpt4-support branch does :)

astromaddie commented 1 year ago

I should update the doc in the branch, thanks for reminding me

ScofieldYeh commented 1 year ago

@astromaddie I tried --gpr4 but got error as :[Errno 2] No such file or directory: '/Users/mstemm/code/bilingual_book_maker\debug_log.txt' will sleep 60 seconds

astromaddie commented 1 year ago

@astromaddie I tried --gpr4 but got error as :[Errno 2] No such file or directory: '/Users/mstemm/code/bilingual_book_maker\debug_log.txt' will sleep 60 seconds

Sorry about that, please pull the branch again... I accidentally left a debug line in there, just removed it.

robyzhou commented 1 year ago

@astromaddie I tried --gpr4 but got error as :[Errno 2] No such file or directory: '/Users/mstemm/code/bilingual_book_maker\debug_log.txt' will sleep 60 seconds

Sorry about that, please pull the branch again... I accidentally left a debug line in there, just removed it.

Hello dude, still have the error as: The model: gpt-4 does not exist will sleep 60 seconds I'm on main branch, and code fetched. do you have any suggestion on this?

Thanks a lot!

astromaddie commented 1 year ago

Hello dude, still have the error as: The model: gpt-4 does not exist will sleep 60 seconds I'm on main branch, and code fetched. do you have any suggestion on this?

Thanks a lot!

Try using the flag -gpt4 ;)

robyzhou commented 1 year ago

Hello dude, still have the error as: The model: gpt-4 does not exist will sleep 60 seconds I'm on main branch, and code fetched. do you have any suggestion on this? Thanks a lot!

Try using the flag -gpt4 ;)

Thanks buddy!

Of course I use the flag -gpt4, and here is the command: python3 make_book.py --book_name test_books/xxx.epub --model gpt4 --proxy 127.0.0.1:1087

I think maybe I should get a openai_key for gpt4 but gpt3. Now, my ChatGPT account is for free. Is it right?

Thanks a lot!

astromaddie commented 1 year ago

Ohh I see! Yes, you can’t use the GPT-4 model unless you have GPT-4 access on your account. You can sign up for an API account and request access to the model, if you’re lucky they’ll open it up to you.

robyzhou commented 1 year ago

Ohh I see! Yes, you can’t use the GPT-4 model unless you have GPT-4 access on your account. You can sign up for an API account and request access to the model, if you’re lucky they’ll open it up to you.

You have my appreciation, dude!