Deployed on Cloudflare Worker, using the free Flux model and optimizing prompts with LLM. It can be called from any application compatible with the OpenAI API. 部署于Cloudflare Worker,免费使用Flux模型,并通过LLM进行提示词优化,支持在任何兼容OpenAI API的应用中调用。
The current design approach is that when the variable CF_IS_TRANSLATE is set to true and the settings for EXTERNAL_API_BASE, EXTERNAL_MODEL, and EXTERNAL_API_KEY are met, the system will invoke external APIs (such as GPT-4o models) to optimize the prompts, rather than just performing translation. In fact, I got a bit lazy—I should have renamed CF_IS_TRANSLATE to PROMPT_OPTIMIZED to avoid any ambiguity.
目前的设计思路是,当变量
CF_IS_TRANSLATE
设置为true
,且满足EXTERNAL_API_BASE
、EXTERNAL_MODEL
、EXTERNAL_API_KEY
的配置要求时,系统将调用外部 API(例如 GPT-4o 等模型)对提示词进行优化,而不仅仅是执行翻译功能。其实我有点偷懒了,应该将CF_IS_TRANSLATE
改为PROMPT_OPTIMIZED
,这样就不会产生歧义了。The current design approach is that when the variable
CF_IS_TRANSLATE
is set totrue
and the settings forEXTERNAL_API_BASE
,EXTERNAL_MODEL
, andEXTERNAL_API_KEY
are met, the system will invoke external APIs (such as GPT-4o models) to optimize the prompts, rather than just performing translation. In fact, I got a bit lazy—I should have renamedCF_IS_TRANSLATE
toPROMPT_OPTIMIZED
to avoid any ambiguity.