khoj-ai / khoj

Your AI second brain. Self-hostable. Get answers from the web or your docs. Build custom agents, schedule automations, do deep research. Turn any online or local LLM into your personal, autonomous AI (e.g gpt, claude, gemini, llama, qwen, mistral).
https://khoj.dev
GNU Affero General Public License v3.0
14.14k stars 704 forks source link

Роль, промт #686

Closed den47999 closed 23 hours ago

den47999 commented 7 months ago

Good afternoon:) Can you add the ability to add your own PROMT and Raleigh? Also add the ability to adjust the temperature as I would like to set 0.1-0.3. Also tell me in the settings is specified model its context 128000 tokens but when sending to the chat long context 5000 tokens error comes out.

414 Request-URI слишком большой

414 Request-URI слишком большой

Screenshot_1

den47999 commented 7 months ago

Screenshot_1 Еще подскажите в настройках указана модель ее контекст 128000 токенов но при отправке в чат длинный контекст 5000 токенов вылезает ошибка.

sabaimran commented 6 months ago

Hi @den47999 ! We've limited the token limit used for this model. Thanks for pointing this out, I'll update it.

You'd like a custom prompt in the cloud instance?

den47999 commented 6 months ago

I want the number of tokens to be 128000 as it helps me with writing code, as well as add temperature selection and promt roles

debanjum commented 23 hours ago

Hey @den47999 , are you self-hosting Khoj or using the Khoj cloud service at https://app.khoj.dev? On the Khoj cloud service, we limit the context window to 60K tokens currently for subscribed users and 30K tokens for free tier users to manage service costs. If you're self-hosting you can set the max prompt size to whatever you like.

For custom prompts, we've recently enabled the ability to create agents on Khoj cloud (or self-hosted) with their custom prompts, tools and knowledge base.

Temperature, is set by the application based on the chat model and sub task the model is expected to do (e.g generate document search queries, infer output mode etc.). It is not user or admin controllable currently. It'd be good to know what you're hoping to get by changing the temperature. This will allow us to find the best way to enable that use-case.

Closing this issue for now. But let me know if that doesn't answer your requests?