Open vineetvedant opened 1 week ago
Hi!
Are you using the docker image available in docker hub or have you cloned this repo?
hi there i am using neo4j solution that you have mentioned in option2 posted . i am facing the trouble to use that . if possible can you guide me . actually i have also cloned the repo . can you give the further steps , actually i am building the similar project . if possible can you share me your experience hand
On Fri, Oct 18, 2024 at 8:33 AM Gurveer Singh Virk @.***> wrote:
Hi!
Are you using the docker image available in docker hub or have you cloned this repo?
— Reply to this email directly, view it on GitHub https://github.com/gurveervirk/ToK/issues/1#issuecomment-2421181636, or unsubscribe https://github.com/notifications/unsubscribe-auth/ANSA3L34SP2UDDHH3RDQDCTZ4B3BPAVCNFSM6AAAAABQENOZFWVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDIMRRGE4DCNRTGY . You are receiving this because you authored the thread.Message ID: @.***>
and can you pls tell me which model you have for the project of ollama is it MISTRAL OR LLAMA2
Message ID: @.***>
and can you pls tell me which model you have for the project of ollama is it MISTRAL OR LLAMA2 Message ID: @.***> …
Default model is mistral:instruct, the latest one in ollama. But, you can use any llm for text completion currently. Will add vision support in the future.
hi there i am using neo4j solution that you have mentioned in option2 posted . i am facing the trouble to use that . if possible can you guide me . actually i have also cloned the repo . can you give the further steps , actually i am building the similar project . if possible can you share me your experience hand … On Fri, Oct 18, 2024 at 8:33 AM Gurveer Singh Virk @.> wrote: Hi! Are you using the docker image available in docker hub or have you cloned this repo? — Reply to this email directly, view it on GitHub <#1 (comment)>, or unsubscribe https://github.com/notifications/unsubscribe-auth/ANSA3L34SP2UDDHH3RDQDCTZ4B3BPAVCNFSM6AAAAABQENOZFWVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDIMRRGE4DCNRTGY . You are receiving this because you authored the thread.Message ID: @.>
Are you using the .exe from the releases page?
no actually i have installed that zip file if possible can you please guide me the steps to setup ! can u please provide your guidance
On Fri, Oct 18, 2024 at 7:16 PM Gurveer Singh Virk @.***> wrote:
hi there i am using neo4j solution that you have mentioned in option2 posted . i am facing the trouble to use that . if possible can you guide me . actually i have also cloned the repo . can you give the further steps , actually i am building the similar project . if possible can you share me your experience hand … <#m-7919716542965773335> On Fri, Oct 18, 2024 at 8:33 AM Gurveer Singh Virk @.> wrote: Hi! Are you using the docker image available in docker hub or have you cloned this repo? — Reply to this email directly, view it on GitHub <#1 (comment) https://github.com/gurveervirk/ToK/issues/1#issuecomment-2421181636>, or unsubscribe https://github.com/notifications/unsubscribe-auth/ANSA3L34SP2UDDHH3RDQDCTZ4B3BPAVCNFSM6AAAAABQENOZFWVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDIMRRGE4DCNRTGY https://github.com/notifications/unsubscribe-auth/ANSA3L34SP2UDDHH3RDQDCTZ4B3BPAVCNFSM6AAAAABQENOZFWVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDIMRRGE4DCNRTGY . You are receiving this because you authored the thread.Message ID: @.>
Are you using the .exe from the releases page?
— Reply to this email directly, view it on GitHub https://github.com/gurveervirk/ToK/issues/1#issuecomment-2422515374, or unsubscribe https://github.com/notifications/unsubscribe-auth/ANSA3L6BPLBFIYTBETOM7LDZ4EGJNAVCNFSM6AAAAABQENOZFWVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDIMRSGUYTKMZXGQ . You are receiving this because you authored the thread.Message ID: @.***>
Try the updated requirements.txt
Let me know if it worked.
Why not... actually I am also building a kind of similar project. I am using hugging face + llama_index and RAG based chat bot . If you like to work I will be glad by the help ..
On Mon, Oct 21, 2024 at 6:33 PM Gurveer Singh Virk @.***> wrote:
Let me know if it worked.
— Reply to this email directly, view it on GitHub https://github.com/gurveervirk/ToK/issues/1#issuecomment-2426619037, or unsubscribe https://github.com/notifications/unsubscribe-auth/ANSA3L2BZBCGCMSCW7NPTJLZ4T3SLAVCNFSM6AAAAABQENOZFWVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDIMRWGYYTSMBTG4 . You are receiving this because you authored the thread.Message ID: @.***>
Sure! Let me know what you need!
Traceback (most recent call last): File "D:\pajidacodepart2\ToK-docker-image\ToK-docker-image\tok\main.py", line 16, in
from llama_index.llms.ollama import Ollama
File "C:\Users\VEDANT\AppData\Roaming\Python\Python311\site-packages\llama_index__init.py", line 17, in
from llama_index.embeddings.langchain import LangchainEmbedding
File "C:\Users\VEDANT\AppData\Roaming\Python\Python311\site-packages\llama_index\embeddings__init__.py", line 16, in
from llama_index.embeddings.openai import OpenAIEmbedding
File "C:\Users\VEDANT\AppData\Roaming\Python\Python311\site-packages\llama_index\embeddings\openai.py", line 18, in
from llama_index.llms.openai_utils import (
File "C:\Users\VEDANT\AppData\Roaming\Python\Python311\site-packages\llama_index\llms\ init.py", line 2, in
from llama_index.llms.anyscale import Anyscale
File "C:\Users\VEDANT\AppData\Roaming\Python\Python311\site-packages\llama_index\llms\anyscale.py", line 11, in
from llama_index.llms.openai import OpenAI
File "C:\Users\VEDANT\AppData\Roaming\Python\Python311\site-packages\llama_index\llms\openai\ init__.py", line 1, in
from llama_index.llms.openai.base import AsyncOpenAI, OpenAI, SyncOpenAI, Tokenizer
File "C:\Users\VEDANT\AppData\Roaming\Python\Python311\site-packages\llama_index\llms\openai\base.py", line 54, in
from llama_index.llms.openai.utils import (
File "C:\Users\VEDANT\AppData\Roaming\Python\Python311\site-packages\llama_index\llms\openai\utils.py", line 24, in
from openai.types.chat.chat_completion_token_logprob import ChatCompletionTokenLogprob
ModuleNotFoundError: No module named 'openai.types.chat.chat_completion_token_logprob'