Open ashwaniag opened 1 week ago
Hi @ashwaniag thanks for trying have you mentioned the backend API URL in frontend env or docker compose??
Hi @kartikpersistent , I mentioned it in the .env file VITE_BACKEND_API_URL and I am building using docker compose
you should mention it in docker compose also
It is already mentioned in the docker-compose.yml
Also I am not unable to connect to the Aura instance. Can you share an example .env file?
HI @ashwaniag Frontend ENV
VITE_BACKEND_API_URL=""
VITE_BLOOM_URL="https://workspace-preview.neo4j.io/workspace/explore?connectURL={CONNECT_URL}&search=Show+me+a+graph&featureGenAISuggestions=true&featureGenAISuggestionsInternal=true"
VITE_REACT_APP_SOURCES=""
VITE_LLM_MODELS_PROD=""
VITE_ENV="DEV"
VITE_TIME_PER_PAGE=50
VITE_CHUNK_SIZE=5242880
VITE_LARGE_FILE_SIZE=5242880
VITE_GOOGLE_CLIENT_ID=""
VITE_CHAT_MODES=""
VITE_BATCH_SIZE=2
VITE_LLM_MODELS=""
Backend ENV
OPENAI_API_KEY=""
DIFFBOT_API_KEY=""
AWS_ACCESS_KEY_ID=""
AWS_SECRET_ACCESS_KEY=""
IS_EMBEDDING= 'true'
KNN_MIN_SCORE='0.8'
NEO4J_URI=""
NEO4J_USERNAME=""
NEO4J_PASSWORD=""
NEO4J_DATABASE=""
LANGCHAIN_ENDPOINT=""
LANGCHAIN_TRACING_V2=''
LANGCHAIN_PROJECT=""
LANGCHAIN_API_KEY=""
NUMBER_OF_CHUNKS_TO_COMBINE='4'
GCP_LOG_METRICS_ENABLED='True'
GEMINI_ENABLED='False'
UPDATE_GRAPH_CHUNKS_PROCESSED='20'
GROQ_API_KEY=""
GCS_FILE_CACHE='True'
NEO4J_USER_AGENT="LLM-Graph-Builder"
ENABLE_USER_AGENT='False'
LLM_MODEL_CONFIG_azure_ai_gpt_35=""
LLM_MODEL_CONFIG_azure_ai_gpt_4o=""
LLM_MODEL_CONFIG_groq_llama3_70b=""
LLM_MODEL_CONFIG_anthropic_claude_3_5_sonnet=""
LLM_MODEL_CONFIG_fireworks_llama_v3p2_90b=""
LLM_MODEL_CONFIG_bedrock_claude_3_5_sonnet=""
LLM_MODEL_CONFIG_ollama_llama3="llama3,http://localhost:11434"
ENTITY_EMBEDDING='True'
DUPLICATE_SCORE_VALUE='0.97'
DUPLICATE_TEXT_DISTANCE='3'
SENTENCE_TRANSFORMERS_HOME="/tmp/sentence_transformers"
ENABLE_COMMUNITIES='TRUE'
LLM_MODEL_CONFIG_openai_gpt_3.5=""
LLM_MODEL_CONFIG_openai_gpt_4=""
LLM_MODEL_CONFIG_openai_gpt_4o=""
LLM_MODEL_CONFIG_openai_gpt_4o_mini=""
LLM_MODEL_CONFIG_gemini_1.5_pro=""
LLM_MODEL_CONFIG_gemini_1.5_flash=""
LLM_MODEL_CONFIG_diffbot=""
DEFAULT_DIFFBOT_CHAT_MODEL="openai_gpt_3.5"
RAGAS_EMBEDDING_MODEL="openai"
YOUTUBE_TRANSCRIPT_PROXY="https://spnmty2eek:2l7vG~4EeDcc8eauQg@gate.smartproxy.com:10002"
I started the container using docker compose locally. Connected to Aura instance, But when I try to upload a local file, frontend gives a 404 error. I do not see any logs in the terminal.