Closed cip22 closed 4 months ago
Its used here add_routes(app,ChatVertexAI(), path="/vertexai")
Hi @cip22 Yes, in our app we do have the support for Gemini/VertexAI which is why you would need to setup GC if you want to run locally. If you don't want the Gemini integration, you can simply comment the line @ais-qianhao-liu posted above (which is in score.py ) and that should remove the error
If in your score.py
there is
logging_client = gclogger.Client()
logger_name = "llm_experiments_metrics" # Saved in the google cloud logs
logger = logging_client.logger(logger_name)
You'll need to comment that out too.
It should be good enough to not configure Gemini as LLM, in the config, or @kartikpersistent
@kartikpersistent can we make the metrics logging also configurable via .env as it should be running for local or other deployments.
When running the backend, uvicorn starts and the reloader process starts. Then: 2024-05-08 11:54:45,375 - Compute Engine Metadata server unavailable on attempt 1 of 3. Reason: timed out 2024-05-08 11:54:48,378 - Compute Engine Metadata server unavailable on attempt 2 of 3. Reason: timed out 2024-05-08 11:54:48,379 - Compute Engine Metadata server unavailable on attempt 3 of 3. Reason: [Errno 64] Host is down 2024-05-08 11:54:48,379 - Authentication failed using Compute Engine authentication due to unavailable metadata server.
Where is google Compute Engine used in this project?