langchain-ai / langgraph-studio

Desktop app for prototyping and debugging LangGraph applications locally.
https://studio.langchain.com
1.99k stars 131 forks source link

Model name is missing in config. #191

Open snakeninny opened 1 week ago

snakeninny commented 1 week ago

I'm running open canvas locally, and it requires LangGraph Studio. The open canvas env is:

# LangSmith tracing
LANGCHAIN_TRACING_V2=true
LANGCHAIN_API_KEY=XXX

# LLM API keys
# Anthropic used for reflection
# ANTHROPIC_API_KEY=
# OpenAI used for content generation
# OPENAI_API_KEY=
# Optional, only required if using `Gemini 1.5 Flash` as the model.
# GOOGLE_API_KEY=

# Feature flags for hiding/showing specific models
NEXT_PUBLIC_FIREWORKS_ENABLED=false
# Gemini has some tool call streaming issues atm.
NEXT_PUBLIC_GEMINI_ENABLED=false
NEXT_PUBLIC_ANTHROPIC_ENABLED=false
NEXT_PUBLIC_OPENAI_ENABLED=false
NEXT_PUBLIC_AZURE_ENABLED=true

# LangGraph Deployment, or local development server via LangGraph Studio.
# If running locally, this URL should be set in the `constants.ts` file.
# LANGGRAPH_API_URL=

# Supabase for authentication
# Public keys
NEXT_PUBLIC_SUPABASE_URL=https://xxx.supabase.co
NEXT_PUBLIC_SUPABASE_ANON_KEY=XXX

# Azure OpenAI Configuration
# ENSURE THEY ARE PREFIXED WITH AN UNDERSCORE.
_AZURE_OPENAI_API_KEY=XXX
_AZURE_OPENAI_API_INSTANCE_NAME=gpt-4o
_AZURE_OPENAI_API_DEPLOYMENT_NAME=gpt-4o
_AZURE_OPENAI_API_VERSION=2024-03-01-preview
# Optional: Azure OpenAI Base Path (if using a different domain)
_AZURE_OPENAI_API_BASE_PATH=https://xxx.openai.azure.com/openai/deployments/

langgraph.json:

{
  "dockerfile_lines": [],
  "graphs": {
    "agent": "./src/agent/open-canvas/index.ts:graph",
    "reflection": "./src/agent/reflection/index.ts:graph",
    "thread_title": "./src/agent/thread-title/index.ts:graph"
  },
  "env": "./.env",
  "node_version": "20"
}

I have installed Docker Desktop and opened open canvas project in LangGraph Studio, then submitted. Error occured: Model name is missing in config.

Which model is missing? How can I set model in config? Thank you!