Open tonydoesathing opened 2 weeks ago
To address the issue of connecting the BaseLanguageModel
output of the AzureChatOpenAI
block to the LLM
input on the PromptRunner
block, follow these steps:
Check Configuration Parameters: Ensure the configuration for AzureChatOpenAIComponent
is correct, including model
, azure_endpoint
, azure_deployment
, api_version
, api_key
, temperature
, and max_tokens
. The api_version
should match the supported versions, like "2023-12-01-preview".
Error Handling: If there's a ValueError
, it might be due to incorrect parameters or network issues when connecting to Azure OpenAI API. This indicates a problem in the instantiation process of AzureChatOpenAI
.
Compatibility Check: Make sure the PromptRunner
block is compatible with the latest AzureChatOpenAI
class. The AzureChatOpenAI
class is designed to work with PromptRunner
blocks that accept BaseLanguageModel
objects, but compatibility issues can arise with updates.
Update Dependencies: Ensure all dependencies, including langchain
, langflow
, and Azure SDKs, are current. Outdated dependencies can lead to compatibility problems.
Browser Compatibility: Test the setup in another browser or incognito mode to rule out browser-specific issues.
Review Documentation: The documentation for AzureChatOpenAIComponent
might offer additional troubleshooting steps or insights.
If these steps don't resolve the issue, providing the exact error message or behavior observed when attempting the connection could help with further troubleshooting.
To continue the conversation, mention @dosu.
Awesome project! Was playing around with the tool and hit a wall, since I only have access to Azure's OpenAI...
Describe the bug
PromptRunner
does not supportAzureChatOpenAI
as an LLM inputBrowser and Version
To Reproduce Steps to reproduce the behavior:
AzureChatOpenAI
PromptRunner
chainBaseLanguageModel
output of theAzureChatOpenAI
block to theLLM
input on thePromptRunner
block and struggleExpected Behavior I should be able to connect
AzureChatOpenAI
's output to theLLM
input on any block