torvalds-dev / llama_index

LlamaIndex (formerly GPT Index) is a data framework for your LLM applications
https://docs.llamaindex.ai
MIT License
0 stars 0 forks source link

[Bug]: openai.log = "debug" doesn't work on latest #8924. #9

Open Raghavan1988 opened 7 months ago

Raghavan1988 commented 7 months ago

Bug Description

Bug Description Setting openai log to "debug" doesn't work.

For example, here's a snippet of a use-case:

... from llama_index.llms import OpenAI import openai

openai.log = "debug" app = FastAPI() loader = SitemapReader()

llm = OpenAI(temperature=0.1, model="gpt-3.5-turbo") service_context = ServiceContext.from_defaults(llm=llm)

Version

0.8.68

Steps to Reproduce

import openai openai.log = "debug"

Relevant Logs/Tracbacks

No response

torvalds-dev commented 7 months ago

torvalds.dev is analyzing the ticket

torvalds-dev commented 7 months ago

The issue "[Bug]: openai.log = "debug" doesn't work on latest #8924" seems to be related to the logging level of the OpenAI library. The user is trying to set the logging level to "debug", but it appears to not be working as expected.

Based on the provided code, the following files seem to be relevant to the issue:

  1. examples/portkey_demo/loadbalancing_streaming_demo.py: This file contains the main code where the OpenAI and Anthropic APIs are being used. The issue might be related to how these APIs are being used or how the responses are being handled.

  2. llama_index/logger/__init__.py: This file initializes the LlamaLogger. If the issue is related to logging, this file might contain relevant code.

  3. llama_index/llms/openai_utils.py: This file contains utility functions for working with the OpenAI API. The issue might be related to these utility functions.

  4. llama_index/llms/llm.py: This file contains the LLM class, which seems to be a key part of the system. If the issue is related to the behavior of the LLMs, this file might contain relevant code.

  5. llama_index/llms/llm_event.py: This file contains the LLMEvent class and related functions. If the issue is related to events in the LLMs, this file might contain relevant code.

  6. llama_index/llms/llm_event_logger.py: This file contains the LLMEventLogger class, which seems to be responsible for logging events. If the issue is related to logging, this file might contain relevant code.

To understand the issue better, the following actions should be taken:

  1. Reproduce the issue: Try to run the code and observe the behavior. Is the logging level not being set to "debug" as expected? Are there any error messages?

  2. Check the OpenAI library: Look at the documentation or source code of the OpenAI library to understand how the logging level is supposed to be set. Is openai.log = "debug" the correct way to do it?

  3. Check the usage of the OpenAI library: Look at how the OpenAI library is being used in the provided code. Is there anything unusual about it that might cause the logging level to not be set correctly?

  4. Check the logging code: Look at the LlamaLogger and LLMEventLogger classes to understand how logging is handled in the system. Is there anything in these classes that might interfere with the logging level of the OpenAI library?

  5. Test with a minimal example: Create a minimal example that only uses the OpenAI library and tries to set the logging level to "debug". Does the issue still occur in this minimal example? If not, the issue might be related to the other parts of the system.