Open kimmeyh opened 3 weeks ago
Can you please open a PR to fix this issue?
Nir,
I cannot at this time.
However, I will consider contributing as soon as I can (probably 3 to 4 weeks).
Sincerely,
Harold
216-357-9227
From: NirDiamant @.> Sent: Monday, September 9, 2024 8:59 AM To: NirDiamant/RAG_Techniques @.> Cc: Harold Kimmey @.>; Author @.> Subject: Re: [NirDiamant/RAG_Techniques] Update from llama_index.core service_context to Settings (Issue #39)
Can you please open a PR to fix this issue?
— Reply to this email directly, view it on GitHub https://github.com/NirDiamant/RAG_Techniques/issues/39#issuecomment-2338058438 , or unsubscribe https://github.com/notifications/unsubscribe-auth/AF2GIS2MI35B5IWIUGEKNUTZVWLQRAVCNFSM6AAAAABN4O6LGSVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDGMZYGA2TQNBTHA . You are receiving this because you authored the thread. https://github.com/notifications/beacon/AF2GIS5JUBM4MJL7DKW2AQ3ZVWLQRA5CNFSM6AAAAABN4O6LGSWGG33NNVSW45C7OR4XAZNMJFZXG5LFINXW23LFNZ2KUY3PNVWWK3TUL5UWJTULLPYMM.gif Message ID: @. @.> >
In choose_chink_size.ipynb The following code: ` # create vector index llm = OpenAI(model="gpt-3.5-turbo")
Gives error: ValueError: ServiceContext is deprecated. Use llama_index.settings.Settings instead, or pass in modules to local functions/methods/interfaces. See the docs for updated usage/migration: https://docs.llamaindex.ai/en/stable/module_guides/supporting_modules/service_context_migration/ Can be replaced by: ` llm = OpenAI(model="gpt-3.5-turbo")