jxnl / instructor

structured outputs for llms
https://python.useinstructor.com/
MIT License
6.52k stars 514 forks source link

create_model() missing 1 required positional argument: 'model_name' #799

Closed ThanhNguye-n closed 2 days ago

ThanhNguye-n commented 3 days ago

Code:

from llama_cpp import Llama
import multiprocessing
from llama_cpp.llama_speculative import LlamaPromptLookupDecoding

import instructor

from pydantic import BaseModel
from typing import List
from rich.console import Console

llama = Llama(
    model_path="/Users/thanhnguyen/Documents/Developer/llamacpp_function_calling/Hermes-2-Pro-Llama-3-8B-Q8_0.gguf",
    n_gpu_layers=20,
    n_batch=600,
    n_threads=multiprocessing.cpu_count() - 5,
    n_ctx=2048,
    draft_model=LlamaPromptLookupDecoding(num_pred_tokens=2),  # (1)!
    logits_all=True,
    verbose=True
)

create = instructor.patch(
    create=llama.create_chat_completion_openai_v1,
    mode=instructor.Mode.JSON_SCHEMA,  # (2)!
)

text_block = """
In our recent online meeting, participants from various backgrounds joined to discuss
the upcoming tech conference. The names and contact details of the participants were as follows:

- Name: John Doe, Email: johndoe@email.com, Twitter: @TechGuru44
- Name: Jane Smith, Email: janesmith@email.com, Twitter: @DigitalDiva88
- Name: Alex Johnson, Email: alexj@email.com, Twitter: @CodeMaster2023

During the meeting, we agreed on several key points. The conference will be held on March 15th, 2024,
at the Grand Tech Arena located at 4521 Innovation Drive. Dr. Emily Johnson, a renowned AI researcher,
will be our keynote speaker.

The budget for the event is set at $50,000, covering venue costs, speaker fees, and promotional activities.
Each participant is expected to contribute an article to the conference blog by February 20th.

A follow-up meetingis scheduled for January 25th at 3 PM GMT to finalize the agenda and confirm the list of speakers.
"""

class User(BaseModel):
    name: str
    email: str
    twitter: str

class MeetingInfo(BaseModel):
    users: List[User]
    date: str
    location: str
    budget: int
    deadline: str

extraction_stream = create(
    response_model=instructor.Partial[MeetingInfo],  # (3)!
    messages=[
        {
            "role": "user",
            "content": f"Get the information about the meeting and the users {text_block}",
        },
    ],
    stream=True,
)

console = Console()

for extraction in extraction_stream:
    obj = extraction.model_dump()
    console.clear()  # (4)!
    console.print(obj)

Error message:

Traceback (most recent call last):
  File "/Users/thanhnguyen/Documents/Developer/llamacpp_function_calling/cookbook/test.py", line 65, in <module>
    response_model=instructor.Partial[MeetingInfo],  # (3)!
  File "/opt/anaconda3/envs/agent1/lib/python3.10/site-packages/instructor/dsl/partial.py", line 300, in __class_getitem__
    **{
  File "/opt/anaconda3/envs/agent1/lib/python3.10/site-packages/instructor/dsl/partial.py", line 304, in <dictcomp>
    else _wrap_models(field_info)
  File "/opt/anaconda3/envs/agent1/lib/python3.10/site-packages/instructor/dsl/partial.py", line 273, in _wrap_models
    modified_args = tuple(
  File "/opt/anaconda3/envs/agent1/lib/python3.10/site-packages/instructor/dsl/partial.py", line 275, in <genexpr>
    Partial[arg]
  File "/opt/anaconda3/envs/agent1/lib/python3.10/site-packages/instructor/dsl/partial.py", line 292, in __class_getitem__
    return create_model(
TypeError: create_model() missing 1 required positional argument: 'model_name'
ggml_metal_free: deallocating
theonesud commented 3 days ago

Facing the same issue. Did you find a fix? Maybe it is a pydantic issue

ThanhNguye-n commented 3 days ago

Facing the same issue. Did you find a fix? Maybe it is a pydantic issue

no idea, but i just know that i have once run this code successfully with another env (1 week ago)

jxnl commented 2 days ago

looks like a issue in pydantic 2.8.0! try downgrading while we roll out the fix

jxnl commented 2 days ago

801