Closed Haripritamreddy closed 1 month ago
So There is same issue which is in langchain repo here . The temporary resolve they found is using from pydantic.v1 import BaseModel
. And there is still same ongoing issue here .The resolve they found is to downgrade to pydantic version 1.10.10. So I tried to downgrade and the issue arises is
/usr/local/lib/python3.10/dist-packages/crewai/agent.py in
ImportError: cannot import name 'InstanceOf' from 'pydantic' (/usr/local/lib/python3.10/dist-packages/pydantic/init.cpython-310-x86_64-linux-gnu.so)
NOTE: If your import is failing due to a missing package, you can manually install dependencies using either !pip or !apt.
It seems that crewai version 1.32 does not work with pydantic 1.10.10. So What version of crewai works with pydantic 1.10.10 another way is to add ability to use a specific pydantic version pydantic.v1 import BaseModel
like langchain is currently doing.
For what its worth, I've been getting around it by having all my functions have a single parameter payload and that payload being a dictionary you can pass multiple arguments to. Here's how I updated the search_internet tool they provided:
class SearchTools():
@tool("Search the internet")
def search_internet(payload):
"""Useful to search the internet about a given topic and return relevant results
:param payload: str, a string representation of dictionary containing the following keys:
query: str, the query to search for
image_count: int, the number of top results to return
example payload:
{
"query": "cat",
"result_count": 4
}
"""
url = "https://google.serper.dev/search"
query = json.loads(payload)['query']
top_result_to_return = json.loads(payload)['result_count']
search_payload = json.dumps({"q": query})
headers = {
'X-API-KEY': os.environ['SERPER_API_KEY'],
'content-type': 'application/json'
}
response = requests.request("POST", url, headers=headers, data=search_payload)
# check if there is an organic key
if 'organic' not in response.json():
return "Sorry, I couldn't find anything about that, there could be an error with you serper api key."
else:
results = response.json()['organic']
stirng = []
for result in results[:top_result_to_return]:
try:
stirng.append('\n'.join([
f"Title: {result['title']}", f"Link: {result['link']}",
f"Snippet: {result['snippet']}", "\n-----------------"
]))
except KeyError:
next
return '\n'.join(stirng)
Example of agent using it:
.Do I need to use a tool? Yes Action: Search the internet Action Input: { "query": "Lance Armstrong aftermath of doping confession", "result_count": 3 }Title: Lance Armstrong Speaks Out On Life After Doping Scandal | TODAY Link: https://www.youtube.com/watch?v=DX38XZkulbw Snippet: Lance Armstrong, the seven-time Tour de France champion and cycling legend who was ...
Title: Armstrong's Doping Downfall - Ethics Unwrapped Link: https://ethicsunwrapped.utexas.edu/video/armstrongs-doping-downfall Snippet: Cyclist Lance Armstrong thought his use of performance-enhancing drugs was a way to level the playing field in a sport with pervasive doping.
Title: Lance Armstrong admits doping in Oprah Winfrey interview Link: https://www.theguardian.com/sport/2013/jan/18/lance-armstrong-admits-doping-oprah-winfrey Snippet: Armstrong said the doping during his triumphs – which revolved around oxygen-boosting drugs to improve endurance – was brazen and not at all ...
Thank you so much.It is working for me. But will It work for inbuilt langchain tools? Because for langchain tools we cannot specify the payload right
hey folks, I def want to make sure multiple input tools work! So I'll do some digging into this one
Hi @joaomdmoura and @Haripritamreddy,
I was also struggling with this, so did a little bit of investigation. Here's a few changes that allowed me to get it to work locally: https://github.com/ap-inflection/crewAI/pull/1/files
It's very OpenAI-centric since that's what were using, but maybe a starting point for other models?
Quite new to the repo (but really enjoying it!), happy to open a PR if there's interest
I found a solution that works for me, by wrapping a function and using langchain.output_parsers.PydanticOutputParser
to modify the instructions.
# this is a general-purpose utility that you can call to wrap any function as a tool
# with a Pydantic schema
class CrewAIStructuredTool:
def from_function(
self,
func,
name,
description,
args_schema: Type[BaseModel] = None,
return_direct: bool = False,
):
"""Create a structured tool from a function."""
parser = PydanticOutputParser(pydantic_object=args_schema)
description_with_schema = f"""{description}
Input should be a string representation of a dictionary containing the following keys:
{parser.get_format_instructions()}
"""
def parse_input_and_delegate(input: str) -> Any:
"""Parse the input and delegate to the function."""
try:
parsed = parser.invoke(input)
except Exception as e:
return f"Could not parse input: {str(e)}"
return func(parsed)
tool = Tool.from_function(
parse_input_and_delegate,
name,
description_with_schema,
args_schema=None,
return_direct=return_direct,
)
return tool
class GetPostInput(BaseModel):
"""Input for reading a post"""
blog_id: int = Field(description="blog_id to look up")
post_id: int = Field(description="post_id to look up")
def get_post(input=GetPostInput) -> Any:
"""Gets content and metadata about blog posts."""
blog_id = input.blog_id
post_id = input.post_id
token = os.environ["MY_ACCESS_TOKEN"]
client = MyApiClient(access_token=token)
try:
return client.get_post(blog_id, post_id)
except Exception as e:
return f"Error fetching post: {str(e)}"
# now create the tool
get_post_tool = CrewAIStructuredTool().from_function(
get_post,
"get_post",
"Tool that gets content and metadata about blog posts",
args_schema=GetPostInput,
)
Langchain tool such as write_tool takes file_path and text to save a file in such cases it returns an error .
Working Agent: Senior Research Analyst Starting Task: Create a file named example.txt and save text example in it.
Btw I am using gemini-pro model.I tried single query functions such as get_stock_price with single query they are working. Am I missing something?