AshwinPathi / claude-api-py

Unofficial Python API for Anthropic's Claude LLM
https://pypi.org/project/claude-api-py/
MIT License
108 stars 11 forks source link

Can I start new conversation with a file? #7

Closed sdimantsd closed 11 months ago

sdimantsd commented 11 months ago

I see an option to create new conversation and in the second message upload a file. Can I upload a file in the a new conversation?

AshwinPathi commented 11 months ago

Could you give an example of the API you want? Would it be something like:

# Current API: 
# start_new_conversation(conversation_name: str, initial_message: str)
conversation_uuid = claude_obj.start_new_conversation("New Conversation", "Hi Claude!")

# Proposed change:
# start_new_conversation(conversation_name: str, initial_message: str, attachments: List[Attachment])
conversation_uuid = claude_obj.start_new_conversation("New Conversation", "Hi Claude!", attachments = [...])

Version 2 is not currently supported but it should be an easy fix to add.

sdimantsd commented 11 months ago

Yes, that will be grate! Thanks

AshwinPathi commented 11 months ago

@sdimantsd I just added an additional argument to the start_new_conversation method called attachments. It accepts a list of AttachmentType objects to send.

You can create an AttachmentType object by calling get_attachment(file_path) from the client. Let me know if it works. This change isn't on the pip package yet, so you will have to use the github repo directly.

sdimantsd commented 11 months ago

Thanks! How can I get it? Its in pip? How can I install from git?

AshwinPathi commented 11 months ago

You should be able to use pip install git+git://github.com/AshwinPathi/claude-api-py.git to install the git repo as a package.

sdimantsd commented 11 months ago

Thanks

sdimantsd commented 11 months ago

How can I get the response using this method?
The function return the uuid, not the output

AshwinPathi commented 11 months ago

@sdimantsd updated. The start_new_conversation() method will return a JSON with the following contents:

return {
    'uuid': {conversation_uuid_string},
    'title': {conversation_title_string},
    'response': {initial_response_string}
}
sdimantsd commented 11 months ago

Greate Thanks

sdimantsd commented 11 months ago

Can you return also the ['stop_reason']?

AshwinPathi commented 11 months ago

@sdimantsd tbh i'll just return the entire JSON response from the send message in the response field if that sounds more reasonable.

Just changed the api on github, please try it out and let me know if this works.

sdimantsd commented 11 months ago

Thanks :-D I hope I will try in a few hours.

winie-hy commented 11 months ago

Hello, if you want to improve the speed of any optimization suggestions author, when I input a text content of thousands of words, it takes 80s time to answer, is it because I opened too many conversations? Do I need to delete the old after creating a session after completion, look forward to and thank you for your reply

AshwinPathi commented 11 months ago

@winie-hy thats probably a better question to ask in another issue, but its kind of expected that the API will be slow if you are sending a large prompt and receiving and equally large response.

The Claude API sends responses to your message in "chunks" which are streamed via Server Side Events to the client. The entire message is completely streamed only after all the chunks are receieved, which for large messages, can take a while.

My API isn't asynchronous, so I basically block (wait) until all the chunks are received, and then I return the entire response to the client.

AshwinPathi commented 11 months ago

@sdimantsd were you able to use the new api? if so I'll close this issue.

sdimantsd commented 11 months ago

That works great! Thanks