Closed sdimantsd closed 11 months ago
Could you give an example of the API you want? Would it be something like:
# Current API:
# start_new_conversation(conversation_name: str, initial_message: str)
conversation_uuid = claude_obj.start_new_conversation("New Conversation", "Hi Claude!")
# Proposed change:
# start_new_conversation(conversation_name: str, initial_message: str, attachments: List[Attachment])
conversation_uuid = claude_obj.start_new_conversation("New Conversation", "Hi Claude!", attachments = [...])
Version 2 is not currently supported but it should be an easy fix to add.
Yes, that will be grate! Thanks
@sdimantsd I just added an additional argument to the start_new_conversation
method called attachments
. It accepts a list of AttachmentType
objects to send.
You can create an AttachmentType
object by calling get_attachment(file_path)
from the client. Let me know if it works. This change isn't on the pip package yet, so you will have to use the github repo directly.
Thanks! How can I get it? Its in pip? How can I install from git?
You should be able to use pip install git+git://github.com/AshwinPathi/claude-api-py.git
to install the git repo as a package.
Thanks
How can I get the response using this method?
The function return the uuid, not the output
@sdimantsd updated. The start_new_conversation()
method will return a JSON with the following contents:
return {
'uuid': {conversation_uuid_string},
'title': {conversation_title_string},
'response': {initial_response_string}
}
Greate Thanks
Can you return also the ['stop_reason']?
@sdimantsd tbh i'll just return the entire JSON response from the send message in the response
field if that sounds more reasonable.
Just changed the api on github, please try it out and let me know if this works.
Thanks :-D I hope I will try in a few hours.
Hello, if you want to improve the speed of any optimization suggestions author, when I input a text content of thousands of words, it takes 80s time to answer, is it because I opened too many conversations? Do I need to delete the old after creating a session after completion, look forward to and thank you for your reply
@winie-hy thats probably a better question to ask in another issue, but its kind of expected that the API will be slow if you are sending a large prompt and receiving and equally large response.
The Claude API sends responses to your message in "chunks" which are streamed via Server Side Events to the client. The entire message is completely streamed only after all the chunks are receieved, which for large messages, can take a while.
My API isn't asynchronous, so I basically block (wait) until all the chunks are received, and then I return the entire response to the client.
@sdimantsd were you able to use the new api? if so I'll close this issue.
That works great! Thanks
I see an option to create new conversation and in the second message upload a file. Can I upload a file in the a new conversation?