Closed jaresty closed 1 month ago
You can explain your last exchange with 'model explain prompt exchange'
For what it's worth, the idea for this came from page four of this draft book from Michael Feathers (the author of Working Effectively With Legacy Code) https://www.linkedin.com/posts/michaelfeathers_use-waywords-ugcPost-7227367323878187008-RNZL?utm_source=share&utm_medium=member_desktop
Sorry but I don't think I am going to merge this. I think it is a bit too specific. If the user wants to do this, I think it is better for them to just copy the info from the debug output and run a model command on it. I don't really like merging destinations that require storing / modifying additional state unless there is strong need for it.
We can reopen if there is need down the line.
Can we discuss? I think there is value here beyond simply this use case. This is helpful for a variety of reasons and it's not so easy to do manually. Missing this tempts me to run off of a fork, since I can't implement it myself otherwise.
thread_to_string
withchats_to_string
to improve clarity in handling chat threads.gptRequest
andgptExchange
to support retrieval of previous requests and exchanges.send_request
to store the last request and response inGPTState
for better context management.GPTState.last_response
assignment ingpt_query
.These changes enhance the ability to run 'explain prompt' and understand prompt-response mechanics.