Closed cchudant closed 2 years ago
Thanks a lot! Will review asap
@CerineBnsd Can you use Black on the python code?
Thanks for formatting the code. Can anyone of you bump the client and version to 0.3.0, and update the gitbook documentation to reflect all those changes?
Thanks for formatting the code. Can anyone of you bump the client and version to 0.3.0, and update the gitbook documentation to reflect all those changes?
I'll update the docstrings, the versions, the client CHANGELOG, and interface reference in gitbook. I think it would be better if @/Charles updates the parts that are related to the server as he may explain it better (especially the refactor as it isn't added yest as far as I know).Otherwise, I'll try to do it.
The refactor has already been documented! What is left to document is the new feature. You can do the client part and @cchudant the server part.
The refactor has already been documented! What is left to document is the new feature. You can do the client part and @cchudant the server part.
It's done for the client-side.
The refactor has already been documented! What is left to document is the new feature. You can do the client part and @cchudant the server part.
It's done for the client-side.
Thanks! However, as the documentation is not ready yet for the server side and the pull request not merged yet, can you push your changes to the doc as a pull request instead? This way we can merge the moment the feature is added for good to the main branch.
Thanks! However, as not the whole documentation is ready and the pull request not done, can you push your changes to the doc as a pull request instead? This way we can merge the moment the feature is added for good to the main branch.
That's right, I didn't pay attention. I'll fix it.
I've seen that the client now returns the output of the inference instead of a response object with an "output" field Was this intended? @CerineBnsd
@JoFrost We are currently pushing the signatures in an array, I was thinking about whether this is the best idea There is currently no good way to get the signature of an individual signed call, maybe we want to return the proof in the response object returned by run_model and send_model. What do you think?
I will update the server side documentation
@cchudant For instance, the response object contains the hash + the output of the inference. As we're not returning the response containing the hash in upload_model, it's more consistent not to return the signature in run_model as well, and then what's left as information in the response object will be only the output array.
I've seen that the client now returns the output of the inference instead of a response object with an "output" field Was this intended? @CerineBnsd
@JoFrost We are currently pushing the signatures in an array, I was thinking about whether this is the best idea There is currently no good way to get the signature of an individual signed call, maybe we want to return the proof in the response object returned by run_model and send_model. What do you think?
I will update the server side documentation
I definitely agree with your suggestion. @clauverjat What do you think?
@cchudant Let's apply your suggestion. Anyway we are only supposed to get the signature of the model we request, and not something else.
Alright, the protected branch rules works perfectly.
I will do a final review if you are done. We should expect to merge this on Friday if everything works well. Don't forget about the documentation for the server side as well
@cchudant The run tests CI failed again. I think the protobuf python files generation is not working anymore, according to the logs.
I'll take care of that Friday. May be a CI caching issue
Closes #13
Thanks to @CerineBnsd for the client part!