Cloud-CV / evalai-cli

:cloud: :rocket: Official EvalAI Command Line Tool
https://cli.eval.ai
BSD 3-Clause "New" or "Revised" License
54 stars 61 forks source link

Fix #305: Allow users to add submission metadata from cli #315

Closed burnerlee closed 3 years ago

burnerlee commented 3 years ago

Fixes #305 by adding a feature to allow users to provide metadata along with making submissions from the cli. When a user makes a submission, a request is made to the server to check if any submission_meta_attributes are available for the corresponding challenge_phase. If any attributes are found, a prompt would appear asking the user if he/she wants to submit the values. The user can agree to submit these values and then fill in their corresponding choice for each field prompted. Enough instructions and choices are provided to the user through the prompts too

burnerlee commented 3 years ago

#305

burnerlee commented 3 years ago

Screenshot from 2021-01-26 23-10-38

burnerlee commented 3 years ago

@Ram81 ping for review

burnerlee commented 3 years ago

@Ram81 all requested changes made except https://github.com/Cloud-CV/evalai-cli/pull/315#discussion_r565924669. Would wait for your reply on that

burnerlee commented 3 years ago

@Ram81 added the required check as well along with the *

burnerlee commented 3 years ago

@Ram81 changes made

burnerlee commented 3 years ago

@burnerlee minor comments. Rest of it LGTM, nice work

Thank you ❤️

burnerlee commented 3 years ago

@Ram81 @RishabhJain2018, this PR and the one on cyclic-imports are almost ready to be merged. Could you please assign me some issues on these projects or any other? Maybe I can implement some feature on any of those

Ram81 commented 3 years ago

@burnerlee can you share a screenshot with sample submission with meta attributes input?

Ram81 commented 3 years ago

@RishabhJain2018 PR LGTM, can you take a look.

RishabhJain2018 commented 3 years ago

Can we please fix the travis build?

burnerlee commented 3 years ago

@burnerlee can you share a screenshot with sample submission with meta attributes input?

#305(2)

burnerlee commented 3 years ago

https://github.com/Cloud-CV/evalai-cli/pull/315#issuecomment-767709514 already attached this screenshot, if you're asking about this @Ram81

Ram81 commented 3 years ago

@burnerlee can you fix the failing tests in travis build.

Ram81 commented 3 years ago

You might've to incorporate changes in tests for submission meta attributes input in evalai submit command

burnerlee commented 3 years ago

Oh, sure. I'll just check it out

burnerlee commented 3 years ago

@Ram81 @RishabhJain2018, I ran setup.py in my local after installing all the dependencies, but could not understand the error behind the failure of these 5 tests. Could you please give me an insight on this Screenshot from 2021-01-31 14-38-14 Screenshot from 2021-01-31 14-38-26 Screenshot from 2021-01-31 14-38-29 Screenshot from 2021-01-31 14-38-39 Screenshot from 2021-01-31 14-38-43

Ram81 commented 3 years ago

For the tests with evalai submit command we passN/y for submission meta attributes input but after your change this input is accepted twice. So these tests might need an additional N/y as input and if input is y you might also have to pass values for metadata attributes. Let me know if you have any more questions

Ram81 commented 3 years ago

So for the tests with N as input you'll have to replace it with N\nN and for tests with Y as input you'll have to pass Y\nvalue1\nvalue2\nvalue3\nvalue4 then again Y/N for new added meta attributes.

RishabhJain2018 commented 3 years ago

@burnerlee Any update on this? cc: @Ram81

burnerlee commented 3 years ago

Yeah, I'll update this today. Have my last exam today. Was preparing for it

RishabhJain2018 commented 3 years ago

okay, thanks for letting us know.

burnerlee commented 3 years ago

@Ram81 the requests for the tests are made to the eval server hosted. Could you please tell me the submission_meta_attributes for the test? challenge id is 1, phase id is 2, ["1", "phase", "2", "submit", "--file", "test_file.txt"]. I would need to add the values to enter for the attributes accordingly

Ram81 commented 3 years ago

@burnerlee those are not actual requests I think. The responses for those requests are being mocked, see the tests in this PR for reference

burnerlee commented 3 years ago

@Ram81, after consoling some logs I actually understood the problem. The changes for adding \nN have to be made. Along with that, I was facing another error. i.e. In the challenge submission, I make a request to the server for fetching the attributes for the challenge phase. This is causing errors while I am testing and needs to be handled in the same way as tests are written, maybe hardcoded. I'll now see how to handle this in testing as this wouldn't be an actual request ig

burnerlee commented 3 years ago

@Ram81, when the get_meta_attributes function would be called in testing, I think it would be

@Ram81, after consoling some logs I actually understood the problem. The changes for adding \nN have to be made. Along with that, I was facing another error. i.e. In the challenge submission, I make a request to the server for fetching the attributes for the challenge phase. This is causing errors while I am testing and needs to be handled in the same way as tests are written, maybe hardcoded. I'll now see how to handle this in testing as this wouldn't be an actual request ig

Yeah, understood how everything was working. Works fine now

burnerlee commented 3 years ago

@RishabhJain2018 @Ram81 build fixed. Please review

burnerlee commented 3 years ago

@RishabhJain2018 made all the requested changes and tasks