Closed deshraj closed 7 years ago
@deshraj Can I work on this? Thanks!
@RishabhJain2018 @deshraj Is there any zip which I can use to mock a successful and failed submission?
I guess when running the seed command, a challenge will be loaded. This challenge contains an evaluation script which will compare a user annotation file(yours) to have integers from 1-10. You can use this to mock a successful submission. Make sure that your rabbitmq is running.
To mock failed submission, presently we don't have any such evaluation script which will throw an error. What you can do is edit main.py of evaluation_script in examples folder and upload it again to form a new challenge. To manually raise an error, you can write 1/0
or something erroneous statement which will throw an error.
We will update here if we plan to add evaluation script which when run throws an error so that stderr output can be checked.
@the-dagger
@deshraj Is this solved?
No, this isn't solved yet.
Okay, Ill come back to this after I do Issue - 864.
@deshraj @RishabhJain2018 can I take this one?
@Ram81: sure. Assigning it to you. :)
@deshraj Sorry for being late, I was busy with the #1139. I'll start with this right away.
@Ram81 Are you working on this? :smile:
@isht3 yeah I will be doing this 😃
@deshraj whenever a submission is made and if at the end if status of submission is "submitted/accpeted" (i.e. if the serializer.is_valid() is true in jobs/views.py - challenge_submission) then I set the stderr file to none, right? So i have to make changes in save function of the model, right?
I am not sure if I am following you correctly here. Just to explain the deliverables here, if the submission state is finished
then no stderr file
should be generated since the submission has been successfully executed and there is no error.
Please let me know if I am not clear here.
@deshraj understood
@deshraj when I made a submission it didn't generate any stderr file in submission folder
The screenshot that you have shared above seems fine to me but I am talking about the finished state. When a submission gets the status finished
, then there should be no stderr file generated. Basically, you have to add a check here https://github.com/Cloud-CV/EvalAI/blob/master/scripts/workers/submission_worker.py#L401 which checks if the submission status is finished
or not and based on that you have to create the std_err
file.
Please let me know if this is not clear.
@deshraj can you share a sucessful submission file for testing the changes made?
Sorry @Ram81, I missed this somehow. Can you please tell me the name of the challenge for which you want to do the submission?
@deshraj https://github.com/Cloud-CV/EvalAI-Examples/tree/master/vqa_challenge vqa challenge
@Ram81 please don't use VQA challenge for debugging purposes. Please use the default challenge that we create using the command python manage.py seed
and use any file to submit as a submission file.
You could also use https://github.com/Cloud-CV/EvalAI/blob/master/examples/example1/test_annotation.txt as a submission file.
@Ram81 Are you using the Jasmine challenge to debug?
@isht3 FYI: the name of the challenge can be anything since it is generated randomly. Anyways, the name of the challenge doesn't matter. The submission file that I shared will work with any of the challenge generated using python manage.py seed
script.
@deshraj I will use test_annotation file or try using default challenge
@deshraj The problem was that seed command was not creating a default challenge for me. All the migrations were done but the challenge is not coming up.
@deshraj Same happens with me when I seed there is no default challenge created
@Ram81 You're getting the migration error?
Interesting. Let me try on my local machine and then I will get back to you.
@Ram81 What version of Python is installed on your Virtualenv
?
@isht3 Python 2.7 & I didn't get any migration error
@Ram81 What do you get when you run the seed command?
@isht3 we can talk on gitter let's discuss details about only issues here
Observed behavior
Currently when there is a successful submission, then also a
stderr_file
is created for that submission which leads to the creation of a blank file that is of no use and adds a small overhead in processing the successful submissions.Expected behavior
When there is a successful submission, then
stderr_file
field should be set None instead of creating a blank file.