dres-dev / DRES

Distributed Retrieval Evaluation Server
MIT License
14 stars 3 forks source link

Support of submitting massive number of video shots to the system #483

Open nikkiwoo-gh opened 4 months ago

nikkiwoo-gh commented 4 months ago

Is your feature request related to a problem? Please describe. I try to use the DRES system for AVS annotation. I have a pool of video shots needed to submit to the DRES system for evaluation, e.g., 10k video shots for each query. Currently, I use the API to submit the video shots. However, I cannot submit too many at a time and it is time-consuming for submitting the pool.

Describe the solution you'd like I wander if the system can have the upload button at the front (UI) for submission. Thus, I can submit the pool by uploading a file. The file lists the video shots.

Describe alternatives you've considered allow submissions of larger batch size. Currenctly, it is less than 300?

Additional context Can I change the batch size of submission before building DRES? if so, where I can change.

Thanks a lot.

sauterl commented 4 months ago

Thanks for your issue.

I try to use the DRES system for AVS annotation.

I assume you are trying to evaluate Ad-hoc Video Search (AVS) system(s) ? DRES is for evaluations, not to annotate a dataset (and export this information).

As for now, DRES is primarily build for interactive evaluation campaigns, where AVS submissions tend to be less than 2.5k submissions simultaneously -- we never tried to have 10k submissions so far.

Before considering this feature request, we'd need some more information:

  1. How do you plan to judge these 10k submissions? The typical AVS setup (and the task type preset) are set for manual judgement, which seems to be off-limits with 10k submisisons.
  2. How did you upload the submissions currently?

I wander if the system can have the upload button at the front (UI) for submission

The frontend (UI) of DRES is designed and build for these use cases:

  1. Designing and building the evaluation (admin)
  2. Orchestrating / directing the evaluation (admin)
  3. Manual judgement of incoming submissions in certain cases (judge)
  4. Getting information abou the (ongoing) evaluation, particularly the task (participant)

Intentionally, there is no UI for submissions, as this should be facilitated using the REST API for clients - did you try this? (There is an OpenApi UI for this: https://editor.swagger.io/?url=https://raw.githubusercontent.com/dres-dev/DRES/master/doc/oas-client.json )