AISG-Technology-Team / AISG-Online-Safety-Challenge-Submission-Guide

Submission Guide + Discussion Board for AI Singapore Online Safety Prize Challenge
https://ospc.aisingapore.org
14 stars 5 forks source link

offensive memes - aliencaocao - batched inference? #2

Closed aliencaocao closed 9 months ago

aliencaocao commented 9 months ago

I am writing to check if batched inference is allowed for the docker image, or must it strictly be single-sample per batch? Asking because speed difference is very large and we could read all stdin at once then output all stdout at once too. The competition github repository did not specify on this and such is possible based on the example scripts given.

AISG-Wesley commented 9 months ago

Hi aliencaocao,

Thanks for your question.

For the purposes of the leaderboard, we do not distinguish how the stdin is processed. You are welcome to experiment with your submission as long as each line from stdin corresponds with an output from stdout.

Referencing the following excerpts from Technical Details: Submission Specification Guidelines:

Input to Docker Container: Your solution must use stdin, where each line from stdin corresponds to a file path to a single image. This simulates a real-world situation where your solution is called on a per-image basis. An example of a line of input from stdin might look like /images/9ad157be-f32f-4770-b409-8c4650478f5b.png .

Output (stdout, stderr) to Container: Solution output: stdout Your solution must use stdout to output the result of your analysis for each line of input from stdin.

aliencaocao commented 9 months ago

thank you for the answer.