I'm currently trying to deploy my own clip-retrieval API using my own data (stored in the local machine).
The dataset is in webdataset format, which I used to make and call an API.
I managed to get a retrieval result using the API, but am trying to figure out how to visualize the image.
For example, from the responses,
[
{
"caption": "A small white dog wearing a black hat. ",
"image_path": "00001153",
"id": 108,
"similarity": 0.4931468367576599
},
{
"caption": "A lit candle in the shape of an elephant.",
"image_path": "00001688",
"id": 1152,
"similarity": 0.4843543767929077
},
...
]
I would like to visualize the images. But since they are stored in .tarfiles (webdatset format), I can not visualize them.
Are there any ways to visualize the retrieved results?
Hi! First of all, thanks for the awesome repo!
I'm currently trying to deploy my own clip-retrieval API using my own data (stored in the local machine). The dataset is in
webdataset
format, which I used to make and call an API.I managed to get a retrieval result using the API, but am trying to figure out how to visualize the image.
For example, from the responses,
I would like to visualize the images. But since they are stored in
.tar
files (webdatset
format), I can not visualize them. Are there any ways to visualize the retrieved results?Thank you!