-
* Name of dataset: Google conceptual captions
* URL of dataset: https://ai.google.com/research/ConceptualCaptions/download
* License of dataset: unknown
* Short description of dataset and use case(…
-
Hello, Thank you for your efforts for writing and putting up your code on github. I really appreciate it.
I have an issue with it when I use anaconda environment while executing it, it gives me the…
-
Location: /pghbio/dbmi/batmanlab/Data/radiologyTextDataset2/singla/RAD-ALL.deid
**Keyword or concept tagging**
- [x] Noble Coder Named Entity Recognition (NER) engine for biomedical text. [DBMI…
-
Change the custom loader your are using for the Dataset API. Keep the same functionalities like data augmentation.
Change the rest of the code to integrate it.
https://www.tensorflow.org/tutorials/e…
-
Hi, thank you very much for open source. I want to use my own Image and caption, and QA data to fine-tune the BLIP2 data. Should my process be to prepare the same data set for okvaq, and then run the …
xcxhy updated
6 months ago
-
景色を見て、「綺麗」「暗い」「壮大」のように印象語で評価する方法が知りたいです。
image captioningのようなことがしたいのですが、
何が映っているかというobject detectionではなく人の感想(印象語/感情語)のようなものを出力として得たいです。
データセットは、flickr/instagramなどのSNSから画像とコメントを取ってきて作る研究があるので、…
-
Professor Luo, I am a beginner and I want to reproduce these models on my own dataset. My dataset is a simple image captioning dataset, and I want to extract attention (att) features for further train…
-
I ran the codes which In the path(\LeakGAN-master\Image COCO\Main.py)(LeakGAN-master\Image COCO\eval_bleu.py). but I can't get the same results as the paper. I just changed the python's version(3…
-
A closed captioning icon on the video activity should toggle subtitles for the video.
Ideally, the subtitles should reflect the language of the video. I think the Right Way is to drop in standard s…
-
CC @Iamgoofball
1. Not sure about this one, but since it's just `ai.say(";[message]")`, I imagine that if comms are down, there will be no captions.
2. Fake-words like `shonk` will show up raw, s…