issues
search
kijai
/
ComfyUI-Florence2
Inference Microsoft Florence2 VLM
MIT License
280
stars
15
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
Error occurred when executing DownloadAndLoadFlorence2Model:
#28
liaceboy
opened
1 hour ago
1
[Feature Request] Separate Region_caption objects
#27
toyxyz
opened
21 hours ago
0
Is there a better way to achieve this effect?
#26
1dadou1
opened
1 day ago
1
[Discussion] Some Observations on Florence2
#25
dnl13
opened
2 days ago
2
Custom SD checkpoint, please.
#24
razvanab
closed
2 days ago
1
No module named 'flash_attn_2_cuda'
#23
1-eyx
opened
3 days ago
2
Error occurred when executing DownloadAndLoadFlorence2Model:
#22
yonghao5
opened
4 days ago
1
'ImageFont' object has no attribute 'font_variant'
#21
ffhelly
opened
5 days ago
2
DocVQA inference support
#20
orabazes
closed
5 days ago
1
Error occurred when executing DownloadAndLoadFlorence2Model:
#19
czlaczi
opened
6 days ago
1
Error occurred when executing DownloadAndLoadFlorence2Model: Object of type Florence2LanguageConfig is not JSON serializable
#18
Amit30swgoh
opened
6 days ago
2
FlashAttention2 has been toggled on, but it cannot be used due to the following error: the package flash_attn seems to be not installed. Please refer to the documentation of https://huggingface.co/docs/transformers/perf_infer_gpu_one#flashattention-2 to install Flash Attention 2.
#17
alenhrp
opened
6 days ago
2
SyntaxWarning: "is not" with a literal. Did you mean "!="? if text_input is not ""
#16
RGX650
closed
1 week ago
1
What's the difference of the 4 models?
#15
deepfree2023
opened
1 week ago
3
update node.py
#14
nero-dv
closed
1 week ago
2
resolves typo in nodes.py:164; Cleaned formatting in nodes.py
#13
nero-dv
closed
1 week ago
0
Not seeing any auto-downloaded LLM in ComfyUI folder after clone this repository in ComfyUI/custom_nodes
#12
ChenxiangLi0620
opened
1 week ago
3
Why do strange slashes often appear in masks?
#11
Datou
opened
1 week ago
1
Please add a function description
#10
alex13by
closed
1 week ago
0
OCR - end and beginning of different lines stick together
#9
Ratinod
closed
1 week ago
4
Error: FlashAttention2 has been toggled on, but it cannot be used [Windows 11]
#8
Vigilence
closed
1 week ago
5
Could not locate the configuration_florence2.py inside
#7
alexcc4
closed
1 week ago
6
add OCR option
#6
marcojoao
closed
1 week ago
2
'list' object has no attribute 'replace'
#5
formulake
closed
1 week ago
2
Needs some string cleanup to get rid of the <s> and </s> in the caption output
#4
RandomGitUser321
opened
1 week ago
5
torch.cat(): expected a non-empty list of Tensors
#3
LankyPoet
closed
1 week ago
4
New Error "Task token should be the only token in the text". After fix this Error: using , flash_attention_2 for attention , FlashAttention2 has been toggled on,
#2
Erwin11
closed
1 week ago
4
There is a way to use flash attention in AMD Windows? RX580
#1
KillyTheNetTerminal
opened
1 week ago
5