issues
search
huggingface
/
optimum-intel
🤗 Optimum Intel: Accelerate inference with Intel optimization tools
https://huggingface.co/docs/optimum/main/en/intel/index
Apache License 2.0
413
stars
112
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
add saving safety_checker
#990
eaidova
closed
2 weeks ago
1
add patching for update_causal_mask to falcon for >= 4.45
#989
eaidova
closed
2 weeks ago
4
Introduce support for NF4 data type for OV weight compression
#988
l-bat
closed
2 weeks ago
3
INC/IPEX support with torch 2.2/2.4
#987
IlyasMoutawwakil
opened
3 weeks ago
3
fix loading safety checker in pipeline
#986
eaidova
closed
3 weeks ago
0
Add a full OV testing GHA workflow for nightly
#985
nikita-savelyevv
opened
3 weeks ago
1
fix order of hidden states in text encoder
#984
eaidova
closed
3 weeks ago
1
fix getting default diffusion pipeline parameters from config
#983
eaidova
closed
3 weeks ago
1
Add SDPA to scope overrides
#982
nikita-savelyevv
closed
3 weeks ago
1
refine Dockerfile to support both cpu and xpu platform
#981
kaixuanliu
opened
4 weeks ago
1
Not able to find `GLIBCXX_3.4.32'
#980
iffishells
opened
4 weeks ago
1
enable gpt2, falcon has core dump error in PagedAttention.single_quer…
#979
jiqing-feng
closed
3 weeks ago
0
fix diffusers version info in IR
#978
eaidova
closed
3 weeks ago
1
phi3 vision
#977
eaidova
closed
2 weeks ago
1
Restore SDPA in Gemma2 models for transformers > 4.45
#976
eaidova
closed
1 month ago
4
Clean and clear CI
#975
IlyasMoutawwakil
closed
3 weeks ago
1
Update `run_ocr_post_training.py` reference
#974
emmanuel-ferdman
closed
1 week ago
2
[OV]: Updated the list of notebooks in the README
#973
AlexKoff88
closed
1 month ago
1
add minicpmv support
#972
eaidova
closed
3 weeks ago
1
request for mllama
#971
yash3056
closed
3 days ago
1
fix switching between legacy and new processing for llava
#970
eaidova
closed
2 weeks ago
1
add support of nanollava model
#969
eaidova
closed
3 weeks ago
1
fix conversion for text embeddings for fp16 models
#968
eaidova
closed
2 weeks ago
1
fix device selection for compilation language model in vlm and model saving
#967
eaidova
closed
1 week ago
1
fix config saving when check on misplaced args broken
#966
eaidova
closed
2 weeks ago
5
Follow-up request after #916 due to issues on the OpenVINO side
#965
andrei-kochin
closed
3 days ago
3
add token_type_ids in lm forward signature
#964
eaidova
closed
4 weeks ago
1
text
#963
eaidova
closed
1 month ago
1
disable warning about tokenizers version for ov tokenizers >= 2024.5
#962
eaidova
closed
1 month ago
1
restore original model_index.json after save_pretrained call
#961
eaidova
closed
1 month ago
1
transformers 4.46 compatibility
#960
echarlaix
closed
1 month ago
2
fix tmp dir saving
#959
eaidova
closed
1 month ago
2
enable qkv concat layer
#958
jiqing-feng
closed
1 month ago
0
updated OVPipelinePart to have separate ov_config
#957
e-ddykim
closed
1 month ago
1
Export model error
#956
eightreal
closed
1 month ago
1
Added notebook to showcase quantization of Sentence Transformers model
#955
AlexKoff88
closed
1 month ago
5
fix bug when doing beam search
#954
kaixuanliu
closed
1 month ago
0
Install torchvision CPU in OpenVINO notebook tests
#953
helena-intel
closed
1 month ago
1
fix compatibility with diffusers < 0.25.0
#952
eaidova
closed
1 month ago
2
Quantization support for CausalVisualLMs
#951
nikita-savelyevv
closed
2 weeks ago
2
fix doc build
#950
echarlaix
closed
1 month ago
1
add env variable for slow tests
#949
echarlaix
closed
1 month ago
0
Symbol use in optimum: fix misprint
#948
jane-intel
closed
1 month ago
8
Exporting tokenizers to OpenVINO is not supported for tokenizers version > 0.19 ?
#947
JamieVC
closed
6 days ago
6
Performance and AMX utilization questions with optimum-intel 1.17 and 1.20 for LLM Inference on SPR CPU
#946
zsym-sjtu
opened
1 month ago
0
refine class IPEXPagedCache's update method
#945
kaixuanliu
closed
1 month ago
1
refine class IPEXPagedCache's update method
#944
kaixuanliu
closed
1 month ago
0
refine class IPEXPagedCache's update method
#943
kaixuanliu
closed
1 month ago
0
Why is inference_mode removed?
#942
zsym-sjtu
closed
1 month ago
1
allow to use SDPA in clip models
#941
eaidova
closed
1 month ago
3
Previous
Next