import yaml
yaml_config = """
base_model: SameedHussain/phi-3.8-flight
dtype: float16
gate_mode: hidden
experts:
- source_model: SameedHussain/phi-3.8-flight
positive_prompts:
- "flight"
- source_model: SameedHussain/phi-3.8-hotel
positive_prompts:
- "hotel"
"""
# Save config as yaml file
with open('config.yaml', 'w', encoding="utf-8") as f:
f.write(yaml_config)
I am trying to merge two of my finetuned models together, all goes well but it just exits on its own after this:
Loading checkpoint shards: 100%|██████████████████| 2/2 [00:32<00:00, 16.37s/it]
expert prompts: 0%| | 0/2 [00:00<?, ?it/s]We detected that you are passing `past_key_values` as a tuple and this is deprecated and will be removed in v4.43. Please use an appropriate `Cache` class (https://huggingface.co/docs/transformers/v4.41.3/en/internal/generation_utils#transformers.Cache)
2024-08-08 14:29:32.424429: E external/local_xla/xla/stream_executor/cuda/cuda_fft.cc:485] Unable to register cuFFT factory: Attempting to register factory for plugin cuFFT when one has already been registered
2024-08-08 14:29:32.710288: E external/local_xla/xla/stream_executor/cuda/cuda_dnn.cc:8454] Unable to register cuDNN factory: Attempting to register factory for plugin cuDNN when one has already been registered
2024-08-08 14:29:32.796198: E external/local_xla/xla/stream_executor/cuda/cuda_blas.cc:1452] Unable to register cuBLAS factory: Attempting to register factory for plugin cuBLAS when one has already been registered
expert prompts: 100%|█████████████████████████████| 2/2 [00:10<00:00, 5.41s/it]
I am trying to merge two of my finetuned models together, all goes well but it just exits on its own after this: