Closed KaifAhmad1 closed 4 months ago
Hi @KaifAhmad1 it seems like you are trying to run code from distilabel<1.0.0
but you have installed a version >=1.0.0
(currently we have distilabel.llms
module). Could you check?
Also @KaifAhmad1 if you are willing to upgrade to v1.0.0 or higher, we're happy to support the transition from distilabel v0.6.0
or lower to v1.0.0 😄
Hey, @plaguss Here is the version of distillable distilabel Version: 1.0.3
I have tried this script but getting same error
!pip install -q -U distilabel<1.0.0 "farm-haystack[preprocessing]"
!pip install -q -U "distilabel[hf-inference-endpoints, argilla]"
Apparently the second command is overwriting the version of distilabel
, try with the following:
!pip install -q -U "distilabel[hf-inference-endpoints, argilla]<1.0.0"
@plaguss Now Getting this exception!
!pip install -q -U distilabel<1.0.0 "farm-haystack[preprocessing]"
!pip install -q -U "distilabel[hf-inference-endpoints, argilla]<1.0.0"
import os
from typing import Dict
from distilabel.llm import InferenceEndpointsLLM
from distilabel.pipeline import Pipeline, pipeline
from distilabel.tasks import TextGenerationTask, SelfInstructTask, Prompt
from datasets import Dataset
from haystack.nodes import PDFToTextConverter, PreProcessor
╭─────────────────────────────── Traceback (most recent call last) ────────────────────────────────╮
│ /usr/local/lib/python3.10/dist-packages/IPython/core/interactiveshell.py:3553 in run_code │
│ │
│ 3550 │ │ │ │ elif async_ : │
│ 3551 │ │ │ │ │ await eval(code_obj, self.user_global_ns, self.user_ns) │
│ 3552 │ │ │ │ else: │
│ ❱ 3553 │ │ │ │ │ exec(code_obj, self.user_global_ns, self.user_ns) │
│ 3554 │ │ │ finally: │
│ 3555 │ │ │ │ # Reset our crash handler in place │
│ 3556 │ │ │ │ sys.excepthook = old_excepthook │
│ in <cell line: 4>:4 │
│ │
│ /usr/local/lib/python3.10/dist-packages/distilabel/llm/__init__.py:18 in <module> │
│ │
│ 15 from distilabel.llm.anyscale import AnyscaleLLM │
│ 16 from distilabel.llm.base import LLM, LLMPool, ProcessLLM │
│ 17 from distilabel.llm.google.vertexai import VertexAIEndpointLLM, VertexAILLM │
│ ❱ 18 from distilabel.llm.huggingface.inference_endpoints import InferenceEndpointsLLM │
│ 19 from distilabel.llm.huggingface.transformers import TransformersLLM │
│ 20 from distilabel.llm.llama_cpp import LlamaCppLLM │
│ 21 from distilabel.llm.mistralai import MistralAILLM │
│ │
│ /usr/local/lib/python3.10/dist-packages/distilabel/llm/huggingface/inference_endpoints.py:38 in │
│ <module> │
│ │
│ 35 │ │ InferenceTimeoutError, │
│ 36 │ │ get_inference_endpoint, │
│ 37 │ ) │
│ ❱ 38 │ from huggingface_hub.inference._text_generation import TextGenerationError │
│ 39 │ │
│ 40 │ _INFERENCE_ENDPOINTS_API_RETRY_ON_EXCEPTIONS = ( │
│ 41 │ │ InferenceTimeoutError, │
╰──────────────────────────────────────────────────────────────────────────────────────────────────╯
ModuleNotFoundError: No module named 'huggingface_hub.inference._text_generation'
That error is not related to distilabel
directly but with the dependencies' versions, you should check the dependencies corresponding to that version. The following ones should work:
!pip install huggingface_hub==0.19.0 --upgrade
!pip install transformers==4.34.1 --upgrade
But as @alvarobartt mentioned we are happy to help you with the transition to use the newer version 🙂
Yeah @plaguss @alvarobartt Please! share the documentation toward transition!