oobabooga / text-generation-webui

A Gradio web UI for Large Language Models.
GNU Affero General Public License v3.0
40.24k stars 5.28k forks source link

[end of output] Error #4446

Closed MilesQLi closed 10 months ago

MilesQLi commented 11 months ago

Describe the bug

Just Run start-Windows, nvidia, No(not ue CUDA 11.8). Then I get

` copying build\lib\sentence_transformers\cross_encoder\evaluation\CEBinaryClassificationEvaluator.py -> build\bdist.win-amd64\wheel.\sentence_transformers\cross_encoder\evaluation error: could not create 'build\bdist.win-amd64\wheel.\sentence_transformers\cross_encoder\evaluation\CEBinaryClassificationEvaluator.py': No such file or directory [end of output]

note: This error originates from a subprocess, and is likely not a problem with pip. ERROR: Failed building wheel for sentence-transformers Running setup.py clean for sentence-transformers Failed to build sentence-transformers ERROR: Could not build wheels for sentence-transformers, which is required to install pyproject.toml-based projects Command '"F:\text-generation-webui-main\text-generation-webui-main\installer_files\conda\condabin\conda.bat" activate "F:\text-generation-webui-main\text-generation-webui-main\installer_files\env" >nul && python -m pip install -r extensions\openai\requirements.txt --upgrade' failed with exit status code '1'.`

Is there an existing issue for this?

Reproduction

See above

Screenshot

No response

Logs

copying build\lib\sentence_transformers\cross_encoder\evaluation\CEBinaryClassificationEvaluator.py -> build\bdist.win-amd64\wheel\.\sentence_transformers\cross_encoder\evaluation
      error: could not create 'build\bdist.win-amd64\wheel\.\sentence_transformers\cross_encoder\evaluation\CEBinaryClassificationEvaluator.py': No such file or directory
      [end of output]

  note: This error originates from a subprocess, and is likely not a problem with pip.
  ERROR: Failed building wheel for sentence-transformers
  Running setup.py clean for sentence-transformers
Failed to build sentence-transformers
ERROR: Could not build wheels for sentence-transformers, which is required to install pyproject.toml-based projects
Command '"F:\text-generation-webui-main\text-generation-webui-main\installer_files\conda\condabin\conda.bat" activate "F:\text-generation-webui-main\text-generation-webui-main\installer_files\env" >nul && python -m pip install -r extensions\openai\requirements.txt --upgrade' failed with exit status code '1'.

System Info

GPU RTX 3090, Windows 11
costanzo commented 11 months ago

Same problem here. GPU RTX 4060Ti Windows 10.

Building wheels for collected packages: sentence-transformers
  Building wheel for sentence-transformers (setup.py) ... error
  error: subprocess-exited-with-error

  × python setup.py bdist_wheel did not run successfully.
  │ exit code: 1
  ╰─> [97 lines of output]
      D:\workSpace\ai\text-generation-webui\text-generation-webui-snapshot-2023-10-29\installer_files\env\Lib\site-packages\setuptools\dist.py:745: SetuptoolsDeprecationWarning: Invalid dash-separated options
      !!

              ********************************************************************************
              Usage of dash-separated 'description-file' will not be supported in future
              versions. Please use the underscore name 'description_file' instead.

              This deprecation is overdue, please update your project and remove deprecated
              calls to avoid build errors in the future.

              See https://setuptools.pypa.io/en/latest/userguide/declarative_config.html for details.
              ********************************************************************************

      !!
        opt = self.warn_dash_deprecation(opt, section)
      running bdist_wheel
      running build
      running build_py
      creating build
      creating build\lib
      creating build\lib\sentence_transformers
      copying sentence_transformers\LoggingHandler.py -> build\lib\sentence_transformers
      copying sentence_transformers\model_card_templates.py -> build\lib\sentence_transformers
      copying sentence_transformers\SentenceTransformer.py -> build\lib\sentence_transformers
      copying sentence_transformers\util.py -> build\lib\sentence_transformers
      copying sentence_transformers\__init__.py -> build\lib\sentence_transformers
      creating build\lib\sentence_transformers\cross_encoder
      copying sentence_transformers\cross_encoder\CrossEncoder.py -> build\lib\sentence_transformers\cross_encoder
      copying sentence_transformers\cross_encoder\__init__.py -> build\lib\sentence_transformers\cross_encoder
      creating build\lib\sentence_transformers\datasets
      copying sentence_transformers\datasets\DenoisingAutoEncoderDataset.py -> build\lib\sentence_transformers\datasets
      copying sentence_transformers\datasets\NoDuplicatesDataLoader.py -> build\lib\sentence_transformers\datasets
      copying sentence_transformers\datasets\ParallelSentencesDataset.py -> build\lib\sentence_transformers\datasets
      copying sentence_transformers\datasets\SentenceLabelDataset.py -> build\lib\sentence_transformers\datasets
      copying sentence_transformers\datasets\SentencesDataset.py -> build\lib\sentence_transformers\datasets
      copying sentence_transformers\datasets\__init__.py -> build\lib\sentence_transformers\datasets
      creating build\lib\sentence_transformers\evaluation
      copying sentence_transformers\evaluation\BinaryClassificationEvaluator.py -> build\lib\sentence_transformers\evaluation
      copying sentence_transformers\evaluation\EmbeddingSimilarityEvaluator.py -> build\lib\sentence_transformers\evaluation
      copying sentence_transformers\evaluation\InformationRetrievalEvaluator.py -> build\lib\sentence_transformers\evaluation
      copying sentence_transformers\evaluation\LabelAccuracyEvaluator.py -> build\lib\sentence_transformers\evaluation
      copying sentence_transformers\evaluation\MSEEvaluator.py -> build\lib\sentence_transformers\evaluation
      copying sentence_transformers\evaluation\MSEEvaluatorFromDataFrame.py -> build\lib\sentence_transformers\evaluation
      copying sentence_transformers\evaluation\ParaphraseMiningEvaluator.py -> build\lib\sentence_transformers\evaluation
      copying sentence_transformers\evaluation\RerankingEvaluator.py -> build\lib\sentence_transformers\evaluation
      copying sentence_transformers\evaluation\SentenceEvaluator.py -> build\lib\sentence_transformers\evaluation
      copying sentence_transformers\evaluation\SequentialEvaluator.py -> build\lib\sentence_transformers\evaluation
      copying sentence_transformers\evaluation\SimilarityFunction.py -> build\lib\sentence_transformers\evaluation
      copying sentence_transformers\evaluation\TranslationEvaluator.py -> build\lib\sentence_transformers\evaluation
      copying sentence_transformers\evaluation\TripletEvaluator.py -> build\lib\sentence_transformers\evaluation
      copying sentence_transformers\evaluation\__init__.py -> build\lib\sentence_transformers\evaluation
      creating build\lib\sentence_transformers\losses
      copying sentence_transformers\losses\BatchAllTripletLoss.py -> build\lib\sentence_transformers\losses
      copying sentence_transformers\losses\BatchHardSoftMarginTripletLoss.py -> build\lib\sentence_transformers\losses
      copying sentence_transformers\losses\BatchHardTripletLoss.py -> build\lib\sentence_transformers\losses
      copying sentence_transformers\losses\BatchSemiHardTripletLoss.py -> build\lib\sentence_transformers\losses
      copying sentence_transformers\losses\ContrastiveLoss.py -> build\lib\sentence_transformers\losses
      copying sentence_transformers\losses\ContrastiveTensionLoss.py -> build\lib\sentence_transformers\losses
      copying sentence_transformers\losses\CosineSimilarityLoss.py -> build\lib\sentence_transformers\losses
      copying sentence_transformers\losses\DenoisingAutoEncoderLoss.py -> build\lib\sentence_transformers\losses
      copying sentence_transformers\losses\MarginMSELoss.py -> build\lib\sentence_transformers\losses
      copying sentence_transformers\losses\MegaBatchMarginLoss.py -> build\lib\sentence_transformers\losses
      copying sentence_transformers\losses\MSELoss.py -> build\lib\sentence_transformers\losses
      copying sentence_transformers\losses\MultipleNegativesRankingLoss.py -> build\lib\sentence_transformers\losses
      copying sentence_transformers\losses\MultipleNegativesSymmetricRankingLoss.py -> build\lib\sentence_transformers\losses
      copying sentence_transformers\losses\OnlineContrastiveLoss.py -> build\lib\sentence_transformers\losses
      copying sentence_transformers\losses\SoftmaxLoss.py -> build\lib\sentence_transformers\losses
      copying sentence_transformers\losses\TripletLoss.py -> build\lib\sentence_transformers\losses
      copying sentence_transformers\losses\__init__.py -> build\lib\sentence_transformers\losses
      creating build\lib\sentence_transformers\models
      copying sentence_transformers\models\Asym.py -> build\lib\sentence_transformers\models
      copying sentence_transformers\models\BoW.py -> build\lib\sentence_transformers\models
      copying sentence_transformers\models\CLIPModel.py -> build\lib\sentence_transformers\models
      copying sentence_transformers\models\CNN.py -> build\lib\sentence_transformers\models
      copying sentence_transformers\models\Dense.py -> build\lib\sentence_transformers\models
      copying sentence_transformers\models\Dropout.py -> build\lib\sentence_transformers\models
      copying sentence_transformers\models\LayerNorm.py -> build\lib\sentence_transformers\models
      copying sentence_transformers\models\LSTM.py -> build\lib\sentence_transformers\models
      copying sentence_transformers\models\Normalize.py -> build\lib\sentence_transformers\models
      copying sentence_transformers\models\Pooling.py -> build\lib\sentence_transformers\models
      copying sentence_transformers\models\Transformer.py -> build\lib\sentence_transformers\models
      copying sentence_transformers\models\WeightedLayerPooling.py -> build\lib\sentence_transformers\models
      copying sentence_transformers\models\WordEmbeddings.py -> build\lib\sentence_transformers\models
      copying sentence_transformers\models\WordWeights.py -> build\lib\sentence_transformers\models
      copying sentence_transformers\models\__init__.py -> build\lib\sentence_transformers\models
      creating build\lib\sentence_transformers\readers
      copying sentence_transformers\readers\InputExample.py -> build\lib\sentence_transformers\readers
      copying sentence_transformers\readers\LabelSentenceReader.py -> build\lib\sentence_transformers\readers
      copying sentence_transformers\readers\NLIDataReader.py -> build\lib\sentence_transformers\readers
      copying sentence_transformers\readers\PairedFilesReader.py -> build\lib\sentence_transformers\readers
      copying sentence_transformers\readers\STSDataReader.py -> build\lib\sentence_transformers\readers
      copying sentence_transformers\readers\TripletReader.py -> build\lib\sentence_transformers\readers
      copying sentence_transformers\readers\__init__.py -> build\lib\sentence_transformers\readers
      creating build\lib\sentence_transformers\cross_encoder\evaluation
      copying sentence_transformers\cross_encoder\evaluation\CEBinaryAccuracyEvaluator.py -> build\lib\sentence_transformers\cross_encoder\evaluation
      copying sentence_transformers\cross_encoder\evaluation\CEBinaryClassificationEvaluator.py -> build\lib\sentence_transformers\cross_encoder\evaluation
      error: could not create 'build\lib\sentence_transformers\cross_encoder\evaluation\CEBinaryClassificationEvaluator.py': No such file or directory
      [end of output]

  note: This error originates from a subprocess, and is likely not a problem with pip.
  ERROR: Failed building wheel for sentence-transformers
  Running setup.py clean for sentence-transformers
Failed to build sentence-transformers
ERROR: Could not build wheels for sentence-transformers, which is required to install pyproject.toml-based projects
Command '"D:\workSpace\ai\text-generation-webui\text-generation-webui-snapshot-2023-10-29\installer_files\conda\condabin\conda.bat" activate "D:\workSpace\ai\text-generation-webui\text-generation-webui-snapshot-2023-10-29\installer_files\env" >nul && python -m pip install -r extensions\openai\requirements.txt --upgrade' failed with exit status code '1'.

Exiting now.
Try running the start/update script again.

Then I ran update command

start_windows.bat --update

After it updates the dependencies. Start the batch file again and it will work!!!

> start_windows.bat 

bin D:\workSpace\ai\text-generation-webui\text-generation-webui-snapshot-2023-10-29\installer_files\env\Lib\site-packages\bitsandbytes\libbitsandbytes_cuda121.dll
2023-11-04 20:52:54 INFO:Loading the extension "gallery"...
Running on local URL:  http://127.0.0.1:7860

To create a public link, set `share=True` in `launch()`.
D:\workSpace\ai\text-generation-webui\text-generation-webui-snapshot-2023-10-29\installer_files\env\Lib\site-packages\gradio\components\dropdown.py:231: UserWarning: The value passed into gr.Dropdown() is not in the list of choices. Please update the list of choices to include: llama or set allow_custom_value=True.
  warnings.warn(
2023-11-04 20:53:03 INFO:Loading CodeLlama-7b-Instruct-hf...
Loading checkpoint shards: 100%|███████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 2/2 [00:05<00:00,  2.85s/it]
2023-11-04 20:53:12 INFO:Loaded the model in 9.26 seconds.
Output generated in 4.45 seconds (9.90 tokens/s, 44 tokens, context 23, seed 1136857672)
Output generated in 6.20 seconds (15.80 tokens/s, 98 tokens, context 24, seed 86105244)
Output generated in 26.31 seconds (16.07 tokens/s, 423 tokens, context 15, seed 960047630)
Output generated in 29.02 seconds (16.06 tokens/s, 466 tokens, context 14, seed 1037576951)
github-actions[bot] commented 10 months ago

This issue has been closed due to inactivity for 6 weeks. If you believe it is still relevant, please leave a comment below. You can tag a developer in your comment.