Closed registro6503 closed 1 year ago
error: Microsoft Visual C++ 14.0 or greater is required.
It's right in your logs
error: Microsoft Visual C++ 14.0 or greater is required.
It's right in your logs
I Already have Microsoft Visual C++14 i reinstalled the newest version now. And i have the same issue :(
I had issues with newest version of xformers today. Maybe it's the same problem. Installing xformers==0.0.20 resolved my issue, but I'm not sure if its applicable to StabilityMatrix.
Sorry folks, seems to be an issue with the xformers package. They released an update that is failing to build on Windows machines. You can follow along with the issue on their repo: https://github.com/facebookresearch/xformers/issues/886
If it is not fixed by them soon, we will release a hotfix to pin the xformers to a version that's known working.
Hey dear Community can anyone help me to this this error? I downloadet the .exe new an execute them while installing stable diffusion the error Prompt.
full log is here:
pip install failed with code 1: 'Collecting xformers\r\n Downloading xformers-0.0.22.post3.tar.gz (3.9 MB)\r\n ---------------------------------------- 3.9/3.9 MB 13.7 MB/s eta 0:00:00\r\n Preparing metadata (setup.py): started\r\n Preparing metadata (setup.py): finished with status 'done'\r\nRequirement already satisfied: numpy in c:\users\elite\desktop\stabilitymatrix\packages\stable-diffusion-webui\venv\lib\site-packages (from xformers) (1.26.0)\r\nCollecting torch==2.1.0 (from xformers)\r\n Obtaining dependency information for torch==2.1.0 from https://files.pythonhosted.org/packages/fa/47/1a7daf04f40715fc1cdc6f1cc3200228a556d06c843e6ceb58883b745e1b/torch-2.1.0-cp310-cp310-win_amd64.whl.metadata\r\n Downloading torch-2.1.0-cp310-cp310-win_amd64.whl.metadata (24 kB)\r\nRequirement already satisfied: filelock in c:\users\elite\desktop\stabilitymatrix\packages\stable-diffusion-webui\venv\lib\site-packages (from torch==2.1.0->xformers) (3.12.4)\r\nRequirement already satisfied: typing-extensions in c:\users\elite\desktop\stabilitymatrix\packages\stable-diffusion-webui\venv\lib\site-packages (from torch==2.1.0->xformers) (4.8.0)\r\nRequirement already satisfied: sympy in c:\users\elite\desktop\stabilitymatrix\packages\stable-diffusion-webui\venv\lib\site-packages (from torch==2.1.0->xformers) (1.12)\r\nRequirement already satisfied: networkx in c:\users\elite\desktop\stabilitymatrix\packages\stable-diffusion-webui\venv\lib\site-packages (from torch==2.1.0->xformers) (3.1)\r\nRequirement already satisfied: jinja2 in c:\users\elite\desktop\stabilitymatrix\packages\stable-diffusion-webui\venv\lib\site-packages (from torch==2.1.0->xformers) (3.1.2)\r\nCollecting fsspec (from torch==2.1.0->xformers)\r\n Obtaining dependency information for fsspec from https://files.pythonhosted.org/packages/fe/d3/e1aa96437d944fbb9cc95d0316e25583886e9cd9e6adc07baad943524eda/fsspec-2023.9.2-py3-none-any.whl.metadata\r\n Downloading fsspec-2023.9.2-py3-none-any.whl.metadata (6.7 kB)\r\nRequirement already satisfied: MarkupSafe>=2.0 in c:\users\elite\desktop\stabilitymatrix\packages\stable-diffusion-webui\venv\lib\site-packages (from jinja2->torch==2.1.0->xformers) (2.1.3)\r\nRequirement already satisfied: mpmath>=0.19 in c:\users\elite\desktop\stabilitymatrix\packages\stable-diffusion-webui\venv\lib\site-packages (from sympy->torch==2.1.0->xformers) (1.3.0)\r\nDownloading torch-2.1.0-cp310-cp310-win_amd64.whl (192.3 MB)\r\n --------------------------------------- 192.3/192.3 MB 81.8 MB/s eta 0:00:00\r\nDownloading fsspec-2023.9.2-py3-none-any.whl (173 kB)\r\n ---------------------------------------- 173.4/173.4 kB ? eta 0:00:00\r\nBuilding wheels for collected packages: xformers\r\n Building wheel for xformers (setup.py): started\r\n Building wheel for xformers (setup.py): finished with status 'error'\r\n error: subprocess-exited-with-error\r\n \r\n python setup.py bdist_wheel did not run successfully.\r\n exit code: 1\r\n \r\n [223 lines of output]\r\n running bdist_wheel\r\n C:\Users\Elite\Desktop\StabilityMatrix\Packages\stable-diffusion-webui\venv\lib\site-packages\torch\utils\cpp_extension.py:476: UserWarning: Attempted to use ninja as the BuildExtension backend but we could not find ninja.. Falling back to using the slow distutils backend.\r\n warnings.warn(msg.format('we could not find ninja.'))\r\n running build\r\n running build_py\r\n creating build\r\n creating build\lib.win-amd64-cpython-310\r\n creating build\lib.win-amd64-cpython-310\xformers\r\n copying xformers\checkpoint.py -> build\lib.win-amd64-cpython-310\xformers\r\n copying xformers\info.py -> build\lib.win-amd64-cpython-310\xformers\r\n copying xformers\test.py -> build\lib.win-amd64-cpython-310\xformers\r\n copying xformers\utils.py -> build\lib.win-amd64-cpython-310\xformers\r\n copying xformers\version.py -> build\lib.win-amd64-cpython-310\xformers\r\n copying xformers\_cpp_lib.py -> build\lib.win-amd64-cpython-310\xformers\r\n copying xformers\_deprecation_warning.py -> build\lib.win-amd64-cpython-310\xformers\r\n copying xformers\__init__.py -> build\lib.win-amd64-cpython-310\xformers\r\n creating build\lib.win-amd64-cpython-310\xformers\benchmarks\r\n copying xformers\benchmarks\benchmark_attn_decoding.py -> build\lib.win-amd64-cpython-310\xformers\benchmarks\r\n copying xformers\benchmarks\benchmark_blocksparse_transformers.py -> build\lib.win-amd64-cpython-310\xformers\benchmarks\r\n copying xformers\benchmarks\benchmark_causal_blocksparse.py -> build\lib.win-amd64-cpython-310\xformers\benchmarks\r\n copying xformers\benchmarks\benchmark_core.py -> build\lib.win-amd64-cpython-310\xformers\benchmarks\r\n copying xformers\benchmarks\benchmark_indexing.py -> build\lib.win-amd64-cpython-310\xformers\benchmarks\r\n copying xformers\benchmarks\benchmark_mem_eff_attention.py -> build\lib.win-amd64-cpython-310\xformers\benchmarks\r\n copying xformers\benchmarks\benchmark_mem_eff_attn_decoder.py -> build\lib.win-amd64-cpython-310\xformers\benchmarks\r\n copying xformers\benchmarks\benchmark_mlp.py -> build\lib.win-amd64-cpython-310\xformers\benchmarks\r\n copying xformers\benchmarks\benchmark_multi_head_dispatch.py -> build\lib.win-amd64-cpython-310\xformers\benchmarks\r\n copying xformers\benchmarks\benchmark_nystrom_utils.py -> build\lib.win-amd64-cpython-310\xformers\benchmarks\r\n copying xformers\benchmarks\benchmark_revnet.py -> build\lib.win-amd64-cpython-310\xformers\benchmarks\r\n copying xformers\benchmarks\benchmark_sddmm.py -> build\lib.win-amd64-cpython-310\xformers\benchmarks\r\n copying xformers\benchmarks\benchmark_swiglu.py -> build\lib.win-amd64-cpython-310\xformers\benchmarks\r\n copying xformers\benchmarks\benchmark_transformer.py -> build\lib.win-amd64-cpython-310\xformers\benchmarks\r\n copying xformers\benchmarks\benchmark_triton_blocksparse.py -> build\lib.win-amd64-cpython-310\xformers\benchmarks\r\n copying xformers\benchmarks\benchmark_triton_dropout.py -> build\lib.win-amd64-cpython-310\xformers\benchmarks\r\n copying xformers\benchmarks\benchmark_triton_fused_linear.py -> build\lib.win-amd64-cpython-310\xformers\benchmarks\r\n copying xformers\benchmarks\benchmark_triton_layernorm.py -> build\lib.win-amd64-cpython-310\xformers\benchmarks\r\n copying xformers\benchmarks\benchmark_triton_softmax.py -> build\lib.win-amd64-cpython-310\xformers\benchmarks\r\n copying xformers\benchmarks\benchmark_triton_stride_sum.py -> build\lib.win-amd64-cpython-310\xformers\benchmarks\r\n copying xformers\benchmarks\utils.py -> build\lib.win-amd64-cpython-310\xformers\benchmarks\r\n copying xformers\benchmarks\__init__.py -> build\lib.win-amd64-cpython-310\xformers\benchmarks\r\n creating build\lib.win-amd64-cpython-310\xformers\components\r\n copying xformers\components\activations.py -> build\lib.win-amd64-cpython-310\xformers\components\r\n copying xformers\components\input_projection.py -> build\lib.win-amd64-cpython-310\xformers\components\r\n copying xformers\components\multi_head_dispatch.py -> build\lib.win-amd64-cpython-310\xformers\components\r\n copying xformers\components\patch_embedding.py -> build\lib.win-amd64-cpython-310\xformers\components\r\n copying xformers\components\residual.py -> build\lib.win-amd64-cpython-310\xformers\components\r\n copying xformers\components\reversible.py -> build\lib.win-amd64-cpython-310\xformers\components\r\n copying xformers\components\simplicial_embedding.py -> build\lib.win-amd64-cpython-310\xformers\components\r\n copying xformers\components\__init__.py -> build\lib.win-amd64-cpython-310\xformers\components\r\n creating build\lib.win-amd64-cpython-310\xformers\factory\r\n copying xformers\factory\block_configs.py -> build\lib.win-amd64-cpython-310\xformers\factory\r\n copying xformers\factory\block_factory.py -> build\lib.win-amd64-cpython-310\xformers\factory\r\n copying xformers\factory\hydra_helper.py -> build\lib.win-amd64-cpython-310\xformers\factory\r\n copying xformers\factory\model_factory.py -> build\lib.win-amd64-cpython-310\xformers\factory\r\n copying xformers\factory\weight_init.py -> build\lib.win-amd64-cpython-310\xformers\factory\r\n copying xformers\factory\__init__.py -> build\lib.win-amd64-cpython-310\xformers\factory\r\n creating build\lib.win-amd64-cpython-310\xformers\helpers\r\n copying xformers\helpers\hierarchical_configs.py -> build\lib.win-amd64-cpython-310\xformers\helpers\r\n copying xformers\helpers\test_utils.py -> build\lib.win-amd64-cpython-310\xformers\helpers\r\n copying xformers\helpers\timm_sparse_attention.py -> build\lib.win-amd64-cpython-310\xformers\helpers\r\n copying xformers\helpers\__init__.py -> build\lib.win-amd64-cpython-310\xformers\helpers\r\n creating build\lib.win-amd64-cpython-310\xformers\ops\r\n copying xformers\ops\common.py -> build\lib.win-amd64-cpython-310\xformers\ops\r\n copying xformers\ops\indexing.py -> build\lib.win-amd64-cpython-310\xformers\ops\r\n copying xformers\ops\rmsnorm.py -> build\lib.win-amd64-cpython-310\xformers\ops\r\n copying xformers\ops\rope_padded.py -> build\lib.win-amd64-cpython-310\xformers\ops\r\n copying xformers\ops\swiglu_op.py -> build\lib.win-amd64-cpython-310\xformers\ops\r\n copying xformers\ops\unbind.py -> build\lib.win-amd64-cpython-310\xformers\ops\r\n copying xformers\ops\__init__.py -> build\lib.win-amd64-cpython-310\xformers\ops\r\n creating build\lib.win-amd64-cpython-310\xformers\profiler\r\n copying xformers\profiler\api.py -> build\lib.win-amd64-cpython-310\xformers\profiler\r\n copying xformers\profiler\device_limits.py -> build\lib.win-amd64-cpython-310\xformers\profiler\r\n copying xformers\profiler\profiler.py -> build\lib.win-amd64-cpython-310\xformers\profiler\r\n copying xformers\profiler\slow_ops_profiler.py -> build\lib.win-amd64-cpython-310\xformers\profiler\r\n copying xformers\profiler\__init__.py -> build\lib.win-amd64-cpython-310\xformers\profiler\r\n creating build\lib.win-amd64-cpython-310\xformers\sparse\r\n copying xformers\sparse\blocksparse_tensor.py -> build\lib.win-amd64-cpython-310\xformers\sparse\r\n copying xformers\sparse\csr_tensor.py -> build\lib.win-amd64-cpython-310\xformers\sparse\r\n copying xformers\sparse\utils.py -> build\lib.win-amd64-cpython-310\xformers\sparse\r\n copying xformers\sparse\_csr_ops.py -> build\lib.win-amd64-cpython-310\xformers\sparse\r\n copying xformers\sparse\__init__.py -> build\lib.win-amd64-cpython-310\xformers\sparse\r\n creating build\lib.win-amd64-cpython-310\xformers\triton\r\n copying xformers\triton\dropout.py -> build\lib.win-amd64-cpython-310\xformers\triton\r\n copying xformers\triton\fused_linear_layer.py -> build\lib.win-amd64-cpython-310\xformers\triton\r\n copying xformers\triton\k_activations.py -> build\lib.win-amd64-cpython-310\xformers\triton\r\n copying xformers\triton\k_dropout.py -> build\lib.win-amd64-cpython-310\xformers\triton\r\n copying xformers\triton\k_fused_matmul_bw.py -> build\lib.win-amd64-cpython-310\xformers\triton\r\n copying xformers\triton\k_fused_matmul_fw.py -> build\lib.win-amd64-cpython-310\xformers\triton\r\n copying xformers\triton\k_layer_norm.py -> build\lib.win-amd64-cpython-310\xformers\triton\r\n copying xformers\triton\k_softmax.py -> build\lib.win-amd64-cpython-310\xformers\triton\r\n copying xformers\triton\k_sum.py -> build\lib.win-amd64-cpython-310\xformers\triton\r\n copying xformers\triton\layer_norm.py -> build\lib.win-amd64-cpython-310\xformers\triton\r\n copying xformers\triton\softmax.py -> build\lib.win-amd64-cpython-310\xformers\triton\r\n copying xformers\triton\sum_strided.py -> build\lib.win-amd64-cpython-310\xformers\triton\r\n copying xformers\triton\utils.py -> build\lib.win-amd64-cpython-310\xformers\triton\r\n copying xformers\triton\vararg_kernel.py -> build\lib.win-amd64-cpython-310\xformers\triton\r\n copying xformers\triton\__init__.py -> build\lib.win-amd64-cpython-310\xformers\triton\r\n creating build\lib.win-amd64-cpython-310\xformers\_flash_attn\r\n copying xformers\_flash_attn\bert_padding.py -> build\lib.win-amd64-cpython-310\xformers\_flash_attn\r\n copying xformers\_flash_attn\flash_attn_interface.py -> build\lib.win-amd64-cpython-310\xformers\_flash_attn\r\n copying xformers\_flash_attn\flash_attn_triton.py -> build\lib.win-amd64-cpython-310\xformers\_flash_attn\r\n copying xformers\_flash_attn\flash_attn_triton_og.py -> build\lib.win-amd64-cpython-310\xformers\_flash_attn\r\n copying xformers\_flash_attn\flash_blocksparse_attention.py -> build\lib.win-amd64-cpython-310\xformers\_flash_attn\r\n copying xformers\_flash_attn\flash_blocksparse_attn_interface.py -> build\lib.win-amd64-cpython-310\xformers\_flash_attn\r\n copying xformers\_flash_attn\fused_softmax.py -> build\lib.win-amd64-cpython-310\xformers\_flash_attn\r\n copying xformers\_flash_attn\__init__.py -> build\lib.win-amd64-cpython-310\xformers\_flash_attn\r\n creating build\lib.win-amd64-cpython-310\xformers\benchmarks\LRA\r\n copying xformers\benchmarks\LRA\batch_fetch_results.py -> build\lib.win-amd64-cpython-310\xformers\benchmarks\LRA\r\n copying xformers\benchmarks\LRA\batch_submit.py -> build\lib.win-amd64-cpython-310\xformers\benchmarks\LRA\r\n copying xformers\benchmarks\LRA\run_grid_search.py -> build\lib.win-amd64-cpython-310\xformers\benchmarks\LRA\r\n copying xformers\benchmarks\LRA\run_tasks.py -> build\lib.win-amd64-cpython-310\xformers\benchmarks\LRA\r\n copying xformers\benchmarks\LRA\run_with_submitit.py -> build\lib.win-amd64-cpython-310\xformers\benchmarks\LRA\r\n copying xformers\benchmarks\LRA\__init__.py -> build\lib.win-amd64-cpython-310\xformers\benchmarks\LRA\r\n creating build\lib.win-amd64-cpython-310\xformers\benchmarks\LRA\code\r\n copying xformers\benchmarks\LRA\code\dataset.py -> build\lib.win-amd64-cpython-310\xformers\benchmarks\LRA\code\r\n copying xformers\benchmarks\LRA\code\model_wrapper.py -> build\lib.win-amd64-cpython-310\xformers\benchmarks\LRA\code\r\n copying xformers\benchmarks\LRA\code\__init__.py -> build\lib.win-amd64-cpython-310\xformers\benchmarks\LRA\code\r\n creating build\lib.win-amd64-cpython-310\xformers\components\attention\r\n copying xformers\components\attention\attention_mask.py -> build\lib.win-amd64-cpython-310\xformers\components\attention\r\n copying xformers\components\attention\attention_patterns.py -> build\lib.win-amd64-cpython-310\xformers\components\attention\r\n copying xformers\components\attention\base.py -> build\lib.win-amd64-cpython-310\xformers\components\attention\r\n copying xformers\components\attention\blocksparse.py -> build\lib.win-amd64-cpython-310\xformers\components\attention\r\n copying xformers\components\attention\compositional.py -> build\lib.win-amd64-cpython-310\xformers\components\attention\r\n copying xformers\components\attention\core.py -> build\lib.win-amd64-cpython-310\xformers\components\attention\r\n copying xformers\components\attention\favor.py -> build\lib.win-amd64-cpython-310\xformers\components\attention\r\n copying xformers\components\attention\fourier_mix.py -> build\lib.win-amd64-cpython-310\xformers\components\attention\r\n copying xformers\components\attention\global_tokens.py -> build\lib.win-amd64-cpython-310\xformers\components\attention\r\n copying xformers\components\attention\lambda_layer.py -> build\lib.win-amd64-cpython-310\xformers\components\attention\r\n copying xformers\components\attention\linformer.py -> build\lib.win-amd64-cpython-310\xformers\components\attention\r\n copying xformers\components\attention\local.py -> build\lib.win-amd64-cpython-310\xformers\components\attention\r\n copying xformers\components\attention\nystrom.py -> build\lib.win-amd64-cpython-310\xformers\components\attention\r\n copying xformers\components\attention\ortho.py -> build\lib.win-amd64-cpython-310\xformers\components\attention\r\n copying xformers\components\attention\pooling.py -> build\lib.win-amd64-cpython-310\xformers\components\attention\r\n copying xformers\components\attention\random.py -> build\lib.win-amd64-cpython-310\xformers\components\attention\r\n copying xformers\components\attention\scaled_dot_product.py -> build\lib.win-amd64-cpython-310\xformers\components\attention\r\n copying xformers\components\attention\sparsity_config.py -> build\lib.win-amd64-cpython-310\xformers\components\attention\r\n copying xformers\components\attention\utils.py -> build\lib.win-amd64-cpython-310\xformers\components\attention\r\n copying xformers\components\attention\visual.py -> build\lib.win-amd64-cpython-310\xformers\components\attention\r\n copying xformers\components\attention\_sputnik_sparse.py -> build\lib.win-amd64-cpython-310\xformers\components\attention\r\n copying xformers\components\attention\__init__.py -> build\lib.win-amd64-cpython-310\xformers\components\attention\r\n creating build\lib.win-amd64-cpython-310\xformers\components\feedforward\r\n copying xformers\components\feedforward\base.py -> build\lib.win-amd64-cpython-310\xformers\components\feedforward\r\n copying xformers\components\feedforward\conv_mlp.py -> build\lib.win-amd64-cpython-310\xformers\components\feedforward\r\n copying xformers\components\feedforward\fused_mlp.py -> build\lib.win-amd64-cpython-310\xformers\components\feedforward\r\n copying xformers\components\feedforward\mixture_of_experts.py -> build\lib.win-amd64-cpython-310\xformers\components\feedforward\r\n copying xformers\components\feedforward\mlp.py -> build\lib.win-amd64-cpython-310\xformers\components\feedforward\r\n copying xformers\components\feedforward\__init__.py -> build\lib.win-amd64-cpython-310\xformers\components\feedforward\r\n creating build\lib.win-amd64-cpython-310\xformers\components\positional_embedding\r\n copying xformers\components\positional_embedding\base.py -> build\lib.win-amd64-cpython-310\xformers\components\positional_embedding\r\n copying xformers\components\positional_embedding\param.py -> build\lib.win-amd64-cpython-310\xformers\components\positional_embedding\r\n copying xformers\components\positional_embedding\rotary.py -> build\lib.win-amd64-cpython-310\xformers\components\positional_embedding\r\n copying xformers\components\positional_embedding\sine.py -> build\lib.win-amd64-cpython-310\xformers\components\positional_embedding\r\n copying xformers\components\positional_embedding\vocab.py -> build\lib.win-amd64-cpython-310\xformers\components\positional_embedding\r\n copying xformers\components\positional_embedding\__init__.py -> build\lib.win-amd64-cpython-310\xformers\components\positional_embedding\r\n creating build\lib.win-amd64-cpython-310\xformers\components\attention\feature_maps\r\n copying xformers\components\attention\feature_maps\base.py -> build\lib.win-amd64-cpython-310\xformers\components\attention\feature_maps\r\n copying xformers\components\attention\feature_maps\softmax.py -> build\lib.win-amd64-cpython-310\xformers\components\attention\feature_maps\r\n copying xformers\components\attention\feature_maps\__init__.py -> build\lib.win-amd64-cpython-310\xformers\components\attention\feature_maps\r\n creating build\lib.win-amd64-cpython-310\xformers\ops\fmha\r\n copying xformers\ops\fmha\attn_bias.py -> build\lib.win-amd64-cpython-310\xformers\ops\fmha\r\n copying xformers\ops\fmha\common.py -> build\lib.win-amd64-cpython-310\xformers\ops\fmha\r\n copying xformers\ops\fmha\cutlass.py -> build\lib.win-amd64-cpython-310\xformers\ops\fmha\r\n copying xformers\ops\fmha\decoder.py -> build\lib.win-amd64-cpython-310\xformers\ops\fmha\r\n copying xformers\ops\fmha\dispatch.py -> build\lib.win-amd64-cpython-310\xformers\ops\fmha\r\n copying xformers\ops\fmha\flash.py -> build\lib.win-amd64-cpython-310\xformers\ops\fmha\r\n copying xformers\ops\fmha\small_k.py -> build\lib.win-amd64-cpython-310\xformers\ops\fmha\r\n copying xformers\ops\fmha\triton.py -> build\lib.win-amd64-cpython-310\xformers\ops\fmha\r\n copying xformers\ops\fmha\triton_splitk.py -> build\lib.win-amd64-cpython-310\xformers\ops\fmha\r\n copying xformers\ops\fmha\__init__.py -> build\lib.win-amd64-cpython-310\xformers\ops\fmha\r\n creating build\lib.win-amd64-cpython-310\xformers\ops\_triton\r\n copying xformers\ops\_triton\k_index_select_cat.py -> build\lib.win-amd64-cpython-310\xformers\ops\_triton\r\n copying xformers\ops\_triton\k_scaled_index_add.py -> build\lib.win-amd64-cpython-310\xformers\ops\_triton\r\n copying xformers\ops\_triton\rmsnorm_kernels.py -> build\lib.win-amd64-cpython-310\xformers\ops\_triton\r\n copying xformers\ops\_triton\rope_padded_kernels.py -> build\lib.win-amd64-cpython-310\xformers\ops\_triton\r\n copying xformers\ops\_triton\__init__.py -> build\lib.win-amd64-cpython-310\xformers\ops\_triton\r\n creating build\lib.win-amd64-cpython-310\xformers\_flash_attn\layers\r\n copying xformers\_flash_attn\layers\patch_embed.py -> build\lib.win-amd64-cpython-310\xformers\_flash_attn\layers\r\n copying xformers\_flash_attn\layers\rotary.py -> build\lib.win-amd64-cpython-310\xformers\_flash_attn\layers\r\n copying xformers\_flash_attn\layers\__init__.py -> build\lib.win-amd64-cpython-310\xformers\_flash_attn\layers\r\n creating build\lib.win-amd64-cpython-310\xformers\_flash_attn\losses\r\n copying xformers\_flash_attn\losses\cross_entropy.py -> build\lib.win-amd64-cpython-310\xformers\_flash_attn\losses\r\n copying xformers\_flash_attn\losses\__init__.py -> build\lib.win-amd64-cpython-310\xformers\_flash_attn\losses\r\n creating build\lib.win-amd64-cpython-310\xformers\_flash_attn\models\r\n copying xformers\_flash_attn\models\baichuan.py -> build\lib.win-amd64-cpython-310\xformers\_flash_attn\models\r\n copying xformers\_flash_attn\models\bert.py -> build\lib.win-amd64-cpython-310\xformers\_flash_attn\models\r\n copying xformers\_flash_attn\models\bigcode.py -> build\lib.win-amd64-cpython-310\xformers\_flash_attn\models\r\n copying xformers\_flash_attn\models\falcon.py -> build\lib.win-amd64-cpython-310\xformers\_flash_attn\models\r\n copying xformers\_flash_attn\models\gpt.py -> build\lib.win-amd64-cpython-310\xformers\_flash_attn\models\r\n copying xformers\_flash_attn\models\gptj.py -> build\lib.win-amd64-cpython-310\xformers\_flash_attn\models\r\n copying xformers\_flash_attn\models\gpt_neox.py -> build\lib.win-amd64-cpython-310\xformers\_flash_attn\models\r\n copying xformers\_flash_attn\models\llama.py -> build\lib.win-amd64-cpython-310\xformers\_flash_attn\models\r\n copying xformers\_flash_attn\models\opt.py -> build\lib.win-amd64-cpython-310\xformers\_flash_attn\models\r\n copying xformers\_flash_attn\models\vit.py -> build\lib.win-amd64-cpython-310\xformers\_flash_attn\models\r\n copying xformers\_flash_attn\models\__init__.py -> build\lib.win-amd64-cpython-310\xformers\_flash_attn\models\r\n creating build\lib.win-amd64-cpython-310\xformers\_flash_attn\modules\r\n copying xformers\_flash_attn\modules\block.py -> build\lib.win-amd64-cpython-310\xformers\_flash_attn\modules\r\n copying xformers\_flash_attn\modules\embedding.py -> build\lib.win-amd64-cpython-310\xformers\_flash_attn\modules\r\n copying xformers\_flash_attn\modules\mha.py -> build\lib.win-amd64-cpython-310\xformers\_flash_attn\modules\r\n copying xformers\_flash_attn\modules\mlp.py -> build\lib.win-amd64-cpython-310\xformers\_flash_attn\modules\r\n copying xformers\_flash_attn\modules\__init__.py -> build\lib.win-amd64-cpython-310\xformers\_flash_attn\modules\r\n creating build\lib.win-amd64-cpython-310\xformers\_flash_attn\ops\r\n copying xformers\_flash_attn\ops\activations.py -> build\lib.win-amd64-cpython-310\xformers\_flash_attn\ops\r\n copying xformers\_flash_attn\ops\fused_dense.py -> build\lib.win-amd64-cpython-310\xformers\_flash_attn\ops\r\n copying xformers\_flash_attn\ops\layer_norm.py -> build\lib.win-amd64-cpython-310\xformers\_flash_attn\ops\r\n copying xformers\_flash_attn\ops\rms_norm.py -> build\lib.win-amd64-cpython-310\xformers\_flash_attn\ops\r\n copying xformers\_flash_attn\ops\__init__.py -> build\lib.win-amd64-cpython-310\xformers\_flash_attn\ops\r\n creating build\lib.win-amd64-cpython-310\xformers\_flash_attn\utils\r\n copying xformers\_flash_attn\utils\benchmark.py -> build\lib.win-amd64-cpython-310\xformers\_flash_attn\utils\r\n copying xformers\_flash_attn\utils\distributed.py -> build\lib.win-amd64-cpython-310\xformers\_flash_attn\utils\r\n copying xformers\_flash_attn\utils\generation.py -> build\lib.win-amd64-cpython-310\xformers\_flash_attn\utils\r\n copying xformers\_flash_attn\utils\pretrained.py -> build\lib.win-amd64-cpython-310\xformers\_flash_attn\utils\r\n copying xformers\_flash_attn\utils\__init__.py -> build\lib.win-amd64-cpython-310\xformers\_flash_attn\utils\r\n creating build\lib.win-amd64-cpython-310\xformers\_flash_attn\ops\triton\r\n copying xformers\_flash_attn\ops\triton\cross_entropy.py -> build\lib.win-amd64-cpython-310\xformers\_flash_attn\ops\triton\r\n copying xformers\_flash_attn\ops\triton\k_activations.py -> build\lib.win-amd64-cpython-310\xformers\_flash_attn\ops\triton\r\n copying xformers\_flash_attn\ops\triton\linear.py -> build\lib.win-amd64-cpython-310\xformers\_flash_attn\ops\triton\r\n copying xformers\_flash_attn\ops\triton\mlp.py -> build\lib.win-amd64-cpython-310\xformers\_flash_attn\ops\triton\r\n copying xformers\_flash_attn\ops\triton\rotary.py -> build\lib.win-amd64-cpython-310\xformers\_flash_attn\ops\triton\r\n copying xformers\_flash_attn\ops\triton\__init__.py -> build\lib.win-amd64-cpython-310\xformers\_flash_attn\ops\triton\r\n running build_ext\r\n C:\Users\Elite\Desktop\StabilityMatrix\Packages\stable-diffusion-webui\venv\lib\site-packages\torch\utils\cpp_extension.py:359: UserWarning: Error checking compiler version for cl: [WinError 2] Das System kann die angegebene Datei nicht finden\r\n warnings.warn(f'Error checking compiler version for {compiler}: {error}')\r\n building 'xformers._C' extension\r\n error: Microsoft Visual C++ 14.0 or greater is required. Get it with "Microsoft C++ Build Tools": https://visualstudio.microsoft.com/visual-cpp-build-tools/\r\n [end of output]\r\n \r\n note: This error originates from a subprocess, and is likely not a problem with pip.\r\n ERROR: Failed building wheel for xformers\r\n Running setup.py clean for xformers\r\nFailed to build xformers\r\nERROR: Could not build wheels for xformers, which is required to install pyproject.toml-based projects\r\n'