Open felixslu opened 1 year ago
TVMError: Data types float32 and float16 must be equal for binary operators
[10:32:31] ~/tvm/tvm/src/relax/ir/block_builder.cc:64: Warning: BlockBuilder destroyed with remaining blocks! [2023-07-13 10:32:31,987] torch._dynamo.convert_frame: [ERROR] WON'T CONVERT forward /root/miniconda3/envs/tvm-build/lib/python3.8/site-packages/transformers/models/clip/modeling_clip.py line 363 due to: Traceback (most recent call last): File "~/tvm/tvm/python/tvm/_ffi/_ctypes/packed_func.py", line 238, in call raise get_last_ffi_error() torch._dynamo.exc.BackendCompilerFailed: backend='_capture' raised: TVMError: Traceback (most recent call last): 12: TVMFuncCall 1: tvm::relax::InferBinaryArithOpOutDtype(tvm::relax::Call const&, tvm::relax::BlockBuilder const&, tvm::relax::TensorStructInfo const&, tvm::relax::TensorStructInfo const&) 0: tvm::relax::BlockBuilderImpl::ReportFatal(tvm::Diagnostic const&) File "/home/luting6/car/tvm/tvm/src/relax/ir/block_builder.cc", line 138 TVMError: Data types float32 and float16 must be equal for binary operators
Traceback (most recent call last):
File "build.py", line 280, in
when I use fp16 version ,I got an capture error on my 3090TI.
pipe = StableDiffusionPipeline.from_pretrained( "runwayml/stable-diffusion-v1-5", revision='fp16', torch_dtype=torch.float16, local_files_only=True )
RuntimeError: "LayerNormKernelImpl" not implemented for 'Half'