在使用Jdiffusion时,调用diffusers的schedulers时往往会出现Compile fused operator,这时变量无法继续何操作,是什么原因导致这个问题呢?
Traceback (most recent call last):
File “/data-4t/M2023-WX/JDiffusion/JDiffusion/examples/demo.py”, line 27, in
result = pipeline(cond, num_inference_steps=75).images[0]
File “/home/M2023-WX/.conda/envs/M2023-WX-jdiffusion/lib/python3.9/site-packages/jittor/init.py”, line 134, in inner
ret = func(*args, **kw)
File “/home/M2023-WX/.cache/huggingface/modules/diffusers_modules/local/pipeline.py”, line 384, in call
latents: torch.Tensor = super().call(
File “/home/M2023-WX/.conda/envs/M2023-WX-jdiffusion/lib/python3.9/site-packages/jittor/init.py”, line 134, in inner
ret = func(*args, **kw)
File “/home/M2023-WX/.conda/envs/M2023-WX-jdiffusion/lib/python3.9/site-packages/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion.py”, line 999, in call
latent_model_input = self.scheduler.scale_model_input(latent_model_input, t)
File “/home/M2023-WX/.conda/envs/M2023-WX-jdiffusion/lib/python3.9/site-packages/diffusers/schedulers/scheduling_euler_ancestral_discrete.py”, line 256, in scale_model_input
self._init_step_index(timestep)
File “/home/M2023-WX/.conda/envs/M2023-WX-jdiffusion/lib/python3.9/site-packages/diffusers/schedulers/scheduling_euler_ancestral_discrete.py”, line 311, in _init_step_index
print(self.timesteps.numpy())
RuntimeError: Wrong inputs arguments, Please refer to examples(help(jt.numpy)).
Types of your inputs are:
self = Var,
args = (),
The function declarations are:
ArrayArgs fetch_sync()
Failed reason:[f 0816 15:58:32.215209 84 parallel_compiler.cc:331] Error happend during compilation:
[Error] source file location:/home/M2023-WX/.cache/jittor/jt1.3.9/g++7.5.0/py3.9.19/Linux-5.15.0-3x88/IntelRXeonRSilxae/ab91/default/cu11.7.64_sm_86/jit/__opkey0_broadcast_to__Tx_float16__DIM_3__BCAST_1__opkey1_broadcast_to__Tx_float32__DIM_3____hash_b1b3856b5e069789_op.cc
Compile fused operator(15/74)failed:[Op(40628:0:1:1:i1:o1:s0:g1,broadcast_to->40629),Op(40626:0:1:1:i1:o1:s0:g1,broadcast_to->40627),Op(40630:0:1:1:i2:o1:s0:g1,binary.multiply->40631),Op(40632:0:1:1:i1:o1:s0:g1,reduce.add->40633),]
Reason: [f 0816 15:58:32.183373 60:C13 cublas_matmul_op.cc:33] Check failed: a->dtype().dsize() == b->dtype().dsize() Something wrong… Could you please report this issue?
type of two inputs should be the same