runtime error

Exit code: 1. Reason: nsformer/diffusion_pytorch_model-0000(…): 20%|β–ˆβ–ˆ | 792M/3.87G [00:01<00:03, 778MB/s] transformer/diffusion_pytorch_model-0000(…): 48%|β–ˆβ–ˆβ–ˆβ–ˆβ–Š | 1.86G/3.87G [00:02<00:02, 945MB/s] transformer/diffusion_pytorch_model-0000(…): 75%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–Œ | 2.91G/3.87G [00:03<00:00, 989MB/s] transformer/diffusion_pytorch_model-0000(…): 100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 3.87G/3.87G [00:04<00:00, 870MB/s] (…)ion_pytorch_model.safetensors.index.json: 0%| | 0.00/121k [00:00<?, ?B/s] (…)ion_pytorch_model.safetensors.index.json: 100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 121k/121k [00:00<00:00, 96.1MB/s] Loading pipeline components...: 0%| | 0/7 [00:00<?, ?it/s] Loading pipeline components...: 14%|β–ˆβ– | 1/7 [00:03<00:18, 3.14s/it]`torch_dtype` is deprecated! Use `dtype` instead! Loading pipeline components...: 29%|β–ˆβ–ˆβ–Š | 2/7 [00:03<00:07, 1.57s/it] Traceback (most recent call last): File "/home/user/app/app.py", line 73, in <module> pipe = DiffusionPipeline.from_pretrained(base_model, torch_dtype=dtype, vae=taef1).to(device) File "/usr/local/lib/python3.10/site-packages/huggingface_hub/utils/_validators.py", line 114, in _inner_fn return fn(*args, **kwargs) File "/usr/local/lib/python3.10/site-packages/diffusers/pipelines/pipeline_utils.py", line 859, in from_pretrained loaded_sub_model = load_sub_model( File "/usr/local/lib/python3.10/site-packages/diffusers/pipelines/pipeline_loading_utils.py", line 698, in load_sub_model loaded_sub_model = load_method(os.path.join(cached_folder, name), **loading_kwargs) File "/usr/local/lib/python3.10/site-packages/transformers/modeling_utils.py", line 277, in _wrapper return func(*args, **kwargs) File "/usr/local/lib/python3.10/site-packages/transformers/modeling_utils.py", line 4974, in from_pretrained model = cls(config, *model_args, **model_kwargs) TypeError: CLIPTextModel.__init__() got an unexpected keyword argument 'offload_state_dict'

Container logs:

Fetching error logs...