runtime error

Exit code: 1. Reason: INFO:httpx:HTTP Request: GET https://api.gradio.app/gradio-messaging/en "HTTP/1.1 200 OK" /usr/local/lib/python3.10/site-packages/transformers/tokenization_utils_base.py:1601: FutureWarning: `clean_up_tokenization_spaces` was not set. It will be set to `True` by default. This behavior will be depracted in transformers v4.45, and will be then set to `False` by default. For more details check this issue: https://github.com/huggingface/transformers/issues/31884 warnings.warn( Traceback (most recent call last): File "/home/user/app/app.py", line 21, in <module> m2m100 = model_translation.ModelM2M100() File "/home/user/app/model_translation.py", line 196, in __call__ cls._instances[cls] = super(Singleton, cls).__call__(*args, **kwargs) File "/home/user/app/model_translation.py", line 208, in __init__ self._model = M2M100ForConditionalGeneration.from_pretrained( File "/usr/local/lib/python3.10/site-packages/transformers/modeling_utils.py", line 3960, in from_pretrained ) = cls._load_pretrained_model( File "/usr/local/lib/python3.10/site-packages/transformers/modeling_utils.py", line 4091, in _load_pretrained_model raise ValueError( ValueError: The current `device_map` had weights offloaded to the disk. Please provide an `offload_folder` for them. Alternatively, make sure you have `safetensors` installed if the model you are using offers the weights in this format.

Container logs:

Fetching error logs...