Replies: 1 comment 5 replies
-
启动命令是什么? |
Beta Was this translation helpful? Give feedback.
5 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
2023-10-12 11:04:55 ERROR:Failed to load the model.
Traceback (most recent call last):
File "D:\pro\aigc\text-generation-webui\installer_files\env\lib\site-packages\transformers\modeling_utils.py", line 484, in load_state_dict
return torch.load(checkpoint_file, map_location=map_location)
File "D:\pro\aigc\text-generation-webui\installer_files\env\lib\site-packages\torch\serialization.py", line 993, in load
with _open_zipfile_reader(opened_file) as opened_zipfile:
File "D:\pro\aigc\text-generation-webui\installer_files\env\lib\site-packages\torch\serialization.py", line 447, in init
super().init(torch._C.PyTorchFileReader(name_or_buffer))
RuntimeError: PytorchStreamReader failed reading zip archive: failed finding central directory
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "D:\pro\aigc\text-generation-webui\installer_files\env\lib\site-packages\transformers\modeling_utils.py", line 488, in load_state_dict
if f.read(7) == "version":
UnicodeDecodeError: 'gbk' codec can't decode byte 0x80 in position 128: illegal multibyte sequence
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "D:\pro\aigc\text-generation-webui\modules\ui_model_menu.py", line 201, in load_model_wrapper
shared.model, shared.tokenizer = load_model(shared.model_name, loader)
File "D:\pro\aigc\text-generation-webui\modules\models.py", line 79, in load_model
output = load_func_maploader
File "D:\pro\aigc\text-generation-webui\modules\models.py", line 210, in huggingface_loader
model = LoaderClass.from_pretrained(path_to_model, **params)
File "D:\pro\aigc\text-generation-webui\installer_files\env\lib\site-packages\transformers\models\auto\auto_factory.py", line 565, in from_pretrained
return model_class.from_pretrained(
File "D:\pro\aigc\text-generation-webui\installer_files\env\lib\site-packages\transformers\modeling_utils.py", line 3307, in from_pretrained
) = cls._load_pretrained_model(
File "D:\pro\aigc\text-generation-webui\installer_files\env\lib\site-packages\transformers\modeling_utils.py", line 3681, in _load_pretrained_model
state_dict = load_state_dict(shard_file)
File "D:\pro\aigc\text-generation-webui\installer_files\env\lib\site-packages\transformers\modeling_utils.py", line 500, in load_state_dict
raise OSError(
OSError: Unable to load weights from pytorch checkpoint file for 'models\chinese-alpaca-2-13b-16k-hf\pytorch_model-00001-of-00003.bin' at 'models\chinese-alpaca-2-13b-16k-hf\pytorch_model-00001-of-00003.bin'. If you tried to load a PyTorch model from a TF 2.0 checkpoint, please set from_tf=True.
Beta Was this translation helpful? Give feedback.
All reactions