runtime error

Exit code: 1. Reason: ecial_tokens_map.json: 0%| | 0.00/544 [00:00<?, ?B/s] special_tokens_map.json: 100%|██████████| 544/544 [00:00<00:00, 5.01MB/s] ================================================== ❌ 模型加载失败! 错误信息: BaichuanTokenizer has no attribute vocab_size ================================================== Traceback (most recent call last): File "/home/user/app/app.py", line 25, in <module> tokenizer = AutoTokenizer.from_pretrained("Go4miii/DISC-FinLLM", use_fast=False, trust_remote_code=True) File "/usr/local/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py", line 1141, in from_pretrained return tokenizer_class.from_pretrained( File "/usr/local/lib/python3.10/site-packages/transformers/tokenization_utils_base.py", line 2113, in from_pretrained return cls._from_pretrained( File "/usr/local/lib/python3.10/site-packages/transformers/tokenization_utils_base.py", line 2359, in _from_pretrained tokenizer = cls(*init_inputs, **init_kwargs) File "/home/user/.cache/huggingface/modules/transformers_modules/Go4miii/DISC_hyphen_FinLLM/8e235cb9e81f952702b0b90c5f3292a1089b0dba/tokenization_baichuan.py", line 55, in __init__ super().__init__( File "/usr/local/lib/python3.10/site-packages/transformers/tokenization_utils.py", line 438, in __init__ self._add_tokens( File "/usr/local/lib/python3.10/site-packages/transformers/tokenization_utils.py", line 546, in _add_tokens current_vocab = self.get_vocab().copy() File "/home/user/.cache/huggingface/modules/transformers_modules/Go4miii/DISC_hyphen_FinLLM/8e235cb9e81f952702b0b90c5f3292a1089b0dba/tokenization_baichuan.py", line 89, in get_vocab vocab = {self.convert_ids_to_tokens(i): i for i in range(self.vocab_size)} File "/usr/local/lib/python3.10/site-packages/transformers/tokenization_utils_base.py", line 1128, in __getattr__ raise AttributeError(f"{self.__class__.__name__} has no attribute {key}") AttributeError: BaichuanTokenizer has no attribute vocab_size

Container logs:

Fetching error logs...