ValueError: Couldn't instantiate the backend tokenizer from one of:
(1) a `tokenizers` library serialization file,
(2) a slow tokenizer instance to convert or
(3) an equivalent slow tokenizer class to instantiate and convert.
You need to have sentencepiece installed to convert a slow tokenizer to a fast one.

 解决方案:

pip install sentencepiece

Logo

有“AI”的1024 = 2048,欢迎大家加入2048 AI社区

更多推荐