Просмотр исходного кода

Fix UnboundLocalError for tokenizer in from_pretrained (#1238)

When FishTokenizer.from_pretrained() fails (e.g., unrecognized
model type in transformers), the except block logs a warning but
tokenizer remains undefined. Line 523 then crashes with:

  UnboundLocalError: cannot access local variable 'tokenizer'
  where it is not associated with a value

Fix: initialize tokenizer = None before the try block.
This allows the model to load successfully even when the
tokenizer can't be instantiated (e.g., with older checkpoints
like fish-speech-1.5 on newer code).

Reproduction:
  python tools/api_server.py \
    --llama-checkpoint-path checkpoints/fish-speech-1.5 \
    --decoder-checkpoint-path checkpoints/fish-speech-1.5/firefly-gan-vq-fsq-8x1024-21hz-generator.pth \
    --device cuda

Crashes at line 523 without this fix.
Andrew Morgan 2 недель назад
Родитель
Сommit
29199c5b2c
1 измененных файлов с 1 добавлено и 0 удалено
  1. 1 0
      fish_speech/models/text2semantic/llama.py

+ 1 - 0
fish_speech/models/text2semantic/llama.py

@@ -496,6 +496,7 @@ class BaseTransformer(nn.Module):
             config.rope_base = rope_base
             logger.info(f"Override rope_base to {rope_base}")
 
+        tokenizer = None
         try:
             tokenizer = FishTokenizer.from_pretrained(path)
             config.semantic_begin_id = tokenizer.semantic_begin_id