Google released T5X checkpoints for MADLAD-400 a couple of months ago, but nobody could figure out how to run them. Turns out the vocabulary was wrong, but they uploaded the correct one last week.

I’ve converted the models to the safetensors format, and I created this space if you want to try the smaller model.

I also published quantized GGUF weights you can use with candle. It decodes at ~15tokens/s on a M2 Mac.

It seems that NLLB is the most popular machine translation model right now, but the license only allows non commercial usage. MADLAD-400 is CC BY 4.0.

  • Inevitable_Emu2722@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    Hi, i have the following error while trying to run it from transformers copying the code provided in huggingface

    Traceback (most recent call last):

    File “/home/XXX/project/translation/translateMADLAD.py”, line 10, in

    tokenizer = T5Tokenizer.from_pretrained(‘jbochi/madlad400-3b-mt’)

    File “/home/lXXX/.local/lib/python3.10/site-packages/transformers/tokenization_utils_base.py”, line 1841, in from_pretrained

    return cls._from_pretrained(

    File “/home/lXXX/.local/lib/python3.10/site-packages/transformers/tokenization_utils_base.py”, line 2060, in _from_pretrained

    raise ValueError(

    ValueError: Non-consecutive added token ‘’ found. Should have index 256100 but has index 256000 in saved vocabulary.