Hi @vince62s,
I'm trying to add a new language in nllb but when I try to merged lora weight to the base model I have this error :
AssertionError : An error in model's partition and checkpoint's slice was detected
When I compare of param.data.size() and ckpt_t[col_slice_start:col_slice_end, row_slice_start:row_slice_end,].size()
I have this result : torch.Size([265108, 2048]) and torch.Size([256206, 2048])
265108 corresponds to the size of the new dictionary with new tokens and 256206 to the size of the nllb dictionary.
I don't understand why is the vocabulary not updated. Yet, during training I have the good value of vocab size (265108)
Could you enlighten me?
Thanks,
Mat