-
Notifications
You must be signed in to change notification settings - Fork 31.3k
Closed
Labels
Description
System Info
transformersversion: 4.57.1- Platform: Linux-6.8.0-87-generic-x86_64-with-glibc2.39
- Python version: 3.12.3
- Huggingface_hub version: 0.36.0
- Safetensors version: 0.7.0
- Accelerate version: 1.11.0
- Accelerate config:
- compute_environment: LOCAL_MACHINE
- distributed_type: NO
- mixed_precision: fp16
- use_cpu: False
- debug: False
- num_processes: 1
- machine_rank: 0
- num_machines: 1
- gpu_ids: all
- rdzv_backend: static
- same_network: True
- main_training_function: main
- enable_cpu_affinity: False
- downcast_bf16: no
- tpu_use_cluster: False
- tpu_use_sudo: False
- tpu_env: []
- DeepSpeed version: not installed
- PyTorch version (accelerator?): 2.9.1+cu129 (CUDA)
- Tensorflow version (GPU?): not installed (NA)
- Flax version (CPU?/GPU?/TPU?): not installed (NA)
- Jax version: not installed
- JaxLib version: not installed
- Using distributed or parallel set-up in script?: no
- Using GPU in script?: no
Who can help?
No response
Information
- The official example scripts
- My own modified scripts
Tasks
- An officially supported task in the
examplesfolder (such as GLUE/SQuAD, ...) - My own task or dataset (give details below)
Reproduction
Run code like this:
from transformers import AutoModelForCausalLM
model = AutoModelForCausalLM.from_pretrained('/path/to/local_model/')and see this in the output:
The module name (originally ) is not a valid Python identifier. Please rename the original module to avoid import issues.
Expected behavior
The model should load without a warning.
Removing the trailing slash from the model path makes the warning go away. It also does not reproduce, with or without the slash, in an older venv I have at transformers 4.55.3