-
Notifications
You must be signed in to change notification settings - Fork 302
Open
Description
I tried quantizing the Qwen/Qwen3-VL-4B-Instruct using the below command
optimum-cli export openvino --model Qwen/Qwen3-VL-4B-Instruct Qwen3-VL-4B-Instruct-int4-sym_group-1 --task image-text-to-text --weight-format int4 --trust-remote-code --sym --backup-precision int8_sym --group-size -1
I tried inferring the quantized model using openvino.genai VLM example and getting the below error.
Unsupported 'qwen3_vl' VLM model type
Metadata
Metadata
Assignees
Labels
No labels