Commit b5f7a83
authored
[Doc] Upgrade multi-node doc (#4365)
### What this PR does / why we need it?
When we are using `Ascend scheduler`, the param `max_num_batched_tokens`
should be larger than `max_model_len`, otherwise, will encountered the
follow error:
```shell
Value error, Ascend scheduler is enabled without chunked prefill feature. Argument max_num_batched_tokens (4096) is smaller than max_model_len (32768). This effectively limits the maximum sequence length to max_num_batched_tokens and makes vLLM reject longer sequences. Please increase max_num_batched_tokens or decrease max_model_len. [type=value_error, input_value=ArgsKwargs((), {'model_co...g': {'enabled': True}}}), input_type=ArgsKwargs]
```
### Does this PR introduce _any_ user-facing change?
Users/Developers who running the model according to the
[tutorial](https://docs.vllm.ai/projects/ascend/en/latest/tutorials/multi_node.html),
the parameters can be specified correctly.
### How was this patch tested?
- vLLM version: v0.11.0
- vLLM main:
vllm-project/vllm@2918c1b
---------
Signed-off-by: wangli <[email protected]>1 parent b34f195 commit b5f7a83
File tree
3 files changed
+9
-9
lines changed- .github/workflows
- docs/source/tutorials
3 files changed
+9
-9
lines changed| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
216 | 216 | | |
217 | 217 | | |
218 | 218 | | |
219 | | - | |
| 219 | + | |
220 | 220 | | |
221 | 221 | | |
222 | 222 | | |
| |||
| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
131 | 131 | | |
132 | 132 | | |
133 | 133 | | |
134 | | - | |
| 134 | + | |
135 | 135 | | |
136 | | - | |
| 136 | + | |
137 | 137 | | |
138 | 138 | | |
139 | 139 | | |
| |||
176 | 176 | | |
177 | 177 | | |
178 | 178 | | |
179 | | - | |
180 | | - | |
| 179 | + | |
| 180 | + | |
181 | 181 | | |
182 | 182 | | |
183 | 183 | | |
| |||
| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
88 | 88 | | |
89 | 89 | | |
90 | 90 | | |
91 | | - | |
92 | | - | |
| 91 | + | |
| 92 | + | |
93 | 93 | | |
94 | 94 | | |
95 | 95 | | |
| |||
130 | 130 | | |
131 | 131 | | |
132 | 132 | | |
133 | | - | |
| 133 | + | |
134 | 134 | | |
135 | | - | |
| 135 | + | |
136 | 136 | | |
137 | 137 | | |
138 | 138 | | |
| |||
0 commit comments