Skip to content

Conversation

@rasmith
Copy link
Contributor

@rasmith rasmith commented Nov 22, 2025

This fixes a test failure where tests/v1/kv_offload/test_cpu_offloading.py adds the FLASHINFER backend to the test, but ROCm platform does not support flashinfer library. Doing this allows the test to be successful in AMD CI. The test runs to completion and the result is:

1 passed, 3 warnings

Randall Smith added 2 commits November 22, 2025 00:48
Copy link
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

The primary change correctly makes the FLASHINFER backend conditional on the CUDA platform, which resolves the described CI failure on ROCm. This is a good fix. However, a debug print statement has been introduced in one of the test files, which should be removed before this pull request is merged.

params = SamplingParams(temperature=0, bad_words=[bad_words_1, bad_words_2])
output = llm.generate(PROMPT, params)
new_text = output[0].outputs[0].text
print(f"new_text={new_text}")
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

high

This print statement appears to be a leftover from debugging. Such statements should be removed from test code to keep the test output clean and avoid confusion during test runs.

@mergify mergify bot added the v1 label Nov 22, 2025
Signed-off-by: Randall Smith <[email protected]>
@DarkLight1337 DarkLight1337 enabled auto-merge (squash) November 22, 2025 09:15
@github-actions github-actions bot added the ready ONLY add when PR is ready to merge/full CI is needed label Nov 22, 2025
@DarkLight1337 DarkLight1337 merged commit 8e22da1 into vllm-project:main Nov 22, 2025
19 checks passed
ywang96 pushed a commit to ywang96/vllm that referenced this pull request Nov 23, 2025
lpapavassiliou pushed a commit to lpapavassiliou/vllm that referenced this pull request Nov 24, 2025
RunkaiTao pushed a commit to RunkaiTao/vllm that referenced this pull request Nov 24, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

ready ONLY add when PR is ready to merge/full CI is needed v1

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants