Skip to content

Conversation

@wxsIcey
Copy link
Collaborator

@wxsIcey wxsIcey commented Dec 4, 2025

What this PR does / why we need it?

This PR add qkv_rmsnorm_rope operator and introduces a graph fusion pass for qknorm_rope operations. The implementation includes a new configuration flag, a pattern matching pass using torch._inductor.pattern_matcher, and a custom Triton kernel for the fused operation.

Co-authored-by: Angazenn [email protected]

Does this PR introduce any user-facing change?

Yes, add new additional_config

How was this patch tested?

todo

Copy link
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request introduces a graph fusion pass for qknorm_rope operations on Ascend hardware, which is a great step for performance optimization. The implementation includes a new configuration flag, a pattern matching pass using torch._inductor.pattern_matcher, and a custom Triton kernel for the fused operation. The code is well-structured, but I've identified several areas for improvement regarding code quality, robustness, and maintainability. My review comments focus on removing debug artifacts, improving code clarity and consistency, enhancing robustness by avoiding hardcoded values and unsafe module-level initializations, and addressing significant code duplication.

@github-actions
Copy link

github-actions bot commented Dec 4, 2025

👋 Hi! Thank you for contributing to the vLLM Ascend project. The following points will speed up your PR merge:‌‌

  • A PR should do only one thing, smaller PRs enable faster reviews.
  • Every PR should include unit tests and end-to-end tests ‌to ensure it works and is not broken by other future PRs.
  • Write the commit message by fulfilling the PR description to help reviewer and future developers understand.

If CI fails, you can run linting and testing checks locally according Contributing and Testing.

@wxsIcey wxsIcey changed the title [Fusion] [Graph] Add qknorm rope fusion [Fusion] [Graph] Add qknorm rope fusion operator Dec 5, 2025
dtype=self.dtype,
device=self.device)
# For GQA models.
elif not self.vllm_config.model_config.use_mla:
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We should be more careful on this condition. As far as I know, GQA models does not always has rope_dim of 128, and this hardcode might cause some potential bugs. Perhaps we can limit it to qwen3_moe only?

return q_output, k_output, v_output


direct_register_custom_op(op_name="qkv_rmsnorm_rope",
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

use import torch_npu._inductor

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The pattern_matcher method of inductor does not support the triton operator. It does support torch.ops.aten (aten operator), torch.ops.npu (custom operator), and torch.add (PyTorch API). Therefore, it is wrapped as a custom op.

return driver.active.utils.get_device_properties(device)


num_vectorcore = get_npu_properties()["num_vectorcore"]
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this parameter has already been defined in triton/utils.py

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks. I have modified it.

@wxsIcey wxsIcey marked this pull request as ready for review December 9, 2025 02:04
@github-actions
Copy link

This pull request has conflicts, please resolve those before we can evaluate the pull request.

@wxsIcey wxsIcey added ready read for review ready-for-test start test by label for PR labels Dec 11, 2025
@wxsIcey wxsIcey requested a review from whx-sjtu December 11, 2025 03:02
@wxsIcey
Copy link
Collaborator Author

wxsIcey commented Dec 11, 2025

This pr rely on #4409, because ci has no triton.

@github-actions
Copy link

This pull request has conflicts, please resolve those before we can evaluate the pull request.


return q_rope, k_rope, v

def replacement(qkv: torch.Tensor, q_weight: torch.Tensor,
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

pattern in 'if xxx else: torch.ops.vllm.qkv_rmsnorm_rope ’ need support in future releases

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We don't perform any special checks in the pattern. You can add a new pattern match.

@github-actions
Copy link

This pull request has conflicts, please resolve those before we can evaluate the pull request.

Signed-off-by: wxsIcey <[email protected]>
Signed-off-by: wxsIcey <[email protected]>
Signed-off-by: wxsIcey <[email protected]>
Signed-off-by: wxsIcey <[email protected]>
Signed-off-by: wxsIcey <[email protected]>
Signed-off-by: wxsIcey <[email protected]>
Signed-off-by: wxsIcey <[email protected]>
Signed-off-by: wxsIcey <[email protected]>
Signed-off-by: wxsIcey <[email protected]>
Signed-off-by: wxsIcey <[email protected]>
Signed-off-by: wxsIcey <[email protected]>
Signed-off-by: wxsIcey <[email protected]>
Signed-off-by: wxsIcey <[email protected]>
Signed-off-by: wxsIcey <[email protected]>
Signed-off-by: wxsIcey <[email protected]>
Signed-off-by: wxsIcey <[email protected]>
Signed-off-by: wxsIcey <[email protected]>
Signed-off-by: wxsIcey <[email protected]>
Signed-off-by: wxsIcey <[email protected]>
Signed-off-by: wxsIcey <[email protected]>
Signed-off-by: wxsIcey <[email protected]>
Signed-off-by: wxsIcey <[email protected]>
Signed-off-by: wxsIcey <[email protected]>
Signed-off-by: wxsIcey <[email protected]>
Signed-off-by: wxsIcey <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

module:core module:ops ready read for review ready-for-test start test by label for PR

Projects

None yet

Development

Successfully merging this pull request may close these issues.

6 participants