Skip to content

Conversation

@jianzs
Copy link
Collaborator

@jianzs jianzs commented Dec 5, 2025

What this PR does / why we need it?

This can cut down on data preparation time for torchair and speed up graph dispatch.

Does this PR introduce any user-facing change?

No

How was this patch tested?

@jianzs jianzs added ready read for review ready-for-test start test by label for PR labels Dec 5, 2025
@jianzs jianzs force-pushed the gh-rm-torchair-cost branch from 503d19e to 828fb05 Compare December 5, 2025 14:22
Copy link
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

The pull request introduces a change to vllm_ascend/torchair/torchair_model_runner.py, specifically within the _compile_torchair_graph method. A new line torch._dynamo.config.inline_inbuilt_nn_modules = False has been added before the loop that triggers torchair graph capture. This configuration change likely aims to prevent the inlining of inbuilt neural network modules during torch.dynamo's graph compilation process. There were no review comments provided for this change.

@github-actions
Copy link

github-actions bot commented Dec 5, 2025

👋 Hi! Thank you for contributing to the vLLM Ascend project. The following points will speed up your PR merge:‌‌

  • A PR should do only one thing, smaller PRs enable faster reviews.
  • Every PR should include unit tests and end-to-end tests ‌to ensure it works and is not broken by other future PRs.
  • Write the commit message by fulfilling the PR description to help reviewer and future developers understand.

If CI fails, you can run linting and testing checks locally according Contributing and Testing.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

ready read for review ready-for-test start test by label for PR

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant