Skip to content

Conversation

@dorde-antic
Copy link
Contributor

@dorde-antic dorde-antic commented Nov 24, 2025

Motivation

Resolves https://github.com/ROCm/rocMLIR-internal/issues/2142

Technical Details

Checks for WMMA in perfRunner when assigning datatypes for attention.

Test Plan

Weekly CI - Tuning phase

Test Result

CI RUN
Successfully filtered out f32 attention configs on WMMA architecture (check attention tuning results on gfx1100)
Run failed for other issues related to tuning

Submission Checklist

@dorde-antic dorde-antic changed the title Try checking for wmma instead for mfma Check for WMMA instead of MFMA when assigning datatypes for attention Nov 26, 2025
@dorde-antic dorde-antic marked this pull request as ready for review November 26, 2025 10:42
@dorde-antic dorde-antic requested a review from causten as a code owner November 26, 2025 10:42
@umangyadav
Copy link
Member

@dorde-antic CI has failed on Navi3x. It is still picking f32 attentions

@dorde-antic
Copy link
Contributor Author

@dorde-antic CI has failed on Navi3x. It is still picking f32 attentions

Weird since on Navi4x it didn't (and the logic we use is the same)
It might be related to how we obtain the chip name on navi3x maybe... i'll investigate

@dhernandez0
Copy link
Contributor

@dorde-antic CI has failed on Navi3x. It is still picking f32 attentions

Weird since on Navi4x it didn't (and the logic we use is the same) It might be related to how we obtain the chip name on navi3x maybe... i'll investigate

@dorde-antic This is blocking weekly CI for upstream merge. Please give priority to this PR

@dorde-antic
Copy link
Contributor Author

@umangyadav I would rather focus on merging this #2123 as a temp solution than merging this PR. #2123 solves both f32 attn thing and thing that we don't try every possible combination on MITuna

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants