-
Notifications
You must be signed in to change notification settings - Fork 616
[TORCH] Added flex_attention hop function #4366
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Open
keshavvinayak01
wants to merge
34
commits into
llvm:main
Choose a base branch
from
keshavvinayak01:keshavvinayak01/torch-aten-flex_attention
base: main
Could not load branches
Branch not found: {{ refName }}
Loading
Could not load tags
Nothing to show
Loading
Are you sure you want to change the base?
Some commits from the old base branch may be removed from the timeline,
and old review comments may become outdated.
+363
−1
Open
Changes from all commits
Commits
Show all changes
34 commits
Select commit
Hold shift + click to select a range
c8c711c
Modified fx_importer to support hop_while_loop
keshavvinayak01 b250583
Addressed Comments | Simplified unique child_func_name creation
keshavvinayak01 db1e7e9
Addressed comments
keshavvinayak01 d9646c6
Formatting
keshavvinayak01 cc03291
Added children module imports to import_frozen_program flow
keshavvinayak01 6a70e1c
Formatting and reordered CHECKs
keshavvinayak01 85e3acd
Changes done to TorchToScf:
keshavvinayak01 e1ff87d
Added Control flow test
keshavvinayak01 558c7db
Cannot FX trace HOP
keshavvinayak01 39d5b24
Added flex_attention hop function
keshavvinayak01 dfdca75
Formatting
keshavvinayak01 6178d07
Fixed merge newline removals
keshavvinayak01 52f1fbc
Added AtenFluxAttentionOp
keshavvinayak01 a56433a
Added changes for correct functional references
keshavvinayak01 b0e8585
QOL changes:
keshavvinayak01 c34efab
Merge branch 'main' into keshavvinayak01/torch-aten-flex_attention
keshavvinayak01 4470978
Update fx_importer.py to remove deprecated note
keshavvinayak01 719fe5a
Clarify enable_gqa support in fx_importer.py
keshavvinayak01 5e024f6
Fix formatting in GeneratedTorchOps.td
keshavvinayak01 c78d699
return_lse is part of the kernel options
keshavvinayak01 da23ec9
Moved op definition to TorchOps.td
keshavvinayak01 af59413
Formatting TorchOps
keshavvinayak01 0103163
Added lit-test; Docs for FlexAttention
keshavvinayak01 48f12bc
Formatting
keshavvinayak01 ec3e5f8
Modified arg extraction
keshavvinayak01 fa5aba2
Removed enable_gqa from flex_attention; HOP does not accept that argu…
keshavvinayak01 2b0637c
Typo
keshavvinayak01 e7da0a7
Simplified arg extract logic
keshavvinayak01 53dd19a
return_lse should be booltype not i1
keshavvinayak01 de91ca2
Added basic_test for flex_attention
keshavvinayak01 47803e3
Formatting and allowed unused unpacked vals
keshavvinayak01 207621c
Added max_scores; changes to match pytorch naming conventions; Added …
keshavvinayak01 acc3ade
Corrected lit test
keshavvinayak01 16fc70c
Renamed aten.flex_attention -> hop_flex_attention; Added more lit tests
keshavvinayak01 File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Uh oh!
There was an error while loading. Please reload this page.