Skip to content

Conversation

@vasqu
Copy link
Contributor

@vasqu vasqu commented Dec 2, 2025

As per title, you need to "()" for the decorator to work as intended. Not sure about the implications but it certainly does not hurt to enable it and avoid any sort of unexpected memory surge


@traced
@torch.no_grad
@torch.no_grad()
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Unsure about this one but I don't see a reason why we would not close here as well cc @remi-or

@HuggingFaceDocBuilderDev

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

Copy link
Collaborator

@ArthurZucker ArthurZucker left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Both are equivalent no?

@vasqu
Copy link
Contributor Author

vasqu commented Dec 3, 2025

Both are equivalent no?

Did a sanity check

import torch
x = torch.tensor([1.], requires_grad=True)

def identity(x):
    return x * 1
z = identity(x)
z.requires_grad  # True

@torch.no_grad()
def doubler(x):
    return x * 2
z = doubler(x)
z.requires_grad  # False

@torch.no_grad
def tripler(x):
    return x * 3
z = tripler(x)
z.requires_grad  # False

You are correct, they are the same. I was confused about the standards around () - guess this is really nitpicky then. Will close this.

@vasqu vasqu closed this Dec 3, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants