-
Notifications
You must be signed in to change notification settings - Fork 361
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: Support flashinfer.rmsnorm #3424
base: main
Are you sure you want to change the base?
Conversation
…add flashinfer.rmsnorm support test case
@torch.library.custom_op("flashinfer::rmsnorm", mutates_args=()) # type: ignore[misc] | ||
def flashinfer_rmsnorm( | ||
input: torch.Tensor, weight: torch.Tensor, eps: float = 1e-6 | ||
) -> torch.Tensor: | ||
return flashinfer.norm.rmsnorm(input, weight) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
we don't need this registration since you can directly use torch.ops.flashinfer.rmsnorm
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
py/torch_tensorrt/dynamo/conversion/plugins/_generate_plugin.py
Outdated
Show resolved
Hide resolved
py/torch_tensorrt/dynamo/conversion/plugins/_generate_plugin.py
Outdated
Show resolved
Hide resolved
@bowang007 Can you move the auto plugin gen tests to a runner other than the one used for converter tests? |
This PR resolves some issues and support flashinfer.rmsnorm
Checklist: