-
Notifications
You must be signed in to change notification settings - Fork 84
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Operator support for F.hardswish
#92
Comments
F.hardswish
Hey @nikitaved (@carmocca you are more than welcome to assist me as well!), based on our discussions under #64 and your helpful suggestions, I've started working on implementing F.hardswish as a good first issue. Here's the code snippet I came up with (I used @torchsymbol(torch.nn.functional.hardswish, is_method=False)
def hardswish(a: TensorProxy, /, inplace: bool = False) -> TensorLike:
utils.check(not inplace, lambda: f"hardswish only supports inplace=False", exception_type=NotImplementedError)
return a * relu6(a + 3) / 6 For Looking forward to your feedback and further guidance! |
Hey, @shaharelys ! Yes, this looks great! And yes, we have to insert these things which are present there for relu6. It is very important to implement tests as well, as otherwise we can not guarantee that everything is connected properly. I am currently on holiday, but I will be back next Tuesday to help you out and expand on why we do things certain way (as per your last post in the one hot issue) :) Cheers! |
This has been addressed now that #100 is merged! Thanks @shaharelys! |
🚀 Feature
Implement HardSwish activation function.
Motivation
Relatively easy activation function implementation as a good first issue as nikitaved suggested under #64
Pitch
Add HardSwish (x * ReLU6(x + 3) / 6) leveraging existing ReLU6 support.
cc @apaz-cli
The text was updated successfully, but these errors were encountered: