-
Notifications
You must be signed in to change notification settings - Fork 193
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
how to implement target attention in this framework #208
Comments
Hey @LeiShenVictoria So I have not read the Deep Interest Network paper, I will, maybe I can incorporate some ideas to the library. As of right now, the only thing "kind-of" similar you would have here are the attention weights of the models. All model components that are based on attention mechanisms have an attribute called I will have a look to the paper Deep Interest Network paper asap and see if I can come up with a quick answer that is more helpful :) |
Hi, thanks for your reply. |
Hey @LeiShenVictoria I would have to read the paper :) I am busy at work now, but ill see what I can do asap |
There is a brach called ffm now where DIN is implemented In the examples folder there is a script called I know the issue was opened a while ago, but took me time to find the time. Let me know if you have any questions |
Covered in PR #234 |
In DeepInterestNetwork, there is a target attention between a candidate feature (one column) and a sequence feature, how to implement this target attention in this repo, which can be considered as an attention between a column in deep part (candidate) and the text part (sequence) i guess...
Thanks a lot
The text was updated successfully, but these errors were encountered: