Gated Pretrained Transformer model for robust denoised sequence-to-sequence modelling
-
Updated
May 29, 2021 - Python
Gated Pretrained Transformer model for robust denoised sequence-to-sequence modelling
Novel recurrent layers for Flux.jl
GSTA: Gated Spatial-Temporal Attention Approach for Travel Time Prediction
Gated Neural Network for Option Pricing implementation version Keras
This repository contains three variants of a Sentiment Analysis model that uses a GRU (Gated Recurrent Unit) to predict the sentiment of a given text as either positive or negative. The models were built using PyTorch, and the training and testing data came from DLStudio
Adaptive Hierarchical Attention-Enhanced Gated Network Integrating Reviews for Item Recommendation [TKDE 2021]
A machine translation project featuring RNN-based Seq2Seq, Transformer model, and pretrained models for translating English to Spanish and Urdu.
Classifier for app reviews on a scale of 1 to 5 using Gated Recurrent Unit (GRU).
Add a description, image, and links to the gated-neural-network topic page so that developers can more easily learn about it.
To associate your repository with the gated-neural-network topic, visit your repo's landing page and select "manage topics."