15 results for “topic:seq2seq-pytorch”
Minimalist NMT for educational purposes
Keras, PyTorch, and NumPy Implementations of Deep Learning Architectures for NLP
A PyTorch implementation of the hierarchical encoder-decoder architecture (HRED) introduced in Sordoni et al (2015). It is a hierarchical encoder-decoder architecture for modeling conversation triples in the MovieTriples dataset. This version of the model is built for the MovieTriples dataset.
Neural Machine Translation using LSTMs and Attention mechanism. Two approaches were implemented, models, one without out attention using repeat vector, and the other using encoder decoder architecture and attention mechanism.
Paper Implementation about Attention Mechanism in Neural Network
No description provided.
Repository containing the code to my bachelor thesis about Neural Machine Translationm (2019)
ICLR_2018_Reproducibility_Challenge : Sketch-RNN
The sequence-to sequence model implemented by pytorch
French to English neural machine translation trained on multi30k dataset.
No description provided.
REST API for training and prediction of seq2seq model
Interpretation for english autoencoder (seq2seq model).
한글을 영어로 번역하는 자연어처리 모델 스터디입니다.
Implementation of Selected Published Papers from AI, RL, NLP Conferences and reputed Journals