24 results for “topic:bilstm-attention”
中文实体关系抽取,pytorch,bilstm+attention
NLP 领域常见任务的实现,包括新词发现、以及基于pytorch的词向量、中文文本分类、实体识别、摘要文本生成、句子相似度判断、三元组抽取、预训练模型等。
1. Use BERT, ALBERT and GPT2 as tensorflow2.0's layer. 2. Implement GCN, GAN, GIN and GraphSAGE based on message passing.
PyTorch implementation of some text classification models (HAN, fastText, BiLSTM-Attention, TextCNN, Transformer) | 文本分类
Use BiLSTM_attention, BERT, ALBERT, RoBERTa, XLNet model to classify the SST-2 data set based on pytorch
Implementation of papers for text classification task on SST-1/SST-2
中文情感分类 | 基于三分类的文本情感分析
Joint text classification on multiple levels with multiple labels, using a multi-head attention mechanism to wire two prediction tasks together.
An NLP research project utilizing the "cardiffnlp/twitter-roberta-base-sentiment-latest" pre-trained transformer for tweet tokenization. The project includes an attention-based biLSTM model that predicts sentiment labels for tweets as negative (-1), neutral (0), or positive (1).
# 2022 COMAP Problem C chosen (Bitcoin and Gold Quant Trading
The design and implementation of an advanced BiLSTM-based model integrated with an attention mechanism for network intrusion detection using the NSL-KDD dataset.
Deep Learning Library for Text Classification.
Analysis of Martingale Strategy in Quantitative Trading Market Based on BiLSTM-Attention Model 基于BiLSTM-Attention模型的马丁策略在量化交易市场的分析
Explainable Sentence-Level Sentiment Analysis – Final project for "Deep Natural Language Processing" course @ PoliTO
This project features a Next Word Prediction Model deployed via a FLASK API, implemented using a Bi-LSTM model with an attention layer.
Deep Learning based end-to-end solution for detecting fraudulent and spam messages across all your devices
This repo contains all files needed to train and select NLP models for fake news detection
This folder implements our solution for the Enefit challenge.
Binary Classification
Course project of CS247.
📌 Short Description: This repository implements and evaluates a deep learning pipeline for image captioning using various CNN encoders VGG16, ResNet50, InceptionV3, etc & a Bi LSTM decoder with attention. The system is trained on the Flickr8k dataset and benchmarked using BLEU, METEOR, ROUGE-L, and CIDEr metrics to assess caption quality
Hybrid CNN–BiLSTM–Attention based early relay maloperation prediction system (ICCECE 2026 accepted)
🚀 A hybrid deep learning model for fake news detection using BERT and BiLSTM with attention mechanism.
The goal of this assignment is to predict the secondary structure (sst3 and sst8 values) from just the primary sequence (seq) using deep learning techniques, which can significantly reduce the need for expensive lab work. Authors: Garv Sachdev, Bay Yong Wei Nicholas, Nathanael Lo Tzin Ye.