Repos
41
Stars
982
Forks
254
Top Language
Python
Loading contributions...
Top Repositories
distant supervised relation extraction models: PCNN MIL (Zeng 2015), PCNN+ATT(Lin 2016). 关系抽取
A Toolkit for Neural Review-based Recommendation models with Pytorch.
supervised relation extraction for PCNN (Zeng 2014) in pytorch 关系抽取
A text classification example with Bert/ELMo/GloVe in pytorch
A text classification example using ddp horovod and accelerate
《统计学习方法》
Repositories
41A Toolkit for Neural Review-based Recommendation models with Pytorch.
supervised relation extraction for PCNN (Zeng 2014) in pytorch 关系抽取
distant supervised relation extraction models: PCNN MIL (Zeng 2015), PCNN+ATT(Lin 2016). 关系抽取
A text classification example with Bert/ELMo/GloVe in pytorch
An Easy-to-use, Scalable and High-performance RLHF Framework (70B+ PPO Full Tuning & Iterative DPO & LoRA & RingAttention & RFT)
A text classification example using ddp horovod and accelerate
the pytorch reproduce version of ENMF: Efficient Neural Matrix Factorization
Sequence Parallel Attention for Long Context LLM Model Training and Inference
This is a tool to login qq zone using python, with multithreads to scrapy what you want,such as, message, blogs board, and photos.
CMMLU: Measuring massive multitask language understanding in Chinese
《统计学习方法》
Experiments on including metadata such as URLs, timestamps, website descriptions and HTML tags during pretraining.
The release repo for "Vicuna: An Open Chatbot Impressing GPT-4"
blog
网易云音乐命令行版本
农业知识图谱(KG):农业领域的信息检索,命名实体识别,关系抽取,分类树构建,数据挖掘
RoFormer升级版
Must-read papers on Recommender System.
Datasets, papers and books on AI & Finance.
A simple but complete full-attention transformer with a set of promising experimental features from various papers
FLASHQuad_pytorch
NLP句子编码、句子embedding、语义相似度:BERT_avg、BERT_whitening、SBERT、SmiCSE
The code of Online Review Helpfulness Prediction with Dual Personalized Attention.
some configs in use
基于知识图谱的《红楼梦》人物关系可视化及问答系统
Hexo theme simple
测试kedixa.top
The lightweight PyTorch wrapper for high-performance AI research. Scale your models, not the boilerplate.
Code and source for paper ``How to Fine-Tune BERT for Text Classification?``
chrome 扩展程序: 使用https加密链接访问google