Top Repositories
Modifications to Graph Wavenet
TF Object Detection on Kitti Data
Visualize your iMessage conversations
Backtranslations of IMDB movie reviews for Data Augmentation Purposes
Pytorch implementation of https://arxiv.org/abs/1905.02249v1
Predicting NBA performance using college box-score stats and combine measurements.
Repositories
89verl: Volcano Engine Reinforcement Learning for LLMs
SGLang is a high-performance serving framework for large language models and multimodal models.
Modifications to Graph Wavenet
Backtranslations of IMDB movie reviews for Data Augmentation Purposes
Predicting NBA performance using college box-score stats and combine measurements.
Visualize your iMessage conversations
DeepSpeed is a deep learning optimization library that makes distributed training easy, efficient, and effective.
TF Object Detection on Kitti Data
news-please - an integrated web crawler and information extractor for news that just works
Development repository for the Triton language and compiler
A library for accelerating Transformer models on NVIDIA GPUs, including using 8-bit floating point (FP8) precision on Hopper and Ada GPUs, to provide better performance with lower memory utilization in both training and inference.
Code for the paper "Efficient Training of Language Models to Fill in the Middle"
This repository contains the code for "Exploiting Cloze Questions for Few-Shot Text Classification and Natural Language Inference"
Pytorch implementation of https://arxiv.org/abs/1905.02249v1
A PyTorch Extension: Tools for easy mixed precision and distributed training in Pytorch
Locality Sensitive Hashing using MinHash in Python/Cython to detect near duplicate text documents
No description provided.
[ICML 2020] code for "PowerNorm: Rethinking Batch Normalization in Transformers" https://arxiv.org/abs/2003.07845
No description provided.
Prefix-Tuning: Optimizing Continuous Prompts for Generation
Beyond the Imitation Game collaborative benchmark for enormous language models
Tensors and Dynamic neural networks in Python with strong GPU acceleration
Facebook AI Research Sequence-to-Sequence Toolkit written in Python.
PyTorch extensions for high performance and large scale training.
🤗 Fast, efficient, open-access datasets and evaluation metrics for Natural Language Processing and more in PyTorch, TensorFlow, NumPy and Pandas
No description provided.
Understanding the Difficulty of Training Transformers
👾 A library of state-of-the-art pretrained models for Natural Language Processing (NLP)
The lightweight PyTorch wrapper for ML researchers. Scale your models. Write less boilerplate
A Python utility / library to sort imports.