Repos
10
Stars
5
Forks
0
Top Language
Python
Loading contributions...
Top Repositories
Ongoing research training transformer models at scale
NeMo: a toolkit for conversational AI
A library for accelerating Transformer models on NVIDIA GPUs, including using 8-bit floating point (FP8) precision on Hopper GPUs, to provide better performance with lower memory utilization in both training and inference.
NeMo Megatron launcher and tools
A PyTorch Extension: Tools for easy mixed precision and distributed training in Pytorch
Training library for Megatron-based models
Repositories
10Training library for Megatron-based models
Ongoing research training transformer models at scale
No description provided.
NeMo: a toolkit for conversational AI
A tool to configure, launch and manage your machine learning experiments.
A library for accelerating Transformer models on NVIDIA GPUs, including using 8-bit floating point (FP8) precision on Hopper GPUs, to provide better performance with lower memory utilization in both training and inference.
NeMo Megatron launcher and tools
A PyTorch Extension: Tools for easy mixed precision and distributed training in Pytorch
Tensors and Dynamic neural networks in Python with strong GPU acceleration
Build and train PyTorch models and connect them to the ML lifecycle using Lightning App templates, without handling DIY infrastructure, cost management, scaling, and other headaches.