Thomas Viehmann
t-vi
Principal Research Engineer at @Lightning-AI Mathematics and Inference at @MathInf I do a lot of @PyTorch work
Languages
Loading contributions...
Top Repositories
Totally Versatile Miscellanea for Pytorch
Gaussian Processes in Pytorch
Demonstration of using Caffe2 inside an Android application.
PyTorch bindings for Warp-CTC
Economic models and things in Pytorch
Repositories
49Economic models and things in Pytorch
Totally Versatile Miscellanea for Pytorch
Gaussian Processes in Pytorch
PyTorch bindings for Warp-CTC
8-bit CUDA functions for PyTorch
Extension for Sphinx to make the sidebar show a full table of contents instead of just the local headings
PyTorch Tutorial at the LOD2021 conference
Demonstration of using Caffe2 inside an Android application.
Estimate derivatives with finite differences
No description provided.
Hackable implementation of state-of-the-art open-source LLMs based on nanoGPT. Supports flash attention, 4-bit and 8-bit quantization, LoRA and LLaMA-Adapter fine-tuning, pre-training. Apache 2.0-licensed.
5X faster 60% less memory QLoRA finetuning
CUDA related news and material links
Tensors and Dynamic neural networks in Python with strong GPU acceleration
Stretching GPU performance for GEMMs and tensor contractions.
Lightning module for TorchDrift
Implementation of the LLaMA language model based on nanoGPT. Supports flash attention, quantization, LoRA fine-tuning, pre-training. Apache 2.0-licensed.
A Fusion Code Generator for NVIDIA GPUs (commonly known as "nvFuser")
Code for the ICLR 2023 paper "GPTQ: Accurate Post-training Quantization of Generative Pretrained Transformers".
Acceleration package for neural networks on multi-core CPUs
A minimal PyTorch re-implementation of the OpenAI GPT (Generative Pretrained Transformer) training
Fast, modular reference implementation of Instance Segmentation and Object Detection algorithms in PyTorch.
Toolbox of models, callbacks, and datasets for AI/ML researchers.
The lightweight PyTorch wrapper for high-performance AI research. Scale your models, not the boilerplate.
Zulip server and web app—powerful open source team chat
Coalition agreement between the SPD, Green Party, and FDP as clean PDF file, .docx file, and .txt file
🤗 Transformers: State-of-the-art Natural Language Processing for Pytorch, TensorFlow, and JAX.
PyTorch image models, scripts, pretrained weights -- ResNet, ResNeXT, EfficientNet, EfficientNetV2, NFNet, Vision Transformer, MixNet, MobileNet-V3/V2, RegNet, DPN, CSPNet, and more
PyTorch tutorials.
The friendly PIL fork (Python Imaging Library)