278 results for “topic:distillation”
Awesome Knowledge Distillation
TurboDiffusion: 100–200× Acceleration for Video Diffusion Models
A unified inference and post-training framework for accelerated video generation.
Awesome Knowledge-Distillation. 分类整理的知识蒸馏paper(2014-2021)。
推荐/广告/搜索领域工业界经典以及最前沿论文集合。A collection of industry classics and cutting-edge papers in the field of recommendation/advertising/search.
The official repo for [NeurIPS'22] "ViTPose: Simple Vision Transformer Baselines for Human Pose Estimation" and [TPAMI'23] "ViTPose++: Vision Transformer for Generic Body Pose Estimation"
PyTorch implementation of various methods for continual learning (XdG, EWC, SI, LwF, FROMP, DGR, BI-R, ER, A-GEM, iCaRL, Generative Classifier) in three different scenarios.
Pytorch implementation of various Knowledge Distillation (KD) methods.
A PyTorch-based knowledge distillation toolkit for natural language processing
PaddleSlim is an open-source library for deep model compression and architecture search.
All-in-one training for vision models (YOLO, ViTs, RT-DETR, DINOv3): pretraining, fine-tuning, distillation.
mobilev2-yolov5s剪枝、蒸馏,支持ncnn,tensorRT部署。ultra-light but better performence!
Generate High-Quality Synthetics, Train, Measure, and Evaluate in a Single Pipeline
高质量中文预训练模型集合:最先进大模型、最快小模型、相似度专门模型
Kandinsky 5.0: A family of diffusion models for Video & Image generation
MEAL V2: Boosting Vanilla ResNet-50 to 80%+ Top-1 Accuracy on ImageNet without Tricks. In NeurIPS 2020 workshop.
Prompt engineering for developers
⚡ Flash Diffusion ⚡: Accelerating Any Conditional Diffusion Model for Few Steps Image Generation (AAAI 2025 Oral)
NVIDIA FastGen: Fast Generation from Diffusion Models
Segmind Distilled diffusion
Reinforcement Learning via Self-Distillation (SDPO)
[ICLR 2026] rCM: SOTA JVP-Based Diffusion Distillation & Few-Step Video Generation & Scaling Up sCM/MeanFlow & Real-Time Autoregressive Video Diffusion
🤗 Optimum Intel: Accelerate inference with Intel optimization tools
A Python library for adversarial machine learning focusing on benchmarking adversarial robustness.
irresponsible innovation. Try now at https://chat.dev/
Quantization library for PyTorch. Support low-precision and mixed-precision quantization, with hardware implementation through TVM.
(CVPR 2022) A minimalist, mapless, end-to-end self-driving stack for joint perception, prediction, planning and control.
Official codebase for "Causal Forcing: Autoregressive Diffusion Distillation Done Right for High-Quality Real-Time Interactive Video Generation"
The Official Repo for "Quick Start Guide to Large Language Models"
A curated archive of breakthroughs in Agents, Architecture, Training, RAG, and On-Device AI.