872 results for “topic:optimizer”
The largest collection of PyTorch image encoders / backbones. Including train, eval, inference, export scripts, and pretrained weights -- ResNet, ResNeXT, EfficientNet, NFNet, Vision Transformer (ViT), MobileNetV4, MobileNet-V3 & V2, RegNet, DPN, CSPNet, Swin Transformer, MaxViT, CoAtNet, ConvNeXt, and more
Multi functional app to find duplicates, empty folders, similar images etc.
Force Remove Copilot, Recall and More in Windows 11
Linux System Optimizer and Monitoring - https://oguzhaninan.github.io/Stacer-Web
Python SQL Parser and Transpiler
SQL Optimizer And Rewriter
🚀Optimizer for mobile applications
AdalFlow: The library to build & auto-optimize LLM applications.
torch-optimizer -- collection of optimizers for Pytorch
Easily optimize images using PHP
On the Variance of the Adaptive Learning Rate and Beyond
SAM: Sharpness-Aware Minimization (PyTorch)
GLSL optimizer based on Mesa's GLSL compiler. Used to be used in Unity for mobile shader optimization.
An implementation of React v15.x that optimizes for small script size
Virtual-machine Translation Intermediate Language
Numerical optimization in pure Rust
Portfolio optimization and back-testing.
Merged into Gifsicle!
The official implementation of “Sophia: A Scalable Stochastic Second-order Optimizer for Language Model Pre-training”
Heimer is a simple cross-platform mind map, diagram, and note-taking tool written in Qt.
An Artifact optimizer for Genshin Impact(And other Gacha games).
Scour - An SVG Optimizer / Cleaner
lightweight lossless file minifier/optimizer
Adan: Adaptive Nesterov Momentum Algorithm for Faster Optimizing Deep Models
The official implementation of MARS: Unleashing the Power of Variance Reduction for Training Large Models
End-to-end Generative Optimization for AI Agents
Linux Optimizer
TorchOpt is an efficient library for differentiable optimization built upon PyTorch.
A Honkai Star Rail optimizer, relic scorer, damage calculator, and various other tools for building and gearing characters
Code for Adam-mini: Use Fewer Learning Rates To Gain More https://arxiv.org/abs/2406.16793