28 results for “topic:zeroth-order-optimization”
Official implementation for the paper "Model-based Diffusion for Trajectory Optimization". Model-based diffusion (MBD) is a novel diffusion-based trajectory optimization framework that employs a dynamics model to run the reverse denoising process to generate high-quality trajectories.
[JMLR (CCF-A)] PyPop7: A Pure-PYthon LibrarY for POPulation-based Black-Box Optimization (BBO), especially *Large-Scale* algorithm variants (from evolutionary computation, swarm intelligence, statistics, operations research, machine learning, mathematical optimization, meta-heuristics, auto-control etc.). [https://jmlr.org/papers/v25/23-0386.html]
ZO2 (Zeroth-Order Offloading): Full Parameter Fine-Tuning 175B LLMs with 18GB GPU Memory [COLM2025]
Square Attack: a query-efficient black-box adversarial attack via random search [ECCV 2020]
Elo ratings for global black box derivative-free optimizers
Official implementation for the paper "CoVO-MPC: Theoretical Analysis of Sampling-based MPC and Optimal Covariance Design" accepted by L4DC 2024. CoVO-MPC is an optimal sampling-based MPC algorithm.
[ICML‘24] Official code for the paper "Revisiting Zeroth-Order Optimization for Memory-Efficient LLM Fine-Tuning: A Benchmark ".
Powell's Derivative-Free Optimization solvers.
[ICLR'24] "DeepZero: Scaling up Zeroth-Order Optimization for Deep Model Training" by Aochuan Chen*, Yimeng Zhang*, Jinghan Jia, James Diffenderfer, Jiancheng Liu, Konstantinos Parasyris, Yihua Zhang, Zheng Zhang, Bhavya Kailkhura, Sijia Liu
This repository contains the PyTorch implementation of Zeroth Order Optimization Based Adversarial Black Box Attack (https://arxiv.org/abs/1708.03999)
Robustify Black-Box Models (ICLR'22 - Spotlight)
Every Call is Precious: Global Optimization of Black-Box Functions with Unknown Lipschitz Constants
Modular optimization library for PyTorch (work-in-progress).
Code for IEEE MLSP 2021 paper titled "Model-Free Learning of Optimal Deterministic Resource Allocations in Wireless Systems via Action-Space Exploration"
Zeroth-Order Regularized Optimization (ZORO): Approximately Sparse Gradients and Adaptive Sampling
Benchmarking optimization solvers.
SCOBO: Sparsity-aware Comparison Oracle Based Optimization
Hard-Thresholding Meets Evolution Strategies in Reinforcement Learning
Implementation of the algorithms described in the papers "ZO-AdaMM: Zeroth Order Adaptive Momentum" by Chen et al., "Stochastic first- and zeroth-order methods" by Ghadimi et al. and "SignSGD via zeroth- order oracle" by Liu et al.
Blockwise Direct Search (MATLAB version)
Sparse Perturbations for Improved Convergence in Stochastic Zeroth-Order Optimization
A pure-MATLAB library for POPulation-based Large-Scale Black-Box Optimization (pop-lsbbo).
PRIMA: Reference Implementation for Powell's methods with Modernization and Amelioration
Federated MeZO fine-tuning for LLMs with adaptive server-side optimizers (FedAdam/W), LoRA, and 4-bit quantization.
Blockwise Direct Search (Octave version)
Blockwise Direct Search (Python version)
Arhiva of the MATLAB version of Blockwise Direct Search
[NeurIPS 2023] “SODA: Robust Training of Test-Time Data Adaptors”