51 results for “topic:derivative-free-optimization”
Distributed GPU-Accelerated Framework for Evolutionary Computation. Comprehensive Library of Evolutionary Algorithms & Benchmark Problems.
Mathematical Optimization in Julia. Local, global, gradient-based and derivative-free. Linear, Quadratic, Convex, Mixed-Integer, and Nonlinear Optimization in one simple, fast, and differentiable interface.
PRIMA is a package for solving general nonlinear optimization problems without using derivatives. It provides the reference implementation for Powell's derivative-free optimization methods, i.e., COBYLA, UOBYQA, NEWUOA, BOBYQA, and LINCOA. PRIMA means Reference Implementation for Powell's methods with Modernization and Amelioration, P for Powell.
[JMLR (CCF-A)] PyPop7: A Pure-PYthon LibrarY for POPulation-based Black-Box Optimization (BBO), especially *Large-Scale* algorithm variants (from evolutionary computation, swarm intelligence, statistics, operations research, machine learning, mathematical optimization, meta-heuristics, auto-control etc.). [https://jmlr.org/papers/v25/23-0386.html]
NOMAD - A blackbox optimization software
Elo ratings for global black box derivative-free optimizers
Toolbox for gradient-based and derivative-free non-convex constrained optimization with continuous and/or discrete variables.
Powell's Derivative-Free Optimization solvers.
Sampling based Model Predictive Control package for Model-Based RL research
A simple implementation of SPSA with automatic learning rate tuning
A julia implementation of the CMA Evolution Strategy for derivative-free optimization of potentially non-linear, non-convex or noisy functions over continuous domains.
COBYLA optimizer for Rust
Numerical illustration of a novel analysis framework for consensus-based optimization (CBO) and numerical experiments demonstrating the practicability of the method
Python library for root-finding in one dimension
Modular optimization library for PyTorch (work-in-progress).
Zeroth-Order Regularized Optimization (ZORO): Approximately Sparse Gradients and Adaptive Sampling
Benchmarking optimization solvers.
Surrogate model library for Derivative-Free Optimization
OMADS - A blackbox optimization python package
Numerical analysis of Particle Swarm Optimization (PSO) and numerical experiments demonstrating the practicability of the method
This repository contains the official PyTorch implementation of the paper: "Learning Discrete Structured VAE using NES".
A consensus-based optimization methods for saddle point problems (CBO-SP)
Blockwise Direct Search (MATLAB version)
Optimization software by Professor M. J. D. Powell
A pure-MATLAB library for POPulation-based Large-Scale Black-Box Optimization (pop-lsbbo).
Myopic and non-myopic Global Optimization via IDW and RBF surrogate models
The Catla Project
A pure-MATLAB library of EVolutionary (population-based) OPTimization for Large-Scale black-box continuous Optimization (evopt-lso).
Python code for running the numerical experiments in the paper "Neural Network Accelerated Implicit Filtering: Integrating Neural Network Surrogates With Provably Convergent Derivative Free Optimization Methods" by Brian Irwin, Eldad Haber, Raviv Gal, and Avi Ziv.
Serial and Parallel Codes for the Global Optimization Algorithm DIRECT