Esta página se muestra en inglés. Una traducción al español está en curso.
ScriptsMay 12, 2026·2 min de lectura

Nevergrad — Gradient-Free Optimization by Meta

Nevergrad is a gradient-free optimization platform from Meta Research providing a unified interface to derivative-free optimizers for hyperparameter tuning, reinforcement learning, and scientific computing.

Introduction

Nevergrad is a Python library from Meta AI Research that provides gradient-free optimization algorithms under a common interface. It is designed for problems where gradients are unavailable or unreliable, such as hyperparameter tuning, reinforcement learning reward shaping, and simulation-based optimization.

What Nevergrad Does

  • Optimizes black-box functions without requiring gradients
  • Provides 30+ optimization algorithms under a unified API
  • Supports continuous, discrete, and mixed search spaces
  • Offers built-in benchmarks for comparing optimizer performance
  • Handles noisy objective functions with appropriate averaging

Architecture Overview

Nevergrad defines a Parametrization system that describes the search space (scalars, arrays, choices, log-scales). An Optimizer wraps a specific algorithm (CMA-ES, differential evolution, PSO, etc.) and generates candidates via ask/tell. The ask method proposes parameter values; tell reports back the loss. This separation allows asynchronous and parallel evaluation.

Self-Hosting & Configuration

  • Install via pip with no heavy dependencies
  • Define search spaces using ng.p.Scalar, ng.p.Array, or ng.p.Choice
  • Select an optimizer or use NGOpt for automatic algorithm selection
  • Set budget to control the total number of objective evaluations
  • Use num_workers > 1 to evaluate candidates in parallel batches

Key Features

  • NGOpt meta-optimizer automatically picks the best algorithm for your problem
  • Expressive parametrization supporting constraints and transformations
  • Ask/tell interface enables asynchronous and distributed evaluation
  • Built-in benchmarking suite for rigorous optimizer comparison
  • Supports multi-objective optimization via Pareto front tracking

Comparison with Similar Tools

  • Optuna — Bayesian optimization with pruning for ML; Nevergrad covers broader optimization use cases beyond ML
  • Hyperopt — TPE-based search for hyperparameters; Nevergrad offers more diverse algorithms
  • scipy.optimize — classical numerical optimization; Nevergrad handles noisy, non-differentiable objectives
  • Ray Tune — orchestrates trials at scale; Nevergrad focuses on the optimization algorithms themselves

FAQ

Q: What is NGOpt? A: NGOpt is a meta-optimizer that selects the best algorithm based on your problem characteristics (budget, dimensionality, noise level).

Q: Can Nevergrad optimize discrete variables? A: Yes. Use ng.p.Choice for categorical variables and ng.p.TransitionChoice for ordered discrete parameters.

Q: How do I run evaluations in parallel? A: Set num_workers and call ask() multiple times before calling tell() with results as they arrive.

Q: Does Nevergrad support constraints? A: Yes. Apply register_cheap_constraint to penalize infeasible regions of the search space.

Sources

Discusión

Inicia sesión para unirte a la discusión.
Aún no hay comentarios. Sé el primero en compartir tus ideas.

Activos relacionados