# Nevergrad — Gradient-Free Optimization by Meta > Nevergrad is a gradient-free optimization platform from Meta Research providing a unified interface to derivative-free optimizers for hyperparameter tuning, reinforcement learning, and scientific computing. ## Install Save as a script file and run: # Nevergrad — Gradient-Free Optimization by Meta ## Quick Use ```bash pip install nevergrad python -c " import nevergrad as ng def sphere(x): return sum(xi**2 for xi in x) optimizer = ng.optimizers.NGOpt(parametrization=ng.p.Array(shape=(3,)), budget=100) recommendation = optimizer.minimize(sphere) print(recommendation.value) " ``` ## Introduction Nevergrad is a Python library from Meta AI Research that provides gradient-free optimization algorithms under a common interface. It is designed for problems where gradients are unavailable or unreliable, such as hyperparameter tuning, reinforcement learning reward shaping, and simulation-based optimization. ## What Nevergrad Does - Optimizes black-box functions without requiring gradients - Provides 30+ optimization algorithms under a unified API - Supports continuous, discrete, and mixed search spaces - Offers built-in benchmarks for comparing optimizer performance - Handles noisy objective functions with appropriate averaging ## Architecture Overview Nevergrad defines a Parametrization system that describes the search space (scalars, arrays, choices, log-scales). An Optimizer wraps a specific algorithm (CMA-ES, differential evolution, PSO, etc.) and generates candidates via ask/tell. The ask method proposes parameter values; tell reports back the loss. This separation allows asynchronous and parallel evaluation. ## Self-Hosting & Configuration - Install via pip with no heavy dependencies - Define search spaces using ng.p.Scalar, ng.p.Array, or ng.p.Choice - Select an optimizer or use NGOpt for automatic algorithm selection - Set budget to control the total number of objective evaluations - Use num_workers > 1 to evaluate candidates in parallel batches ## Key Features - NGOpt meta-optimizer automatically picks the best algorithm for your problem - Expressive parametrization supporting constraints and transformations - Ask/tell interface enables asynchronous and distributed evaluation - Built-in benchmarking suite for rigorous optimizer comparison - Supports multi-objective optimization via Pareto front tracking ## Comparison with Similar Tools - **Optuna** — Bayesian optimization with pruning for ML; Nevergrad covers broader optimization use cases beyond ML - **Hyperopt** — TPE-based search for hyperparameters; Nevergrad offers more diverse algorithms - **scipy.optimize** — classical numerical optimization; Nevergrad handles noisy, non-differentiable objectives - **Ray Tune** — orchestrates trials at scale; Nevergrad focuses on the optimization algorithms themselves ## FAQ **Q: What is NGOpt?** A: NGOpt is a meta-optimizer that selects the best algorithm based on your problem characteristics (budget, dimensionality, noise level). **Q: Can Nevergrad optimize discrete variables?** A: Yes. Use ng.p.Choice for categorical variables and ng.p.TransitionChoice for ordered discrete parameters. **Q: How do I run evaluations in parallel?** A: Set num_workers and call ask() multiple times before calling tell() with results as they arrive. **Q: Does Nevergrad support constraints?** A: Yes. Apply register_cheap_constraint to penalize infeasible regions of the search space. ## Sources - https://github.com/facebookresearch/nevergrad - https://facebookresearch.github.io/nevergrad/ --- Source: https://tokrepo.com/en/workflows/asset-da7cf503 Author: Script Depot