"In this tutorial we introduce Nevergrad, while running a simple Ray Tune experiment. Tune’s Search Algorithms integrate with Nevergrad and, as a result, allow you to seamlessly scale up a Nevergrad optimization process - without sacrificing performance.\n",
"\n",
"Nevergrad provides gradient/derivative-free optimization able to handle noise over the objective landscape, including evolutionary, bandit, and Bayesian optimization algorithms. Nevergrad internally supports search spaces which are continuous, discrete or a mixture of thereof. It also provides a library of functions on which to test the optimization algorithms and compare with other benchmarks.\n",
"\n",
"In this example we minimize a simple objective to briefly demonstrate the usage of Nevergrad with Ray Tune via `NevergradSearch`. It's useful to keep in mind that despite the emphasis on machine learning experiments, Ray Tune optimizes any implicit or explicit objective. Here we assume `nevergrad==0.4.3.post7` library is installed. To learn more, please refer to [Nevergrad website](https://github.com/facebookresearch/nevergrad)."
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "5ab54f85",
"metadata": {
"tags": [
"remove-cell"
]
},
"outputs": [],
"source": [
"# !pip install ray[tune]\n",
"!pip install nevergrad==0.4.3.post7 "
]
},
{
"cell_type": "markdown",
"id": "66cb8206",
"metadata": {},
"source": [
"Click below to see all the imports we need for this example.\n",
"You can also launch directly into a Binder instance to run this notebook yourself.\n",
"Just click on the rocket symbol at the top of the navigation."
"Now we construct the hyperparameter search space using `ConfigSpace`"
]
},
{
"cell_type": "markdown",
"id": "e2405373",
"metadata": {},
"source": [
"Next we define the search algorithm built from `NevergradSearch`, constrained to a maximum of `4` concurrent trials with a `ConcurrencyLimiter`. Here we use `ng.optimizers.OnePlusOne`, a simple evolutionary algorithm."
"Finally, we run the experiment to `\"min\"`imize the \"mean_loss\" of the `objective` by searching `search_space` via `algo`, `num_samples` times. This previous sentence is fully characterizes the search problem we aim to solve. With this in mind, observe how efficient it is to execute `tune.run()`."
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "769f4368",
"metadata": {},
"outputs": [],
"source": [
"analysis = tune.run(\n",
" objective,\n",
" search_alg=algo,\n",
" metric=\"mean_loss\",\n",
" mode=\"min\",\n",
" name=\"nevergrad_exp\",\n",
" num_samples=num_samples,\n",
" config=search_config,\n",
")"
]
},
{
"cell_type": "markdown",
"id": "950003be",
"metadata": {},
"source": [
"Here are the hyperparamters found to minimize the mean loss of the defined objective."
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "0f021674",
"metadata": {
"lines_to_next_cell": 0
},
"outputs": [],
"source": [
"print(\"Best hyperparameters found were: \", analysis.best_config)"
]
},
{
"cell_type": "markdown",
"id": "0d3824ae",
"metadata": {},
"source": [
"## Optional: passing the (hyper)parameter space into the search algorithm\n",
"\n",
"We can also pass the search space into `NevergradSearch` using their designed format."