mirror of
https://github.com/vale981/ray
synced 2025-03-06 18:41:40 -05:00

Co-authored-by: brettskymind <brett@pathmind.com> Co-authored-by: Max Pumperla <max.pumperla@googlemail.com>
347 lines
9.2 KiB
Text
347 lines
9.2 KiB
Text
{
|
||
"cells": [
|
||
{
|
||
"cell_type": "markdown",
|
||
"id": "5a1d28f3",
|
||
"metadata": {},
|
||
"source": [
|
||
"# Running Tune experiments with Nevergrad\n",
|
||
"\n",
|
||
"In this tutorial we introduce Nevergrad, while running a simple Ray Tune experiment. Tune’s Search Algorithms integrate with Nevergrad and, as a result, allow you to seamlessly scale up a Nevergrad optimization process - without sacrificing performance.\n",
|
||
"\n",
|
||
"Nevergrad provides gradient/derivative-free optimization able to handle noise over the objective landscape, including evolutionary, bandit, and Bayesian optimization algorithms. Nevergrad internally supports search spaces which are continuous, discrete or a mixture of thereof. It also provides a library of functions on which to test the optimization algorithms and compare with other benchmarks.\n",
|
||
"\n",
|
||
"In this example we minimize a simple objective to briefly demonstrate the usage of Nevergrad with Ray Tune via `NevergradSearch`. It's useful to keep in mind that despite the emphasis on machine learning experiments, Ray Tune optimizes any implicit or explicit objective. Here we assume `nevergrad==0.4.3.post7` library is installed. To learn more, please refer to [Nevergrad website](https://github.com/facebookresearch/nevergrad)."
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "code",
|
||
"execution_count": null,
|
||
"id": "5ab54f85",
|
||
"metadata": {
|
||
"tags": [
|
||
"remove-cell"
|
||
]
|
||
},
|
||
"outputs": [],
|
||
"source": [
|
||
"# !pip install ray[tune]\n",
|
||
"!pip install nevergrad==0.4.3.post7 "
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "markdown",
|
||
"id": "66cb8206",
|
||
"metadata": {},
|
||
"source": [
|
||
"Click below to see all the imports we need for this example.\n",
|
||
"You can also launch directly into a Binder instance to run this notebook yourself.\n",
|
||
"Just click on the rocket symbol at the top of the navigation."
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "code",
|
||
"execution_count": null,
|
||
"id": "1f6d7a31",
|
||
"metadata": {
|
||
"tags": [
|
||
"hide-input"
|
||
]
|
||
},
|
||
"outputs": [],
|
||
"source": [
|
||
"import time\n",
|
||
"\n",
|
||
"import ray\n",
|
||
"import nevergrad as ng\n",
|
||
"from ray import tune\n",
|
||
"from ray.tune.suggest import ConcurrencyLimiter\n",
|
||
"from ray.tune.suggest.nevergrad import NevergradSearch"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "markdown",
|
||
"id": "41f2c881",
|
||
"metadata": {},
|
||
"source": [
|
||
"Let's start by defining a simple evaluation function.\n",
|
||
"We artificially sleep for a bit (`0.1` seconds) to simulate a long-running ML experiment.\n",
|
||
"This setup assumes that we're running multiple `step`s of an experiment and try to tune two hyperparameters,\n",
|
||
"namely `width` and `height`, and `activation`."
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "code",
|
||
"execution_count": null,
|
||
"id": "271bd5c5",
|
||
"metadata": {},
|
||
"outputs": [],
|
||
"source": [
|
||
"def evaluate(step, width, height, activation):\n",
|
||
" time.sleep(0.1)\n",
|
||
" activation_boost = 10 if activation==\"relu\" else 1\n",
|
||
" return (0.1 + width * step / 100) ** (-1) + height * 0.1 + activation_boost"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "markdown",
|
||
"id": "f060ea83",
|
||
"metadata": {},
|
||
"source": [
|
||
"Next, our `objective` function takes a Tune `config`, evaluates the `score` of your experiment in a training loop,\n",
|
||
"and uses `tune.report` to report the `score` back to Tune."
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "code",
|
||
"execution_count": null,
|
||
"id": "c71fc423",
|
||
"metadata": {},
|
||
"outputs": [],
|
||
"source": [
|
||
"def objective(config):\n",
|
||
" for step in range(config[\"steps\"]):\n",
|
||
" score = evaluate(step, config[\"width\"], config[\"height\"], config[\"activation\"])\n",
|
||
" tune.report(iterations=step, mean_loss=score)"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "code",
|
||
"execution_count": null,
|
||
"id": "619263ee",
|
||
"metadata": {
|
||
"lines_to_next_cell": 0,
|
||
"tags": [
|
||
"remove-cell"
|
||
]
|
||
},
|
||
"outputs": [],
|
||
"source": [
|
||
"ray.init(configure_logging=False)"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "markdown",
|
||
"id": "5b7a4b94",
|
||
"metadata": {},
|
||
"source": [
|
||
"Now we construct the hyperparameter search space using `ConfigSpace`"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "markdown",
|
||
"id": "e2405373",
|
||
"metadata": {},
|
||
"source": [
|
||
"Next we define the search algorithm built from `NevergradSearch`, constrained to a maximum of `4` concurrent trials with a `ConcurrencyLimiter`. Here we use `ng.optimizers.OnePlusOne`, a simple evolutionary algorithm."
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "code",
|
||
"execution_count": null,
|
||
"id": "f099b674",
|
||
"metadata": {},
|
||
"outputs": [],
|
||
"source": [
|
||
"algo = NevergradSearch(\n",
|
||
" optimizer=ng.optimizers.OnePlusOne,\n",
|
||
")\n",
|
||
"algo = tune.suggest.ConcurrencyLimiter(algo, max_concurrent=4)"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "markdown",
|
||
"id": "4bddc4e5",
|
||
"metadata": {},
|
||
"source": [
|
||
"The number of samples is the number of hyperparameter combinations that will be tried out. This Tune run is set to `1000` samples.\n",
|
||
"(you can decrease this if it takes too long on your machine)."
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "code",
|
||
"execution_count": null,
|
||
"id": "adb807bc",
|
||
"metadata": {},
|
||
"outputs": [],
|
||
"source": [
|
||
"num_samples = 1000"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "code",
|
||
"execution_count": null,
|
||
"id": "191c7f89",
|
||
"metadata": {
|
||
"tags": [
|
||
"remove-cell"
|
||
]
|
||
},
|
||
"outputs": [],
|
||
"source": [
|
||
"# If 1000 samples take too long, you can reduce this number.\n",
|
||
"# We override this number here for our smoke tests.\n",
|
||
"num_samples = 10"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "markdown",
|
||
"id": "a3956381",
|
||
"metadata": {},
|
||
"source": [
|
||
"Finally, all that's left is to define a search space."
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "code",
|
||
"execution_count": null,
|
||
"id": "8829ffc5",
|
||
"metadata": {},
|
||
"outputs": [],
|
||
"source": [
|
||
"search_config = {\n",
|
||
" \"steps\": 100,\n",
|
||
" \"width\": tune.uniform(0, 20),\n",
|
||
" \"height\": tune.uniform(-100, 100),\n",
|
||
" \"activation\": tune.choice([\"relu, tanh\"])\n",
|
||
"}"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "markdown",
|
||
"id": "0b9d051f",
|
||
"metadata": {},
|
||
"source": [
|
||
"Finally, we run the experiment to `\"min\"`imize the \"mean_loss\" of the `objective` by searching `search_space` via `algo`, `num_samples` times. This previous sentence is fully characterizes the search problem we aim to solve. With this in mind, observe how efficient it is to execute `tune.run()`."
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "code",
|
||
"execution_count": null,
|
||
"id": "769f4368",
|
||
"metadata": {},
|
||
"outputs": [],
|
||
"source": [
|
||
"analysis = tune.run(\n",
|
||
" objective,\n",
|
||
" search_alg=algo,\n",
|
||
" metric=\"mean_loss\",\n",
|
||
" mode=\"min\",\n",
|
||
" name=\"nevergrad_exp\",\n",
|
||
" num_samples=num_samples,\n",
|
||
" config=search_config,\n",
|
||
")"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "markdown",
|
||
"id": "950003be",
|
||
"metadata": {},
|
||
"source": [
|
||
"Here are the hyperparamters found to minimize the mean loss of the defined objective."
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "code",
|
||
"execution_count": null,
|
||
"id": "0f021674",
|
||
"metadata": {
|
||
"lines_to_next_cell": 0
|
||
},
|
||
"outputs": [],
|
||
"source": [
|
||
"print(\"Best hyperparameters found were: \", analysis.best_config)"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "markdown",
|
||
"id": "0d3824ae",
|
||
"metadata": {},
|
||
"source": [
|
||
"## Optional: passing the (hyper)parameter space into the search algorithm\n",
|
||
"\n",
|
||
"We can also pass the search space into `NevergradSearch` using their designed format."
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "code",
|
||
"execution_count": null,
|
||
"id": "89ae7455",
|
||
"metadata": {},
|
||
"outputs": [],
|
||
"source": [
|
||
"space = ng.p.Dict(\n",
|
||
" width=ng.p.Scalar(lower=0, upper=20),\n",
|
||
" height=ng.p.Scalar(lower=-100, upper=100),\n",
|
||
" activation=ng.p.Choice(choices=[\"relu\", \"tanh\"])\n",
|
||
")"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "code",
|
||
"execution_count": null,
|
||
"id": "e52eeab1",
|
||
"metadata": {},
|
||
"outputs": [],
|
||
"source": [
|
||
"algo = NevergradSearch(\n",
|
||
" optimizer=ng.optimizers.OnePlusOne,\n",
|
||
" space=space,\n",
|
||
" metric=\"mean_loss\",\n",
|
||
" mode=\"min\"\n",
|
||
")\n",
|
||
"algo = tune.suggest.ConcurrencyLimiter(algo, max_concurrent=4)"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "markdown",
|
||
"id": "f8926177",
|
||
"metadata": {},
|
||
"source": [
|
||
"Again we run the experiment, this time with a less passed via the `config` and instead passed through `search_alg`."
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "code",
|
||
"execution_count": null,
|
||
"id": "64f39800",
|
||
"metadata": {},
|
||
"outputs": [],
|
||
"source": [
|
||
"analysis = tune.run(\n",
|
||
" objective,\n",
|
||
" search_alg=algo,\n",
|
||
"# metric=\"mean_loss\",\n",
|
||
"# mode=\"min\",\n",
|
||
" name=\"nevergrad_exp\",\n",
|
||
" num_samples=num_samples,\n",
|
||
" config={\"steps\": 100},\n",
|
||
")"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "code",
|
||
"execution_count": null,
|
||
"id": "0478c1ea",
|
||
"metadata": {
|
||
"tags": [
|
||
"remove-cell"
|
||
]
|
||
},
|
||
"outputs": [],
|
||
"source": [
|
||
"ray.shutdown()"
|
||
]
|
||
}
|
||
],
|
||
"metadata": {
|
||
"kernelspec": {
|
||
"display_name": "Python 3 (ipykernel)",
|
||
"language": "python",
|
||
"name": "python3"
|
||
},
|
||
"orphan": true
|
||
},
|
||
"nbformat": 4,
|
||
"nbformat_minor": 5
|
||
}
|