"const": "hello" # It is also ok to specify constant values.
})
..caution:: If you use a Search Algorithm, you may not be able to specify lambdas or grid search with this
interface, as the search algorithm may require a different search space declaration.
To sample multiple times/run multiple trials, specify ``tune.run(num_samples=N``. If ``grid_search`` is provided as an argument, the *same* grid will be repeated ``N`` times.
Note that grid search and random search primitives are inter-operable. Each can be used independently or in combination with each other.
..code-block:: python
# 6 different configs.
tune.run(trainable, num_samples=2, config={
"x": tune.sample_from(...),
"y": tune.grid_search([a, b, c])
}
)
In the below example, ``num_samples=10`` repeats the 3x3 grid search 10 times, for a total of 90 trials, each with randomly sampled values of ``alpha`` and ``beta``.
..code-block:: python
:emphasize-lines:12
tune.run(
my_trainable,
name="my_trainable",
# num_samples will repeat the entire config 10 times.
num_samples=10
config={
# ``sample_from`` creates a generator to call the lambda once per trial.
You'll often run into awkward search spaces (i.e., when one hyperparameter depends on another). Use ``tune.sample_from(func)`` to provide a **custom** callable function for generating a search space.
The parameter ``func`` should take in a ``spec`` object, which has a ``config`` namespace from which you can access other hyperparameters. This is useful for conditional distributions:
Here's an example showing a grid search over two nested parameters combined with random sampling from two lambda functions, generating 9 different trials. Note that the value of ``beta`` depends on the value of ``alpha``, which is represented by referencing ``spec.config.alpha`` in the lambda function. This lets you specify conditional parameter distributions.