.. _tune-grid-random: Grid/Random Search ================== Overview -------- Tune has a native interface for specifying a grid search or random search. You can specify the search space via ``tune.run(config=...)``. Thereby, you can either use the ``tune.grid_search`` primitive to specify an axis of a grid search... .. code-block:: python tune.run( trainable, config={"bar": tune.grid_search([True, False])}) ... or one of the random sampling primitives to specify distributions (:ref:`tune-sample-docs`): .. code-block:: python tune.run( trainable, config={ "param1": tune.choice([True, False]), "bar": tune.uniform(0, 10), "alpha": tune.sample_from(lambda _: np.random.uniform(100) ** 2), "const": "hello" # It is also ok to specify constant values. }) .. caution:: If you use a Search Algorithm, you may not be able to specify lambdas or grid search with this interface, as the search algorithm may require a different search space declaration. To sample multiple times/run multiple trials, specify ``tune.run(num_samples=N``. If ``grid_search`` is provided as an argument, the *same* grid will be repeated ``N`` times. .. code-block:: python # 13 different configs. tune.run(trainable config={ "x": tune.choice([0, 1, 2]), } ) # 13 different configs. tune.run(trainable, num_samples=13, config={ "x": tune.choice([0, 1, 2]), "y": tune.randn([0, 1, 2]), } ) # 4 different configs. tune.run(trainable, config={"x": tune.grid_search([1, 2, 3, 4])}, num_samples=1) # 3 different configs. tune.run(trainable, config={"x": grid_search([1, 2, 3])}, num_samples=1) # 6 different configs. tune.run(trainable, config={"x": tune.grid_search([1, 2, 3])}, num_samples=2) # 9 different configs. tune.run(trainable, num_samples=1, config={ "x": tune.grid_search([1, 2, 3]), "y": tune.grid_search([a, b, c])} ) # 18 different configs. tune.run(trainable, num_samples=2, config={ "x": tune.grid_search([1, 2, 3]), "y": tune.grid_search([a, b, c])} ) # 45 different configs. tune.run(trainable, num_samples=5, config={ "x": tune.grid_search([1, 2, 3]), "y": tune.grid_search([a, b, c])} ) Note that grid search and random search primitives are inter-operable. Each can be used independently or in combination with each other. .. code-block:: python # 6 different configs. tune.run(trainable, num_samples=2, config={ "x": tune.sample_from(...), "y": tune.grid_search([a, b, c]) } ) In the below example, ``num_samples=10`` repeats the 3x3 grid search 10 times, for a total of 90 trials, each with randomly sampled values of ``alpha`` and ``beta``. .. code-block:: python :emphasize-lines: 12 tune.run( my_trainable, name="my_trainable", # num_samples will repeat the entire config 10 times. num_samples=10 config={ # ``sample_from`` creates a generator to call the lambda once per trial. "alpha": tune.sample_from(lambda spec: np.random.uniform(100)), # ``sample_from`` also supports "conditional search spaces" "beta": tune.sample_from(lambda spec: spec.config.alpha * np.random.normal()), "nn_layers": [ # tune.grid_search will make it so that all values are evaluated. tune.grid_search([16, 64, 256]), tune.grid_search([16, 64, 256]), ], }, ) Custom/Conditional Search Spaces -------------------------------- You'll often run into awkward search spaces (i.e., when one hyperparameter depends on another). Use ``tune.sample_from(func)`` to provide a **custom** callable function for generating a search space. The parameter ``func`` should take in a ``spec`` object, which has a ``config`` namespace from which you can access other hyperparameters. This is useful for conditional distributions: .. code-block:: python tune.run( ..., config={ # A random function "alpha": tune.sample_from(lambda _: np.random.uniform(100)), # Use the `spec.config` namespace to access other hyperparameters "beta": tune.sample_from(lambda spec: spec.config.alpha * np.random.normal()) } ) Here's an example showing a grid search over two nested parameters combined with random sampling from two lambda functions, generating 9 different trials. Note that the value of ``beta`` depends on the value of ``alpha``, which is represented by referencing ``spec.config.alpha`` in the lambda function. This lets you specify conditional parameter distributions. .. code-block:: python :emphasize-lines: 4-11 tune.run( my_trainable, name="my_trainable", config={ "alpha": tune.sample_from(lambda spec: np.random.uniform(100)), "beta": tune.sample_from(lambda spec: spec.config.alpha * np.random.normal()), "nn_layers": [ tune.grid_search([16, 64, 256]), tune.grid_search([16, 64, 256]), ], } ) .. _tune-sample-docs: Random Distributions API ------------------------ tune.randn ~~~~~~~~~~ .. autofunction:: ray.tune.randn tune.loguniform ~~~~~~~~~~~~~~~ .. autofunction:: ray.tune.loguniform tune.uniform ~~~~~~~~~~~~ .. autofunction:: ray.tune.uniform tune.choice ~~~~~~~~~~~ .. autofunction:: ray.tune.choice tune.sample_from ~~~~~~~~~~~~~~~~ .. autoclass:: ray.tune.sample_from Grid Search API --------------- .. autofunction:: ray.tune.grid_search Internals --------- BasicVariantGenerator ~~~~~~~~~~~~~~~~~~~~~ .. autoclass:: ray.tune.suggest.BasicVariantGenerator