2020-04-25 18:25:56 -07:00
.. _tune-index:
2020-02-18 13:43:19 -08:00
Tune: Scalable Hyperparameter Tuning
====================================
2019-09-04 12:44:42 -07:00
2018-08-19 11:00:55 -07:00
.. image :: images/tune.png
:scale: 30%
:align: center
2018-04-17 09:57:35 -07:00
2020-02-18 13:43:19 -08:00
Tune is a Python library for experiment execution and hyperparameter tuning at any scale. Core features:
2018-02-02 23:03:12 -08:00
2020-04-06 12:16:35 -07:00
* Launch a multi-node :ref: `distributed hyperparameter sweep <tune-distributed>` in less than 10 lines of code.
2020-04-27 18:01:00 -07:00
* Supports any machine learning framework, :ref: `including PyTorch, XGBoost, MXNet, and Keras<tune-guides-overview>` .
2020-06-11 11:29:16 -07:00
* Automatically manages :ref: `checkpoints <tune-checkpoint>` and logging to :ref: `TensorBoard <tune-logging>` .
2020-06-11 11:23:36 -07:00
* Choose among state of the art algorithms such as :ref: `Population Based Training (PBT) <tune-scheduler-pbt>` , :ref: `BayesOptSearch <bayesopt>` , :ref: `HyperBand/ASHA <tune-scheduler-hyperband>` .
2020-05-18 11:29:38 -05:00
* Move your models from training to serving on the same infrastructure with `Ray Serve`_ .
2018-11-08 23:45:05 -08:00
2020-05-18 11:29:38 -05:00
.. _`Ray Serve`: rayserve/overview.html
2018-11-08 23:45:05 -08:00
2020-04-27 18:01:00 -07:00
**Want to get started?** Head over to the :ref: `60 second Tune tutorial <tune-60-seconds>` .
2019-09-04 12:44:42 -07:00
2019-08-02 09:17:20 -07:00
Quick Start
-----------
2018-04-17 09:57:35 -07:00
2020-04-06 12:16:35 -07:00
To run this example, install the following: `` pip install 'ray[tune]' torch torchvision `` .
2018-03-19 12:55:10 -07:00
2020-04-06 12:16:35 -07:00
This example runs a small grid search to train a convolutional neural network using PyTorch and Tune.
2018-03-19 12:55:10 -07:00
2019-08-02 09:17:20 -07:00
.. literalinclude :: ../../python/ray/tune/tests/example.py
:language: python
:start-after: __quick_start_begin__
:end-before: __quick_start_end__
2018-03-19 12:55:10 -07:00
2019-08-02 09:17:20 -07:00
If TensorBoard is installed, automatically visualize all trial results:
2018-03-19 12:55:10 -07:00
2019-08-02 09:17:20 -07:00
.. code-block :: bash
2018-03-19 12:55:10 -07:00
2019-08-02 09:17:20 -07:00
tensorboard --logdir ~/ray_results
2018-03-19 12:55:10 -07:00
2017-11-23 11:31:59 -08:00
2019-08-02 09:17:20 -07:00
.. image :: images/tune-start-tb.png
2019-12-02 19:59:23 -08:00
:scale: 30%
:align: center
2017-11-23 11:31:59 -08:00
2019-09-21 19:06:34 +01:00
If using TF2 and TensorBoard, Tune will also automatically generate TensorBoard HParams output:
.. image :: images/tune-hparams-coord.png
2019-12-02 19:59:23 -08:00
:scale: 20%
:align: center
2019-09-21 19:06:34 +01:00
2018-02-12 14:01:19 -08:00
2020-04-27 18:01:00 -07:00
.. tip :: Join the `Ray community slack <https://forms.gle/9TSdDYUgxYs8SA9e8> `_ to discuss Ray Tune (and other Ray libraries)!
2020-06-11 11:23:36 -07:00
Why choose Tune?
2020-04-27 18:01:00 -07:00
----------------
2020-06-11 11:23:36 -07:00
There are many other hyperparameter optimization libraries out there. If you're new to Tune, you're probably wondering, "what makes Tune different?"
.. include :: tune/why_tune.rst
Reference Materials
-------------------
2020-04-27 18:01:00 -07:00
Here are some reference materials for Tune:
2019-08-02 09:17:20 -07:00
2020-04-27 18:01:00 -07:00
* :ref: `Tune Tutorials, Guides, and Examples <tune-guides-overview>`
* `Code <https://github.com/ray-project/ray/tree/master/python/ray/tune> `__ : GitHub repository for Tune
2019-08-02 09:17:20 -07:00
2019-12-02 19:59:23 -08:00
Below are some blog posts and talks about Tune:
- [blog] `Tune: a Python library for fast hyperparameter tuning at any scale <https://towardsdatascience.com/fast-hyperparameter-tuning-at-scale-d428223b081c> `_
- [blog] `Cutting edge hyperparameter tuning with Ray Tune <https://medium.com/riselab/cutting-edge-hyperparameter-tuning-with-ray-tune-be6c0447afdf> `_
- [blog] `Simple hyperparameter and architecture search in tensorflow with Ray Tune <http://louiskirsch.com/ai/ray-tune> `_
- [slides] `Talk given at RISECamp 2019 <https://docs.google.com/presentation/d/1v3IldXWrFNMK-vuONlSdEuM82fuGTrNUDuwtfx4axsQ/edit?usp=sharing> `_
2019-12-13 10:38:17 -08:00
- [video] `Talk given at RISECamp 2018 <https://www.youtube.com/watch?v=38Yd_dXW51Q> `_
2020-01-19 01:49:33 -08:00
- [video] `A Guide to Modern Hyperparameter Optimization (PyData LA 2019) <https://www.youtube.com/watch?v=10uz5U3Gy6E> `_ (`slides <https://speakerdeck.com/richardliaw/a-modern-guide-to-hyperparameter-optimization> `_ )
2018-01-24 13:45:10 -08:00
2018-08-19 11:00:55 -07:00
Citing Tune
-----------
2018-01-24 13:45:10 -08:00
2018-08-19 11:00:55 -07:00
If Tune helps you in your academic research, you are encouraged to cite `our paper <https://arxiv.org/abs/1807.05118> `__ . Here is an example bibtex:
2018-01-24 13:45:10 -08:00
2018-08-19 11:00:55 -07:00
.. code-block :: tex
2018-01-24 13:45:10 -08:00
2018-08-19 11:00:55 -07:00
@article{liaw2018tune,
title={Tune: A Research Platform for Distributed Model Selection and Training},
author={Liaw, Richard and Liang, Eric and Nishihara, Robert
and Moritz, Philipp and Gonzalez, Joseph E and Stoica, Ion},
journal={arXiv preprint arXiv:1807.05118},
year={2018}
}