.. _tune-index: Tune: Scalable Hyperparameter Tuning ==================================== .. image:: images/tune.png :scale: 30% :align: center Tune is a Python library for experiment execution and hyperparameter tuning at any scale. Core features: * Launch a multi-node :ref:`distributed hyperparameter sweep ` in less than 10 lines of code. * Supports any machine learning framework, :ref:`including PyTorch, XGBoost, MXNet, and Keras`. * Automatically manages :ref:`checkpoints ` and logging to :ref:`TensorBoard `. * Choose among state of the art algorithms such as :ref:`Population Based Training (PBT) `, :ref:`BayesOptSearch `, :ref:`HyperBand/ASHA `. * Move your models from training to serving on the same infrastructure with `Ray Serve`_. .. _`Ray Serve`: serve/index.html **Want to get started?** Head over to the :ref:`60 second Tune tutorial `. .. tip:: Join the `Ray community slack `_ to discuss Ray Tune (and other Ray libraries)! Quick Start ----------- To run this example, install the following: ``pip install 'ray[tune]' torch torchvision``. This example runs a small grid search to train a convolutional neural network using PyTorch and Tune. .. literalinclude:: ../../python/ray/tune/tests/example.py :language: python :start-after: __quick_start_begin__ :end-before: __quick_start_end__ If TensorBoard is installed, automatically visualize all trial results: .. code-block:: bash tensorboard --logdir ~/ray_results .. image:: images/tune-start-tb.png :scale: 30% :align: center If using TF2 and TensorBoard, Tune will also automatically generate TensorBoard HParams output: .. image:: images/tune-hparams-coord.png :scale: 20% :align: center Why choose Tune? ---------------- There are many other hyperparameter optimization libraries out there. If you're new to Tune, you're probably wondering, "what makes Tune different?" .. include:: tune/why_tune.rst Reference Materials ------------------- Here are some reference materials for Tune: * :ref:`Tune Tutorials, Guides, and Examples ` * `Code `__: GitHub repository for Tune Below are some blog posts and talks about Tune: - [blog] `Tune: a Python library for fast hyperparameter tuning at any scale `_ - [blog] `Cutting edge hyperparameter tuning with Ray Tune `_ - [blog] `Simple hyperparameter and architecture search in tensorflow with Ray Tune `_ - [slides] `Talk given at RISECamp 2019 `_ - [video] `Talk given at RISECamp 2018 `_ - [video] `A Guide to Modern Hyperparameter Optimization (PyData LA 2019) `_ (`slides `_) Citing Tune ----------- If Tune helps you in your academic research, you are encouraged to cite `our paper `__. Here is an example bibtex: .. code-block:: tex @article{liaw2018tune, title={Tune: A Research Platform for Distributed Model Selection and Training}, author={Liaw, Richard and Liang, Eric and Nishihara, Robert and Moritz, Philipp and Gonzalez, Joseph E and Stoica, Ion}, journal={arXiv preprint arXiv:1807.05118}, year={2018} }