Tune: Scalable Hyperparameter Tuning ==================================== .. image:: images/tune.png :scale: 30% :align: center Tune is a Python library for experiment execution and hyperparameter tuning at any scale. Core features: * Launch a multi-node :ref:`distributed hyperparameter sweep ` in less than 10 lines of code. * Supports any machine learning framework, including PyTorch, XGBoost, MXNet, and Keras. See `examples here `_. * Natively `integrates with optimization libraries `_ such as `HyperOpt `_, `Bayesian Optimization `_, and `Facebook Ax `_. * Choose among `scalable algorithms `_ such as `Population Based Training (PBT)`_, `Vizier's Median Stopping Rule`_, `HyperBand/ASHA`_. * Visualize results with `TensorBoard `__. .. _`Population Based Training (PBT)`: tune-schedulers.html#population-based-training-pbt .. _`Vizier's Median Stopping Rule`: tune-schedulers.html#median-stopping-rule .. _`HyperBand/ASHA`: tune-schedulers.html#asynchronous-hyperband .. important:: Join our `community slack `_ to discuss Ray! For more information, check out: * `Code `__: GitHub repository for Tune. * `User Guide `__: A comprehensive overview on how to use Tune's features. * `Tutorial Notebooks `__: Our tutorial notebooks of using Tune with Keras or PyTorch. **Try out a tutorial notebook on Colab**: .. raw:: html Tune Tutorial Quick Start ----------- To run this example, install the following: ``pip install 'ray[tune]' torch torchvision``. This example runs a small grid search to train a convolutional neural network using PyTorch and Tune. .. literalinclude:: ../../python/ray/tune/tests/example.py :language: python :start-after: __quick_start_begin__ :end-before: __quick_start_end__ If TensorBoard is installed, automatically visualize all trial results: .. code-block:: bash tensorboard --logdir ~/ray_results .. image:: images/tune-start-tb.png :scale: 30% :align: center If using TF2 and TensorBoard, Tune will also automatically generate TensorBoard HParams output: .. image:: images/tune-hparams-coord.png :scale: 20% :align: center Take a look at the :ref:`Distributed Experiments ` documentation for: 1. Setting up distributed experiments on your local cluster 2. Using AWS and GCP 3. Spot instance usage/pre-emptible instances, and more. Talks and Blogs --------------- Below are some blog posts and talks about Tune: - [blog] `Tune: a Python library for fast hyperparameter tuning at any scale `_ - [blog] `Cutting edge hyperparameter tuning with Ray Tune `_ - [blog] `Simple hyperparameter and architecture search in tensorflow with Ray Tune `_ - [slides] `Talk given at RISECamp 2019 `_ - [video] `Talk given at RISECamp 2018 `_ - [video] `A Guide to Modern Hyperparameter Optimization (PyData LA 2019) `_ (`slides `_) Open Source Projects using Tune ------------------------------- Here are some of the popular open source repositories and research projects that leverage Tune. Feel free to submit a pull-request adding (or requesting a removal!) of a listed project. - `Softlearning `_: Softlearning is a reinforcement learning framework for training maximum entropy policies in continuous domains. Includes the official implementation of the Soft Actor-Critic algorithm. - `Flambe `_: An ML framework to accelerate research and its path to production. See `flambe.ai `_. - `Population Based Augmentation `_: Population Based Augmentation (PBA) is a algorithm that quickly and efficiently learns data augmentation functions for neural network training. PBA matches state-of-the-art results on CIFAR with one thousand times less compute. - `Fast AutoAugment by Kakao `_: Fast AutoAugment (Accepted at NeurIPS 2019) learns augmentation policies using a more efficient search strategy based on density matching. - `Allentune `_: Hyperparameter Search for AllenNLP from AllenAI. - `machinable `_: A modular configuration system for machine learning research. See `machinable.org `_. Citing Tune ----------- If Tune helps you in your academic research, you are encouraged to cite `our paper `__. Here is an example bibtex: .. code-block:: tex @article{liaw2018tune, title={Tune: A Research Platform for Distributed Model Selection and Training}, author={Liaw, Richard and Liang, Eric and Nishihara, Robert and Moritz, Philipp and Gonzalez, Joseph E and Stoica, Ion}, journal={arXiv preprint arXiv:1807.05118}, year={2018} }