ray/python/ray/tune
Eric Liang b948405532
[tune] clean up population based training prototype (#1478)
* patch up pbt

* Sat Jan 27 01:00:03 PST 2018

* Sat Jan 27 01:04:14 PST 2018

* Sat Jan 27 01:04:21 PST 2018

* Sat Jan 27 01:15:15 PST 2018

* Sat Jan 27 01:15:42 PST 2018

* Sat Jan 27 01:16:14 PST 2018

* Sat Jan 27 01:38:42 PST 2018

* Sat Jan 27 01:39:21 PST 2018

* add pbt

* Sat Jan 27 01:41:19 PST 2018

* Sat Jan 27 01:44:21 PST 2018

* Sat Jan 27 01:45:46 PST 2018

* Sat Jan 27 16:54:42 PST 2018

* Sat Jan 27 16:57:53 PST 2018

* clean up test

* Sat Jan 27 18:01:15 PST 2018

* Sat Jan 27 18:02:54 PST 2018

* Sat Jan 27 18:11:18 PST 2018

* Sat Jan 27 18:11:55 PST 2018

* Sat Jan 27 18:14:09 PST 2018

* review

* try out a ppo example

* some tweaks to ppo example

* add postprocess hook

* Sun Jan 28 15:00:40 PST 2018

* clean up custom explore fn

* Sun Jan 28 15:10:21 PST 2018

* Sun Jan 28 15:14:53 PST 2018

* Sun Jan 28 15:17:04 PST 2018

* Sun Jan 28 15:33:13 PST 2018

* Sun Jan 28 15:56:40 PST 2018

* Sun Jan 28 15:57:36 PST 2018

* Sun Jan 28 16:00:35 PST 2018

* Sun Jan 28 16:02:58 PST 2018

* Sun Jan 28 16:29:50 PST 2018

* Sun Jan 28 16:30:36 PST 2018

* Sun Jan 28 16:31:44 PST 2018

* improve tune doc

* concepts

* update humanoid

* Fri Feb  2 18:03:33 PST 2018

* fix example

* show error file
2018-02-02 23:03:12 -08:00
..
examples [tune] clean up population based training prototype (#1478) 2018-02-02 23:03:12 -08:00
test [tune] clean up population based training prototype (#1478) 2018-02-02 23:03:12 -08:00
__init__.py [tune] Ray Tune API cleanup (#1454) 2018-01-24 16:55:17 -08:00
config_parser.py [tune] Clean up result logging: move out of /tmp, add timestamp (#1297) 2017-12-15 14:19:08 -08:00
error.py [tune] Experiment Management API (#1328) 2018-01-24 13:45:10 -08:00
function_runner.py [tune] Ray Tune API cleanup (#1454) 2018-01-24 16:55:17 -08:00
hyperband.py [tune] clean up population based training prototype (#1478) 2018-02-02 23:03:12 -08:00
logger.py [tune] clean up population based training prototype (#1478) 2018-02-02 23:03:12 -08:00
median_stopping_rule.py [tune] clean up population based training prototype (#1478) 2018-02-02 23:03:12 -08:00
ParallelCoordinatesVisualization.ipynb [tune] Fix Tune ParallelCoordinateViz Notebook (#1494) 2018-02-01 00:13:57 -08:00
pbt.py [tune] clean up population based training prototype (#1478) 2018-02-02 23:03:12 -08:00
README.rst [rllib] Split docs into user and development guide (#1377) 2018-01-01 11:10:44 -08:00
registry.py [rllib] [tune] Custom preprocessors and models, various fixes (#1372) 2017-12-28 13:19:04 -08:00
result.py [tune] clean up population based training prototype (#1478) 2018-02-02 23:03:12 -08:00
trainable.py [tune] clean up population based training prototype (#1478) 2018-02-02 23:03:12 -08:00
trial.py [tune] clean up population based training prototype (#1478) 2018-02-02 23:03:12 -08:00
trial_runner.py [tune] save error msg, cleanup after object checkpoints 2018-01-29 18:48:45 -08:00
trial_scheduler.py [tune] Experiment Management API (#1328) 2018-01-24 13:45:10 -08:00
tune.py [tune] clean up population based training prototype (#1478) 2018-02-02 23:03:12 -08:00
TuneClient.ipynb [tune] Experiment Management API (#1328) 2018-01-24 13:45:10 -08:00
variant_generator.py [tune] fix doc typo and also make sure to clean "/" from config (#1476) 2018-01-26 21:51:07 -08:00
visual_utils.py [tune] Documentation for Ray.tune (#1243) 2017-11-23 11:31:59 -08:00
web_server.py [tune] Fix Docs (#1469) 2018-01-25 16:39:00 -08:00

Ray.tune: Hyperparameter Optimization Framework
===============================================

Ray.tune is a hyperparameter tuning framework for long-running tasks such as RL and deep learning training.

User documentation can be `found here <http://ray.readthedocs.io/en/latest/tune.html>`__.

Implementation overview
-----------------------

At a high level, Ray.tune takes in JSON experiment configs (e.g. that defines the grid or random search)
and compiles them into a number of `Trial` objects. It schedules trials on the Ray cluster using a given
`TrialScheduler` implementation (e.g. median stopping rule or HyperBand).

This is implemented as follows:

-  `variant_generator.py <https://github.com/ray-project/ray/blob/master/python/ray/tune/variant_generator.py>`__
   parses the config and generates the trial variants.

-  `trial.py <https://github.com/ray-project/ray/blob/master/python/ray/tune/trial.py>`__ manages the lifecycle
   of the Ray actor responsible for executing the trial.

-  `trial_runner.py <https://github.com/ray-project/ray/blob/master/python/ray/tune/tune.py>`__ tracks scheduling
   state for all the trials of an experiment. TrialRunners are usually
   created automatically by ``run_experiments(experiment_json)``, which parses and starts the experiments.

-  `trial_scheduler.py <https://github.com/ray-project/ray/blob/master/python/ray/tune/trial_scheduler.py>`__
   plugs into TrialRunner to implement custom prioritization or early stopping algorithms.