mirror of
https://github.com/vale981/ray
synced 2025-03-09 12:56:46 -04:00
![]() * patch up pbt * Sat Jan 27 01:00:03 PST 2018 * Sat Jan 27 01:04:14 PST 2018 * Sat Jan 27 01:04:21 PST 2018 * Sat Jan 27 01:15:15 PST 2018 * Sat Jan 27 01:15:42 PST 2018 * Sat Jan 27 01:16:14 PST 2018 * Sat Jan 27 01:38:42 PST 2018 * Sat Jan 27 01:39:21 PST 2018 * add pbt * Sat Jan 27 01:41:19 PST 2018 * Sat Jan 27 01:44:21 PST 2018 * Sat Jan 27 01:45:46 PST 2018 * Sat Jan 27 16:54:42 PST 2018 * Sat Jan 27 16:57:53 PST 2018 * clean up test * Sat Jan 27 18:01:15 PST 2018 * Sat Jan 27 18:02:54 PST 2018 * Sat Jan 27 18:11:18 PST 2018 * Sat Jan 27 18:11:55 PST 2018 * Sat Jan 27 18:14:09 PST 2018 * review * try out a ppo example * some tweaks to ppo example * add postprocess hook * Sun Jan 28 15:00:40 PST 2018 * clean up custom explore fn * Sun Jan 28 15:10:21 PST 2018 * Sun Jan 28 15:14:53 PST 2018 * Sun Jan 28 15:17:04 PST 2018 * Sun Jan 28 15:33:13 PST 2018 * Sun Jan 28 15:56:40 PST 2018 * Sun Jan 28 15:57:36 PST 2018 * Sun Jan 28 16:00:35 PST 2018 * Sun Jan 28 16:02:58 PST 2018 * Sun Jan 28 16:29:50 PST 2018 * Sun Jan 28 16:30:36 PST 2018 * Sun Jan 28 16:31:44 PST 2018 * improve tune doc * concepts * update humanoid * Fri Feb 2 18:03:33 PST 2018 * fix example * show error file |
||
---|---|---|
.. | ||
examples | ||
test | ||
__init__.py | ||
config_parser.py | ||
error.py | ||
function_runner.py | ||
hyperband.py | ||
logger.py | ||
median_stopping_rule.py | ||
ParallelCoordinatesVisualization.ipynb | ||
pbt.py | ||
README.rst | ||
registry.py | ||
result.py | ||
trainable.py | ||
trial.py | ||
trial_runner.py | ||
trial_scheduler.py | ||
tune.py | ||
TuneClient.ipynb | ||
variant_generator.py | ||
visual_utils.py | ||
web_server.py |
Ray.tune: Hyperparameter Optimization Framework =============================================== Ray.tune is a hyperparameter tuning framework for long-running tasks such as RL and deep learning training. User documentation can be `found here <http://ray.readthedocs.io/en/latest/tune.html>`__. Implementation overview ----------------------- At a high level, Ray.tune takes in JSON experiment configs (e.g. that defines the grid or random search) and compiles them into a number of `Trial` objects. It schedules trials on the Ray cluster using a given `TrialScheduler` implementation (e.g. median stopping rule or HyperBand). This is implemented as follows: - `variant_generator.py <https://github.com/ray-project/ray/blob/master/python/ray/tune/variant_generator.py>`__ parses the config and generates the trial variants. - `trial.py <https://github.com/ray-project/ray/blob/master/python/ray/tune/trial.py>`__ manages the lifecycle of the Ray actor responsible for executing the trial. - `trial_runner.py <https://github.com/ray-project/ray/blob/master/python/ray/tune/tune.py>`__ tracks scheduling state for all the trials of an experiment. TrialRunners are usually created automatically by ``run_experiments(experiment_json)``, which parses and starts the experiments. - `trial_scheduler.py <https://github.com/ray-project/ray/blob/master/python/ray/tune/trial_scheduler.py>`__ plugs into TrialRunner to implement custom prioritization or early stopping algorithms.