2017-02-27 21:14:31 -08:00
Ray
===
2018-01-19 10:14:34 -08:00
.. raw :: html
<embed>
<a href="https://github.com/ray-project/ray"><img style="position: absolute; top: 0; right: 0; border: 0;" src="https://camo.githubusercontent.com/365986a132ccd6a44c23a9169022c0b5c890c387/68747470733a2f2f73332e616d617a6f6e6177732e636f6d2f6769746875622f726962626f6e732f666f726b6d655f72696768745f7265645f6161303030302e706e67" alt="Fork me on GitHub" data-canonical-src="https://s3.amazonaws.com/github/ribbons/forkme_right_red_aa0000.png"></a>
</embed>
2019-08-28 17:54:15 -07:00
.. image :: https://github.com/ray-project/ray/raw/master/doc/source/images/ray_header_logo.png
2017-02-27 21:14:31 -08:00
2019-08-28 17:54:15 -07:00
**Ray is a fast and simple framework for building and running distributed applications.**
2018-01-19 10:14:34 -08:00
2019-08-28 17:54:15 -07:00
Ray is packaged with the following libraries for accelerating machine learning workloads:
- `Tune`_ : Scalable Hyperparameter Tuning
2019-08-05 23:33:14 -07:00
- `RLlib`_ : Scalable Reinforcement Learning
- `Distributed Training <distributed_training.html> `__
2017-09-30 15:37:28 -07:00
2019-08-05 23:33:14 -07:00
Install Ray with: `` pip install ray `` . For nightly wheels, see the `Installation page <installation.html> `__ .
2018-03-08 09:18:09 -08:00
View the `codebase on GitHub`_ .
.. _`codebase on GitHub`: https://github.com/ray-project/ray
2019-08-05 23:33:14 -07:00
Quick Start
-----------
2019-08-28 17:54:15 -07:00
Execute Python functions in parallel.
2019-08-05 23:33:14 -07:00
.. code-block :: python
2019-08-28 17:54:15 -07:00
import ray
2019-08-05 23:33:14 -07:00
ray.init()
@ray.remote
def f(x):
return x * x
futures = [f.remote(i) for i in range(4)]
print(ray.get(futures))
To use Ray's actor model:
.. code-block :: python
2019-08-28 17:54:15 -07:00
import ray
2019-08-05 23:33:14 -07:00
ray.init()
@ray.remote
class Counter():
def __init__(self):
self.n = 0
2019-08-12 11:16:16 +03:00
def increment(self):
2019-08-05 23:33:14 -07:00
self.n += 1
def read(self):
return self.n
counters = [Counter.remote() for i in range(4)]
[c.increment.remote() for c in counters]
futures = [c.read.remote() for c in counters]
print(ray.get(futures))
2019-08-28 17:54:15 -07:00
Visit the `Walkthrough <walkthrough.html> `_ page a more comprehensive overview of Ray features.
2019-08-05 23:33:14 -07:00
Ray programs can run on a single machine, and can also seamlessly scale to large clusters. To execute the above Ray script in the cloud, just download `this configuration file <https://github.com/ray-project/ray/blob/master/python/ray/autoscaler/aws/example-full.yaml> `__ , and run:
`` ray submit [CLUSTER.YAML] example.py --start ``
2019-08-28 17:54:15 -07:00
Read more about `launching clusters <autoscaling.html> `_ .
2019-08-05 23:33:14 -07:00
Tune Quick Start
----------------
2019-08-28 17:54:15 -07:00
`Tune`_ is a library for hyperparameter tuning at any scale. With Tune, you can launch a multi-node distributed hyperparameter sweep in less than 10 lines of code. Tune supports any deep learning framework, including PyTorch, TensorFlow, and Keras.
2019-08-05 23:33:14 -07:00
2019-08-06 08:46:59 -07:00
.. note ::
To run this example, you will need to install the following:
.. code-block :: bash
$ pip install ray torch torchvision filelock
2019-08-05 23:33:14 -07:00
2019-08-06 08:46:59 -07:00
This example runs a small grid search to train a CNN using PyTorch and Tune.
2019-08-05 23:33:14 -07:00
2019-08-06 08:46:59 -07:00
.. literalinclude :: ../../python/ray/tune/tests/example.py
:language: python
:start-after: __quick_start_begin__
:end-before: __quick_start_end__
If TensorBoard is installed, automatically visualize all trial results:
.. code-block :: bash
2019-08-05 23:33:14 -07:00
2019-08-06 08:46:59 -07:00
tensorboard --logdir ~/ray_results
2018-03-08 09:18:09 -08:00
2018-08-19 11:00:55 -07:00
.. _`Tune`: tune.html
2019-08-05 23:33:14 -07:00
RLlib Quick Start
-----------------
`RLlib`_ is an open-source library for reinforcement learning built on top of Ray that offers both high scalability and a unified API for a variety of applications.
.. code-block :: bash
pip install tensorflow # or tensorflow-gpu
pip install ray[rllib] # also recommended: ray[debug]
.. code-block :: python
import gym
from gym.spaces import Discrete, Box
from ray import tune
class SimpleCorridor(gym.Env):
def __init__(self, config):
self.end_pos = config["corridor_length"]
self.cur_pos = 0
self.action_space = Discrete(2)
self.observation_space = Box(0.0, self.end_pos, shape=(1, ))
def reset(self):
self.cur_pos = 0
return [self.cur_pos]
def step(self, action):
if action == 0 and self.cur_pos > 0:
self.cur_pos -= 1
elif action == 1:
self.cur_pos += 1
done = self.cur_pos >= self.end_pos
return [self.cur_pos], 1 if done else 0, done, {}
tune.run(
"PPO",
config={
"env": SimpleCorridor,
"num_workers": 4,
"env_config": {"corridor_length": 5}})
2018-08-19 11:00:55 -07:00
.. _`RLlib`: rllib.html
2018-03-08 09:18:09 -08:00
2019-08-05 23:33:14 -07:00
2019-08-28 17:54:15 -07:00
More Information
----------------
- `Tutorial`_
- `Blog`_
- `Ray paper`_
- `Ray HotOS paper`_
- `RLlib paper`_
- `Tune paper`_
.. _`Tutorial`: https://github.com/ray-project/tutorial
.. _`Blog`: https://ray-project.github.io/
.. _`Ray paper`: https://arxiv.org/abs/1712.05889
.. _`Ray HotOS paper`: https://arxiv.org/abs/1703.03924
.. _`RLlib paper`: https://arxiv.org/abs/1712.09381
.. _`Tune paper`: https://arxiv.org/abs/1807.05118
Getting Involved
----------------
- `ray-dev@googlegroups.com`_ : For discussions about development or any general
questions.
- `StackOverflow`_ : For questions about how to use Ray.
- `GitHub Issues`_ : For reporting bugs and feature requests.
- `Pull Requests`_ : For submitting code contributions.
2019-08-05 23:33:14 -07:00
.. _`ray-dev@googlegroups.com`: https://groups.google.com/forum/#!forum/ray-dev
.. _`GitHub Issues`: https://github.com/ray-project/ray/issues
.. _`StackOverflow`: https://stackoverflow.com/questions/tagged/ray
2019-08-28 17:54:15 -07:00
.. _`Pull Requests`: https://github.com/ray-project/ray/pulls
2019-08-05 23:33:14 -07:00
2018-03-08 09:18:09 -08:00
2017-02-27 21:14:31 -08:00
.. toctree ::
2019-08-28 17:54:15 -07:00
:maxdepth: -1
2017-02-27 21:14:31 -08:00
:caption: Installation
2018-03-12 00:52:00 -07:00
installation.rst
2017-02-27 21:14:31 -08:00
2017-03-04 23:06:02 -08:00
.. toctree ::
2019-08-28 17:54:15 -07:00
:maxdepth: -1
2019-08-05 23:33:14 -07:00
:caption: Using Ray
2017-03-04 23:06:02 -08:00
2019-08-05 23:33:14 -07:00
walkthrough.rst
2017-03-17 16:48:25 -07:00
actors.rst
2017-06-08 00:12:44 -07:00
using-ray-with-gpus.rst
2019-09-07 11:50:18 -07:00
serialization.rst
2019-08-24 18:15:16 -07:00
memory-management.rst
2019-09-07 11:50:18 -07:00
configure.rst
2019-08-05 23:33:14 -07:00
troubleshooting.rst
2019-09-07 11:50:18 -07:00
advanced.rst
2019-08-05 23:33:14 -07:00
package-ref.rst
2018-03-08 09:18:09 -08:00
2018-12-12 10:40:54 -08:00
.. toctree ::
2019-08-28 17:54:15 -07:00
:maxdepth: -1
2019-08-05 23:33:14 -07:00
:caption: Cluster Setup
2018-12-12 10:40:54 -08:00
autoscaling.rst
using-ray-on-a-cluster.rst
2019-08-05 23:33:14 -07:00
deploy-on-kubernetes.rst
2019-08-19 00:46:26 -04:00
deploying-on-slurm.rst
2018-12-12 10:40:54 -08:00
2018-03-08 09:18:09 -08:00
.. toctree ::
2019-08-28 17:54:15 -07:00
:maxdepth: -1
2018-08-19 11:00:55 -07:00
:caption: Tune
2018-03-08 09:18:09 -08:00
2017-11-23 11:31:59 -08:00
tune.rst
2019-08-02 09:17:20 -07:00
tune-tutorial.rst
2018-08-19 11:00:55 -07:00
tune-usage.rst
2019-08-02 09:17:20 -07:00
tune-distributed.rst
2018-08-19 11:00:55 -07:00
tune-schedulers.rst
tune-searchalg.rst
tune-package-ref.rst
2019-05-05 00:04:13 -07:00
tune-design.rst
2018-11-08 23:45:05 -08:00
tune-examples.rst
2019-05-05 00:04:13 -07:00
tune-contrib.rst
2018-03-08 09:18:09 -08:00
.. toctree ::
2019-08-28 17:54:15 -07:00
:maxdepth: -1
2018-10-01 12:49:39 -07:00
:caption: RLlib
2018-03-08 09:18:09 -08:00
2017-12-06 18:17:51 -08:00
rllib.rst
2019-08-12 17:39:02 -07:00
rllib-toc.rst
2018-07-01 00:05:08 -07:00
rllib-training.rst
rllib-env.rst
rllib-models.rst
2019-04-07 00:36:18 -07:00
rllib-algorithms.rst
2019-01-03 15:15:36 +08:00
rllib-offline.rst
2018-07-08 18:46:52 -07:00
rllib-concepts.rst
2019-01-29 21:06:09 -08:00
rllib-examples.rst
2019-05-27 14:17:32 -07:00
rllib-dev.rst
rllib-package-ref.rst
2017-03-04 23:06:02 -08:00
2018-03-13 22:23:50 -07:00
.. toctree ::
2019-08-28 17:54:15 -07:00
:maxdepth: -1
2019-08-05 23:33:14 -07:00
:caption: Experimental
2018-03-13 22:23:50 -07:00
2019-06-01 21:39:22 -07:00
distributed_training.rst
2019-09-03 15:35:42 -07:00
tf_distributed_training.rst
2018-03-13 22:23:50 -07:00
pandas_on_ray.rst
2019-08-20 20:49:15 -07:00
projects.rst
2019-08-05 23:33:14 -07:00
signals.rst
async_api.rst
2018-03-13 22:23:50 -07:00
2017-02-27 21:14:31 -08:00
.. toctree ::
2019-08-28 17:54:15 -07:00
:maxdepth: -1
2017-02-27 21:14:31 -08:00
:caption: Examples
2017-03-11 21:16:36 -08:00
example-rl-pong.rst
2017-11-08 23:40:51 -08:00
example-parameter-server.rst
2019-03-20 18:47:12 -07:00
example-newsreader.rst
2017-03-07 01:07:32 -08:00
example-resnet.rst
2017-03-11 00:57:53 -08:00
example-a3c.rst
2017-03-11 15:30:31 -08:00
example-lbfgs.rst
2017-11-27 21:38:35 -08:00
example-streaming.rst
2017-03-11 15:30:31 -08:00
using-ray-with-tensorflow.rst
2019-08-28 17:54:15 -07:00
using-ray-with-pytorch.rst
2017-02-27 21:14:31 -08:00
.. toctree ::
2019-08-28 17:54:15 -07:00
:maxdepth: -1
2019-08-05 23:33:14 -07:00
:caption: Development and Internals
2017-05-22 15:20:20 -07:00
2018-01-19 16:16:45 -08:00
development.rst
2018-01-25 21:40:52 -08:00
profiling.rst
2019-08-05 23:33:14 -07:00
fault-tolerance.rst
contrib.rst