Ray === .. raw:: html Fork me on GitHub .. image:: https://github.com/ray-project/ray/raw/master/doc/source/images/ray_header_logo.png **Ray is a fast and simple framework for building and running distributed applications.** Ray is packaged with the following libraries for accelerating machine learning workloads: - `Tune`_: Scalable Hyperparameter Tuning - `RLlib`_: Scalable Reinforcement Learning - `RaySGD`_: Distributed Training Wrappers Star us on `on GitHub`_. You can also get started by visiting our `Tutorials `_. For the latest wheels (nightlies), see the `installation page `__. .. _`on GitHub`: https://github.com/ray-project/ray .. _`RaySGD`: raysgd/raysgd.html .. important:: Join our `community slack `_ to discuss Ray! Quick Start ----------- First, install Ray with: ``pip install ray`` .. code-block:: python # Execute Python functions in parallel. import ray ray.init() @ray.remote def f(x): return x * x futures = [f.remote(i) for i in range(4)] print(ray.get(futures)) To use Ray's actor model: .. code-block:: python import ray ray.init() @ray.remote class Counter(object): def __init__(self): self.n = 0 def increment(self): self.n += 1 def read(self): return self.n counters = [Counter.remote() for i in range(4)] [c.increment.remote() for c in counters] futures = [c.read.remote() for c in counters] print(ray.get(futures)) Visit the `Walkthrough `_ page a more comprehensive overview of Ray features. Ray programs can run on a single machine, and can also seamlessly scale to large clusters. To execute the above Ray script in the cloud, just download `this configuration file `__, and run: ``ray submit [CLUSTER.YAML] example.py --start`` Read more about `launching clusters `_. Tune Quick Start ---------------- `Tune`_ is a library for hyperparameter tuning at any scale. With Tune, you can launch a multi-node distributed hyperparameter sweep in less than 10 lines of code. Tune supports any deep learning framework, including PyTorch, TensorFlow, and Keras. .. note:: To run this example, you will need to install the following: .. code-block:: bash $ pip install ray torch torchvision filelock This example runs a small grid search to train a CNN using PyTorch and Tune. .. literalinclude:: ../../python/ray/tune/tests/example.py :language: python :start-after: __quick_start_begin__ :end-before: __quick_start_end__ If TensorBoard is installed, automatically visualize all trial results: .. code-block:: bash tensorboard --logdir ~/ray_results .. _`Tune`: tune.html RLlib Quick Start ----------------- `RLlib`_ is an open-source library for reinforcement learning built on top of Ray that offers both high scalability and a unified API for a variety of applications. .. code-block:: bash pip install tensorflow # or tensorflow-gpu pip install ray[rllib] # also recommended: ray[debug] .. code-block:: python import gym from gym.spaces import Discrete, Box from ray import tune class SimpleCorridor(gym.Env): def __init__(self, config): self.end_pos = config["corridor_length"] self.cur_pos = 0 self.action_space = Discrete(2) self.observation_space = Box(0.0, self.end_pos, shape=(1, )) def reset(self): self.cur_pos = 0 return [self.cur_pos] def step(self, action): if action == 0 and self.cur_pos > 0: self.cur_pos -= 1 elif action == 1: self.cur_pos += 1 done = self.cur_pos >= self.end_pos return [self.cur_pos], 1 if done else 0, done, {} tune.run( "PPO", config={ "env": SimpleCorridor, "num_workers": 4, "env_config": {"corridor_length": 5}}) .. _`RLlib`: rllib.html More Information ---------------- Here are some talks, papers, and press coverage involving Ray and its libraries. Please raise an issue if any of the below links are broken! Blog and Press ~~~~~~~~~~~~~~ - `Modern Parallel and Distributed Python: A Quick Tutorial on Ray `_ - `Why Every Python Developer Will Love Ray `_ - `Ray: A Distributed System for AI (BAIR) `_ - `10x Faster Parallel Python Without Python Multiprocessing `_ - `Implementing A Parameter Server in 15 Lines of Python with Ray `_ - `Ray Distributed AI Framework Curriculum `_ - `RayOnSpark: Running Emerging AI Applications on Big Data Clusters with Ray and Analytics Zoo `_ - `First user tips for Ray `_ - [Tune] `Tune: a Python library for fast hyperparameter tuning at any scale `_ - [Tune] `Cutting edge hyperparameter tuning with Ray Tune `_ - [RLlib] `New Library Targets High Speed Reinforcement Learning `_ - [RLlib] `Scaling Multi Agent Reinforcement Learning `_ - [RLlib] `Functional RL with Keras and Tensorflow Eager `_ - [Modin] `How to Speed up Pandas by 4x with one line of code `_ - [Modin] `Quick Tip – Speed up Pandas using Modin `_ - `Ray Blog`_ .. _`Ray Blog`: https://ray-project.github.io/ Talks (Videos) ~~~~~~~~~~~~~~ - `Programming at any Scale with Ray | SF Python Meetup Sept 2019 `_ - `Ray for Reinforcement Learning | Data Council 2019 `_ - `Scaling Interactive Pandas Workflows with Modin `_ - `Ray: A Distributed Execution Framework for AI | SciPy 2018 `_ - `Ray: A Cluster Computing Engine for Reinforcement Learning Applications | Spark Summit `_ - `RLlib: Ray Reinforcement Learning Library | RISECamp 2018 `_ - `Enabling Composition in Distributed Reinforcement Learning | Spark Summit 2018 `_ - `Tune: Distributed Hyperparameter Search | RISECamp 2018 `_ Slides ~~~~~~ - `Talk given at UC Berkeley DS100 `_ - `Talk given in October 2019 `_ - [Tune] `Talk given at RISECamp 2019 `_ Academic Papers ~~~~~~~~~~~~~~~ - `Ray paper`_ - `Ray HotOS paper`_ - `RLlib paper`_ - `Tune paper`_ .. _`Ray paper`: https://arxiv.org/abs/1712.05889 .. _`Ray HotOS paper`: https://arxiv.org/abs/1703.03924 .. _`RLlib paper`: https://arxiv.org/abs/1712.09381 .. _`Tune paper`: https://arxiv.org/abs/1807.05118 Getting Involved ---------------- - `ray-dev@googlegroups.com`_: For discussions about development or any general questions. - `StackOverflow`_: For questions about how to use Ray. - `GitHub Issues`_: For reporting bugs and feature requests. - `Pull Requests`_: For submitting code contributions. .. _`ray-dev@googlegroups.com`: https://groups.google.com/forum/#!forum/ray-dev .. _`GitHub Issues`: https://github.com/ray-project/ray/issues .. _`StackOverflow`: https://stackoverflow.com/questions/tagged/ray .. _`Pull Requests`: https://github.com/ray-project/ray/pulls .. toctree:: :maxdepth: -1 :caption: Installation installation.rst .. toctree:: :maxdepth: -1 :caption: Ray Core walkthrough.rst using-ray.rst configure.rst ray-dashboard.rst cluster-index.rst Tutorial and Examples package-ref.rst .. toctree:: :maxdepth: -1 :caption: Tune tune.rst Tune Guides and Tutorials tune-usage.rst tune-schedulers.rst tune-searchalg.rst tune-examples.rst tune/api_docs/overview.rst tune-contrib.rst .. toctree:: :maxdepth: -1 :caption: RLlib rllib.rst rllib-toc.rst rllib-training.rst rllib-env.rst rllib-models.rst rllib-algorithms.rst rllib-offline.rst rllib-concepts.rst rllib-examples.rst rllib-package-ref.rst rllib-dev.rst .. toctree:: :maxdepth: -1 :caption: Ray SGD raysgd/raysgd.rst raysgd/raysgd_pytorch.rst raysgd/raysgd_tensorflow.rst raysgd/raysgd_ref.rst .. toctree:: :maxdepth: -1 :caption: Other Libraries multiprocessing.rst joblib.rst iter.rst pandas_on_ray.rst serve.rst .. toctree:: :maxdepth: -1 :caption: Development and Internals development.rst profiling.rst fault-tolerance.rst getting-involved.rst