[docs] new structure (#21776)

This PR consolidates both #21667 and #21759 (look there for features), but improves on them in the following way:

- [x] we reverted renaming of existing projects `tune`, `rllib`, `train`, `cluster`, `serve`, `raysgd` and `data` so that links won't break. I think my consolidation efforts with the `ray-` prefix were a little overeager in that regard. It's better like this. Only the creation of `ray-core` was a necessity, and some files moved into the `rllib` folder, so that should be relatively benign.
- [x] Additionally, we added Algolia `docsearch`, screenshot below. This is _much_ better than our current search. Caveat: there's a sphinx dependency that needs to be replaced (`sphinx-tabs`) by another, newer one (`sphinx-panels`), as the former prevents loading of the `algolia.js` library. Will follow-up in the next PR (hoping this one doesn't get re-re-re-re-reverted).
This commit is contained in:
Max Pumperla 2022-01-22 00:42:05 +01:00 committed by GitHub
parent 45eebdd6e3
commit f9b71a8bf6
No known key found for this signature in database
GPG key ID: 4AEE18F83AFDEB23
212 changed files with 717 additions and 586 deletions

9
doc/.gitignore vendored
View file

@ -1 +1,8 @@
auto_examples/
# Generated documentation files
_build
source/_static/thumbs
source/ray-core/examples/
source/tune/tutorials/
source/tune/generated_guides/
source/data/examples/

View file

@ -1,5 +1,5 @@
# --------------------------------------------------------------------
# Tests from the doc/examples directory.
# Tests from the doc directory.
# Please keep these sorted alphabetically, but start with the
# root directory.
# --------------------------------------------------------------------
@ -8,8 +8,8 @@
py_test(
name = "dask_xgboost",
size = "medium",
main = "examples/dask_xgboost/dask_xgboost.py",
srcs = ["examples/dask_xgboost/dask_xgboost.py"],
main = "source/ray-core/_examples/dask_xgboost/dask_xgboost.py",
srcs = ["source/ray-core/_examples/dask_xgboost/dask_xgboost.py"],
tags = ["exclusive", "team:ml", "py37"],
args = ["--smoke-test", "--address ''", "--num-actors 4",
"--cpus-per-actor 1", "--num-actors-inference 4",
@ -20,8 +20,8 @@ py_test(
py_test(
name = "modin_xgboost",
size = "medium",
main = "examples/modin_xgboost/modin_xgboost.py",
srcs = ["examples/modin_xgboost/modin_xgboost.py"],
main = "source/ray-core/_examples/modin_xgboost/modin_xgboost.py",
srcs = ["source/ray-core/_examples/modin_xgboost/modin_xgboost.py"],
tags = ["exclusive", "team:ml", "py37"],
args = ["--smoke-test", "--address ''", "--num-actors 4",
"--cpus-per-actor 1", "--num-actors-inference 4",
@ -38,7 +38,7 @@ py_test(
py_test(
name = "datasets_train",
size = "medium",
srcs = ["examples/datasets_train/datasets_train.py"],
srcs = ["source/ray-core/_examples/datasets_train/datasets_train.py"],
tags = ["exclusive", "team:ml", "py37", "datasets_train"],
args = ["--smoke-test", "--num-workers=2", "--use-gpu"]
)
@ -46,62 +46,62 @@ py_test(
py_test(
name = "plot_hyperparameter",
size = "small",
srcs = ["examples/plot_hyperparameter.py"],
srcs = ["source/ray-core/_examples/plot_hyperparameter.py"],
tags = ["exclusive", "team:ml"]
)
py_test(
name = "plot_parameter_server",
size = "medium",
srcs = ["examples/plot_parameter_server.py"],
srcs = ["source/ray-core/_examples/plot_parameter_server.py"],
tags = ["exclusive", "team:ml"]
)
py_test(
name = "plot_pong_example",
size = "large",
srcs = ["examples/plot_pong_example.py"],
srcs = ["source/ray-core/_examples/plot_pong_example.py"],
tags = ["exclusive", "team:ml"]
)
py_test(
name = "progress_bar",
size = "small",
srcs = ["examples/progress_bar.py"],
srcs = ["source/ray-core/_examples/progress_bar.py"],
tags = ["exclusive", "team:ml"]
)
# Directory: examples/doc_code
# Directory: source/ray-core/_examples/doc_code
py_test(
name = "doc_code_tf_example",
size = "small",
main = "examples/doc_code/tf_example.py",
srcs = ["examples/doc_code/tf_example.py"],
main = "source/ray-core/_examples/doc_code/tf_example.py",
srcs = ["source/ray-core/_examples/doc_code/tf_example.py"],
tags = ["exclusive", "tf", "team:ml"]
)
py_test(
name = "doc_code_torch_example",
size = "small",
main = "examples/doc_code/torch_example.py",
srcs = ["examples/doc_code/torch_example.py"],
main = "source/ray-core/_examples/doc_code/torch_example.py",
srcs = ["source/ray-core/_examples/doc_code/torch_example.py"],
tags = ["exclusive", "pytorch", "team:ml"]
)
py_test(
name = "doc_code_metrics_example",
size = "small",
main = "examples/doc_code/metrics_example.py",
srcs = ["examples/doc_code/metrics_example.py"],
main = "source/ray-core/_examples/doc_code/metrics_example.py",
srcs = ["source/ray-core/_examples/doc_code/metrics_example.py"],
tags = ["exclusive", "team:serve"]
)
py_test(
name = "doc_code_runtime_env_example",
size = "small",
main = "examples/doc_code/runtime_env_example.py",
srcs = ["examples/doc_code/runtime_env_example.py"],
main = "source/ray-core/_examples/doc_code/runtime_env_example.py",
srcs = ["source/ray-core/_examples/doc_code/runtime_env_example.py"],
tags = ["exclusive", "post_wheel_build", "team:serve"]
)

View file

@ -6,7 +6,10 @@ SPHINXOPTS =
SPHINXBUILD = sphinx-build
PAPER =
BUILDDIR = _build
AUTOGALLERYDIR= source/auto_examples source/tune/tutorials source/tune/generated_guides source/data/examples
AUTOGALLERYDIR= source/ray-core/examples\
source/tune/tutorials\
source/tune/generated_guides\
source/data/examples
# User-friendly check for sphinx-build
ifeq ($(shell which $(SPHINXBUILD) >/dev/null 2>&1; echo $$?), 1)

View file

@ -1,24 +1,62 @@
# Ray Documentation
To compile the documentation, run the following commands from this directory.
Note that Ray must be installed first.
Repository for documentation of the Ray project, hosted at [docs.ray.io](https://docs.ray.io).
## Installation
To build the documentation, make sure you have `ray` installed first.
For building the documentation locally install the following dependencies:
```bash
pip install -r requirements-doc.txt
pip install -U -r requirements-rtd.txt # important for reproducing the deployment environment
make html
open _build/html/index.html
```
To test if there are any build errors with the documentation, do the following.
## Building the documentation
To compile the documentation and open it locally, run the following command from this directory.
```bash
sphinx-build -W -b html -d _build/doctrees source _build/html
make html && open _build/html/index.html
```
## Building just one sub-project
Often your changes in documentation just concern one sub-project, such as Tune or Train.
To build just this one sub-project, and ignore the rest (leading to build warnings due to broken references etc.), run the following command:
```shell
DOC_LIB=<project> sphinx-build -b html -d _build/doctrees source _build/html
```
where `<project>` is the name of the sub-project and can be any of the docs projects in the `source/`
directory either called `tune`, `rllib`, `train`, `cluster`, `serve`, `raysgd`, `data` or the ones starting
with `ray-`, e.g. `ray-observability`.
## Announcements and includes
To add new announcements and other messaging to the top or bottom of a documentation page,
check the `_includes` folder first to see if the message you want is already there (like "get help"
or "we're hiring" etc.)
If not, add the template you want and include it accordingly, i.e. with
```markdown
.. include:: /_includes/<my-announcement>
```
This ensures consistent messaging across documentation pages.
## Checking for broken links
To check if there are broken links, run the following (we are currently not running this
in the CI since there are false positives).
```bash
make linkcheck
```
## Running doctests
To run tests for examples shipping with docstrings in Python files, run the following command:
```shell
make doctest
```

View file

@ -1,3 +1,6 @@
# Production requirements. This is what readthedocs.org picks up
# Python / ML libraries
click
colorama
colorful
@ -14,28 +17,38 @@ pickle5
pillow
pyarrow
pydantic
pygments
pyyaml
recommonmark
scikit-optimize
redis
sphinx==3.0.4
sphinx-click
sphinx-copybutton
sphinxemoji
sphinx-gallery
sphinx-jsonschema
sphinx-tabs
sphinx-version-warning
sphinx-book-theme==0.0.42
sphinxcontrib.yt
starlette
tabulate
uvicorn==0.16.0
werkzeug
# Ray libraries
git+https://github.com/ray-project/tune-sklearn@master#tune-sklearn
git+https://github.com/ray-project/xgboost_ray@master#egg=xgboost_ray
git+https://github.com/ray-project/lightgbm_ray@main#lightgbm_ray
git+https://github.com/ray-project/ray_lightning@cacd374370e858adc0c00c76fe2e657e38790e0a#ray_lightning
scikit-optimize
# Syntax highlighting
Pygments==2.11.2
# Sphinx
sphinx==4.3.2
sphinx-click==3.0.2
sphinx-copybutton==0.4.0
sphinxemoji==0.2.0
sphinx-gallery==0.10.0
sphinx-jsonschema==1.17.2
# spinx-panels==0.6.0
sphinx-tabs==3.2.0
sphinx-version-warning==1.1.2
sphinx-book-theme==0.1.7
sphinx-external-toc==0.2.3
sphinxcontrib.yt==0.2.2
sphinx-sitemap==2.2.0
myst-parser
# MyST
myst-parser==0.15.2
myst-nb==0.13.1

View file

@ -1,13 +1,2 @@
Pygments==2.3.1
setuptools==41.0.1
docutils==0.16
mock==1.0.1
pillow==8.3.2
alabaster>=0.7,<0.8,!=0.7.5
commonmark==0.8.1
recommonmark==0.5.0
sphinx==3.0.4
readthedocs-sphinx-ext<1.1
sphinx-book-theme==0.0.42
sphinx-sitemap==2.2.0
myst-parser
# CI requirements: this is the file buildkite needs.
-r requirements-doc.txt

View file

@ -1,3 +1,6 @@
/* For Algolia*/
/*#site-navigation { overflow: visible; }*/
/*Extends the docstring signature box.*/
.rst-content dl:not(.docutils) dt {
display: block;

View file

@ -0,0 +1,7 @@
docsearch({
apiKey: '6c42f30d9669d8e42f6fc92f44028596',
indexName: 'docs-ray',
appId: 'LBHF0PABBL',
inputSelector: '#search-input',
debug: false,
});

172
doc/source/_toc.yml Normal file
View file

@ -0,0 +1,172 @@
format: jb-book
root: index
parts:
- caption: Overview
chapters:
- file: ray-overview/index
- file: ray-overview/installation
- file: ray-overview/ray-libraries
- caption: Ray ML
chapters:
- file: data/dataset
title: Ray Data
sections:
- file: data/dataset-pipeline
- file: data/dataset-ml-preprocessing
- file: data/dataset-execution-model
- file: data/dataset-tensor-support
- file: data/examples/big_data_ingestion
- file: data/dask-on-ray
- file: data/mars-on-ray
- file: data/modin/index
- file: data/raydp
- file: train/train
title: Ray Train
sections:
- file: train/user_guide
- file: train/examples
- file: train/architecture
- file: train/migration-guide
- file: raysgd/raysgd
title: "RaySGD v1: Distributed Training Wrappers"
- file: tune/index
title: Ray Tune
sections:
- file: tune/key-concepts
- file: tune/user-guide
- file: tune/tutorials/overview
sections:
- file: tune/tutorials/tune-tutorial.rst
- file: tune/tutorials/tune-advanced-tutorial.rst
- file: tune/tutorials/tune-distributed.rst
- file: tune/tutorials/tune-lifecycle.rst
- file: tune/tutorials/tune-mlflow.rst
- file: tune/tutorials/tune-pytorch-cifar.rst
- file: tune/tutorials/tune-pytorch-lightning.rst
- file: tune/tutorials/tune-serve-integration-mnist.rst
- file: tune/tutorials/tune-sklearn.rst
- file: tune/tutorials/tune-xgboost.rst
- file: tune/tutorials/tune-wandb.rst
- file: tune/examples/index
- file: tune/contrib
- file: serve/index
title: Ray Serve
sections:
- file: serve/tutorial
- file: serve/core-apis
- file: serve/http-servehandle
- file: serve/deployment
- file: serve/ml-models
- file: serve/pipeline
- file: serve/performance
- file: serve/architecture
- file: serve/tutorials/index
- file: serve/faq
- file: rllib/index
title: Ray RLlib
sections:
- file: rllib/rllib-toc
- file: rllib/core-concepts
- file: rllib/rllib-training
- file: rllib/rllib-env
- file: rllib/rllib-models
- file: rllib/rllib-algorithms
- file: rllib/rllib-sample-collection
- file: rllib/rllib-offline
- file: rllib/rllib-concepts
- file: rllib/rllib-examples
- file: rllib/rllib-dev
- file: workflows/concepts
title: Ray Workflows
sections:
- file: workflows/basics
- file: workflows/management
- file: workflows/actors
- file: workflows/metadata
- file: workflows/events
- file: workflows/comparison
- file: workflows/advanced
- file: ray-more-libs/index
title: More Ray ML Libraries
- caption: Ray Core
chapters:
- file: ray-core/walkthrough
title: Getting Started
- file: ray-core/using-ray
title: "User Guide"
sections:
- file: ray-core/starting-ray
- file: ray-core/actors
- file: ray-core/namespaces
- file: ray-core/handling-dependencies
- file: ray-core/async_api
- file: ray-core/concurrency_group_api
- file: ray-core/using-ray-with-gpus
- file: ray-core/serialization
- file: ray-core/memory-management
- file: ray-core/placement-group
- file: ray-core/troubleshooting
- file: ray-core/fault-tolerance
- file: ray-core/advanced
- file: ray-core/cross-language
- file: ray-core/using-ray-with-tensorflow
- file: ray-core/using-ray-with-pytorch
- file: ray-core/using-ray-with-jupyter
- file: ray-core/examples/overview
title: "Tutorials and Examples"
sections:
- file: ray-core/examples/tips-for-first-time
- file: ray-core/examples/testing-tips
- file: ray-core/examples/progress_bar
- file: ray-core/examples/plot_streaming
- file: ray-core/examples/placement-group
- file: ray-core/examples/plot_parameter_server
- file: ray-core/examples/plot_hyperparameter
- file: ray-core/examples/plot_lbfgs
- file: ray-core/examples/plot_example-lm
- file: ray-core/examples/plot_newsreader
- file: ray-core/examples/dask_xgboost/dask_xgboost
- file: ray-core/examples/modin_xgboost/modin_xgboost
- file: ray-core/examples/plot_pong_example
- file: ray-core/examples/plot_example-a3c
- file: ray-core/examples/using-ray-with-pytorch-lightning
- caption: Deploying Ray Clusters
chapters:
- file: cluster/quickstart
- file: cluster/user-guide
sections:
- file: cluster/index
- file: cluster/guide
- file: cluster/job-submission
title: "Submitting Ray Jobs"
- file: cluster/ray-client
- file: cluster/cloud
sections:
- file: cluster/aws-tips
- file: cluster/deploy
sections:
- file: cluster/kubernetes
- file: cluster/yarn
- file: cluster/slurm
- file: cluster/lsf
- caption: References
chapters:
- file: ray-references/api
- file: ray-references/faq
- caption: Developer Guide
chapters:
- file: ray-contribute/getting-involved
sections:
- file: ray-contribute/development
- file: ray-contribute/fake-autoscaler
- file: ray-core/configure
- file: ray-observability/index
- file: ray-design-patterns/index
- file: ray-contribute/whitepaper
# TODO: Add examples section

View file

@ -464,4 +464,4 @@ Now that you have a working understanding of the cluster launcher, check out:
Questions or Issues?
--------------------
.. include:: /_help.rst
.. include:: /_includes/_help.rst

View file

@ -226,7 +226,7 @@ The ``RESTARTS`` column reports the RayCluster's ``status.autoscalerRetries`` fi
Questions or Issues?
--------------------
.. include:: /_help.rst
.. include:: /_includes/_help.rst
.. _`RayCluster CRD`: https://github.com/ray-project/ray/tree/master/deploy/charts/ray/crds/cluster_crd.yaml
.. _`finalizer` : https://kubernetes.io/docs/tasks/extend-kubernetes/custom-resources/custom-resource-definitions/#finalizers

View file

@ -80,7 +80,7 @@ If you run into problems setting up GPUs for your Ray cluster on Kubernetes, ple
Questions or Issues?
--------------------
.. include:: /_help.rst
.. include:: /_includes/_help.rst
.. _`GKE`: https://cloud.google.com/kubernetes-engine/docs/how-to/gpus
.. _`EKS`: https://docs.aws.amazon.com/eks/latest/userguide/eks-optimized-ami.html

View file

@ -148,7 +148,7 @@ To delete a running Ray cluster, you can run the following command:
Questions or Issues?
--------------------
.. include:: /_help.rst
.. include:: /_includes/_help.rst
.. _`Kubernetes Namespace`: https://kubernetes.io/docs/concepts/overview/working-with-objects/namespaces/

View file

@ -282,7 +282,7 @@ Next steps
Questions or Issues?
--------------------
.. include:: /_help.rst
.. include:: /_includes/_help.rst
.. _`Kubernetes`: https://kubernetes.io/
.. _`Kubernetes Job`: https://kubernetes.io/docs/concepts/workloads/controllers/jobs-run-to-completion/

View file

@ -2,8 +2,8 @@
.. _cluster-reference:
Config YAML and CLI Reference
=============================
Cluster Config YAML and CLI Reference
=====================================
.. toctree::
:maxdepth: 2

View file

@ -0,0 +1,12 @@
Deployment Guide
================
.. toctree::
:maxdepth: 1
:caption: Deployment Guide
:hidden:
index.rst
guide.rst
job-submission.rst
ray-client.rst

View file

@ -191,6 +191,6 @@ To clean up a running job, use the following (using the application ID):
Questions or Issues?
--------------------
.. include:: /_help.rst
.. include:: /_includes/_help.rst
.. _`Skein`: https://jcrist.github.io/skein/

View file

@ -16,10 +16,9 @@ import glob
import shutil
import sys
import os
import urllib
sys.path.insert(0, os.path.abspath("."))
from custom_directives import CustomGalleryItemDirective, fix_xgb_lgbm_docs
from custom_directives import *
from datetime import datetime
# These lines added to enable Sphinx to work without installing Ray.
@ -32,87 +31,9 @@ class ChildClassMock(mock.Mock):
return mock.Mock
MOCK_MODULES = [
"ax",
"ax.service.ax_client",
"blist",
"ConfigSpace",
"dask.distributed",
"gym",
"gym.spaces",
"horovod",
"horovod.runner",
"horovod.runner.common",
"horovod.runner.common.util",
"horovod.ray",
"horovod.ray.runner",
"horovod.ray.utils",
"hyperopt",
"hyperopt.hp"
"kubernetes",
"mlflow",
"modin",
"mxnet",
"mxnet.model",
"optuna",
"optuna.distributions",
"optuna.samplers",
"optuna.trial",
"psutil",
"ray._raylet",
"ray.core.generated",
"ray.core.generated.common_pb2",
"ray.core.generated.runtime_env_common_pb2",
"ray.core.generated.gcs_pb2",
"ray.core.generated.logging_pb2",
"ray.core.generated.ray.protocol.Task",
"ray.serve.generated",
"ray.serve.generated.serve_pb2",
"scipy.signal",
"scipy.stats",
"setproctitle",
"tensorflow_probability",
"tensorflow",
"tensorflow.contrib",
"tensorflow.contrib.all_reduce",
"tree",
"tensorflow.contrib.all_reduce.python",
"tensorflow.contrib.layers",
"tensorflow.contrib.rnn",
"tensorflow.contrib.slim",
"tensorflow.core",
"tensorflow.core.util",
"tensorflow.keras",
"tensorflow.python",
"tensorflow.python.client",
"tensorflow.python.util",
"torch",
"torch.distributed",
"torch.nn",
"torch.nn.parallel",
"torch.utils.data",
"torch.utils.data.distributed",
"wandb",
"zoopt",
]
CHILD_MOCK_MODULES = [
"pytorch_lightning",
"pytorch_lightning.accelerators",
"pytorch_lightning.plugins",
"pytorch_lightning.plugins.environments",
"pytorch_lightning.utilities",
"tensorflow.keras.callbacks",
]
import scipy.stats
import scipy.linalg
for mod_name in MOCK_MODULES:
sys.modules[mod_name] = mock.Mock()
# ray.rllib.models.action_dist.py and
# ray.rllib.models.lstm.py will use tf.VERSION
sys.modules["tensorflow"].VERSION = "9.9.9"
for mod_name in CHILD_MOCK_MODULES:
@ -142,6 +63,7 @@ extensions = [
"sphinx.ext.viewcode",
"sphinx.ext.napoleon",
"sphinx_click.ext",
# "sphinx_panels",
"sphinx_tabs.tabs",
"sphinx-jsonschema",
"sphinx_gallery.gen_gallery",
@ -151,13 +73,26 @@ extensions = [
"versionwarning.extension",
"sphinx_sitemap",
"myst_parser",
# "myst_nb",
"sphinx.ext.doctest",
"sphinx.ext.coverage",
"sphinx_external_toc",
]
external_toc_exclude_missing = False
external_toc_path = '_toc.yml'
# There's a flaky autodoc import for "TensorFlowVariables" that fails depending on the doc structure / order
# of imports.
# autodoc_mock_imports = ["ray.experimental.tf_utils"]
# This is used to suppress warnings about explicit "toctree" directives.
suppress_warnings = ["etoc.toctree"]
versionwarning_admonition_type = "note"
versionwarning_banner_title = "Join the Ray Discuss Forums!"
FORUM_LINK = "https://discuss.ray.io"
versionwarning_messages = {
# Re-enable this after Ray Summit.
# "latest": (
@ -172,15 +107,17 @@ versionwarning_messages = {
}
versionwarning_body_selector = "#main-content"
sphinx_gallery_conf = {
# Example sources are taken from these folders:
"examples_dirs": [
"../examples",
"ray-core/_examples",
"tune/_tutorials",
"data/_examples",
], # path to example scripts
# path where to save generated examples
"gallery_dirs": ["auto_examples", "tune/tutorials", "data/examples"],
"ignore_pattern": "../examples/doc_code/",
],
# and then generated into these respective target folders:
"gallery_dirs": ["ray-core/examples", "tune/tutorials", "data/examples"],
"ignore_pattern": "ray-core/examples/doc_code/",
"plot_gallery": "False",
"min_reported_time": sys.maxsize,
# "filename_pattern": "tutorial.py",
@ -235,23 +172,23 @@ release = version
# Usually you set "language" from the command line for these cases.
language = None
# There are two options for replacing |today|: either, you set today to some
# non-false value, then it is used:
# today = ''
# Else, today_fmt is used as the format for a strftime call.
# today_fmt = '%B %d, %Y'
# List of patterns, relative to source directory, that match files and
# directories to ignore when looking for source files.
exclude_patterns = ["_build"]
exclude_patterns += sphinx_gallery_conf["examples_dirs"]
# The reST default role (used for this markup: `text`) to use for all
# documents.
# default_role = None
# If "DOC_LIB" is found, only build that top-level navigation item.
build_one_lib = os.getenv("DOC_LIB")
# If true, '()' will be appended to :func: etc. cross-reference text.
# add_function_parentheses = True
all_toc_libs = [
f.path for f in os.scandir(".") if f.is_dir() and "ray-" in f.path
]
all_toc_libs += [
"cluster", "tune", "data", "raysgd", "train", "rllib", "serve", "workflows"
]
if build_one_lib and build_one_lib in all_toc_libs:
all_toc_libs.remove(build_one_lib)
exclude_patterns += all_toc_libs
# If true, the current module name will be prepended to all description
# unit titles (such as .. function::).
@ -262,7 +199,7 @@ exclude_patterns += sphinx_gallery_conf["examples_dirs"]
# show_authors = False
# The name of the Pygments (syntax highlighting) style to use.
pygments_style = "pastie"
pygments_style = "lovelace"
# A list of ignored prefixes for module index sorting.
# modindex_common_prefix = []
@ -292,7 +229,7 @@ html_theme_options = {
"use_issues_button": True,
"use_edit_page_button": True,
"path_to_docs": "doc/source",
"home_page_in_toc": True,
"home_page_in_toc": False,
"show_navbar_depth": 0,
}
@ -301,7 +238,7 @@ html_theme_options = {
# The name for this set of Sphinx documents. If None, it defaults to
# "<project> v<release> documentation".
html_title = f"Ray v{release}"
html_title = f"Ray {release}"
# A shorter title for the navigation bar. Default is the same as html_title.
# html_short_title = None
@ -318,6 +255,19 @@ html_favicon = "_static/favicon.ico"
# Add any paths that contain custom static files (such as style sheets) here,
# relative to this directory. They are copied after the builtin static files,
# so a file named "default.css" will overwrite the builtin "default.css".
# TODO: this adds algolia search. can activate this once sphinx-tabs has been
# replaced by sphinx-panels.
# html_css_files = [
# "https://cdn.jsdelivr.net/npm/docsearch.js@2/dist/cdn/docsearch.min.css",
# ]
#
# html_js_files = [
# (
# "https://cdn.jsdelivr.net/npm/docsearch.js@2/dist/cdn/docsearch.min.js",
# {"defer": "defer"},
# ),
# ("docsearch.sbt.js", {"defer": "defer"}),
# ]
html_static_path = ["_static"]
# Add any extra paths that contain custom files (such as robots.txt or
@ -355,9 +305,6 @@ html_static_path = ["_static"]
# If true, "Created using Sphinx" is shown in the HTML footer. Default is True.
# html_show_sphinx = True
# If true, "(C) Copyright ..." is shown in the HTML footer. Default is True.
# html_show_copyright = True
# If true, an OpenSearch description file will be output, and all pages will
# contain a <link> tag referring to it. The value of this option must be the
# base URL from which the finished HTML is served.
@ -400,7 +347,7 @@ latex_elements = {
# (source start file, target name, title,
# author, documentclass [howto, manual, or own class]).
latex_documents = [
(master_doc, "Ray.tex", "Ray Documentation", "The Ray Team", "manual"),
(master_doc, "Ray.tex", "Ray Documentation", author, "manual"),
]
# The name of an image file (relative to this directory) to place at the top of
@ -433,7 +380,6 @@ man_pages = [(master_doc, "ray", "Ray Documentation", [author], 1)]
# man_show_urls = False
# -- Options for Texinfo output -------------------------------------------
# Grouping the document tree into Texinfo files. List of tuples
# (source start file, target name, title, author,
# dir menu entry, description, category)
@ -444,60 +390,20 @@ texinfo_documents = [
"Ray Documentation",
author,
"Ray",
"One line description of project.",
"Ray provides a simple, universal API for building distributed applications.",
"Miscellaneous",
),
]
# Documents to append as an appendix to all manuals.
# texinfo_appendices = []
# If false, no module index is generated.
# texinfo_domain_indices = True
# How to display URL addresses: 'footnote', 'no', or 'inline'.
# texinfo_show_urls = 'footnote'
# If true, do not generate a @detailmenu in the "Top" node's menu.
# texinfo_no_detailmenu = False
# pcmoritz: To make the following work, you have to run
# sudo pip install recommonmark
# Python methods should be presented in source code order
autodoc_member_order = "bysource"
# Taken from https://github.com/edx/edx-documentation
FEEDBACK_FORM_FMT = "https://github.com/ray-project/ray/issues/new?title={title}&labels=docs&body={body}"
def feedback_form_url(project, page):
"""Create a URL for feedback on a particular page in a project."""
return FEEDBACK_FORM_FMT.format(
title=urllib.parse.quote(
"[docs] Issue on `{page}.rst`".format(page=page)),
body=urllib.parse.quote(
"# Documentation Problem/Question/Comment\n"
"<!-- Describe your issue/question/comment below. -->\n"
"<!-- If there are typos or errors in the docs, feel free to create a pull-request. -->\n"
"\n\n\n\n"
"(Created directly from the docs)\n"),
)
def update_context(app, pagename, templatename, context, doctree):
"""Update the page rendering context to include ``feedback_form_url``."""
context["feedback_form_url"] = feedback_form_url(app.config.project,
pagename)
# see also http://searchvoidstar.tumblr.com/post/125486358368/making-pdfs-from-markdown-on-readthedocsorg-using
def setup(app):
app.connect("html-page-context", update_context)
# Custom CSS
app.add_css_file("css/custom.css")
# Custom directives
# Custom Sphinx directives
app.add_directive("customgalleryitem", CustomGalleryItemDirective)
# Custom connects
# Custom docstring processor
app.connect("autodoc-process-docstring", fix_xgb_lgbm_docs)

View file

@ -5,6 +5,15 @@ from docutils.statemachine import StringList
from docutils import nodes
import os
import sphinx_gallery
import urllib
# Note: the scipy import has to stay here, it's used implicitly down the line
import scipy.stats # noqa: F401
import scipy.linalg # noqa: F401
__all__ = [
"CustomGalleryItemDirective", "fix_xgb_lgbm_docs", "MOCK_MODULES",
"CHILD_MOCK_MODULES", "update_context"
]
try:
FileNotFoundError
@ -134,3 +143,103 @@ def fix_xgb_lgbm_docs(app, what, name, obj, options, lines):
for i, _ in enumerate(lines):
for replacement in replacements:
lines[i] = lines[i].replace(*replacement)
# Taken from https://github.com/edx/edx-documentation
FEEDBACK_FORM_FMT = "https://github.com/ray-project/ray/issues/new?" \
"title={title}&labels=docs&body={body}"
def feedback_form_url(project, page):
"""Create a URL for feedback on a particular page in a project."""
return FEEDBACK_FORM_FMT.format(
title=urllib.parse.quote(
"[docs] Issue on `{page}.rst`".format(page=page)),
body=urllib.parse.quote(
"# Documentation Problem/Question/Comment\n"
"<!-- Describe your issue/question/comment below. -->\n"
"<!-- If there are typos or errors in the docs, feel free "
"to create a pull-request. -->\n"
"\n\n\n\n"
"(Created directly from the docs)\n"),
)
def update_context(app, pagename, templatename, context, doctree):
"""Update the page rendering context to include ``feedback_form_url``."""
context["feedback_form_url"] = feedback_form_url(app.config.project,
pagename)
MOCK_MODULES = [
"ax",
"ax.service.ax_client",
"blist",
"ConfigSpace",
"dask.distributed",
"gym",
"gym.spaces",
"horovod",
"horovod.runner",
"horovod.runner.common",
"horovod.runner.common.util",
"horovod.ray",
"horovod.ray.runner",
"horovod.ray.utils",
"hyperopt",
"hyperopt.hp"
"kubernetes",
"mlflow",
"modin",
"mxnet",
"mxnet.model",
"optuna",
"optuna.distributions",
"optuna.samplers",
"optuna.trial",
"psutil",
"ray._raylet",
"ray.core.generated",
"ray.core.generated.common_pb2",
"ray.core.generated.runtime_env_common_pb2",
"ray.core.generated.gcs_pb2",
"ray.core.generated.logging_pb2",
"ray.core.generated.ray.protocol.Task",
"ray.serve.generated",
"ray.serve.generated.serve_pb2",
"scipy.signal",
"scipy.stats",
"setproctitle",
"tensorflow_probability",
"tensorflow",
"tensorflow.contrib",
"tensorflow.contrib.all_reduce",
"tree",
"tensorflow.contrib.all_reduce.python",
"tensorflow.contrib.layers",
"tensorflow.contrib.rnn",
"tensorflow.contrib.slim",
"tensorflow.core",
"tensorflow.core.util",
"tensorflow.keras",
"tensorflow.python",
"tensorflow.python.client",
"tensorflow.python.util",
"torch",
"torch.distributed",
"torch.nn",
"torch.nn.parallel",
"torch.utils.data",
"torch.utils.data.distributed",
"wandb",
"zoopt",
]
CHILD_MOCK_MODULES = [
"pytorch_lightning",
"pytorch_lightning.accelerators",
"pytorch_lightning.plugins",
"pytorch_lightning.plugins.environments",
"pytorch_lightning.utilities",
"tensorflow.keras.callbacks",
]

View file

@ -1,4 +1,4 @@
Dataset API Reference
Data API Reference
=====================
Creating a Dataset

View file

Before

Width:  |  Height:  |  Size: 77 KiB

After

Width:  |  Height:  |  Size: 77 KiB

View file

@ -146,7 +146,7 @@ You can also get started by visiting our `Tutorials <https://github.com/ray-proj
Getting Involved
================
.. include:: ray-overview/involvement.rst
.. include:: ray-contribute/involvement.rst
If you're interested in contributing to Ray, visit our page on :ref:`Getting Involved <getting-involved>` to read about the contribution process and see what you can work on!
@ -219,168 +219,3 @@ Papers
.. _`RLlib paper`: https://arxiv.org/abs/1712.09381
.. _`RLlib flow paper`: https://arxiv.org/abs/2011.12719
.. _`Tune paper`: https://arxiv.org/abs/1807.05118
.. toctree::
:hidden:
:maxdepth: -1
:caption: Overview of Ray
ray-overview/index.rst
ray-libraries.rst
installation.rst
.. toctree::
:hidden:
:maxdepth: -1
:caption: Ray Core
walkthrough.rst
using-ray.rst
configure.rst
ray-dashboard.rst
Tutorial and Examples <auto_examples/overview.rst>
Design patterns and anti-patterns <ray-design-patterns/index.rst>
package-ref.rst
.. toctree::
:hidden:
:maxdepth: -1
:caption: Multi-node Ray
cluster/index.rst
cluster/quickstart.rst
cluster/guide.rst
Ray Job Submission <ray-job-submission/overview.rst>
cluster/ray-client.rst
cluster/reference.rst
cluster/cloud.rst
cluster/deploy.rst
.. toctree::
:hidden:
:maxdepth: -1
:caption: Ray Serve
serve/index.rst
serve/tutorial.rst
serve/core-apis.rst
serve/http-servehandle.rst
serve/deployment.rst
serve/ml-models.rst
serve/pipeline.rst
serve/performance.rst
serve/architecture.rst
serve/tutorials/index.rst
serve/faq.rst
serve/package-ref.rst
.. toctree::
:hidden:
:maxdepth: -1
:caption: Ray Data
data/dataset.rst
data/dataset-pipeline.rst
data/dataset-ml-preprocessing.rst
data/dataset-execution-model.rst
data/dataset-tensor-support.rst
data/package-ref.rst
data/examples/big_data_ingestion
data/dask-on-ray.rst
data/mars-on-ray.rst
data/modin/index.rst
data/raydp.rst
.. toctree::
:hidden:
:maxdepth: -1
:caption: Ray Workflows
workflows/concepts.rst
workflows/basics.rst
workflows/management.rst
workflows/actors.rst
workflows/metadata.rst
workflows/events.rst
workflows/comparison.rst
workflows/advanced.rst
workflows/package-ref.rst
.. toctree::
:hidden:
:maxdepth: -1
:caption: Ray Tune
tune/index.rst
tune/key-concepts.rst
tune/user-guide.rst
tune/tutorials/overview.rst
tune/examples/index.rst
tune/api_docs/overview.rst
tune/contrib.rst
.. toctree::
:hidden:
:maxdepth: -1
:caption: Ray RLlib
rllib/index.rst
rllib-toc.rst
rllib/core-concepts.rst
rllib-training.rst
rllib-env.rst
rllib-models.rst
rllib-algorithms.rst
rllib-sample-collection.rst
rllib-offline.rst
rllib-concepts.rst
rllib-examples.rst
rllib/package_ref/index.rst
rllib-dev.rst
.. toctree::
:hidden:
:maxdepth: -1
:caption: Ray Train
train/train.rst
train/user_guide.rst
train/examples.rst
train/architecture.rst
train/api.rst
train/migration-guide.rst
RaySGD v1: Distributed Training Wrappers <raysgd/raysgd.rst>
.. toctree::
:hidden:
:maxdepth: -1
:caption: More Libraries
multiprocessing.rst
joblib.rst
xgboost-ray.md
lightgbm-ray.rst
ray-lightning.rst
ray-collective.rst
.. toctree::
:hidden:
:maxdepth: -1
:caption: Observability
ray-metrics.rst
ray-debugging.rst
ray-logging.rst
ray-tracing.rst
.. toctree::
:hidden:
:maxdepth: -1
:caption: Contributor Guide
getting-involved.rst
development.rst
fake-autoscaler.rst
whitepaper.rst
debugging.rst
profiling.rst

View file

@ -284,7 +284,7 @@ solicited by current reviewers.
More Resources for Getting Involved
-----------------------------------
.. include:: ray-overview/involvement.rst
.. include:: ../ray-contribute/involvement.rst
.. note::

View file

@ -0,0 +1,4 @@
Architecture Whitepaper
=======================
For an in-depth overview of Ray internals, check out the `Ray 1.0 Architecture whitepaper <https://docs.google.com/document/d/1lAy0Owi-vPz2jEqBSaHNQcy2IBSDEHyXNOQZlGuj93c/preview>`__.

View file

@ -4,7 +4,7 @@ name: ray-example-lbfgs
description: "Parallelizing the L-BFGS algorithm in ray"
tags: ["ray-example", "optimization", "lbfgs"]
documentation: https://docs.ray.io/en/master/auto_examples/plot_lbfgs.html
documentation: https://docs.ray.io/en/master/ray-core/examples/plot_lbfgs.html
cluster:
config: ray-project/cluster.yaml

View file

@ -4,7 +4,7 @@ name: ray-example-newsreader
description: "A simple news reader example that uses ray actors to serve requests"
tags: ["ray-example", "flask", "rss", "newsreader"]
documentation: https://docs.ray.io/en/master/auto_examples/plot_newsreader.html
documentation: https://docs.ray.io/en/master/ray-core/examples/plot_newsreader.html
cluster:
config: ray-project/cluster.yaml

View file

@ -23,23 +23,23 @@ Ray Examples
.. customgalleryitem::
:tooltip: Tips for first time users.
:figure: /images/pipeline.png
:description: :doc:`/auto_examples/tips-for-first-time`
:description: :doc:`tips-for-first-time`
.. customgalleryitem::
:tooltip: Tips for testing Ray applications
:description: :doc:`/auto_examples/testing-tips`
:description: :doc:`testing-tips`
.. customgalleryitem::
:tooltip: Progress Bar for Ray Tasks
:description: :doc:`/auto_examples/progress_bar`
:description: :doc:`progress_bar`
.. customgalleryitem::
:tooltip: Implement a simple streaming application using Rays actors.
:description: :doc:`/auto_examples/plot_streaming`
:description: :doc:`plot_streaming`
.. customgalleryitem::
:tooltip: Learn placement group use cases with examples.
:description: :doc:`/auto_examples/placement-group`
:description: :doc:`placement-group`
.. raw:: html
@ -67,34 +67,34 @@ Machine Learning Examples
.. customgalleryitem::
:tooltip: Build a simple parameter server using Ray.
:figure: /images/param_actor.png
:description: :doc:`/auto_examples/plot_parameter_server`
:figure: /ray-core/images/param_actor.png
:description: :doc:`plot_parameter_server`
.. customgalleryitem::
:tooltip: Simple parallel asynchronous hyperparameter evaluation.
:figure: /images/hyperparameter.png
:description: :doc:`/auto_examples/plot_hyperparameter`
:figure: /ray-core/images/hyperparameter.png
:description: :doc:`plot_hyperparameter`
.. customgalleryitem::
:tooltip: Walkthrough of parallelizing the L-BFGS algorithm.
:description: :doc:`/auto_examples/plot_lbfgs`
:description: :doc:`plot_lbfgs`
.. customgalleryitem::
:tooltip: Distributed Fault-Tolerant BERT training for FAIRSeq using Ray.
:description: :doc:`/auto_examples/plot_example-lm`
:description: :doc:`plot_example-lm`
.. customgalleryitem::
:tooltip: Implementing a simple news reader using Ray.
:description: :doc:`/auto_examples/plot_newsreader`
:description: :doc:`plot_newsreader`
.. customgalleryitem::
:tooltip: Train an XGBoost-Ray model using Dask for data processing.
:description: :doc:`/auto_examples/dask_xgboost/dask_xgboost`
:description: :doc:`dask_xgboost/dask_xgboost`
.. customgalleryitem::
:tooltip: Train an XGBoost-Ray model using Modin for data processing.
:description: :doc:`/auto_examples/modin_xgboost/modin_xgboost`
:description: :doc:`modin_xgboost/modin_xgboost`
.. raw:: html
@ -119,13 +119,13 @@ These are simple examples that show you how to leverage Ray Core. For Ray's prod
.. customgalleryitem::
:tooltip: Asynchronous Advantage Actor Critic agent using Ray.
:figure: /images/a3c.png
:description: :doc:`/auto_examples/plot_example-a3c`
:figure: /ray-core/images/a3c.png
:description: :doc:`plot_example-a3c`
.. customgalleryitem::
:tooltip: Parallelizing a policy gradient calculation on OpenAI Gym Pong.
:figure: /images/pong.png
:description: :doc:`/auto_examples/plot_pong_example`
:description: :doc:`plot_pong_example`
.. raw:: html
@ -148,4 +148,4 @@ These are full guides on how you can use Ray with various Machine Learning libra
.. customgalleryitem::
:tooltip: Using Ray with PyTorch Lightning.
:figure: /images/pytorch_lightning_small.png
:description: :doc:`/auto_examples/using-ray-with-pytorch-lightning`
:description: :doc:`using-ray-with-pytorch-lightning`

View file

@ -26,7 +26,7 @@ To run the application, first install **ray** and then some dependencies:
pip install scipy
.. image:: ../images/a3c.png
.. image:: /ray-core/images/a3c.png
:align: center
You can run the code with

View file

@ -8,11 +8,11 @@ To run this example, you will need to install Ray on your local machine to use t
You can view the `code for this example`_.
.. _`code for this example`: https://github.com/ray-project/ray/tree/master/doc/examples/lm
.. _`code for this example`: https://github.com/ray-project/ray/tree/master/doc/source/ray-core/_examples/lm
To use Ray cluster launcher on AWS, install boto (``pip install boto3``) and configure your AWS credentials in ``~/.aws/credentials`` as described on the :ref:`Automatic Cluster Setup page <cluster-cloud>`.
We provide an `example config file <https://github.com/ray-project/ray/tree/master/doc/examples/lm/lm-cluster.yaml>`__ (``lm-cluster.yaml``).
We provide an `example config file <https://github.com/ray-project/ray/tree/master/doc/source/ray-core/_examples/lm/lm-cluster.yaml>`__ (``lm-cluster.yaml``).
In the example config file, we use an ``m5.xlarge`` on-demand instance as the head node, and use ``p3.2xlarge`` GPU spot instances as the worker nodes. We set the minimal number of workers to 1 and maximum workers to 2 in the config, which can be modified according to your own demand.
@ -50,12 +50,12 @@ files from a local path:
Preprocessing Data
------------------
Once the cluster is started, you can then SSH into the head node using ``ray attach lm-cluster.yaml`` and download or preprocess the data on EFS for training. We can run ``preprocess.sh`` (`code <https://github.com/ray-project/ray/tree/master/doc/examples/lm/preprocess.sh>`_) to do this, which adapts instructions from `the RoBERTa tutorial <https://github.com/pytorch/fairseq/blob/master/examples/roberta/README.pretraining.md>`__.
Once the cluster is started, you can then SSH into the head node using ``ray attach lm-cluster.yaml`` and download or preprocess the data on EFS for training. We can run ``preprocess.sh`` (`code <https://github.com/ray-project/ray/tree/master/doc/source/ray-core/_examples/lm/preprocess.sh>`_) to do this, which adapts instructions from `the RoBERTa tutorial <https://github.com/pytorch/fairseq/blob/master/examples/roberta/README.pretraining.md>`__.
Training
--------
We provide ``ray_train.py`` (`code <https://github.com/ray-project/ray/tree/master/doc/examples/lm/ray_train.py>`__) as an entrypoint to the Fairseq library. Since we are training the model on spot instances, we provide fault-tolerance in ``ray_train.py`` by checkpointing and restarting when a node fails. The code will also check whether there are new resources available after checkpointing. If so, the program will make use of them by restarting and resizing.
We provide ``ray_train.py`` (`code <https://github.com/ray-project/ray/tree/master/doc/source/ray-core/_examples/lm/ray_train.py>`__) as an entrypoint to the Fairseq library. Since we are training the model on spot instances, we provide fault-tolerance in ``ray_train.py`` by checkpointing and restarting when a node fails. The code will also check whether there are new resources available after checkpointing. If so, the program will make use of them by restarting and resizing.
Two main components of ``ray_train.py`` are a ``RayDistributedActor`` class and a function ``run_fault_tolerant_loop()``. The ``RayDistributedActor`` sets proper arguments for different ray actor processes, adds a checkpoint hook to enable the process to make use of new available GPUs, and calls the ``main`` of Fairseq:
@ -256,7 +256,7 @@ In ``ray_train.py``, we also define a set of helper functions. ``add_ray_args()`
To start training, run `following commands <https://github.com/ray-project/ray/tree/master/doc/examples/lm/ray_train.sh>`__ (``ray_train.sh``) on the head machine:
To start training, run `following commands <https://github.com/ray-project/ray/tree/master/doc/source/ray-core/_examples/lm/ray_train.sh>`__ (``ray_train.sh``) on the head machine:
.. code-block:: bash

View file

@ -9,7 +9,7 @@ This script will demonstrate how to use two important parts of the Ray API:
using ``ray.remote`` to define remote functions and ``ray.wait`` to wait for
their results to be ready.
.. image:: ../images/hyperparameter.png
.. image:: /ray-core/images/hyperparameter.png
:align: center
.. tip:: For a production-grade implementation of distributed

View file

@ -11,13 +11,13 @@ application, first install these dependencies.
You can view the `code for this example`_.
.. _`code for this example`: https://github.com/ray-project/ray/tree/master/doc/examples/lbfgs
.. _`code for this example`: https://github.com/ray-project/ray/tree/master/doc/source/ray-core/_examples/lbfgs
Then you can run the example as follows.
.. code-block:: bash
python ray/doc/examples/lbfgs/driver.py
python ray/doc/source/ray-core/_examples/lbfgs/driver.py
Optimization is at the heart of many machine learning algorithms. Much of

View file

@ -15,7 +15,7 @@ To run this example, you will need to install `NPM`_ and a few python dependenci
pip install flask-cors
Navigate to the ``ray/doc/examples/newsreader`` directory and start the Flask server.
Navigate to the ``ray/doc/source/ray-core/_examples/newsreader`` directory and start the Flask server.
.. code-block:: bash
@ -44,12 +44,12 @@ Star some of the articles in the channel. Each time you star an article, you
can see the Flask server responding in its terminal.
Now we will view our database. Navigate back to the
``ray/doc/examples/newsreader`` directory. Access the database by running
``ray/doc/source/ray-core/_examples/newsreader`` directory. Access the database by running
``sqlite3 newsreader.db`` in the terminal. This will start a sqlite session.
View all the articles in the ``news`` table by running ``SELECT * FROM news;``.
For more details on commands in sqlite, you can run `.help` in
the database.
.. _`frontend`: https://github.com/saqueib/qreader
.. _`code for this example`: https://github.com/ray-project/ray/tree/master/doc/examples/newsreader
.. _`code for this example`: https://github.com/ray-project/ray/tree/master/doc/source/ray-core/_examples/newsreader
.. _`NPM`: https://docs.npmjs.com/downloading-and-installing-node-js-and-npm

View file

@ -9,7 +9,7 @@ nodes) maintains global shared parameters of a machine-learning model
(e.g., a neural network) while the data and computation of calculating
updates (i.e., gradient descent updates) are distributed over worker nodes.
.. image:: ../images/param_actor.png
.. image:: /ray-core/images/param_actor.png
:align: center
Parameter servers are a core part of many machine learning applications. This

View file

@ -14,7 +14,7 @@ then be passed back to each Ray actor for more gradient calculation.
This application is adapted, with minimal modifications, from
Andrej Karpathy's `source code`_ (see the accompanying `blog post`_).
.. image:: ../images/pong-arch.svg
.. image:: /ray-core/images/pong-arch.svg
:align: center

View file

@ -7,7 +7,7 @@ computes word counts on wikipedia articles.
You can view the `code for this example`_.
.. _`code for this example`: https://github.com/ray-project/ray/tree/master/doc/examples/streaming
.. _`code for this example`: https://github.com/ray-project/ray/tree/master/doc/source/ray-core/_examples/streaming
To run the example, you need to install the dependencies
@ -20,7 +20,7 @@ and then execute the script as follows:
.. code-block:: bash
python ray/doc/examples/streaming/streaming.py
python ray/doc/source/ray-core/_examples/streaming/streaming.py
For each round of articles read, the script will output
the top 10 words in these articles together with their word count:

View file

Before

Width:  |  Height:  |  Size: 481 KiB

After

Width:  |  Height:  |  Size: 481 KiB

View file

@ -79,7 +79,7 @@ Runtime environments let you transition your Ray application from running on you
open("my_datafile.txt").read()
return requests.get("https://www.ray.io")
.. literalinclude:: ../examples/doc_code/runtime_env_example.py
.. literalinclude:: /ray-core/_examples/doc_code/runtime_env_example.py
:language: python
:start-after: __runtime_env_conda_def_start__
:end-before: __runtime_env_conda_def_end__
@ -99,7 +99,7 @@ Specifying a Runtime Environment Per-Job
You can specify a runtime environment for your whole job, whether running a script directly on the cluster, using :ref:`Ray Job submission <jobs-overview>`, or using :ref:`Ray Client<ray-client>`:
.. literalinclude:: ../examples/doc_code/runtime_env_example.py
.. literalinclude:: /ray-core/_examples/doc_code/runtime_env_example.py
:language: python
:start-after: __ray_init_start__
:end-before: __ray_init_end__
@ -130,7 +130,7 @@ Specifying a Runtime Environment Per-Task or Per-Actor
You can specify different runtime environments per-actor or per-task using ``.options()`` or the ``@ray.remote()`` decorator:
.. literalinclude:: ../examples/doc_code/runtime_env_example.py
.. literalinclude:: /ray-core/_examples/doc_code/runtime_env_example.py
:language: python
:start-after: __per_task_per_actor_start__
:end-before: __per_task_per_actor_end__

View file

Before

Width:  |  Height:  |  Size: 24 KiB

After

Width:  |  Height:  |  Size: 24 KiB

View file

Before

Width:  |  Height:  |  Size: 24 KiB

After

Width:  |  Height:  |  Size: 24 KiB

View file

Before

Width:  |  Height:  |  Size: 128 KiB

After

Width:  |  Height:  |  Size: 128 KiB

View file

Before

Width:  |  Height:  |  Size: 19 KiB

After

Width:  |  Height:  |  Size: 19 KiB

View file

Before

Width:  |  Height:  |  Size: 32 KiB

After

Width:  |  Height:  |  Size: 32 KiB

View file

Before

Width:  |  Height:  |  Size: 37 KiB

After

Width:  |  Height:  |  Size: 37 KiB

Some files were not shown because too many files have changed in this diff Show more