ray/doc/source/tune/tutorials/tune-output.rst
Max Pumperla 5cc9355303
[Docs ] Tune docs overhaul (first part) (#22112)
Continuing docs overhaul, tune now has:

- [x] better landing page
- [x] a getting started guide
- [x] user guide was cut down, partially merged with FAQ, and partially integrated with tutorials
- [x] the new user guide contains guides to tune features and practical integrations
- [x] we rewrote some of the feature guides for clarity 
- [x] we got rid of sphinx-gallery for this sub-project (only data and core left), as it looks bad and is unnecessarily complicated anyway (plus, makes the build slower)
- [x] sphinx-gallery examples are now moved to markdown notebook, as started in #22030.
- [x] Examples are tested in the new framework, of course.

There's still a lot one can do, but this is already getting too large. Will follow up with more fine-tuning next week.

Co-authored-by: Antoni Baum <antoni.baum@protonmail.com>
Co-authored-by: Kai Fricke <krfricke@users.noreply.github.com>
2022-02-07 15:47:03 +00:00

164 lines
5.6 KiB
ReStructuredText

A Guide To Logging & Outputs in Tune
====================================
Tune by default will log results for TensorBoard, CSV, and JSON formats.
If you need to log something lower level like model weights or gradients, see :ref:`Trainable Logging <trainable-logging>`.
You can learn more about logging and customizations here: :ref:`loggers-docstring`.
.. _tune-logging:
How to configure logging in Tune?
---------------------------------
Tune will log the results of each trial to a sub-folder under a specified local dir, which defaults to ``~/ray_results``.
.. code-block:: bash
# This logs to two different trial folders:
# ~/ray_results/trainable_name/trial_name_1 and ~/ray_results/trainable_name/trial_name_2
# trainable_name and trial_name are autogenerated.
tune.run(trainable, num_samples=2)
You can specify the ``local_dir`` and ``trainable_name``:
.. code-block:: python
# This logs to 2 different trial folders:
# ./results/test_experiment/trial_name_1 and ./results/test_experiment/trial_name_2
# Only trial_name is autogenerated.
tune.run(trainable, num_samples=2, local_dir="./results", name="test_experiment")
To specify custom trial folder names, you can pass use the ``trial_name_creator`` argument to `tune.run`.
This takes a function with the following signature:
.. code-block:: python
def trial_name_string(trial):
"""
Args:
trial (Trial): A generated trial object.
Returns:
trial_name (str): String representation of Trial.
"""
return str(trial)
tune.run(
MyTrainableClass,
name="example-experiment",
num_samples=1,
trial_name_creator=trial_name_string
)
To learn more about Trials, see its detailed API documentation: :ref:`trial-docstring`.
.. _tensorboard:
How to log to TensorBoard?
--------------------------
Tune automatically outputs TensorBoard files during ``tune.run``.
To visualize learning in tensorboard, install tensorboardX:
.. code-block:: bash
$ pip install tensorboardX
Then, after you run an experiment, you can visualize your experiment with TensorBoard by specifying
the output directory of your results.
.. code-block:: bash
$ tensorboard --logdir=~/ray_results/my_experiment
If you are running Ray on a remote multi-user cluster where you do not have sudo access,
you can run the following commands to make sure tensorboard is able to write to the tmp directory:
.. code-block:: bash
$ export TMPDIR=/tmp/$USER; mkdir -p $TMPDIR; tensorboard --logdir=~/ray_results
.. image:: ../images/ray-tune-tensorboard.png
If using TensorFlow ``2.x``, Tune also automatically generates TensorBoard HParams output, as shown below:
.. code-block:: python
tune.run(
...,
config={
"lr": tune.grid_search([1e-5, 1e-4]),
"momentum": tune.grid_search([0, 0.9])
}
)
.. image:: ../../images/tune-hparams.png
.. _tune-console-output:
How to control console output?
------------------------------
User-provided fields will be outputted automatically on a best-effort basis.
You can use a :ref:`Reporter <tune-reporter-doc>` object to customize the console output.
.. code-block:: bash
== Status ==
Memory usage on this node: 11.4/16.0 GiB
Using FIFO scheduling algorithm.
Resources requested: 4/12 CPUs, 0/0 GPUs, 0.0/3.17 GiB heap, 0.0/1.07 GiB objects
Result logdir: /Users/foo/ray_results/myexp
Number of trials: 4 (4 RUNNING)
+----------------------+----------+---------------------+-----------+--------+--------+----------------+-------+
| Trial name | status | loc | param1 | param2 | acc | total time (s) | iter |
|----------------------+----------+---------------------+-----------+--------+--------+----------------+-------|
| MyTrainable_a826033a | RUNNING | 10.234.98.164:31115 | 0.303706 | 0.0761 | 0.1289 | 7.54952 | 15 |
| MyTrainable_a8263fc6 | RUNNING | 10.234.98.164:31117 | 0.929276 | 0.158 | 0.4865 | 7.0501 | 14 |
| MyTrainable_a8267914 | RUNNING | 10.234.98.164:31111 | 0.068426 | 0.0319 | 0.9585 | 7.0477 | 14 |
| MyTrainable_a826b7bc | RUNNING | 10.234.98.164:31112 | 0.729127 | 0.0748 | 0.1797 | 7.05715 | 14 |
+----------------------+----------+---------------------+-----------+--------+--------+----------------+-------+
.. _tune-log_to_file:
How to redirect stdout and stderr to files?
-------------------------------------------
The stdout and stderr streams are usually printed to the console.
For remote actors, Ray collects these logs and prints them to the head process.
However, if you would like to collect the stream outputs in files for later
analysis or troubleshooting, Tune offers an utility parameter, ``log_to_file``,
for this.
By passing ``log_to_file=True`` to ``tune.run()``, stdout and stderr will be logged
to ``trial_logdir/stdout`` and ``trial_logdir/stderr``, respectively:
.. code-block:: python
tune.run(
trainable,
log_to_file=True)
If you would like to specify the output files, you can either pass one filename,
where the combined output will be stored, or two filenames, for stdout and stderr,
respectively:
.. code-block:: python
tune.run(
trainable,
log_to_file="std_combined.log")
tune.run(
trainable,
log_to_file=("my_stdout.log", "my_stderr.log"))
The file names are relative to the trial's logdir. You can pass absolute paths,
too.
If ``log_to_file`` is set, Tune will automatically register a new logging handler
for Ray's base logger and log the output to the specified stderr output file.