ray/rllib/utils
2020-12-23 11:30:50 -05:00
..
exploration [RLlib] Attention Net prep PR #3. (#12450) 2020-12-07 13:08:17 +01:00
schedules [RLlib] Allow for more than 2^31 policy timesteps. (#11301) 2020-10-12 13:49:11 -07:00
spaces [RLlib] Trajectory view API: Simple List Collector (on by default for PPO); LSTM-agnostic (#11056) 2020-10-01 16:57:10 +02:00
tests Change Python's ObjectID to ObjectRef (#9353) 2020-07-10 17:49:04 +08:00
__init__.py [RLlib] Fix all example scripts to run on GPUs. (#11105) 2020-10-02 23:07:44 +02:00
actors.py Change Python's ObjectID to ObjectRef (#9353) 2020-07-10 17:49:04 +08:00
annotations.py Fix overriden typo (#11227) 2020-10-07 19:11:07 -07:00
compression.py Stop vendoring pyarrow (#7233) 2020-02-19 19:01:26 -08:00
debug.py [rllib] Flexible multi-agent replay modes and replay_sequence_length (#8893) 2020-06-12 20:17:27 -07:00
deprecation.py [RLlib] Exploration API: merge deterministic flag with exploration classes (SoftQ and StochasticSampling). (#7155) 2020-02-19 12:18:45 -08:00
error.py Remove future imports (#6724) 2020-01-09 00:15:48 -08:00
filter.py [rllib] Rrk/12079 custom filters (#12095) 2020-11-19 13:20:20 -08:00
filter_manager.py [rllib] Deprecate policy optimizers (#8345) 2020-05-21 10:16:18 -07:00
framework.py [RLlib] Issue 12244: Unable to restore multi-agent PPOTFPolicy's Model (from exported). (#12786) 2020-12-11 16:13:38 +01:00
from_config.py [RLlib] Make envs specifiable in configs by their class path. (#8750) 2020-06-03 08:14:29 +02:00
memory.py [RLlib] Trajectory View API (preparatory cleanup and enhancements). (#9678) 2020-07-29 21:15:09 +02:00
numpy.py [RLlib] Issue 12118: LSTM prev-a/r should be separately configurable. Fix missing prev-a one-hot encoding. (#12397) 2020-11-25 11:27:46 -08:00
sgd.py [RLlib] Attention Nets: tf (#12753) 2020-12-20 20:22:32 -05:00
test_utils.py [RLlib] Fix JAX import bug. (#12621) 2020-12-07 11:05:08 -08:00
tf_ops.py [RLlib] Curiosity exploration module: tf/tf2.x/tf-eager support. (#11945) 2020-11-29 12:31:24 +01:00
tf_run_builder.py [RLlib] Tf2x preparation; part 2 (upgrading try_import_tf()). (#9136) 2020-06-30 10:13:20 +02:00
timer.py [rllib] Enable performance metrics reporting for RLlib pipelines, add A3C (#7299) 2020-02-28 16:44:17 -08:00
torch_ops.py [RLlib] TorchPolicies: Accessing "infos" dict in train_batch causes TypeError. (#13039) 2020-12-23 11:30:50 -05:00
tracking_dict.py [RLlib] Trajectory view API: Enable by default for PPO, IMPALA, PG, A3C (tf and torch). (#11747) 2020-11-12 16:27:34 +01:00
typing.py [RLlib] Attention Nets: tf (#12753) 2020-12-20 20:22:32 -05:00
window_stat.py [RLLib] WindowStat bug fix (#9213) 2020-07-12 23:01:32 +02:00