ray/rllib/agents/ppo/ddppo.py

5 lines
98 B
Python
Raw Normal View History

from ray.rllib.algorithms.ddppo import ( # noqa
DDPPO as DDPPOTrainer,
DEFAULT_CONFIG,
)