mirror of
https://github.com/vale981/ray
synced 2025-03-05 18:11:42 -05:00
![]() This line: ``` pip3 install -U --force-reinstall xgboost xgboost_ray lightgbm_ray petastorm ``` also re-installs the dependencies of these packages, and the `--force-reinstall` means we overwrite existing ones. This leads us to re-install the latest ray release, overwriting the wheels to be tested: ``` [INFO] 5/31/2022, 12:12:16 AM: Successfully installed ... ray-1.12.1 ... [INFO] 5/31/2022, 12:12:17 AM: * Executed RUN pip3 install -U --force-reinstall xgboost xgboost_ray petastorm (ff6ae9f9) ``` Instead, we should use `--no-deps` to avoid re-installing dependencies. Also, the wheels sanity check is moved to after installing additional packages in order to catch these errors earlier. |
||
---|---|---|
.. | ||
workloads | ||
app_config.yaml | ||
app_config_gpu.yaml | ||
create_test_data.py | ||
README.rst | ||
tpl_cpu_moderate.yaml | ||
tpl_cpu_small.yaml | ||
tpl_gpu_small.yaml |
XGBoost on Ray tests ==================== This directory contains various XGBoost on Ray release tests. You should run these tests with the `releaser <https://github.com/ray-project/releaser>`_ tool. Overview -------- There are four kinds of tests: 1. ``distributed_api_test`` - checks general API functionality and should finish very quickly (< 1 minute) 2. ``train_*`` - checks single trial training on different setups. 3. ``tune_*`` - checks multi trial training via Ray Tune. 4. ``ft_*`` - checks fault tolerance. Generally the releaser tool will run all tests in parallel, but if you do it sequentially, be sure to do it in the order above. If ``train_*`` fails, ``tune_*`` will fail, too. Acceptance criteria ------------------- These tests are considered passing when they throw no error at the end of the output log.