* working with dataframes with too many rows and columns
* repr works for jupyter notebooks now
* added comments and test file
* added repr test file to .travis.yml
* added back ray.dataframe as pd to test file
* fixed pandas importing issues in test file
* getting the front and back of df more efficiently
* only keeping dataframe tests in travis
* fixing numpy array for row and col lengths issue
* doesn't add dimensions if df is small enough
* implemented memory_usage()
* completed memory_usage - still failing 2 tests
* only failing one test for memory_usage
* all repr and dataframes tests passing now
* fixing error related to python2 in info()
* fixing python2 errors
* fixed linting errosr
* using _arithmetic_helper in memory_usage()
* fixed last lint error
* removed testing-specific code
* adding back travis test
* removing extra tests from travis
* re-added concat test
* fixes with new indexing scheme
* code cleanup
* fully working with new indexing scheme
* added tests for info and memory_usage
* removed test file
* baseline impl for index_df.py
* added skeleton for index_df.py
* initial impl index_df
* separate out partition and non-partition impls
* add len function
* drop returns index_df slice of dropped indices
* housecleaning
* Integrate index overhaul
* Rename index df to index metadata
* Fix flake8 issues
* Addressing issues
* fix import issue
* Added metadata passing to constructor
adding tests
fixing flake8
adding init
flake 8 on test
fixing tests, imports, and flake8
handling for index
adding tests for row, index
added more robust error handling for axis
fixing test failures
cleaning up error sfor 2.7
updating travis
resolving import
fixing flake8
moved import order
Fixing to refactor and delaying implementing ray-pd inner concat
resolving ray-pd concat and from_pandas mutation
Revert "resolving ray-pd concat and from_pandas mutation"
This reverts commit 5db43e4e89e328286532f3ef98a4526575c5d08d.
* Add raylet monitor script to timeout Raylet heartbeats
* Unit test for removing a different client from the client table
* Set node manager heartbeat according to global config
* Doc and fixes
* Add regression test for client table disconnect, refactor client table
* Fix linting.
* Integrate worker with raylet.
* Begin allowing worker to attach to cluster.
* Fix linting and documentation.
* Fix linting.
* Comment tests back in.
* Fix type of worker command.
* Remove xray python files and tests.
* Fix from rebase.
* Add test.
* Copy over raylet executable.
* Small cleanup.
* [tune] Added pbt with keras on cifar10 dataset example
* ENH: add gpu resources
* CLN: requires 4 GPUs resource
* CLN: use single quotes
* CLN: don't save model by default
* Print error when actor takes too long to start, and refactor error message pushing.
* Print warning every ten seconds.
* Fix linting and tests.
* Fix tests.
* Speed up actor creation task submission by generating IDs deterministically.
* Revert "Speed up actor creation task submission by generating IDs deterministically."
This reverts commit 175d9587302664916ce9db4071185485da8da041.
* Don't generate actor IDs deterministically yet.
* Factor out ID generation method.
* Provide experimental API for changing number of return values and resource requirements at task submission time.
* Remove code duplication and add tests.