v0.1.2
Full Changelog: v0.1.1...v0.1.2
Milestone: https://github.com/pytorch/torchx/milestones/3
- PyTorch 1.11 Support
- Python 3.10 Support
torchx.workspace
- TorchX now supports a concept of workspaces. This enables seamless launching
of jobs using changes present in your local workspace. For Docker based
schedulers, we automatically build a new docker container on job launch
making it easier than ever to run experiments. #333
- TorchX now supports a concept of workspaces. This enables seamless launching
torchx.schedulers
- Ray #329
- Newly added Ray scheduler makes it easy to launch jobs on Ray.
- https://pytorch.medium.com/large-scale-distributed-training-with-torchx-and-ray-1d09a329aacb
- AWS Batch #381
- Newly added AWS Batch scheduler makes it easy to launch jobs in AWS with minimal infrastructure setup.
- Slurm
- Slurm jobs will by default launch in the current working directory to match
local_cwd
and workspace behavior. #372 - Replicas now have their own log files and can be accessed programmatically. #373
- Support for
comment
,mail-user
andconstraint
fields. #391 - Workspace support (prototype) - Slurm jobs can now be launched in isolated experiment directories. #416
- Slurm jobs will by default launch in the current working directory to match
- Kubernetes
- All Docker-based Schedulers (Kubernetes, Batch, Docker)
- Local Scheduler
- Ray #329
torchx.components
dist.ddp
- HPO
- Ax runner now lives in the Ax repo facebook/Ax@8e2e68f
torchx.cli
torchx.runner
torchx.notebook
(prototype)- Added new workspace interface for developing models and launching jobs via a Jupyter Notebook. #356
- Docs