TorchOpt
TorchOpt is an efficient library for differentiable optimization built upon PyTorch. Torchopt is
Comprehensive: TorchOpt provides three differentiation modes - explicit differentiation, implicit differentiation, and zero-order differentiation for handling different differentiable optimization situations.
Flexible: TorchOpt provides both functional and objective-oriented API for users different preferences. Users can implement differentiable optimization in JAX-like or PyTorch-like style.
Efficient: TorchOpt provides (1) CPU/GPU acceleration differentiable optimizer (2) RPC-based distributed training framework (3) Fast Tree Operations, to largely increase the training efficiency for bi-level optimization problems.
Installation
Requirements:
Please follow the instructions at https://pytorch.org to install PyTorch in your Python environment first. Then run the following command to install TorchOpt from PyPI:
pip install torchopt
You can also build shared libraries from source, use:
git clone https://github.com/metaopt/torchopt.git
cd torchopt
pip3 install .
We provide a conda environment recipe to install the build toolchain such as cmake
, g++
, and nvcc
.
You can use the following commands with conda / mamba to create a new isolated environment.
git clone https://github.com/metaopt/torchopt.git
cd torchopt
# You may need `CONDA_OVERRIDE_CUDA` if conda fails to detect the NVIDIA driver (e.g. in docker or WSL2)
CONDA_OVERRIDE_CUDA=12.1 conda env create --file conda-recipe-minimal.yaml
conda activate torchopt
- TorchOpt Optimizer
- Functional Optimizers
- Classic Optimizers
- Differentiable Meta-Optimizers
- Implicit Differentiation
- Linear System Solvers
- Zero-Order Differentiation
- Optimizer Hooks
- Gradient Transformation
- Optimizer Schedules
- Apply Parameter Updates
- Combining Optimizers
- Distributed Utilities
- General Utilities
- Visualizing Gradient Flow
The Team
TorchOpt is a work by
Jie Ren (JieRen98)
Xidong Feng (waterhorse1)
Bo Liu (Benjamin-eecs)
Xuehai Pan (XuehaiPan)
Luo Mai (luomai)
Yaodong Yang (PKU-YYang).
Support
If you are having issues, please let us know by filing an issue on our issue tracker.
Changelog
See CHANGELOG.md.
License
TorchOpt is licensed under the Apache 2.0 License.
Citing
If you find TorchOpt useful, please cite it in your publications.
@article{JMLR:TorchOpt,
author = {Jie Ren* and Xidong Feng* and Bo Liu* and Xuehai Pan* and Yao Fu and Luo Mai and Yaodong Yang},
title = {TorchOpt: An Efficient Library for Differentiable Optimization},
journal = {Journal of Machine Learning Research},
year = {2023},
volume = {24},
number = {367},
pages = {1--14},
url = {http://jmlr.org/papers/v24/23-0191.html}
}