Deep learning has had remarkable success in robotic perception, but its data-centric nature suffers when it comes to generalizing to ever-changing environments. By contrast, physics-based optimization generalizes better, but it does not perform as well in complicated tasks due to the lack of high-level semantic information and the reliance on manual parametric tuning. To take advantage of these two complementary worlds, we present PyPose: a robotics-oriented, PyTorch-based library that combines deep perceptual models with physics-based optimization techniques. Our design goal for PyPose is to make it user-friendly, efficient, and interpretable with a tidy and well-organized architecture. Using an imperative style interface, it can be easily integrated into real-world robotic applications.
- System:
LTI
,LTV
,NLS
- Filter:
EKF
,UKF
,PF
- PnP Solver:
EPnP
- Linear Quadratic Regulator:
LQR
- IMU Preintegration:
IMUPreintegrator
- ......
GaussNewton
LevenbergMarquardt
- ......
Want more features? Create an issue here to request new features.
PyPose is highly efficient and supports parallel computing for Jacobian of Lie group and Lie algebra. See following comparison.
Efficiency and memory comparison of batched Lie group operations (we take Theseus performance as 1×).
More information about efficiency comparison goes to our paper for PyPose.
pip install pypose
- Requirement:
On Ubuntu, macOS, or Windows, install PyTorch, then run:
pip install -r requirements/runtime.txt
- Install locally:
git clone https://github.com/pypose/pypose.git
cd pypose && python setup.py develop
- Run tests
pytest
-
Make sure the above installation is correct.
-
Go to CONTRIBUTING.md
- The following code sample shows how to rotate random points and compute the gradient of batched rotation.
>>> import torch, pypose as pp
>>> # A random so(3) LieTensor
>>> r = pp.randn_so3(2, requires_grad=True)
so3Type LieTensor:
tensor([[ 0.1606, 0.0232, -1.5516],
[-0.0807, -0.7184, -0.1102]], requires_grad=True)
>>> R = r.Exp() # Equivalent to: R = pp.Exp(r)
SO3Type LieTensor:
tensor([[ 0.0724, 0.0104, -0.6995, 0.7109],
[-0.0395, -0.3513, -0.0539, 0.9339]], grad_fn=<AliasBackward0>)
>>> p = R @ torch.randn(3) # Rotate random point
tensor([[ 0.8045, -0.8555, 0.5260],
[ 0.3502, 0.8337, 0.9154]], grad_fn=<ViewBackward0>)
>>> p.sum().backward() # Compute gradient
>>> r.grad # Print gradient
tensor([[-0.7920, -0.9510, 1.7110],
[-0.2659, 0.5709, -0.3855]])
- This example shows how to estimate batched inverse of transform by a second-order optimizer. Two usage options for a
scheduler
are provided, each of which can work independently.
>>> from torch import nn
>>> import torch, pypose as pp
>>> from pypose.optim import LM
>>> from pypose.optim.strategy import Constant
>>> from pypose.optim.scheduler import StopOnPlateau
>>> class InvNet(nn.Module):
def __init__(self, *dim):
super().__init__()
init = pp.randn_SE3(*dim)
self.pose = pp.Parameter(init)
def forward(self, input):
error = (self.pose @ input).Log()
return error.tensor()
>>> device = torch.device("cuda")
>>> input = pp.randn_SE3(2, 2, device=device)
>>> invnet = InvNet(2, 2).to(device)
>>> strategy = Constant(damping=1e-4)
>>> optimizer = LM(invnet, strategy=strategy)
>>> scheduler = StopOnPlateau(optimizer, steps=10, patience=3, decreasing=1e-3, verbose=True)
>>> # 1st option, full optimization
>>> scheduler.optimize(input=input)
>>> # 2nd option, step optimization
>>> while scheduler.continual():
loss = optimizer.step(input)
scheduler.step(loss)
>>> # Note: remove one of the above options for usage!
For more usage, see Documentation. For more applications, see Examples.
If you use PyPose, please cite the paper below. You may also download it here.
@inproceedings{wang2023pypose,
title = {{PyPose}: A Library for Robot Learning with Physics-based Optimization},
author = {Wang, Chen and Gao, Dasong and Xu, Kuan and Geng, Junyi and Hu, Yaoyu and Qiu, Yuheng and Li, Bowen and Yang, Fan and Moon, Brady and Pandey, Abhinav and Aryan and Xu, Jiahe and Wu, Tianhao and He, Haonan and Huang, Daning and Ren, Zhongqiang and Zhao, Shibo and Fu, Taimeng and Reddy, Pranay and Lin, Xiao and Wang, Wenshan and Shi, Jingnan and Talak, Rajat and Cao, Kun and Du, Yi and Wang, Han and Yu, Huai and Wang, Shanzhao and Chen, Siyu and Kashyap, Ananth and Bandaru, Rohan and Dantu, Karthik and Wu, Jiajun and Xie, Lihua and Carlone, Luca and Hutter, Marco and Scherer, Sebastian},
booktitle = {IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)},
year = {2023}
}
More papers describing PyPose:
@inproceedings{zhan2023pypose,
title = {{PyPose} v0.6: The Imperative Programming Interface for Robotics},
author = {Zitong Zhan and Xiangfu Li and Qihang Li and Haonan He and Abhinav Pandey and Haitao Xiao and Yangmengfei Xu and Xiangyu Chen and Kuan Xu and Kun Cao and Zhipeng Zhao and Zihan Wang and Huan Xu and Zihang Fang and Yutian Chen and Wentao Wang and Xu Fang and Yi Du and Tianhao Wu and Xiao Lin and Yuheng Qiu and Fan Yang and Jingnan Shi and Shaoshu Su and Yiren Lu and Taimeng Fu and Karthik Dantu and Jiajun Wu and Lihua Xie and Marco Hutter and Luca Carlone and Sebastian Scherer and Daning Huang and Yaoyu Hu and Junyi Geng and Chen Wang},
year = {2023},
booktitle = {IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) Workshop},
}