-
Notifications
You must be signed in to change notification settings - Fork 86
6. Tutorials & FAQs
How to install RoboHive using pipy?
We recommend installation within a conda environment. If you don't have one yet, create one using
conda create -n robohive python=3
conda activate robohive
It is possible to install RoboHive with:
pip install -U robohive
How to install RoboHive in editable mode?
RoboHive
uses git submodules to resolve dependencies. Please follow the steps exactly as below to install it correctly.
-
Clone this repo with pre-populated submodule dependencies as -
git clone --recursive https://github.com/vikashplus/robohive.git
Note: RoboHive aggressively uses git-submodules. Therefore, it is important to use the exact command above for installation. A basic understanding of how to work with submodules is expected.
-
Install the package
cd robohive pip install -e .[a0] #with a0 binding for realworld robot pip install -e . #simulation only
-
Export the pythonpath
export PYTHONPATH="<path/to/robohive>:$PYTHONPATH"
OR Add repo to pythonpath by updating ~/.bashrc
or ~/.bash_profile
- You can visualize the environments with random controls using the below command
$ python robohive/utils/examine_env.py -e FrankaReachRandom-v0
How to install RoboHive on a colab?
RoboHive has a detailed colab tutorial to walk you through various steps of installation to env-interactions. Give it a try.
Which MuJoCo bindings are supported in RoboHive? How can I pick MuJoCo bindings for RoboHive?
RoboHive supports both MuJoCo bindings - MuJoCo bindings from DeepMind as well as MuJoCo_Py bindings form OpenAI.
Set the environment variable sim_backend=MUJOCO
or sim_backend=MUJOCO_PY
to choose which binding to use.
GLFW errors during visualization?
- If the visualization results in a GLFW error, this is because
mujoco-py
does not see some graphics drivers correctly. This can usually be fixed by explicitly loading the correct drivers before running any scripts. See this page for more details. - If FFmpeg isn't found then run
apt-get install ffmpeg
on linux andbrew install ffmpeg
on osx (conda install FFmpeg
causes some issues)
How to pick OpenGL rendering backends for onscreen/offscreen?
Rendering in RoboHive is directly handled by the simulation backend --
- mujoco_py: You can check the backend using
python -c "import mujoco_py; print(mujoco_py.cymj)"
- mujoco: follow instructions here
How to install hardware drivers to work with robots?
Please follow instructions in the Robohive's robot class for hardware install instructions.
How to explore an environment onscreen using random actions?
RoboHive exposes a powerful utility script to examine environments. The following command can be used to examine an environment with random actions. $ python robohive/utils/examine_env.py --env_name FrankaReachRandom-v0
Also try the python robohive/utils/examine_env.py --help
to view all available options.
How to explore an environment onscreen using actions from specified policy?
RoboHive exposes a powerful utility script to examine environments using actions from a policy. The following command can be used
$ python robohive/utils/examine_env.py --env_name FrankaReachRandom-v0 --policy_path <path_to_policy>
the policy is expected to be a pickle file. The policy class is expected to expose an act = policy.get_actions(observations)
method. See robohive/utils/examine_env.py
for more details.
Also, try the python robohive/utils/examine_env.py --help
to view all available options.
How to explore an environment offscreen?
RoboHive exposes a powerful utility script to examine environments. The following command can be used to examine a policy offscreen
$ python robohive/utils/examine_env.py --env_name FrankaReachRandom-v0 --render offscreen
Also, try the python robohive/utils/examine_env.py --help
to view all available options for onscreen/offscreen rendering.
How to register a RoboHive environment?
RoboHive environment follows OpenAI's gym API. The environment can be registered like any gym environment. We provide a detailed notebook explaining the process of registering and instantiating a RoboHive environment at - robohive/tutorials/1_env_registration_and_customization.ipynb
How to customize an existing RoboHive environment?
RoboHive environment follows OpenAI's gym API. We provide a detailed notebook explaining the process of customizing RoboHive's environments at - robohive/tutorials/1_env_registration_and_customization.ipynb
How to interact with a RoboHive environment?
RoboHive environment follows OpenAI's gym API. Environment interactions are similar to any gym environment. We provide a detailed notebook at - robohive/tutorials/2_env_interactions.ipynb
explaining with examples of how to interact with RoboHive's environments.
How to access details (observation/ proprioception/ exteroception) from the RoboHive environment?
RoboHive supports three ways to query and access the environment based on the nature of the information required -- observation
/ proprioception
/ exteroception
We provide a detailed notebook at - robohive/tutorials/3_get_obs_proprio_extero.ipynb
explaining with examples how to access these features.
How to render images offscreen from a RoboHive simulation?
RoboHive simulations are pure MuJoCo models. We provide a tutorial (robohive/tutorials/render_cams.py
) showing ways to render images offscreen and save them using a mujoco model
How to interact with a RoboHive's Robot without the environment?
RoboHive provides an explicit robot class that hides the differences between the sim and the real allowing users to seamlessly switch between the two. We provide a tutorial (/robohive/tutorials/examine_robot.py
) to demonstrate how to use the RoboHive's Robot class in isolation. This use-case is common in scenarios where the entire env definitions aren't required for experiments. In this tutorial, we demonstrate how we can use the RoboHive's Franka Robot in the real world using specifications available in a hardware config file.
# python tutorials/examine_robot.py --sim_path PATH_robot_sim.xml --config_path PATH_robot_configurations.config
$ python tutorials/examine_robot.py --sim_path envs/arms/franka/assets/franka_reach_v0.xml --config_path envs/arms/franka/assets/franka_reach_v0.config
How to use RoboHive's inverse kinematics (with min jerk)?
RoboHive provides support for inverse kinematics. We provide a tutorial (/robohive/tutorials/ik_minjerk_trajectory.py
) to demonstrate how to use RoboHive's inverse kinematics with a min-jerk trajectory.
python tutorials/ik_minjerk_trajectory.py --sim_path envs/arms/franka/assets/franka_busbin_v0.xml
How to use RoboHive's teleoperation capabilities with keyboard/ spacenav?
RoboHive provides support for teleoperation. We provide a tutorial (/robohive/tutorials/ee_teleop.py
) to demonstrate how to use RoboHive's teleoperation capabilities over an environment to collect data
python robohive/tutorials/ee_teleop.py -e rpFrankaRobotiqData-v0
How to use RoboHive's teleoperation capabilities with Oculus controller?
RoboHive provides support for teleoperation. We provide a tutorial (robohive/tutorials/ee_teleop_oculus.py
) to demonstrate how to use RoboHive's teleoperation capabilities over an environment to collect data
python tutorials/ee_teleop.py -e rpFrankaRobotiqData-v0
How do I set up RoboHive's teleoperation in the sim/ real?
RoboHive provides support for teleoperation. We provide a video tutorial to walk you through RoboHive's
- installation
- teleoperation capabilities over an environment to collect data
- Easy of switching between simulation and hardware teleoperation
RoboHive: A unified framework for robot learning