A navigation simulator (navsim) API built on top of Python, Pytorch.
In the future, navsim may be compatible with a variety of simulators, but for now it uses A Realistc Open environment for Rapid Agent training(ARORA) Simulator, that is a highly attributed Unity3D GameEngine based Berlin city environment.
You can either run navsim inside container or directly in host without container.
clone the navsim
repo:
git clone --recurse-submodules [email protected]:ucf-sttc/navsim.git
All further commands should be done inside navsim repo: cd navsim
Please make sure to install the following as per latest instrunctions that work for your system. As a guidance the link to instructions that worked are being provided.
- Install nvidia driver
- Install docker engine | Do not install docker desktop. It has been tested not to work with desktop.
- Install nvidia container toolkit
To check that dependencies are working properly for your user:
docker run --rm --runtime=nvidia --gpus all debian:11.6-slim nvidia-smi
You should see nvidia-smi output.
The user inside the container, ezdev
, comes with an id of 1000:1000. If you want this user to be able to read and write files as per your uid:gid, then run the following command to fix the id of the user inside the container:
DUID=`id -u` DGID=`id -g` docker compose build navsim-fixid
You can also specify the image to fix as env variable:
IMAGE=ghcr.io/ucf-sttc/navsim/navsim:<version> DUID=`id -u` DGID=`id -g` docker compose build navsim-fixid
You will see output similar to following:
armando@thunderbird:~/workspaces/navsim$ docker compose build navsim-fixid
[+] Building 5.5s (8/8) FINISHED
=> [internal] load .dockerignore 0.1s
=> => transferring context: 2B 0.0s
=> [internal] load build definition from Dockerfile-navsim-fixid 0.0s
=> => transferring dockerfile: 440B 0.0s
=> [internal] load metadata for ghcr.io/ucf-sttc/navsim/navsim 0.0s
=> [1/4] FROM ghcr.io/ucf-sttc/navsim/navsim 1.1s
=> [2/4] RUN id ezdev 0.5s
=> [3/4] RUN usermod -u 1003 ezdev && groupmod -g 1003 ezdev 2.5s
=> [4/4] RUN id ezdev 0.7s
=> exporting to image 0.4s
=> => exporting layers 0.3s
=> => writing image sha256:e69a490b875892bdbb5498797dcef3aa4551223 0.0s
=> => naming to ghcr.io/ucf-sttc/navsim/navsim 0.0s
Modify the lines 6-8 of navsim/docker-compose.yml
for paths specific to your system.
$HOME/exp/
: Experiments are run in this folder$HOME/unity-envs/
: Unity-based standalone binaries are kept here/data
: This is where all the above symlinked folders are present. Please note in line 7 ofdocker-compose.yml
: because on our systems, all our folders such asworkspaces/navsim
,exp
andunity-envs
reside in/data
and are symlinked in home folder, hence this/data
folder also has to be mounted else the symlinks wont work in container. If you are not using symlinks, then you can remove this line.
Run the following commands to test everything works fine:
docker compose run --rm navsim-test
docker compose run --rm navsim navsim --help
In the following test command replace ARORA/ARORA.x86_64
with the path to your unity binary that you mapped in x-data: &data section
of docker-compose in above instructions. In our case it is the foldername and binary after $HOME/unity-envs
that we mapped in x-data: &data
section of docker-compose earlier: ARORA/ARORA.x86_64
.
docker compose run --rm navsim navsim --plan --env arora-v0 --show_visual --env_path /unity-envs/ARORA/ARORA.x86_64
Run the following command: (remove -d
after run
to run it in foreground):
docker compose run -d --rm navsim <navsim command>
<navsim command>
is described in the section below.
- Install
mamba 4.12.0-3
:https://github.com/conda-forge/miniforge/releases/download/4.12.0-3/Mambaforge-4.12.0-3-Linux-x86_64.sh
-
cd /path/to/navsim/repo mamba create -n navsim python==3.8.16 mamba env update -n navsim -f pyenv/navsim.yml conda activate navsim ./install-repo.sh
- Read
navsim_envs
tutorial to use and test thenavsim_envs
- Run
jupyter notebook
. The notebooks are inexamples
folder. - Run the
<navsim command>
is described in the section below
navsim --plan --env arora-v0 --show_visual --env_path /unity-envs/<path-to-arora-binary>
. For example,<path-to-arora-binary>
in our case is the foldername and binary after$HOME/unity-envs
that we mapped in line 5 of docker-compose earlier:ARORA/ARORA.x86_64
.navsim_env_test min_env_config.yml
navsim --help
shows the optionsnavsim --run_id $run_id --env_path $envdir/$envbin
- executes and/or trains the modelnavsim-benchmark $envdir/$envbin
- benchmarks the modelnavsim-saturate-gpu $envdir/$envbin
- Saturates the GPU- Replace the navsim command with your own command if you are just importing the NavSim env and have your own code in experiment directory.
ERROR: [Loader Message] Code 0 : /usr/lib/x86_64-linux-gnu/libvulkan_radeon.so: cannot open shared object file: No such file or directory
No protocol specified
No protocol specified
ERROR: [Loader Message] Code 0 : loader_scanned_icd_add: Could not get 'vkCreateInstance' via 'vk_icdGetInstanceProcAddr' for ICD libGLX_nvidia.so.0
ERROR: [Loader Message] Code 0 : /usr/lib/i386-linux-gnu/libvulkan_intel.so: cannot open shared object file: No such file or directory
ERROR: [Loader Message] Code 0 : /usr/lib/i386-linux-gnu/libvulkan_radeon.so: cannot open shared object file: No such file or directory
ERROR: [Loader Message] Code 0 : /usr/lib/x86_64-linux-gnu/libvulkan_intel.so: cannot open shared object file: No such file or directory
ERROR: [Loader Message] Code 0 : /usr/lib/i386-linux-gnu/libvulkan_lvp.so: cannot open shared object file: No such file or directory
ERROR: [Loader Message] Code 0 : /usr/lib/x86_64-linux-gnu/libvulkan_lvp.so: cannot open shared object file: No such file or directory
Cannot create Vulkan instance.
This problem is often caused by a faulty installation of the Vulkan driver or attempting to use a GPU that does not support Vulkan.
ERROR at /build/vulkan-tools-oFB8Ns/vulkan-tools-1.2.162.0+dfsg1/vulkaninfo/vulkaninfo.h:666:vkCreateInstance failed with ERROR_INCOMPATIBLE_DRIVER
Solution: For fixing this error you have to update your nvidia driver and fix the id inside the container, as follows:
- Check your nvidia driver with the following commands:
sudo apt list --installed | grep nvidia-driver
andnvidia-smi
For example on our laptop:
armando@thunderbird:~/workspace/navsim$ sudo apt list --installed | grep nvidia-driver
WARNING: apt does not have a stable CLI interface. Use with caution in scripts.
nvidia-driver-470/focal-updates,focal-security,now 470.182.03-0ubuntu0.20.04.1 amd64 [installed]
armando@thunderbird:~/workspace/navsim$ nvidia-smi
Sun May 14 10:53:30 2023
+-----------------------------------------------------------------------------+
| NVIDIA-SMI 470.182.03 Driver Version: 470.182.03 CUDA Version: 11.4 |
|-------------------------------+----------------------+----------------------+
Reinstall the nvidia-driver or update it to latest one.
sudo apt update
sudo apt install nvidia-driver-530
sudo reboot
If you dont rebooot after installing the driver then you will get the following error:
Failed to initialize NVML: Driver/library version mismatch
- Update the id inside the container as per section: fix the user id inside the container
- Switch to feature branch
- Modify code
- Test it in container
- Switch to feature branch
docker compose run navsim-dev bash
cd navsim/docs
make html latexpdf
pdf
andhtml
versions are inside thedocs/build/...
folders
- switch to feature branch
- Update version
- Modify the
version.txt
- Modify image version in:
- docker-compose.yml
- .github/workflows/deploy-docs.yml
- Build the container:
docker compose build navsim-build
DUID=`id -u` DGID=`id -g` docker compose build navsim-fixid
- Test the container
- Commit and push the changes to feature branch
- create a pull request to main branch
- Switch to main branch:
git checkout main
- Merge the pull request
- Build and push the container:
docker compose build navsim-build
docker compose push navsim-build
- Run the docs workflow in github manually
git tag vx.x.x
andgit push --tags
- Create a release in github with the tag
-
Use only google style to document your code: https://sphinxcontrib-napoleon.readthedocs.io/en/latest/example_google.html#example-google
-
/opt/navsim
: Navsim code folder
To give the zip of repo to someone, run the following command:
zip -FSr repo.zip \
navsim-lab navsim-envs \
navsim-mlagents/ml-agents-envs \
navsim-mlagents/gym-unity \
version.txt \
install-repo.sh \
examples \
[email protected]