Autoware.Auto with SVL Simulator #

Table of Contents

Overview top#

This guide describes setting up and using Autoware.Auto with the SVL Simulator. As Autoware.Auto is still under-development, full self-driving is not yet possible. This guide will focus on running individual modules which have been implemented.

Setup top#

Requirements top#

  • Linux operating system
  • NVIDIA graphics card

Install Docker CE top#

To install Docker CE please refer to theĀ official documentation. We also suggest following through with theĀ post installation steps to run docker as a non-root user.

Install NVIDIA Container Toolkit top#

Before installing the NVIDIA Container Toolkit, make sure that you have the appropriate NVIDIA drivers installed. To test if the NVIDIA drivers are properly installed, enter nvidia-smi in a terminal. If the drivers are installed properly an output similar to the following should appear:

| NVIDIA-SMI 440.59       Driver Version: 440.59       CUDA Version: 10.2     |
| GPU  Name        Persistence-M| Bus-Id        Disp.A | Volatile Uncorr. ECC |
| Fan  Temp  Perf  Pwr:Usage/Cap|         Memory-Usage | GPU-Util  Compute M. |
|   0  GeForce GTX 108...  Off  | 00000000:65:00.0  On |                  N/A |
|  0%   59C    P5    22W / 250W |   1490MiB / 11175MiB |      4%      Default |

| Processes:                                                       GPU Memory |
|  GPU       PID   Type   Process name                             Usage      |
|    0      1187      G   /usr/lib/xorg/Xorg                           863MiB |
|    0      3816      G   /usr/bin/gnome-shell                         305MiB |
|    0      4161      G   ...-token=7171B24E50C2F2C595566F55F1E4D257    68MiB |
|    0      4480      G   147MiB |
|    0     17936      G   ...-token=5299D28BAAD9F3087B25687A764851BB   103MiB |

The installation steps for the NVIDIA Container Toolkit are available from the official documentation.

Simulator installation top#

  • Download and extract the latest simulator release under the ~/adehome folder.
  • (Optional) Download the latest PythonAPI release (make sure the release version matches the simulator) and install it using pip:
cd PythonAPI
pip3 install --user .

Install top#

Install ADE top#

cd ~/.local/bin
$ mv ade+x86_64 ade
$ chmod +x ade
$ ./ade --version
$ ./ade update-cli
$ ./ade --version
mkdir -p ~/adehome
cd ~/adehome
touch .adehome

Download Autoware.Auto top#

Download Autoware.Auto under the ~/adehome folder.

cd ~/adehome
git clone

Installation and Development: Setup guide guide for

Install ROS 2 LGSVL Bridge top#

There are two ways of installing ros2-lgsvl-bridge:

  • Using the Package Manager
  • Building source code

1. Using the Package Manager#

# In the ade container
sudo apt update
sudo apt install ros-foxy-lgsvl-bridge

2. Building source code.#

cd ~/adehome/AutowareAuto/src/external
git clone
cd ros2-lgsvl-bridge
git pull origin foxy-devel
git checkout foxy-devel

Refer to in the repo.

# In the ade container
cd ~/AutowareAuto
colcon build --packages-select lgsvl_bridge --cmake-args '-DCMAKE_BUILD_TYPE=Release'

Refer to in the repo.

source ~/AutowareAuto/src/external/ros2-lgsvl-bridge/install/setup.bash

Install ROS 2 LGSVL Messages top#

mkdir -p ~/adehome/AutowareAuto/src/external/
cd ~/adehome/AutowareAuto/src/external/
git clone
# In the ade container
cd ~/AutowareAuto
colcon build --cmake-args '-DCMAKE_BUILD_TYPE=Release'
# You may want to build only lgsvl_msgs package with the following command.
colcon build --packages-select lgsvl_msgs --cmake-args '-DCMAKE_BUILD_TYPE=Release'
cd ~/AutowareAuto
source install/setup.bash
ros2 msg list |grep lgsvl_msgs
# If you can see the list of lgsvl_msgs, they're ready to be used.

Run Simulator alongside Autoware.Auto top#

The ROS 2 web bridge allows the simulator and to communicate. To test this connection we can visualize sensor data from the simulator in rviz2 (running in the container).

Start the Autoware.Auto containers without NVIDIA setup:#

cd ~/adehome/AutowareAuto
source .aderc-amd64-foxy-lgsvl
ade start

Start the Autoware.Auto containers with NVIDIA setup:#

Create a aderc file which has nvidia setup:

vim  ~/adehome/AutowareAuto/.aderc-amd64-foxy-lgsvl-nvidia


export ADE_DOCKER_RUN_ARGS="--cap-add=SYS_PTRACE --net=host --privileged --add-host ade: -e RMW_IMPLEMENTATION=rmw_cyclonedds_cpp --runtime=nvidia -ti --rm -e DISPLAY -v /tmp/.X11-unix:/tmp/.X11-unix \
-e NVIDIA_DRIVER_CAPABILITIES=compute,utility,display"
export ADE_IMAGES="

Start ADE container:

cd ~/adehome/AutowareAuto
source .aderc-amd64-foxy-lgsvl-nvidia
ade start
  • Enter the container and start rviz2:
ade enter
cd ~/AutowareAuto
colcon build
source ~/AutowareAuto/install/setup.bash
rviz2 -d /home/"${USER}"/AutowareAuto/install/autoware_auto_examples/share/autoware_auto_examples/rviz2/autoware.rviz
  • Start the SVL Simulator outside ADE container by launching the executable and click on the OPEN BROWSER button to open the web UI.
$ (path\to\downloaded\simulator)/svlsimulator-linux64-2021.1/simulator
  • In the Vehicles tab under Library look for Lexus2016RXHybrid. If not available, see the Library page to add it.

    • Make sure that Autoware.Auto sensor configuration has the ROS2 bridge and all of sensors are added.
    • Click Vehicles under Library in the left side and click Lexus2016RXHybrid and click Autoware.Autoin Sensor Configurations.
    • If you can see i mark next to sensor name, click Add to Library button to add sensor plugins into library.
  • Switch to the Simulations tab and click the Add new button:

    • Enter a Simulation Name and click Next.
    • Select Random Traffic in Runtime Template.
    • Select a map from the drop down menu. If none are available follow this guide to get a map.
    • Select the Lexus2016RXHybrid from the drop down menu of Vehicle.
    • Select the Autoware.Auto in Sensor Configuration and click Next.
  • Select Autoware.Auto (Apex.AI) in Autopilot and enter the bridge address (default:localhost:9090) in Bridge IP box and click Next.

    • Click Publish.
    • Press Run Simulation button.
  • Launch ROS 2 LGSVL bridge in a new terminal:

# In the ADE container
$ lgsvl_bridge

NOTE ROS 2 LGSVL Bridge needs to be running.

You should now be able to see the LiDAR point cloud in rviz (see image below).

If the pointcloud is not visible make sure the Fixed Frame (under Global Options) is set to lidar_front and that a PointCloud2 message is added which listens on the /lidar_front/points_raw topic.

Run Simulator alongside Autoware.Auto in ADE docker container top#

1.Build ADE docker container#

Copy the following Dockerfile into ~/adehome/AutowareAuto/tools/ade_image folder.

FROM ubuntu:20.04
CMD ["bash"]


RUN apt update && apt install locales
RUN locale-gen en_US en_US.UTF-8
RUN update-locale LC_ALL=en_US.UTF-8 LANG=en_US.UTF-8
RUN export LANG=en_US.UTF-8

# tz America/Los_Angeles
ENV TZ=America/Los_Angeles
RUN ln -snf /usr/share/zoneinfo/$TZ /etc/localtime && echo $TZ > /etc/timezone
RUN apt-get install -y tzdata

RUN apt update && apt install -y curl gnupg2 lsb-release python3-pip gettext
RUN curl -sSL  -o /usr/share/keyrings/ros-archive-keyring.gpg

RUN echo "deb [arch=$(dpkg --print-architecture) signed-by=/usr/share/keyrings/ros-archive-keyring.gpg] $(lsb_release -cs) main" | tee /etc/apt/sources.list.d/ros2.list > /dev/null
# /

RUN apt-get update && \
    apt-get install -y \
      python3-vcstool \
      ros-$ROS_DISTRO-cyclonedds \
      ros-$ROS_DISTRO-rmw-cyclonedds-cpp && \
    rm -rf /var/lib/apt/lists/* #/tmp/ros-deps

COPY apt-packages /tmp/
RUN apt-get update && \
    apt-get install -y \
        $(cat /tmp/apt-packages | cut -d# -f1 | envsubst) \
    && rm -rf /var/lib/apt/lists/* /tmp/apt-packages

RUN echo 'ALL ALL=(ALL) NOPASSWD:ALL' >> /etc/sudoers
RUN echo 'Defaults env_keep += "DEBUG ROS_DISTRO"' >> /etc/sudoers

COPY pip3-packages /tmp/
RUN pip3 install -U \
        $(cut -d# -f1 </tmp/pip3-packages) \
    && rm -rf /root/.cache /tmp/pip-* /tmp/pip3-packages

RUN git clone && mkdir -p udpreplay/build \
      && cd udpreplay/build && cmake .. && make && make install \
      && cd - && rm -rf udpreplay/

COPY bashrc-git-prompt /
RUN cat /bashrc-git-prompt >> /etc/skel/.bashrc && \
    rm /bashrc-git-prompt
COPY gdbinit /etc/gdb/

# ===================== CLEAN UP ZONE ===================== #
# Commands in the cleanup zone will be cleaned up before every release
# and put into the correct place.
RUN apt-get update \
  && apt-get install -y \
    unzip \
    ca-certificates \
    libx11-6 \
    libxau6 \
    libxcb1 \
    libxdmcp6 \
    libxext6 \
    libvulkan1 \
    libgl1 \
    libgtk2.0-0 \
    vulkan-utils \ \
    xdg-utils \
  && apt-get clean

ADD "" /etc/vulkan/icd.d/nvidia_icd.json
RUN chmod 644 /etc/vulkan/icd.d/nvidia_icd.json

# ===================== END OF CLEAN UP ZONE ===================== #

# Add print url into tty part.
COPY print-tty.desktop /usr/share/applications/print-tty.desktop
COPY /usr/bin/
RUN chmod 755 /usr/bin/
RUN apt-get update \
  && apt-get install -y \
    ros-$ROS_DISTRO-lgsvl-bridge \
  && apt-get clean
# /Add print url into tty part.

# Do full package upgrade as last step
# to avoid disrupting layer caching
RUN apt-get update && \
    apt-get -y dist-upgrade && \
    rm -rf /var/lib/apt/lists/*

COPY /etc/profile.d/
COPY gitconfig /etc/gitconfig
COPY entrypoint /ade_entrypoint
COPY colcon-defaults.yaml /usr/local/etc/colcon-defaults.yaml
RUN echo "export COLCON_DEFAULTS_FILE=/usr/local/etc/colcon-defaults.yaml" >> \
ENTRYPOINT ["/ade_entrypoint"]
CMD ["/bin/sh", "-c", "trap 'exit 147' TERM; tail -f /dev/null & wait ${!}"]

Copy the following print-tty.desktop file into ~/adehome/AutowareAuto/tools/ade_image folder.

[Desktop Entry]
GenericName=Print On Tty
Comment=Prints text on first tty
TryExec=echo %F

Copy the following file into ~/adehome/AutowareAuto/tools/ade_image folder.


echo "$@" >/dev/tty

Build docker container and tag it:

cd ~/adehome/AutowareAuto/tools/ade_image
docker build -t .

2.Build and start ADE docker container#

Copy the following .aderc-amd64-foxy-lgsvl-nvidia into ~/adehome/AutowareAuto folder:

export ADE_DOCKER_RUN_ARGS="--cap-add=SYS_PTRACE --net=host --privileged --add-host ade: -e RMW_IMPLEMENTATION=rmw_cyclonedds_cpp --runtime=nvidia -ti --rm -e DISPLAY -v /tmp/.X11-unix:/tmp/.X11-unix \
-e NVIDIA_DRIVER_CAPABILITIES=compute,utility,display"
export ADE_IMAGES="

If you have simulator docker container, you may include it in .aderc-amd64-foxy-lgsvl-nvidia file but it you don't you can mount local simulator into /opt/lgsvl in ade docker container when you start ade container.

Source the .aderc-amd64-foxy-lgsvl-nvidia:

source .aderc-amd64-foxy-lgsvl-nvidia


unzip -d ~

Start ade:

ade start -- -e DISPLAY -e XAUTHORITY=/tmp/.Xauthority -v ${XAUTHORITY}:/tmp/.Xauthority -v /tmp/.X11-unix:/tmp/.X11-unix -v /var/run/docker.sock:/var/run/docker.sock -v /tmp/LGElectronics/:/tmp/LGElectronics/ -v ~/svlsimulator-linux64-2021.2.1:/opt/lgsvl

3.Run simulator in docker container#

ade enter
lgsvl_bridge &

Click link to cluster and URL is printed on the terminal.

Copy and paste the URL on your web browser.