Home
build details

# AMOD18 Project Unicorn

Modified 2018-12-27 by MartaTintore

Project Unicorn is a project for AMOD (Autonomous Mobility On Demand) course from ETH Zürich that focuses on the intersection navigation for duckiebots in Duckietown.

The following document provides the instructions on how to run the intersection navigation demo on a duckiebot and a basic overview of the project (in the section below).

The interested reader can find the source code in the Project Unicorn Intersection Navigation repository.

Requires: Duckiebot in DB18 configuration

Requires: Completed camera calibration

Requires: Completed wheel calibration

Requires: Local computer with Docker installation

### Demo workflow

Modified 2018-12-27 by MartaTintore

The Intersection Navigation demo performs one intersection maneuver out of the following 3:

• Left

• Right

• Straight

For each intersection type (3-way or 4-way) a feasible direction will be chosen randomly by the algorithm.

Intersection Navigation and Lane following demo:

Additionally to the official demo (“pure” intersection navigation), a demo combining lane following and intersection navigation can be executed.

In this case, the duckiebot will perform one intersection maneuver and switch to lane following automatically after.

To avoid unnecessary computation while performing intersection navigation in our lane following image, the camera publishes on different topics: one for lane following and the other one for intersection navigation.

As a red line detection module was not operational during our project, the lane follower will not stop at the next intersection but continue lane following.

## Video of expected results

Modified 2018-12-27 by MartaTintore

The expected behaviors should look like the following videos:

## Duckietown setup notes

Modified 2018-12-27 by MartaTintore

The following is assumed:

## Laptop setup notes

Modified 2018-12-27 by MartaTintore

Clone the duckietown-intnav folder in your PC:

laptop $git clone --branch demo git@github.com:duckietown/duckietown-intnav.git  ## Duckiebot setup notes Modified 2018-12-27 by MartaTintore Requires: Completed intersection navigation calibration. An accurate localization in the intersection area is crucial for successful navigation. The used algorithm is inter alia based on reprojecting points from the camera to the world frame. Therefore in order to enable an accurate localization an accurate camera calibration is required, especially with respect to scale. Calibration test instructions: ## Pre-flight checklist Modified 2018-12-27 by MartaTintore Check: The duckiebot has sufficient battery. Check: The intersection is free of obstacles (including other duckiebots). ## Intersection Navigation demo instructions Modified 2018-12-27 by MartaTintore ## Additional demo: Intersection Navigation and Lane following demo instructions Modified 2018-12-27 by MartaTintore ## Troubleshooting Modified 2018-12-27 by MartaTintore roslaunch not working properly. Stop the process with Ctrl + C. Exit the container: duckiebot$ exit


Restart the container from the terminal (go back to Step 2).

Duckiebot not moving.

Make sure the duckiebot can see AprilTags.

Duckiebot is not navigating the intersection properly (not moving smoothly or cutting corners).

Make sure the wheel calibration is done correctly.

Make sure the AprilTags are placed according to Figure 24.6 and Figure 24.8.

‘roslaunch xml error’ displayed.

Try to restart the container again (try 2-3 times). If the error is not fixed, re-flash your SD card.

Docker failed to register layer: ‘no space left on device’

Remove unused images on Portainer.

## Demo failure demonstration

Modified 2018-12-27 by MartaTintore

The following video shows how the Intersection Navigation demo can fail, when the assumptions are not respected.

## Project description

Modified 2018-12-27 by MartaTintore

### Our Approach

Modified 2018-12-27 by MartaTintore

As indicated in grey in Figure 24.16 the intersection navigation goes through four main steps:

1- Estimation of the initial pose: Starting at a red line at any intersection, the duckiebot estimates its initial pose. The duckiebot relative position in the intersection with regard to the AprilTags (based on camera image) is calculated (AprilTag 2: Efficient and robust fiducial detection) and added to the fixed AprilTags relative position with regard to the global frame to obtain the duckiebot initial pose in the global frame.

2- Trajectory Generation: different paths for going left, right and straight are pre-computed and chosen depending on the desired intersection exit. Given the intersection type, the intersection command (which can be given as user input or as a random chose) and bearing in mind the duckiebot dynamic constraints and duckietown intersection boundaries, the proper trajectory is generated.

3- Pure Pursuit Controller: Constantly updating its pose estimate, it follows the path in a closed-loop manner thanks to the PurePursuit path tracking algorithm from the controller. Implementation based on the paper Automatic Steering Methods for Autonomous Automobile Path Tracking.

4- Interface: detects when the duckiebot has reached the intersection end (exit lane) and switches back to the duckietown lane follower.

No questions found. You can ask a question on the website.