Autonomous Mobile Bot Part 1: Autonomous SLAM

This is part of a series of posts where I document my progress in building an autonomous mobile bot from scratch. The ultimate goal of this project is to build a robot that can accomplish the following tasks:

  • Autonomously map an indoor space using SLAM and save the generated map
  • Load the generated map file and localize the robot inside the map
  • Move the robot around by sending move commands from a GUI that shows the map

Background:
At this point in the project, I had set up the simulation envrionment for the robot in Gazebo. I had the robot model in both .urdf (for RViz) and .sdf (for Gazebo) format. The robot was a two-wheeled robot with wheel encoders, an IMU, and a LiDAR. These sensors were added using the Gazebo plugins (diff_drive, imu, lidar)

Robot model in RViz

The simulation environment in which the robot drove around was the turtlebot3_world, which gets downloaded when you install ROS2.

turtlebot3_world in Gazebo

SLAM was accomplished by using the Nav2 and Slam-Toolbox packages. I decided to use prebuilt packages since implementing an efficient and robust SLAM algorithm from scratch requires an extensive specialty in various fields, such as nonlinear state estimation.

Now that the simulation environment and SLAM had been setup, I had to implement an algorithm to accomplish autonomous SLAM. That means the algorithm had to take in a partially-completed map of the environment, which is being constantly updated by SLAM, identify unexplored areas, and send move commands to said areas.

Overall Design:

I got inspiration from this Github repository: https://github.com/robo-friends/m-explore-ros2, which contained code that accomplishes autonomous SLAM.

I took the general design idea from this implementation which were:
  1. Use Breath-FIrst-Search (BFS) to find all cells that are adajcent to at least one cell with unknown occupancy (I will call these cells frontier cells from now on).
  2. Upon encountering a frontier cell, use BFS again to find all connected frontier cells.
  3. Store the "island" of frontier cells as invidiual structs.
  4. Navigate the robot to the centriod of the island of frontier cells.

This method made sense because my robot had a 360-degree LiDAR. Thus, navigating to the center of connected frontier cells would allow the robot to map the environment with only a few move commands.
Struct to store the island of frontier cells:
struct Frontiers
{
    double min_distance;
    double cost;
    geometry_msgs::msg::Point centriod;
    std::vector<geometry_msgs::msg::Point> points;
};

BFS that searches for a frontier cell:
// use bfs to traverse through cells less than MAX_NON_OBSTACLE to find a frontier point
// if a frontier point is found, use bfs(again) to find all connected frontiers using buildNewFrontiers
// buildNewFrontiers returns a Frontiers struct, which contains a vector of frontier points
// Essentially, the Frontiers object represents an island of frontier points
// the robot is made to traverse towards the centroid of the island
while (!bfs.empty())
{
    unsigned int idx = bfs.front();
    bfs.pop();

    for (unsigned int& nbr : nhood4(idx))
    {
        if (!visited[nbr] && mapData[nbr] <= MAX_NON_OBSTACLE)
        {
            if (isNewFrontier(nbr))
            {
                frontiersList.push_back(buildNewFrontiers(nbr, visited, startPose));
            }
            bfs.push(nbr);
        }
        visited[nbr] = true;
    }
}
bool FrontierSearch::isNewFrontier(unsigned int idx)
{
    // if any of its neighboring cells is unknown, it is a frontier cell
    for (unsigned int& nbr : nhood8(idx))
    {
        if (mapData[nbr] == NO_INFORMATION)
        {
            return true;
        }
    }
    return false;
}
  • frontier cell is a cell with at least one neighboring cell whose occupancy is unknown
  • nhood4 and nhood8 are helped functions that return the 4 or 8 neighboring cells 
    • up, down, left, right for nhood4
    • up, upper_right, right, bottom_right, bottom, bottom_left, left, upper_left for nhood8
BFS that finds all the connected frontier cells:
Frontiers FrontierSearch::buildNewFrontiers(unsigned int startIdx, std::vector<bool>& visited,
                                            const geometry_msgs::msg::Pose& robotPose)
{
    // get all connected frontiers and return the Frontiers struct
    // by traversing through cells less than MAX_NON_OBSTACLE
    Frontiers frontiers;
    frontiers.min_distance = std::numeric_limits<double>::infinity();

    std::queue<unsigned int> bfs;
    bfs.push(startIdx);
    visited[startIdx] = true;

    while (!bfs.empty())
    {
        unsigned int idx = bfs.front();
        bfs.pop();

        // change map index into map coordinates and store it in frontiers
        unsigned int mx, my;
        double wx, wy;
        indexToCells(idx, mx, my);
        mapToWorld(mx, my, wx, wy);
        geometry_msgs::msg::Point p;
        p.x = wx;
        p.y = wy;
        frontiers.points.push_back(p);

        // calculating centroid of frontiers
        frontiers.centriod.x += wx;
        frontiers.centriod.y += wy;

        // getting minimum distance from robot pose
        double distance = distanceFormula(robotPose.position.x, robotPose.position.y, wx, wy);
        if (frontiers.min_distance > distance)
        {
            frontiers.min_distance = distance;
        }

        for (unsigned int& nbr : nhood8(idx))
        {
            if (!visited[nbr] && isNewFrontier(nbr) && mapData[nbr] <= MAX_NON_OBSTACLE)
            {
                bfs.push(nbr);
            }
            visited[nbr] = true;
        }
    }
    frontiers.centriod.x /= frontiers.points.size();
    frontiers.centriod.y /= frontiers.points.size();

    return frontiers;
}

After finding all the frontier islands, the algorithm greedily chose the frontier island with the least cost. The cost of each frontier island was calculated like so:
double FrontierSearch::frontierCost(const Frontiers& frontier)
{
    return (distanceWeight * frontier.min_distance * resolution) -
           (sizeWeight * frontier.points.size() * resolution);
}
Here, min_distance is the distance between the robot's current position and the closest frontier cell in the frontier island and points.size() is the number of frontier cells in the island. The cost is a combination of these two metrics and each metric are multiplied by their weight, which reflects their relative importance in calculating the cost. In plain english, the robot is set to explore frontier islands that are both close to the robot and have a lot of frontier cells.

Each time the SLAM algorithm sends an updated map over a ROS topic, all the frontier islands are found and they sorted according to their cost. The algorithm then picks the frontier island with the least cost and traveles towards its centriod. A video of the process is shown in this video:

Results:

Analysis:

I learned that the value of distanceWeight and sizeWeight matters a lot in determining the time it takes to complete the autonomous mapping process.

Top video: sizeWeight > distanceWeight
Bottom video: sizeWeight < distanceWeight

When sizeWeight was higher than distanceWeight, the robot changed its course in the middle of mapping the room and started moving to a different room. This is because as the room is mapped, the number of frontiers in the room decreases and its cost is no longer the lowest amongst other frontier island. This happens if higher weight is given to sizeWeight: the robot prioritizes the size of frontier islands and changes its course to a different frontier island with more frontier cells without finishing the room first. With this configuation, the robot left small frontier islands throughout the map and had to revisit those them after all the large islands were mapped. This was inefficient as the robot has to revisit the same islands. The problem was fixed by making the distanceWeight higher. By doing so, the robot prioritized frontier islands that are closer and finished mapping the entire room before moving onto the next room.

Comments

Popular posts from this blog

RGB-D Visual Odometry

Bird's eye view for wheelchair

Iterative Closest Point Algorithm using Octree (CUDA Accelerated)