Random Search in a Small Environment

The goal of the challenge is to demonstrate the capability of a robot to move in a cluttered environment with a possibility to see what the robot see.

Expected

The robot is positioned somewhere in a closed area (i.e. an area bounded with obstacles).

The robot moves continuously in this area while avoiding obstacles (i.e. area limits and some obstacles randomly set in the area).

The sensor data (scan, vision) and potentially other information (considered obstacles, frames,...) is visible on rviz.

The robot trajectory would permit the robot to explore the overall area. In other words, the robot will go everywhere (i.e. the probability that the robot will reach a specific reachable position in the area is equal to 1 at infinite time).

One first approach can be to develop a ricochet robot that changes its direction randomly each time an obstacle prevent the robot to move forward.

consigns

Each group commit the minimal required files in a specific grp_pibotXX ros2 package inside their git repository.

Release: Thusday afternoon of week-3 (Monday 14-01-2025)

The required files:

  • At the root repository, a README.md file in markdown syntax introducing the project.
  • A directory grp_pibotXX matching the ROS2 package where the elements will be found (XX matches the number of the pibot).
  • Inside the grp_pibotXX package, a launch file simulation_v1_launch.yaml starting the appropriate nodes for demonstrating in the simulation.
  • Then, a launch file tbot_v1_launch.yaml starting the appropriate nodes for demonstrating with a tbot.

Criteria

  1. The group follows the consigns (i.e. the repository is presented as expected)
  2. The robot behavior is safe (no collision with any obstacles)
  3. rviz2 is started and well configured.
  4. The robot moves everywhere in it environment.
  5. A String message is sent in a detection topic each time a green ghost is found in front of the robot (at this time we do not care about false positive).

Evaluation protocol (for evaluators...)

Here the evaluation protocol applied. It is highly recommended to process it yourself before the submission...

  1. Clone the group’s repository
  2. Take a look to what is inside the repository and read the README.md file.
  3. Build it (colcon build and source from the workspace directory) accordingly to README.md instructions.
  4. Launch the simulation demonstration: ros2 launch grp_pibotXX simulation_v1_launch.yaml and appreciate the solution (at this point vision detection should not be activated)
  5. Stop everything.
  6. Start the pibotXX robot, and connect the camera.
  7. Launch the Turtlebot demonstration (ros2 launch grp_pibotXX tbot_v1_launch.yaml), and appreciate the solution.
  8. Take a look to the code, by starting from the launch files.