Random Search in a Small Environment
The goal of the challenge is to demonstrate the capability of a robot to move in a cluttered environment with a possibility to see what the robot see.
Expected
The robot is positioned somewhere in a closed area (i.e. an area bounded with obstacles).
The robot moves continuously in this area while avoiding obstacles (i.e. area limits and some obstacles randomly set in the area).
The sensor data (scan, vision) and potentially other information (considered obstacles, frames,...) is visible on rviz.
The robot trajectory would permit the robot to explore the overall area. In other words, the robot will go everywhere (i.e. the probability that the robot will reach a specific reachable position in the area is equal to 1 at infinite time).
One first approach can be to develop a ricochet robot that changes its direction randomly each time an obstacle prevent the robot to move forward.
consigns
Each group commit the minimal required files in a specific grp_pibotXX
ros2 package inside their git repository.
Release: Thusday afternoon of week-3 (Monday 14-01-2025)
The required files:
- At the root repository, a
README.md
file in markdown syntax introducing the project. - A directory
grp_pibotXX
matching the ROS2 package where the elements will be found (XX
matches the number of the pibot). - Inside the
grp_pibotXX
package, a launch filesimulation_v1_launch.yaml
starting the appropriate nodes for demonstrating in the simulation. - Then, a launch file
tbot_v1_launch.yaml
starting the appropriate nodes for demonstrating with a tbot.
Criteria
- The group follows the consigns (i.e. the repository is presented as expected)
- The robot behavior is safe (no collision with any obstacles)
- rviz2 is started and well configured.
- The robot moves everywhere in it environment.
- A
String
message is sent in adetection
topic each time a green ghost is found in front of the robot (at this time we do not care about false positive).
Evaluation protocol (for evaluators...)
Here the evaluation protocol applied. It is highly recommended to process it yourself before the submission...
- Clone the group’s repository
- Take a look to what is inside the repository and read the
README.md
file. - Build it (
colcon build
andsource
from the workspace directory) accordingly toREADME.md
instructions. - Launch the simulation demonstration:
ros2 launch grp_pibotXX simulation_v1_launch.yaml
and appreciate the solution (at this point vision detection should not be activated) - Stop everything.
- Start the
pibotXX
robot, and connect the camera. - Launch the Turtlebot demonstration (
ros2 launch grp_pibotXX tbot_v1_launch.yaml
), and appreciate the solution. - Take a look to the code, by starting from the launch files.