Efficient Exploration
The goal of the challenge is to demonstrate the capability a robot has to navigate in a cluttered environment and locate specific objects. The localization requires to build a map of the environment.
Expected
The robot is positioned somewhere in a closed area (i.e. an area bounded with obstacles).
The robot moves continuously in this area by avoiding obstacles.
The robot knowledge is extended with new incoming data. In other words, the robot build a map and localizes itself in it.
The robot detects green ghost in the sensor flux. Messages are sent in topics one to state the detection and another one to mark the position in the map (cf. marker_msgs)
Experiments can be performed with \(2\) computers, one on the robot (Control PC) and a second for visualization and human control (Operator PC).
consigns
Each group commit the minimal required files in a specific grp_pibotXX
ros2 package inside their git repository.
Release: Wednesday of week-4, Wednesday 24 of January
The required files:
- At the root repository, a
README.md
file in markdown syntax introducing the project. - A directory
grp_pibotXX
matching the ROS2 package where the elements will be found (XX
matches the number of your pibot robot). - Inside the
grp_pibotXX
package, a launch filesimulation_v2_launch
starting the appropriate nodes for demonstrating SLAM in the simulation. - Then, a launch file
vision_launch
starting the appropriate nodes for demonstrating vision capability (the pibot doesn't move). - Then, a launch file
tbot_v2_launch
starting the appropriate nodes for demonstrating with a Turtlebot (SLAM + localisation of the green ghost). - Finally, a launch file
operator_launch
start a visualization (enriched map, without connection to image topic) + distant control solution.
The simulation is configured with your preference with Gazebo or Stage.
Criteria
Minimal:
- The group follows the consigns (i.e. the repository is presented as expected)
- The robot behavior is safe (no collision with any obstacles)
- Vision part detect correctly green ghost in the sensor flux.
- rviz2 is started and well configured in a second PC and display the built map.
- It is possible to visualize a marker for detected ghosts, at the positions of the ghosts in the map.
Optional (the order does not matter):
- The robot movement is oriented toward the unknown areas to speed up the exploration.
- The green ghost are identified with a number and the robot is capable of recognizing a ghost on a second passage (it should be easy to count the ghost in the map).
- The detection node detects all the ghosts, in any position but with robustness to false positives.
- Processes are clearly established (start and stop the robot, save the map, get the lists of ghosts, set the xp in pause, ...)
- Developed nodes are based on ROS2 Parameters (for speed, obstacle detections, ...)
- The Kobuki features are integrated to the scenario (robot button, contact to the ground, void detection, bips,...)
- The green ghost are identified with a number and the robot is capable of recognizing a ghost on a second passage (it should be easy to count the ghost in the map or by reading the topic).
- The robot behavior permits to caracterize clearly a green ghost.
- The list is not exhaustive, be inventive !
The challenge2 tbot.launch.py
launch file may take an initial map.
Video
A 5 minutes technical video is expected per group.
The video can be shared somewhere on the internet (YouTube for example), and a link to it must be added in the group README.md
file.
The video is not a file in the git repository.
Git is not designed to version large binary files.
We encourage the group to share the video publicly, but this is only optional.
The video integrates a challenge presentation (with IMT Logo, CERI SN ;), the developers, the teachers, the philosophy of the solution and results.