diff --git a/README.md b/README.md index 1ae211a..222778d 100644 --- a/README.md +++ b/README.md @@ -26,6 +26,7 @@ For questions regarding the workshop, please join the Dronecode Foundation Disco This workshop introduces you to PX4’s ROS 2 integration layer and shows how to create your own flight modes, perception pipelines, and control executors. You’ll also get a brief introduction to PX4 and a ready-to-use developer environment designed for seamless integration between PX4 and ROS 2. By the end of the workshop, you’ll have a complete ROS 2 package capable of controlling a simulated drone, performing navigation tasks, and executing precision landings. + **Note:** Each example includes several exercises designed to reinforce the concepts, deepen understanding, and encourage exploration of the environment. Solutions are provided either **commented out in the code** (you can uncomment and rebuild the package to test) or as **separate, individual packages.** @@ -38,6 +39,7 @@ For more detailed instructions and guidance, please refer to the dedicated **REA ### Environment Setup For detailed environment and Docker setup instructions, see the [docs/README.md](docs/setup.md) guide. +Please complete this step before you proceed. ### Control Pipelines @@ -52,8 +54,8 @@ The goal is to compare these two approaches, highlighting their differences and For detailed instructions and exercises, refer to the following guides in this repository: -- **Offboard Demo:** [Offboard Demo](offboard_demo/README.md) -- **Custom Mode Demo:** [Custom Mode Demo](custom_mode_demo/README.md) +- [Offboard Demo](px4_roscon_25/offboard_demo/README.md) +- [Custom Mode Demo](px4_roscon_25/custom_mode_demo/README.md) ### Perception & Applications @@ -62,12 +64,14 @@ In this section, we explore **three practical examples** of perception and contr 1. **ArUco Marker Detection** – Detect markers using ROS 2 and PX4. No custom flight mode is required. 2. **Teleoperation** – Ever seen a TurtleBot flying? This demo shows how to manually control a drone using a keyboard and to use a LiDAR scan for environmental awareness. 3. **Precision Landing** – Combine ArUco detection with a **Custom Mode** to perform precision landing. + - **Precision Landing with Executor** – This is a follow up exercise to incorporate Precision Land in the former Custom Modes Demo, where an Executor schedules Waypoints and Precison Land to find and land on the ArUco Marker in the maze. For more detailed instructions and exercises, refer to the following demos: -- **ArUco Marker Detection:** [ArUco Marker Detection](aruco_tracker/README.md) -- **Teleoperation:** [Teleoperation](teleop/README.md) -- **Precision Landing:** [Precision Landing](precision_land/README.md) +- [ArUco Marker Detection](px4_roscon_25/aruco_tracker/README.md) +- [Teleoperation](px4_roscon_25/teleop/README.md) +- [Precision Landing](px4_roscon_25/precision_land/README.md) +- [Precision Landing with Executor](px4_roscon_25/precision_land_executor/README.md) ### Q&A, Resources & Hardware Show-and-Tell diff --git a/px4_roscon_25/aruco_tracker/README.md b/px4_roscon_25/aruco_tracker/README.md index c6b60a8..470a1c3 100644 --- a/px4_roscon_25/aruco_tracker/README.md +++ b/px4_roscon_25/aruco_tracker/README.md @@ -67,5 +67,5 @@ Where: To access the vehicle's altitude (distance to ground), the node subscribes to PX4's `VehicleLocalPosition` message and uses the `dist_bottom` field, which provides the distance measurement from the vehicle to the ground below. -The solution to the exercise is commented out at the end of the `OffboardDemo.cpp`, `OffboardDemo.hpp` and `offboard_demo.launch.py`. +The solution to the exercise is commented out at the end of the `ArucoTracker.cpp` and `ArucoTracker.hpp`. Feel free to uncomment it and recompile the package to unveil it. diff --git a/px4_roscon_25/custom_mode_demo/README.md b/px4_roscon_25/custom_mode_demo/README.md index 3a1a8d5..c6f4212 100644 --- a/px4_roscon_25/custom_mode_demo/README.md +++ b/px4_roscon_25/custom_mode_demo/README.md @@ -61,5 +61,5 @@ The `custom_mode_demo.launch.py` can also start the _MicroXrceAgent_ and the _gz 2. Explore what happens if you want to change the Mode order in the Executor 3. Add another Mode after yaw, where you change the altitude to 3 m before landing -The solution to the exercise is commented out at the end of the `OffboardDemo.cpp`, `OffboardDemo.hpp` and `offboard_demo.launch.py`. +The solution to the exercise is commented out at the end of the `CustomMode.cpp`, `CustomMode.hpp`, `CustomModeExecutor.cpp` and `CustomModeExecutor.hpp` Feel free to uncomment it and recompile the package to unveil it. diff --git a/px4_roscon_25/precision_land/README.md b/px4_roscon_25/precision_land/README.md index e4054a4..7e786ac 100644 --- a/px4_roscon_25/precision_land/README.md +++ b/px4_roscon_25/precision_land/README.md @@ -43,3 +43,14 @@ Once it is above the target, the drone will descend. ``` 3. Takeoff and then change into `PrecisionLandCustom` flight mode. + +## Exercise + +Integrate all the concepts from today. Create a ROS 2 package with a Mode Executor for Precision Landing and perform a precision landing in the maze we explored, using teleoperation. +Incorporate the custom waypoints you recorded along the way (Create a CustomWaypoints Mode). +Hint: You can start by copying the Custom Mode package. +Replace CustomYaw with PrecisionLand and add the waypoints you collected earlier. + +The solution can be found in a separate package: + +- [Precision Landing with Executor](px4_roscon_25/precision_land_executor/README.md) diff --git a/px4_roscon_25/teleop/README.md b/px4_roscon_25/teleop/README.md index f37de7c..cd6d06b 100644 --- a/px4_roscon_25/teleop/README.md +++ b/px4_roscon_25/teleop/README.md @@ -1,4 +1,4 @@ -# teleop +# Teleoperation This packages offers uses PX4 custom modes and executor to teleop a drone. @@ -44,3 +44,11 @@ ros2 run teleop_twist_rpyt_keyboard teleop_twist_rpyt_keyboard ``` Just like in the Custom Mode demo, the teleop one requires you to manually activate it and arm the drone! +Make sure you launch the maze in Foxglove, more details see [Foxglove Instructions](../px4_roscon_25/README.md) + +## Exercise + +Navigate the drone to the target in Foxglove using teleoperation. +Do not use the Gazebo simulation, rely solely on the LiDAR (laser scan) data for guidance. +Hint: Observe and record a few key waypoints along the path, these will be useful for a subsequent exercise. +https://docs.px4.io/main/en/msg_docs/VehicleLocalPosition