Chapter 8: Multimodal Perception and Intelligent Decision-Making¶
Visual perception refers to the process by which machines acquire environmental information through sensors and analyze and interpret images using computer vision techniques. It encompasses tasks such as object detection and recognition, endowing the system with environmental awareness. Obstacle avoidance decision-making relies on environmental information obtained through visual perception, employing environment modeling, path planning, and intelligent decision-making algorithms to formulate behavior strategies that avoid obstacles, prevent collisions, and achieve predefined objectives. These two components are interdependent: visual perception provides environmental data for obstacle avoidance decision-making, while the latter executes actions based on this data. They play a critical role in fields such as autonomous driving and robot navigation, promoting intelligent applications and development of unmanned systems in complex environments.
8.1 Background and Theory¶
Multimodal perception and intelligent decision-making technologies constitute the core pillars for intelligent unmanned systems to achieve efficient collaboration, autonomous operation, and safety assurance, forming a closed-loop mechanism of “perception–cognition–action.”
8.1.1 Multisource Information Fusion and Robust Perception¶
Traditional positioning and control schemes for unmanned systems often rely on single-sensor inputs, such as GNSS satellite positioning. In highly interfered environments—including forests, urban canyons, over sea surfaces, or indoors—integrating multimodal sensor data (e.g., vision, LiDAR, IMU) has become inevitable. Techniques such as SLAM algorithms enable the construction of robust, continuous, and dynamically updatable environmental models.
8.1.2 Obstacle-Avoidance Path Planning and Intelligent Decision-Making¶
Obstacle-avoidance planning depends on fused perception data and leverages advanced methods—including deep learning, reinforcement learning, behavior trees, graph search, and optimization algorithms—to enable autonomous assessment of obstacle risks, dynamic adjustment of speed and heading, and real-time generation of safe and efficient paths.
8.2 Framework and Interfaces¶
The RflySim toolchain, combined with typical development cases, provides a detailed introduction to the supporting capabilities for intelligent perception and decision-making tasks, including sensor interfaces, data acquisition and processing workflows, and typical task algorithm architectures.
8.2.1 Image Acquisition in Virtual Environments¶
RflySim offers a high-fidelity virtual sensor simulation environment, supporting the generation of multimodal sensor data—including RGB vision, depth images, LiDAR, and IMU—providing realistic test data sources for visual perception algorithms.
8.2.2 Object Detection and Tracking¶
The platform supports algorithm validation for typical vision tasks, including object detection and tracking, path planning, and obstacle-avoidance strategies. It provides standardized interface frameworks to help developers efficiently transition from simulation validation to real-machine deployment.
8.3 Showcase of Outstanding Cases¶
Five-UAV Visual Shared SLAM Hardware-in-the-Loop Simulation:
Simulation Algorithm Development and Validation:
8.4 Course-Linked Video Lectures¶
Public Lecture Replay (Session 7: Multimodal Perception and Intelligent Decision-Making):
8.5 Chapter Experiment Cases¶
The related verification experiments and guided cases for this chapter are located in the [Installation Directory]\RflySimAPIs\8.RflySimVision folder.
8.5.1 Interface Learning Experiments¶
Stored in the 8.RflySimVision\0.ApiExps folder, covering foundational platform interface tutorials and general introductions to various tools.
Experiment 1: Binocular Camera System Calibration
- 📦 Version Requirement:
Free Edition - 📁 File Path: 3-VisionAIAPI/0.BinocularCameraCalib/Readme.pdf
📝 Experiment Overview: Acquire RflySim 3D images via Python interface, demonstrate binocular camera system calibration by altering the position and orientation of the chessboard, and learn visual sensor configuration and camera parameter tuning.
Experiment 2: Vision Image Capture Interface Experiment
- 📦 Version Requirement:
Free Edition - 📁 File Path: 1-UsageAPI/0.ConfigJsonAPI/Readme.pdf
📝 Experiment Overview:
Acquire RflySim 3D images using the Python interface VisionCaptureApi, learn visual sensor configuration, camera parameter settings, aircraft control, and UE4 control, achieving real-time image capture and flight controller integration.
Experiment 3: NX and PX4 Joint Hardware-in-the-Loop (HIL) Ring-Penetration Simulation
- 📦 Version Requirement:
Free Edition - 📁 File Path: 0.Preparation/3.NXwithPX4Config/0.PX4+NX/Readme.pdf
📝 Experiment Overview:
Implement hardware-in-the-loop simulation jointly between NX and Pixhawk6x. Automatically obtain the IP address via the Python interface ReqCopterSim, subscribe to image data using ROS, control the aircraft via MAVROS, and use the OpenCV library to achieve ring-penetration functionality. Learn visual sensor configuration and MAVLink communication setup.
Experiment 4: Visual Development Environment Configuration and Preliminary Knowledge
- 📦 Version Requirement:
Free Edition - 📁 File Path: 0.Preparation/Readme.pdf
📝 Experiment Overview: Configure the RflySim visual development environment, including virtual machine setup, ROS environment configuration, NX and Pixhawk joint simulation, and visual box HIL simulation as preparatory experiments.
Experiment 5: MAVROS Python OFFBOARD Control Experiment
- 📦 Version Requirement:
Free Edition - 📁 File Path: 2-DistributedSimAPI/3.RosDistCtrl/0.RosOffbPyC++/0.PyRosLearn/Readme.pdf
📝 Experiment Overview:
Achieve OFFBOARD mode control of the UAV via the MAVROS Python interface, learning automatic simulation IP acquisition, rospy node programming, aircraft arming, and position setting.
Experiment 6: RflySim Vision Interface Experiment
- 📦 Version Requirement:
Free Edition - 📁 File Path: 1-UsageAPI/Readme.pdf
📝 Experiment Overview: Acquire RflySim 3D images and perform real-time control via Python interface, learning the usage of vision interfaces, including acquisition and processing of multi-camera images, depth maps, point clouds, and other visual data.
Experiment 7: RflySim Vision UDP Direct Transmission with PNG Compression Experiment
- 📦 Version Requirement:
Free Edition - 📁 File Path: 2-DistributedSimAPI/1.VisionAPIsTest/1-VisionCapAPI-UE4DirectUDP-PNGConpressed/Readme.pdf
📝 Experiment Overview:
Implement distributed simulation via UDP direct transmission of PNG-compressed images. Images are received on a remote Linux system or another Windows PC, and flight control commands are sent back. Learn configuration of the SendProtocol transmission mode.
Experiment 8: MAVROS C++ OFFBOARD Control
- 📦 Version Requirement:
Free Edition - 📁 File Path: 2-DistributedSimAPI/3.RosDistCtrl/0.RosOffbPyC++/1.C++RosLearn/Readme.pdf
📝 Experiment Overview: Switch the aircraft to OFFBOARD mode and arm it via MAVROS C++ program, set fixed coordinate points for position control, and learn key technologies such as ROS nodes, topic publishing, and service calls.
Experiment 9: Visual Box Hardware-in-the-Loop Simulation (Ring-Penetration via Serial Port)
-
📦 Version Requirement:
Free Edition📝 Experiment Overview:
Implement hardware-in-the-loop simulation for the Vision Box, automatically obtain IP addresses via the Python interfaceReqCopterSim, establish co-simulation between RflySim 3D and CopterSim, integrate ROS and MAVROS to control the aircraft for closed-loop flight, and learn visual sensor configuration and theSendProtocolimage transmission mode.
Experiment 10: Automatic Generation of AI Training Dataset
- 📦 Version Requirement:
Free Edition - 📁 File Path: 3-VisionAIAPI/1.GenObjectDataSet/Readme.pdf
📝 Experiment Overview:
Automatically generate AI training datasets using the Python interface VisionCaptureApi. Image data is output in VOC format, and point cloud data in KITTI format, suitable for training object detection and recognition models.
Experiment 11: MAVROS C++ Control Interface Experiment
- 📦 Version Requirement:
Free Edition - 📁 File Path: 2-DistributedSimAPI/3.RosDistCtrl/1.MavrosCtrlC++/Readme.pdf
📝 Experiment Overview:
Control the aircraft via a C++ program using the MAVROS interface, achieving MAVLink-ROS message conversion. Demonstrates automatic IP acquisition and multi-mode aircraft control in distributed simulation.
Experiment 12: Multi-Camera Image Acquisition Experiment
- 📦 Version Requirement:
Free Edition - 📁 File Path: 1-UsageAPI/1.ImgSenorAPI/1.MutCameraImageGet/Readme.pdf
📝 Experiment Overview:
Acquire RGB, grayscale, and depth images from three cameras using the Python interface. Covers visual sensor configuration, real-time camera parameter adjustment, and aircraft control, including usage of the VisionCaptureApi interface and UE4 control.
Experiment 13: PX4ApiTest Control Demonstration
- 📦 Version Requirement:
Free Edition - 📁 File Path: 2-DistributedSimAPI/2.UavDistCtrl/1.PX4ApiTest/Readme.pdf
📝 Experiment Overview:
Demonstrate aircraft control using the Python interface PX4MavCtrlV4.py, including usage of position, velocity, attitude, and acceleration control commands.
Experiment 14: RflySim 3D Object Position Acquisition
- 📦 Version Requirement:
Free Edition - 📁 File Path: 1-UsageAPI/8.RflySim3DAPI/1.RflySim3DPosGet/Readme.pdf
📝 Experiment Overview:
Acquire position information of dynamically created objects in RflySim 3D using the Python interface. Learn usage of the getUE4Pos function to achieve real-time aircraft position data acquisition.
Experiment 15: Point Cloud Segmentation Experiment
- 📦 Version Requirement:
Free Edition - 📁 File Path: 1-UsageAPI/3.PointCloudAPI/1.SegmentPointCloudDemo/Readme.pdf
📝 Experiment Overview:
Acquire segmented point cloud data via the Python vision interface, enabling real-time point cloud display and processing. Covers visual sensor configuration, Open3D-based point cloud visualization, and drone control.
Experiment 16: UDP Direct Transmission of Point Cloud Data — Introductory Experiment
- 📦 Version Requirement:
Free Edition - 📁 File Path: 8.LidarAPIDemo/1.SharedMemoryClientServer/Readme.pdf
📝 Experiment Overview:
Learn to receive point cloud data sent by RflySim3D via UDP direct transmission using the Python interface VisionCaptureApi.py, and visualize the point cloud in a graphical interface. Master configuration of the SendProtocol transmission mode.
Experiment 17: Lightweight UAV Mass-Point Model Control Experiment
-
📦 Version Requirement:
Free Edition📝 Experiment Overview:
Implements point-mass-based drone control via the Python-based PX4MavCtrl interface, delivering dynamic performance comparable to software/hardware-in-the-loop simulations, significantly reducing CPU resource consumption and enhancing flight stability.
Experiment 18: Getting Started with VMware
- 📦 Version Requirement:
Free Edition - 📁 File Path: 0.Preparation/1.VMwareUbuntu/Readme.pdf
📝 Experiment Overview:
Learn fundamental VMware virtual machine operations, including installation and configuration, network mode selection (Bridged/NAT), basic settings, and login procedures, enabling successful VM startup and network configuration.
Experiment 19: Mid360 LiDAR Simulation
- 📦 Version Requirement:
Free Edition - 📁 File Path: 10.Mid360Demo/Readme.pdf
📝 Experiment Overview:
Simulates the Livox Mid360 LiDAR sensor using RflySim, establishing a complete data pipeline from sensor simulation to ROS-based visualization, and validating PX4 Offboard flight control functionality.
Experiment 20: Timestamp Acquisition
- 📦 Version Requirement:
Free Edition - 📁 File Path: 1-UsageAPI/10.ReadTimeStmp/Readme.pdf
📝 Experiment Overview:
Acquires timestamp data via Python interfaces, learning to use mav.StartTimeStmplisten and vis.StartTimeStmplisten to monitor aircraft timestamps. Data includes checksum, aircraft ID, simulation start timestamp, current timestamp, and heartbeat counter.
Experiment 21: RflySim Fisheye Camera Experiment
- 📦 Version Requirement:
Free Edition - 📁 File Path: 11.FishEyeDemo/Readme.pdf
📝 Experiment Overview:
Demonstrates fisheye camera usage in RflySim for vision-based simulation. Covers visual sensor configuration, image acquisition via VisionCaptureApi, and drone control via MAVLink.
Experiment 22: Simulated Pod UI Control System
- 📦 Version Requirement:
Free Edition - 📁 File Path: 12.UAV_PodSimUI/Readme.pdf
📝 Experiment Overview:
Implements a complete UI-based control system for simulated pods in RflySim. Covers control of pod pitch/yaw angles, zoom, and magnification parameters; explores interaction mechanisms between visual sensors and the simulation environment; and introduces principles of AI-based target recognition and tracking.
Experiment 23: Distributed Vision Control
- 📦 Version Requirement:
Free Edition - 📁 File Path: 2-DistributedSimAPI/Readme.pdf
📝 Experiment Overview:
Transmits image data (PNG/JPG, compressed or uncompressed) directly via UDP to remote Linux or Windows machines, receives images, and sends back aircraft control commands—enabling distributed vision control and multi-simulation testing.
Experiment 24: RflySim Vision API – Direct UDP Uncompressed Image Transmission
- 📦 Version Requirement:
Free Edition - 📁 File Path: 2-DistributedSimAPI/1.VisionAPIsTest/2-VisionCapAPI-UE4DirectUDP-NoCompress/Readme.pdf
📝 Experiment Overview:
Implements distributed co-simulation via direct UDP transmission of uncompressed PNG images to remote Linux or Windows machines, with image reception and aircraft command feedback. Focuses on configuring SendProtocol to 2 for uncompressed image transmission.
Experiment 25: Camera Calibration
- 📦 Version Requirement:
Free Edition - 📁 File Path: 3-VisionAIAPI/2.CameraCalcDemo/Readme.pdf
📝 Experiment Overview:
Uses the Python VisionCaptureApi interface to capture RflySim 3D images, updates camera parameters in real time, collects calibration images, and performs intrinsic camera parameter calibration using MATLAB—learning camera imaging geometry calibration methods.
Experiment 26: ROS Environment Setup in Ubuntu Virtual Machine
-
📦 Version Requirement:
Free Edition- 📁 File Path: 0.Preparation/2.GenenralUbuntuConfig/Readme.pdf
📝 Experiment Overview:
Master Ubuntu virtual machine configuration methods; learn ROS 1/2 installation and MAVROS configuration; familiarize yourself with installing commonly used libraries such as PCL and OpenCV; understand application scenarios for bridge mode and NAT mode network configurations.
Experiment 27: IMU and Camera Data Acquisition Experiment
- 📦 Version Requirement:
Free Edition - 📁 File Path: 1-UsageAPI/2.ImuSenorAPI/Readme.pdf
📝 Experiment Overview:
This experiment acquires IMU and camera data via Python interface, teaching the use of the VisionCaptureApi interface, including configuring visual sensors, capturing images, dynamically modifying camera parameters in real time, and controlling aircraft flight.
Experiment 28: LiDAR Point Cloud API Display Experiment
- 📦 Version Requirement:
Free Edition - 📁 File Path: 1-UsageAPI/3.PointCloudAPI/2.LidarAPIPointCloudDemo/Readme.pdf
📝 Experiment Overview:
Acquires LiDAR point cloud data via Python interface and displays it in real time. Learn to use the VisionCaptureApi visual interface and the Open3DShow point cloud visualization functionality, mastering platform-based image acquisition and shared-memory-based point cloud visualization techniques.
Experiment 29: Python Mavsdk Control Experiment
- 📦 Version Requirement:
Free Edition - 📁 File Path: 2-DistributedSimAPI/2.UavDistCtrl/2.MavsdkApiTest/Readme.pdf
📝 Experiment Overview:
Demonstrates aircraft control using the Python mavsdk library. Automatically acquires IP addresses via ReqCopterSim, enabling distributed online simulation. Learn MAVLink communication and offboard control methods.
Experiment 30: Image Acquisition Without CopterSim
- 📦 Version Requirement:
Free Edition - 📁 File Path: 1-UsageAPI/1.ImgSenorAPI/2.NoCopterSimImageGet/Readme.pdf
📝 Experiment Overview:
Acquires RflySim 3D image data via the Python interface VisionCaptureApi without launching CopterSim, and dynamically updates camera parameters (pose, position, FOV, etc.) in real time. Focuses on mastering the usage of visual sensor APIs and camera configuration.
Experiment 31: ROS Image Data Subscription Experiment
- 📦 Version Requirement:
Free Edition - 📁 File Path: 2-DistributedSimAPI/3.RosDistCtrl/2.OpenCVRos/Readme.pdf
📝 Experiment Overview:
Subscribes to and acquires image data from RflySim via ROS. Learn distributed simulation online configuration, automatic IP acquisition via ReqCopterSim, visual sensor configuration, and rospy image topic subscription, enabling cross-platform image data transmission and processing.
Experiment 32: Vision Box Network Port Loopback HIL Simulation
- 📦 Version Requirement:
Free Edition - 📁 File Path: 0.Preparation/3.NXwithPX4Config/2.Pixhawk6xNetSim/Readme.pdf
📝 Experiment Overview:
Implements hardware-in-the-loop (HIL) simulation for a vision box via network port. Automatically acquires IP addresses via Python interface, establishes online connection between RflySim3D and CopterSim, and uses ROS and MAVROS to control the aircraft for ring-through missions.
Experiment 33: UDP LiDAR Point Cloud Data Transmission Experiment
- 📦 Version Requirement:
Free Edition - 📁 File Path: 8.LidarAPIDemo/2.UDPDirectClientServer/Readme.pdf
📝 Experiment Overview:
Sends image acquisition requests to RflySim via Python interface VisionCaptureApi and PX4MavCtrler, receives point cloud data in UDP direct transmission mode, and dynamically plots point clouds in the virtual machine. Key focus: configuring SendProtocol transmission mode and mastering automatic IP acquisition via ReqCopterSim.
Experiment 34: RflySim Visual AI Interface Experiment
-
📦 Version Requirement:
Free Edition- 📁 File Path: 3-VisionAIAPI/Readme.pdf
📝 Experiment Overview: A suite of six sub-experiments for the Vision AI interface, covering binocular calibration, camera calibration, AI training dataset generation, YOLO dataset generation, UE4 camera model derivation, and 3D position estimation. These experiments utilize the Python interface
VisionCaptureApito achieve real-time acquisition and processing of RflySim 3D images.
Experiment 35: Vision Sensor UDP Direct Transmission with JPEG Compression
- 📦 Version Requirement:
Free Edition - 📁 File Path: 2-DistributedSimAPI/1.VisionAPIsTest/3-VisionCapAPI-UE4DirectUDP-JPEGCompressed/Readme.pdf
📝 Experiment Overview: Implements distributed simulation with direct UDP transmission of JPEG-compressed images. Images are received on a remote Linux system (e.g., WinWSL, virtual machine, onboard board, intelligent vision box) or another Windows machine, and aircraft control commands are sent back.
Experiment 36: Derivation of Ideal UE4 Camera Model
- 📦 Version Requirement:
Free Edition - 📁 File Path: 3-VisionAIAPI/3.CameraCalcDemo2/Readme.pdf
📝 Experiment Overview: Uses the Python interface to acquire RflySim 3D images, applies object detection algorithms to derive the ideal UE4 camera model, calculates focal length and intrinsic/extrinsic matrices, and verifies the accuracy of camera parameters under different field-of-view angles.
Experiment 37: Vision Hardware-in-the-Loop Simulation – Modified sysID Loop-through
- 📦 Version Requirement:
Free Edition - 📁 File Path: 0.Preparation/3.NXwithPX4Config/3.CustSysID/Readme.pdf
📝 Experiment Overview:
Automatically acquires IP addresses via the Python interface ReqCopterSim.py, establishes co-simulation between RflySim 3D and CopterSim, subscribes to image data via ROS, and controls the aircraft via MAVROS, enabling hardware-in-the-loop loop-through experiments with modified sys_id.
Experiment 38: Depth Map Acquisition
- 📦 Version Requirement:
Free Edition - 📁 File Path: 1-UsageAPI/1.ImgSenorAPI/3.DepthCameraDemo/Readme.pdf
📝 Experiment Overview:
Configures camera parameters and acquires depth map data via the Python interface. Covers usage of the VisionCaptureApi vision interface, configuration of depth cameras (TypeID=2), real-time modification of camera pose and position, and depth map reading methods.
Experiment 39: Livox LiDAR Point Cloud Visualization
- 📦 Version Requirement:
Free Edition - 📁 File Path: 1-UsageAPI/3.PointCloudAPI/3.LidarLivoxPointCloudDemo/Readme.pdf
📝 Experiment Overview: Uses the Python interface to control DJI Livox LiDAR scanning, acquires point cloud data, and visualizes it in real time using Open3D.
Experiment 40: Python MAVROS Control Experiment
- 📦 Version Requirement:
Free Edition - 📁 File Path: 2-DistributedSimAPI/2.UavDistCtrl/3.MavRosPyApiTest/Readme.pdf
📝 Experiment Overview:
Demonstrates automatic acquisition of the simulation computer’s IP address and aircraft control via the Python MAVROS interface. Covers usage of the ReqCopterSim interface and rospy topic publishing/subscribing and service calls, enabling aircraft control in distributed simulation.
Experiment 41: Getting Started with Vision Hardware-in-the-Loop Kit
- 📦 Version Requirement:
Free Edition - 📁 File Path: 0.Preparation/3.NXwithPX4Config/Readme.pdf
📝 Experiment Overview: Introduces the basic usage of the vision hardware-in-the-loop kit, including hardware configuration and connection setup for the NVIDIA Jetson NX and PX4 flight controller.
Experiment 42: UDP Direct Transmission of Point Cloud Data
-
📦 Version Requirement:
Free Edition📝 Experiment Overview: Send image capture requests via the Python interface
VisionCaptureApi.py, receive processed world-coordinate point cloud data in UDP direct transmission mode, and dynamically display the point cloud in the virtual machine.
Experiment 43: MAVROS Vision-Based Ring-Through Control
- 📦 Version Requirement:
Free Edition - 📁 File Path: 2-DistributedSimAPI/3.RosDistCtrl/3.UavVisionRosCtrl/Readme.pdf
📝 Experiment Overview: Automatically obtain IP addresses via Python interface, establish co-simulation between RflySim 3D and CopterSim, subscribe to image data via ROS, and control the aircraft using MAVROS to achieve drone ring-through functionality based on OpenCV.
Experiment 44: UE4 Direct UDP JPEG-Compressed Distributed Simulation
- 📦 Version Requirement:
Free Edition - 📁 File Path: 2-DistributedSimAPI/1.VisionAPIsTest/4-VisionCapAPI-UE4DirectUDP-JPEGCompressed-2UE4/Readme.pdf
📝 Experiment Overview:
Automatically obtain IP addresses via Python interface ReqCopterSim, establish co-simulation between RflySim3D and CopterSim, and implement multi-window distributed visual transmission of UDP direct-transmitted JPEG-compressed images, including configuration and control of three visual sensors.
Experiment 45: Distributed UDP-Compressed Image Transmission (Automatic IP Acquisition)
- 📦 Version Requirement:
Free Edition - 📁 File Path: 0.Preparation/4.AutoObtainIPRun/Readme.pdf
📝 Experiment Overview:
This experiment implements distributed image transmission and control command feedback from Windows to Linux/WinWSL using UDP compressed image transmission mode, and teaches how to use ReqCopterSim to automatically acquire IP addresses for co-simulation.
Experiment 46: Depth Map to Point Cloud Conversion
- 📦 Version Requirement:
Free Edition - 📁 File Path: 1-UsageAPI/3.PointCloudAPI/4.DepthPointCloudDemo/Readme.pdf
📝 Experiment Overview: Acquire depth map data via Python vision interface, convert it into point cloud images, and display them in real time. Learn visual sensor configuration, point cloud visualization, and flight controller control interface usage.
Experiment 47: GetCamObjDemo – Camera Object Information Acquisition
- 📦 Version Requirement:
Free Edition - 📁 File Path: 1-UsageAPI/1.ImgSenorAPI/4.GetCamObjDemo/Readme.pdf
📝 Experiment Overview:
Acquire information about aircraft, objects, and cameras via Python interface; learn to control aircraft using UE4CtrlAPI.py and PX4MavCtrl.py, and retrieve visual sensor data.
Experiment 48: Accurate 3D Position Acquisition of Objects
- 📦 Version Requirement:
Free Edition - 📁 File Path: 3-VisionAIAPI/4.GetRelativePosDemo/Readme.pdf
📝 Experiment Overview:
Call the sendUE4Pos function via Python interface to generate aircraft and spheres, acquire 3D coordinates of camera, object, and target center, and compute relative positional relationships among objects.
Experiment 49: Python MAVROS Drone Control
- 📦 Version Requirement:
Free Edition - 📁 File Path: 2-DistributedSimAPI/2.UavDistCtrl/4.MavRosPyID3/Readme.pdf
📝 Experiment Overview:
Demonstrate offboard mode control of drones via Python MAVROS interface; learn automatic IP acquisition via ReqCopterSim and usage of rospy node subscription and publishing.
Experiment 50: Multi-Visual-Box Collaborative Ring-Through Simulation
-
📦 Version Requirement:
Free Edition📝 Experiment Overview: Conduct joint hardware-in-the-loop simulation using two NX vision boxes, automatically obtain IP addresses via Python interface, and integrate ROS and MAVROS to control the aircraft for visual navigation experiments involving ring-penetration tasks.
Experiment 51: Point Cloud Data Visualization
- 📦 Version Requirement:
Free Edition - 📁 File Path: 4.Point-CloudVisualize/Readme.pdf
📝 Experiment Overview:
Automatically obtain IP addresses via the Python interface ReqCopterSim, transmit point cloud data via UDP, and visualize the point clouds using the Open3DShow interface.
Experiment 52: Direct UDP Transmission of PNG-Compressed Images
- 📦 Version Requirement:
Free Edition - 📁 File Path: 1-UsageAPI/4.SendProtocolAPI/Readme.pdf
📝 Experiment Overview:
Configure SendProtocol to enable UDP-based compressed image transmission. Learn to automatically obtain IP addresses via ReqCopterSim, and implement distributed co-simulation where images are received and aircraft control commands are sent back from remote systems such as WSL or virtual machines.
Experiment 53: VisionCapAPI IMU Data Acquisition
- 📦 Version Requirement:
Free Edition - 📁 File Path: 2-DistributedSimAPI/1.VisionAPIsTest/5-VisionCapAPI-IMUDataGet/Readme.pdf
📝 Experiment Overview:
Acquire IMU data from CopterSim via the Python interface VisionCaptureApi.py, and learn how to configure requests for IMU data transmission and data reading using vision interfaces.
Experiment 54: Automatic YOLO Dataset Generation via RflySim Vision
- 📦 Version Requirement:
Free Edition - 📁 File Path: 3-VisionAIAPI/5.GenVisionDataSet/Readme.pdf
📝 Experiment Overview:
Use the Python interface VisionCaptureApi to retrieve RflySim 3D images and camera parameters, automatically generate datasets in YOLO format, and split them into training and testing sets using maketxt.py.
Experiment 55: Distributed UDP Compressed Image Transmission (Manual IP Configuration)
- 📦 Version Requirement:
Free Edition - 📁 File Path: 0.Preparation/5.ManModifyIPRun/Readme.pdf
📝 Experiment Overview: Transmit PNG-compressed images via UDP to a remote Linux system or another Windows computer, and send aircraft control commands back. This experiment requires manually setting IP addresses instead of automatic acquisition.
Experiment 56: Camera Segmented Image Acquisition
- 📦 Version Requirement:
Free Edition - 📁 File Path: 1-UsageAPI/1.ImgSenorAPI/5.SegmentImageDemo/Readme.pdf
📝 Experiment Overview:
Acquire RGB and segmented images from RflySim 3D via the Python interface VisionCaptureApi, and learn visual sensor configuration, camera parameter adjustment, and aircraft control.
Experiment 57: Linux Image Reception and ROS Publishing
- 📦 Version Requirement:
Free Edition - 📁 File Path: 5.VisCaptureMergeROSAPI/Readme.pdf
📝 Experiment Overview: Use a Python interface on a remote Linux system to request sensor data from RflySim 3D, receive images and point clouds, forward them into ROS, and perform visualization using RViz in a distributed simulation setup.
Experiment 58: Vision Interface UDP Transmission Latency Test
- 📦 Version Requirement:
Free Edition - 📁 File Path: 2-DistributedSimAPI/1.VisionAPIsTest/6-VisionCapAPI-UE4DirectUDP-DelayTest/Readme.pdf
📝 Experiment Overview:
Acquire IMU and image timestamps via the Python vision interface VisionCaptureApi, compute the minimum achievable image capture latency under UDP network transmission, and evaluate latency performance at a 200 Hz image capture frequency.
Experiment 59: ROS System TF Tree Configuration Modification Experiment
- 📦 Version Requirement:
Free Edition - 📁 File Path: 6.ConfigROSTFAPIDemo/Readme.pdf
📝 Experiment Overview:
Customize and modify the frame_id of the ROS system's TF tree via the Config.json configuration file and Python interface, enabling TF coordinate system configuration and modification for sensor data topics. This experiment helps master TF tree construction methods in distributed simulation.
Experiment 60: One-Click PyTorch Environment Installation
- 📦 Version Requirement:
Free Edition - 📁 File Path: 0.Preparation/6.Pytorch/Readme.pdf
📝 Experiment Overview: Learn to quickly configure a PyTorch deep learning environment using a one-click script, including automatic installation of dependencies such as CUDA and cuDNN.
Experiment 61: Ranging Sensor Experiment
- 📦 Version Requirement:
Free Edition - 📁 File Path: 1-UsageAPI/1.ImgSenorAPI/6.RangingImageDemo/Readme.pdf
📝 Experiment Overview: Create a laser ranging sensor via the Python interface and acquire ranging data in real time. This experiment covers key topics including visual sensor configuration, distance data acquisition, image display, visual interface usage, sensor parameter configuration, and aircraft control commands.
Experiment 62: Simulation of Three Position Tracking Controllers
- 📦 Version Requirement:
Free Edition - 📁 File Path: 1-UsageAPI/6.ThreeCtrlModes/Readme.pdf
📝 Experiment Overview:
Use the Python interface PX4MavCtrlV4.py to simultaneously control the aircraft’s target position and forward velocity during visual control. This experiment teaches the usage of three position tracking controllers: PosCtrl, VelCtrlBody, and VelCtrlEarth. Note: This experiment supports execution only in the Windows Python environment.
Experiment 63: AirSim Interface Experiment
- 📦 Version Requirement:
Free Edition - 📁 File Path: 1-UsageAPI/7.AirSimAPITest/Readme.pdf
📝 Experiment Overview: Control the drone using the AirSim API via the Python interface, achieving position and attitude control.
Experiment 64: Infrared Grayscale and Thermal Image Acquisition
- 📦 Version Requirement:
Free Edition - 📁 File Path: 1-UsageAPI/1.ImgSenorAPI/7.InfraredgrayThermalImageDemo/Readme.pdf
📝 Experiment Overview:
Acquire infrared grayscale and thermal camera images from RflySim 3D using the Python interface VisionCaptureApi, mastering visual sensor configuration and image acquisition methods.
Experiment 65: Livox LiDAR UDP Direct Point Cloud Transmission Experiment
- 📦 Version Requirement:
Free Edition - 📁 File Path: 7.LidarLivoxDemo/Readme.pdf
📝 Experiment Overview: Send image acquisition requests to RflySim 3D via the Python interface, retrieve 10 Hz point cloud data from the DJI Livox LiDAR, and use Open3D to display point clouds in real time, achieving distributed online simulation.
Experiment 66: Ceres and OpenCV Installation
- 📦 Version Requirement:
Free Edition - 📁 File Path: 0.Preparation/7.OpencvC++/Readme.pdf
📝 Experiment Overview: Learn how to quickly install the Ceres optimization library and OpenCV vision library on Ubuntu via scripts, including configuration files and one-click execution scripts, helping users rapidly set up a visual algorithm development environment.
Experiment 67: Getting Started with Anaconda
- 📦 Version Requirement:
Free Edition - 📁 File Path: 0.Preparation/8.Anaconda/Readme.pdf
📝 Experiment Overview: Learn to manage Python environments using Anaconda, including basic operations such as environment creation, activation, and package installation and management.
Experiment 68: Point Cloud Data Transmission Experiment
-
📦 Version Requirement:
Free Edition- 📁 File Path: 8.LidarAPIDemo/Readme.pdf
📝 Experiment Overview:
Learn to receive RflySim 3D point cloud data via both shared memory and UDP methods, and master the transmission and processing workflows for LiDAR and world-coordinate-system point clouds.
Experiment 69: UDP Direct Transmission of Camera Gimbal Data Simulation
- 📦 Version Requirement:
Free Edition - 📁 File Path: 9.CameraInfo/Readme.pdf
📝 Experiment Overview:
Run the server on Ubuntu and transmit image data via UDP direct transmission. Process the returned images, subscribe to screenshot emitter view window messages and gimbal control messages, and publish camera and gimbal data topics.
Experiment 70: Serial Port Hardware-in-the-Loop (HITL) Simulation
- 📦 Version Requirement:
Free Edition - 📁 File Path: 1-UsageAPI/9.serial_connect_HITL/Readme.pdf
📝 Experiment Overview:
Implement hardware-in-the-loop simulation with two serial port communications using PX4MavCtrl. Learn to configure serial port parameters, baud rates, and communication connections with the flight controller, and complete control command transmission and flight validation.
Experiment 71: OpenCV 4.10 Source Code Compilation on Ubuntu 22.04
- 📦 Version Requirement:
Free Edition - 📁 File Path: 0.Preparation/7.OpencvC++/opencv/Ubuntu22.04/Readme.pdf
📝 Experiment Overview:
Compile OpenCV 4.10 source code offline in the Ubuntu 22.04 environment, configure CUDA/GPU acceleration parameters, and complete compilation and installation via automated scripts or manual commands. Verify CUDA functionality for both C++ and Python versions.