Tutorials: Yahboom ROSMASTER M1
Yahboom ROSMASTER M1 can be equipped with various peripherals, including a 3D depth camera/2MP HD camera PTZ(optional), LiDAR, AI voice module, and ROS robot expansion board, building human-level 3D visual perception and environmental understanding capabilities. Yahboom ROSMASTER M1 supports Raspberry Pi 5, RDK X5, Jetson Nano 4GB, and Jetson Orin Nano 8G, and is fully compatible with ROS2 HUMBLE, deeply integrating with mainstream AI frameworks. Employing an innovative multimodal dual-model collaborative reasoning architecture, it efficiently integrates visual, voice, and text information, possessing human-like capabilities such as continuous dialogue, instant interruption, dynamic scene reasoning, and intentions speculation.
Whether conducting SLAM mapping and navigation, AI visual recognition, path planning research, or carrying out multimodal human-computer interaction experiments, this robot car can meet all your needs.
- Mecanum omnidirectional drive chassis, High-torque 520 encoder motor
- Multi-master platform compatibility, Meeting needs from beginner to research
Yahboom ROSMASTER M1 Supports multiple computing platforms including Raspberry Pi 5, RDK X5, Jetson Nano 4GB, and Jetson Orin Nano 8G.Compatible with ROS2 HUMBLE, it adapts to multi-level needs such as school teaching, laboratory work, and AI research, providing users with extremely high scalability and sustainability.
- Multi-sensor fusion, Building human-like 3D perception
Yahboom ROSMASTER M1 Equipped with 3D depth camera, 2MP HD camera PTZ, LiDAR, AI voice module, and other peripherals, this system forms a multimodal environmental perception system, supporting advanced applications such as visual recognition, SLAM mapping, and environmental understanding.
- Multimodal human-like Interaction with superior AI capabilities
- Highly scalable for research purposes, Meeting diverse experimental needs
Yahboom ROSMASTER M1 Suitable for various advanced scenarios such as SLAM navigation, AI visual recognition, path planning, multimodal interaction, and embodied intelligence research. Supports a wealth of ROS teaching examples and open-source resources, making it easy to use in classroom teaching and research projects.