STM32+AI control Robotic hand solution
Candidate for:EE Awards - Edge Computing Platform
Gesture Recognition and Control System Based on STM32N6 and STM32MP257
This system, built on the STM32N6 and STM32MP257 platforms, implements a gesture recognition and control system. It captures hand gesture data in real-time via a camera. The STM32N6 leverages its integrated NPU to accelerate the MediaPipe deep learning model, extracting the coordinates of 21 hand keypoints at a rate of 30fps. These coordinates are transmitted to a Programmable Logic Controller (PLC) via an RS-485 bus.
The dual-core STM32MP257 PLC, operating on the CODESYS platform support multiple communication protocols, and performs joint angle calculations. It converts the received coordinates into robotic hand control commands, driving the RuiYan RY-H1 robotic hand via the CAN protocol. This robotic hand employs a 15-degree-of-freedom linear actuation design, control by 5 pieces STM32G4. Integrated with a force-position hybrid algorithm, it achieves biomimetic motion and enables adaptive grasping without requiring force sensors.
The system integrates ST's X-CUBE-AI model deployment capabilities and supports industrial communication protocols (EtherCAT/Modbus). Operating under a heterogeneous computing architecture, it achieves a millisecond-level closed-loop from gesture recognition to mechanical execution. This solution is suitable for industrial applications such as automated assembly and intelligent grasping, featuring both high precision and strong real-time performance.
Malicious vote manipulation is expressly forbidden in this voting event. The organizers reserve the right to evaluate the fairness and accuracy of the voting results. AspenCore retains the authority to interpret the rules of this event.