VideoLibrary

Face Expression Recognition Robot with Arduino UNO Q | DigiKey

The Arduino UNO Q is a unique hybrid board that blends a Qualcomm QRB2210 application processor with an STM32U585 microcontroller, making it ideal for projects that need both AI/ML inference and real-time control. Unlike typical single-board computers, the UNO Q shines in headless edge-AI applications (e.g. computer vision, local web servers, ONNX/TensorFlow Lite inference, and USB-camera processing) while the MCU handles the precise timing required for servos, LEDs, and sensors. That combination makes it a powerful platform for robotics, mechatronics, smart cameras, and interactive art. To demonstrate what this hybrid architecture can do, I built a pan/tilt face-tracking robot with a USB webcam and a NeoPixel LED ring. The Qualcomm side runs a custom YOLOv11-nano facial-expression recognition model, while the microcontroller side drives two hobby servos using PID loops for smooth motion. When the robot detects a face, it automatically rotates to follow the largest bounding box (usually the closest face) and then changes its LED color or pattern based on the predicted emotion (“happy,” “surprised,” “neutral,” and so on). Everything runs entirely on-device, no cloud services or external GPU required. If you’d like build your own version of this pan/tilt emotion recognition robot, check out the written instructions here: https://www.digikey.com/en/maker/projects/face-expression-detection-robot-with-arduino-uno-q/f3a3ad34a4b34be3a55177b1f893816b Mechanical files and code for this project can be found here: https://github.com/ShawnHymel/face-expression-detection-robot To get expression recognition working efficiently on low-power hardware, the YOLOv11-nano model is retrained using transfer learning. Starting from the pretrained COCO weights, the head of the network is fine-tuned on a labeled facial-expression dataset from Roboflow. After training, the model is exported to ONNX, optimized, and deployed directly onto the UNO Q. A Python script running on the Linux side handles image capture, preprocessing, and inference, then sends servo targets and expression states to the MCU via Arduino’s new Bridge RPC system. This separation lets the CPU focus on computer vision while the microcontroller handles movement. The result is a responsive, battery-friendly robot that blends computer vision, machine learning, and real-time robotics into a single compact platform. Whether you want to recreate this exact build or expand it into something more advanced (e.g. gesture-reactive art, smart home sensing, human-robot interaction, mobile robotics) the UNO Q’s hybrid design makes it a powerful tool for modern edge-AI projects.

1/2/2026 7:11:11 PM