This is my latest project video. I made two replicas of the XBOT 4000 face (from the Netflix series "Love Death + Robots)
One of these is just a static version hooked up to a raspberry pi and a lidar sensor. The other is controlled by an NVidia Jetson (running MTCNN face tracking software). This version uses a total of 10 XL-320 servos controlled by an OpenCM 9.04 running custom firmware (I had two of these controllers lying around, so I thought it best to put them to good use :)).
Two servos are used for a camera gimbal and the remaining eight are used for the animatronic eye mechanism.
The code running on the Jetson is able to detect faces in the video feed at approx 15 frames/s. If it detects a single face, it will look at the bounding box position relative to the image and attempt to align the camera so that the bounding box is centered. As long as the face is off center, the Jetson will send commands to the OpenCM controller to adjust the camera gimbal. The four servos controlling the eyball will track the camera movement, making appear so that the robot is looking directly at the person standing in front of it. If the person in front of the camera moves, the robot will seem to track him/her with the eyes.
The four remaining servos are dedicated to controlling the eyelids. This allows for a great number of expressions (sleepy, surprised, winking etc)