
We have started working on the head for Spot Puppy and although the design is still very basic, we now have the basic structure to work with. The head contains 3 components an OLED screen to bring Spot Puppy’s eyes alive, a Pi Camera for computer vision processing and a Time-of-Flight sensor to calculate distances from the camera.
We suspect the quadruped movement may cause too much motion blur on the camera so we have a contingency plan to also add an accelerometer and servos to the head design to provide automatic head stabilisation.
As you can see below we have been using cardboard and basic 3D prints for our initial prototypes which we have now converted to our generation 1, fully 3D printed head.
Our design for Spot Puppy utilises an Arduino to drive the OELD display which is connected via a serial line to the Raspberry Pi. The Pi will control the camera, the ToF sensor and our aim is to send command messages to the Arduino to allow the eyes to react to things that Spot Puppy senses in front of its head. We are currently using a Arduino Mega board but the plan is to replace the Mega with a Nano which will sit inside the head.
The videos below shows the first time we ran the computer vision code, the distance sensing code and the eye animation code all at the same time. The computer vision code is written in Python utilising OpenCV and this version is a rudimentary detection system for the Raspberry Pi box. The ToF sensor also seems to be working well and we intend this to be a secondary input to help us make sense of what the camera is seeing.
We have a lot more work to do, including spray painting some very large barrels red and green in preparation for the Eco Disaster event.