When we entered Pi Wars our ambition was to simply demonstrate a walking robot could compete in the event, so we are really proud to have been declared the total score winner. To have also come second in the public vote for favourite robot, just behind the brilliant MacFeegle Prime, was the icing on the cake.
I am really please to announce that we have met our Christmas target of creating a version of Spot Puppy that we can control via a Bluetooth controller to navigate around a simple set of obstacles. All of the core components (servos, camera, time-of-flight sensor, accelerometer, battery, Arduino Nano, OLED and of course the Pi) and are incorporated in this version so we are in a good position for 2020 PiWars
We made a lot of progress through December, starting with a great deal of excitement from seeing Spot Puppy’s very first steps. However, as you can see it was far from perfect and at that point we had very little control of the direction (hitting the wall was not intentional!)
We still have a lot of work to do including perfecting the auxillary components for barrel pushing, shooting and balloon popping. However we will be focussing next on the autonomous events starting with line following.
We have started working on the head for Spot Puppy and although the design is still very basic, we now have the basic structure to work with. The head contains 3 components an OLED screen to bring Spot Puppy’s eyes alive, a Pi Camera for computer vision processing and a Time-of-Flight sensor to calculate distances from the camera.
We suspect the quadruped movement may cause too much motion blur on the camera so we have a contingency plan to also add an accelerometer and servos to the head design to provide automatic head stabilisation.
As you can see below we have been using cardboard and basic 3D prints for our initial prototypes which we have now converted to our generation 1, fully 3D printed head.
Our design for Spot Puppy utilises an Arduino to drive the OELD display which is connected via a serial line to the Raspberry Pi. The Pi will control the camera, the ToF sensor and our aim is to send command messages to the Arduino to allow the eyes to react to things that Spot Puppy senses in front of its head. We are currently using a Arduino Mega board but the plan is to replace the Mega with a Nano which will sit inside the head.
The videos below shows the first time we ran the computer vision code, the distance sensing code and the eye animation code all at the same time. The computer vision code is written in Python utilising OpenCV and this version is a rudimentary detection system for the Raspberry Pi box. The ToF sensor also seems to be working well and we intend this to be a secondary input to help us make sense of what the camera is seeing.
We have a lot more work to do, including spray painting some very large barrels red and green in preparation for the Eco Disaster event.
It is mid November and everything is starting to come together for Spot Puppy. The legs are now attached to our first generation body and the addition of an accelerometer has allowed us to develop an active balance system as you can see in the video below.
We still have some problems with servo shake and we need to investigate dampening techniques to try and eliminate them. We would love to hear from you if you have any good ideas on addressing the issue.
The videos below show Spot Puppy prior to the active balance system. As you can see simply going up and down was fine but as soon as we attempted walking we realised we needed the accelerometer and much better balance.
We are aiming to have the core movement and balance system completed before Christmas along with all the sensors wired in. Our current plans are to include a camera, time of flight sensors and an OELD screen.
After many iterations for our Spot Puppy leg design, which we documented in the evolution of a leg design blog post, we have finally printed a set of four generation 5 Spot Puppy legs which we hope will be the final design. The legs are printed on an Ultimaker 3 and we are using metal geared servos to drive 3 points for rotation, two at the hip and one at the knee. The dual hip servos allow rotation both in the forward/backward plane as well as the perpendicular outwards/inwards plane.
The sequence of photos below show the assembly process for the legs from the original 3D print to first assembly.
And here they are fully built legs including servo horns and squash ball feet.
To help with the development of our control library and in particular the creation of different gait routines (walking, trotting, running, etc) we decided to develop a simulator to capture the control signals from the control library and visualise them on a screen. This resulted in a much quicker development cycle as the impact of changes to the control library and the gait routines could be tested immediately. An additional benefit was the running speed of the visualisations could be easily reduced to create a “slo-mo” version which was much easier to analyse.
Before the simulator we would make the changes to the control library, run the changes on the robot while filming, and then study the slo-mo film to try and work out what was going on.
A key design feature of the visualisation process was to ensure the control library is completely unaware of whether it is producing output for a real robot or for the visualiser. Another design feature was to be able to capture the raw control library output to a file so it can be easily replayed and compared to other runs. All the code is written in Python and it makes extensive use of the matplotlib library.
The leg design of Spot Puppy has already gone through a significant evolution based on iterative experimentation and learning. The early versions were design to be extremely light as shown in the picture below. This consideration was mostly due to our concerned about the available power of a micro servo. However after early testing it became clear that the design was too flimsy and with too much flex. We also discover the micro servo were more powerful than we originally thought.
With the knowledge we had acquired from testing our first generation of the leg we made significant design changes and moved on to generation 2 as shown below. This version of the leg design was much stronger and included screw holes for mounting the servos. We printed a pair of legs and using a cardboard box for a body and pencils for back legs we had our first moving prototype. Through testing we found some more design issues the biggest of which was the single attachment point for the servos.
With all of our learning from generation 1 & 2 we moved on to generation 3. This version of the leg included attachment points both sides of the servo which made it feel much stronger. To see how this version of the leg performed we decided to combine a test with an early version of our control library so we built a painting robot that used a Spot Puppy leg as its arm. Although we could still see flaws in both the leg design and the control library it was immensely satisfying to see our robot paint a simply picture.
Encouraged by the success of our painting robot we made a small modification to the foot and reprinted 2 more versions of the generation 3 legs. With the legs assembled we further developed the control library to include an early understanding of walking and running gaits and set about testing with the very satisfactory result shown below.
We are still making refinements to the leg and completing further experimentation to determine the best foot design but we are very happy with the progress so far.
Spot Puppy will contain three servos per leg, as shown in the conceptual pictures below. One for outward motion at the hip, and two for forwards and upwards motion at the knee and thigh. An onboard Raspberry Pi will run control code written in Python operating as autonomously as possible. Human control will also be available through a bluetooth handheld controller.
We will post frequent updates to this blog about the progress of the build and the actual competition in 2020.