Wednesday, May 25, 2022

Toot Toot

 There's a bit of Shepherds Pi and the Farmyard Tour that gives points for audio control, so this is our take on it. This doesn't do much, listens for a whistle and then provides basic interpretation into commands. One whistle 'toot' is a command, as well as two and three, and after a brief interlude, the last command is cancelled. 



For this, a Pi pico is paired with a microphone....and a whistle...feeding the ADC of the pico with the audio signal, and then running an FFT against the signal to pick out the whistles' frequency, 2k7Hz in this case and converts blasts it into a logic signal.




As with the other attachments, this supports the WHOU enquiry, GPIO handshake and returns the info of TOOT, TOO2 and TOO3 as commands.  Power and serial comms are via USB for easy attachment.

Seems a bit of a small post, but it's a complete sub-project that contributes.



Monday, May 23, 2022

Using the remote

 With lots of challenging challenges, it has felt easy to ignore that the Farmyard Tours was to be remote control and not autonomous, so we needed a solution which was a bit more dextrous than a keyboard. Using a full RC set with adapters was always an option, but a more personal system was preferred and an almost universal games controller format was chosen. 



Could have been any brand, but it turned out as a low cost PS3 controller from a no-name Chinese source. As the main Pi controller just wanted a data feed it could translate, the interface chosen to the controller was Bluetooth run by a small ESP32 module dedicated to the communication. This was a lot less bother than direct comms and meant that the controller Pi wasn't cluttered with noisy code. Here is a larger dev board for testing, and mounted on a test chassis. The communication is via serial over USB and a simple handshaking option is used to request data from the PS3. 


One of the issues with this configuration is that when the controller times out and disconnects, the ESP32 code can't reconnect, so an extra pin was dedicated to reboot the ESP32 into connection mode again. 

Here's a brief video of the test chassis twirling in the arena..........and negotiating a bit of garden.


....and negotiating a bit of garden.







Thursday, May 19, 2022

A new chassis

So after lots of testing....well a bit anyway...we needed to build a new chassis, if only because we'd drilled so many holes in the old one! 

First make the battery holder which is also part of the suspension. Here it is.



Well that's spectacular isn't it. It's very basic, is made from a piece of acrylic tube, one end has a fixed plastic plug with a battery contact permanently attached and the other has a plastic spring holding the other contact in, which is held in place with a screw. The tube itself forms the central pivot of the suspension.

The two halves of the chassis were updated to accommodate the positioning of the camera as well as the main controller and attachment processors.

This is the rear subframe from below, the motors fixed in place either side of a tunnel which houses the battery tube. 

And a picture of the front subframe, housing motors, a camera and illumination LEDs, with the second half of the battery tunnel shown.

Putting them together, with an empty tube in the middle. The two subframes can rotate independently of each other around the battery tube. This isn't of much use in the arena challenges, but when negotiating the farm tour obstacle course it's expected to come into its own to keep the chassis stable and in contact with the ground.

As mentioned previously, part of the rebuild was to accommodate a camera.
The camera is mounted below the attachment deck of the chassis, the bolt holes for attachments can be see top left and right, and faces down and forward, looking at the arena surface ahead of the robot. The LEDs are there to ensure it has plenty of illumination for the camera and isn't dependent on ambient light, eliminating an uncertain variable.



Close-up of the two subframes partly rotated against each other to show the suspension in operation. Already, with only the motor, LEDs and camera fitted there are a lot of wires in this robot and cable management is important, hence the prepared holes in the subframes.

From the outset we wanted interchangeable wheels to adapt to the different challenges. These are 'arena' wheels with their adapters. The adapters were always part of the design as with so many different motor shaft attachments available we needed to be able to switch them without having to remake whole wheels.



Now it looks a lot more like something with the wheels attached and wires routed. The front chassis  mounts for attachments can be seen, and that growing wiring is getting some management..

And suddenly there's a wire explosion!!! At the rear (right) is the main power switch, and just ahead is the controller pi board mounted vertically. At the far side is the power regulator. Raw power from the batteries is delivered at around 11V and used by the attachments, the regulator ensures the pi is supplied with a healthy 5V. 
Nearest the camera is the motor control pico which receives commands from the controller and incorporates a LED to provide a simple status display. This connects to the controller via USB. 

And here is the assembled new chassis with the Hungry Cattle hopper attachment. 
There's a lot of parts to this but it does work very well. Only a short video of this chassis working but it gives an idea of the basics.




Next up a  blog giving a description of the architecture of these robots.





Thursday, March 17, 2022

Attachment action

After a bit of a delay actually doing things, now entering the phase of actually testing some solutions. 

First up is the Shepherds Pi herding solution. The chassis is working well and to get the basic herder fitted just required some bolts, power and a USB cable. This wasn't just a sudden event and the interface has undergone a bit of refinement.


What looks to be the final interface is to run attachment microcontrollers from USB via a hub, with a separate power supply to run motors and servos, putting the power conversion on the attachment to optimise for the particular attachment, standard power supplied being 12V.


This is a picture of the shepherding attachment, controlled by a Pico on a Kitronik control board connected to the arm servos with a small buck converter power supply and power display. Also shown is a handle to aid picking the chassis up as it's now getting a significant amount of handling. The control board is a bit over specified here but in the next iteration will have to run two stepper motors.

The shepherding attachment has gone through several phases, and the one shown is good enough for basic positional testing using dead reckoning. Here's a video of a basic operation.


Some erratic movement on the arms due to controller initialisation signals, but this is a good test of a sheep 'fetch'.

While the arms do a good job with the sheep, they do need to do more and so the turbo-shepherd version is in construction and testing. Here's a video of it in test mode. 


Here a stepper motor is now giving lift to the arm to allow it to be move up and out of the way, open and close the gate, as well as potentially pick up recalcitrant sheep!

With the herding going well, the Hungry Cattle challenge is almost complete.

The montage above shows the feed dispenser attached to the chassis. The feed hoppers are attached to a turntable mounted between the wheels of the chassis. The robot feeder drives up alongside a trough, the turntable rotates a full hopper over the trough and dispenses feed. The robot then moves on to the next trough, meanwhile the hopper rotates the next hopper into position. Once feeding complete, the hopper turntable rotates back to a central position for refilling. The controller here is a Pico on a dedicated bit of stripboard to run three servos and a small stepper. 

Finally, a bit of dimension checking. The rules say it all has to fit withing the marked rectangles, so here we are, fitting in!
Picture with arms extended

and then with arms parked.
May need to tidy a few wires up!!!! 

That's it until next time, when we'll have some remote control via a PS3 controller, a dog whistle and maybe even some voice commands.....come by Shep!



Friday, February 25, 2022

Joining bits together!

Another team meeting this week to look at where we're going and show our progress. This is more a position update blog, not much to show in a structured way but an example of where we are.


First up is a video of the sheep herder. This is the basic version in operation, not the turbo, just giving us a view of how things will pan out in the arena. The arms and flippers aren't powered and dead reckoning is used for the small amount of navigation shown.


So we have a sheep dog but it needs a bit of training! This is also the first time a chassis and attachment have been mated together, it took only a few minutes and that's how the other attachments should fit!

Also demonstrated was our change to the use of a PS3 controller for remote control operations. We'll be using this for the Farmyard Tour challenge of course but it's main job in the coming weeks is to rehearse the best sequence of operations for the chassis and attachments for testing.

We're still moving on with the apple picking attachment, the laser cut practise tree shown in the last blog entry is now assembled, needs a bit of weight adding and the apple picking cup MK2 ready for testing.




Finally the cattle feeding turntable was demonstrated. This challenge is almost ready for full testing with most of the components finalised and control code written. This is possibly the simplest attachment but still comprises 36 individual 3D printed parts, together with many nuts, bolts, servo's and a stepper motor. Hidden in there is the custom controller board as well!


The picture has been shown before but next time will be a full demo video, we've already written the routine for the video!. Its controlled via micropython on a pico. 


That's it for this time, next time we'll have working sheep dog arms and a trough filling attachment!



Tuesday, February 15, 2022

High Speed Hoppers, bit of apple picking and some scary sheep handlers

The Hungry Cattle solution is still going well and we now have three hoppers mounted on a turntable which reside withing the wheel base when fully loaded, but can selectively swing out to dump their feed into a trough, either at the front of the robot or to the side.

This is the 'closed' position with the weight centralised. This isn't the competition chassis, just the test one.


And then its swung round to place a hopper over a trough at the side, and also at the front if that is advantageous.

New hopper funnels are being made to throw the feed further and try to get a more even spread in the trough, and hold a bit more feed to ensure coverage.

The turntable is run on a small stepper motor but there isn't any synchronisation fitted yet so it a bit hit and miss as to where it stops. Here's a video of it rotating.


Once synchronised, the hopper will rotate in 90 degree steps. When the hopper stops over a trough, the hopper releases it's feed using the drum mechanism and the robot can move on to the next trough. The hoppers rotate back to the inboard position for refilling.

Not a lot of news published on the apple picking. Here we have one of the apple pickers and a tree worthy of Ikea, laser cut from 3mm ply.



Scary carpet eh?


The turbo-shepherd has had the first iteration of the sheep handling arms tested.



This hasn't been that successful but the idea is working ok. To improve its operation, more powerful servo motors will be chosen and the layout of the components optimised to be able to 'herd' the sheep more effectively. The video shows the arms being parked, open wide to gather sheep, and also selectively moving to push sheep to the side. The movement could be a lot faster and smoother but this is primarily to test the concepts.

Here is the updated design being built.


The challenge has been to loose as much of the mechanism in height rather than loose herding capacity, the parked position being less than 225mm wide and 100mm deep, and folding out to 325mm wide to gather the sheep. before pushing them into the sheep pen. The arms will also operate the gate mechanism.

Next up, hopefully first views of the apple picker, completing the Hungry Cattle hardware and maybe a bit more of the sheep handler design built. On a personal note, I'm just pleased RS components delivered my reel of solder today :)  


Saturday, February 12, 2022

Start of an arena

 While a lot is happening, somehow there's never much to see for all the effort, but at our last PiWars get together we had the start of our arena to view.


It's made out of flooring board so can be dismantled into three parts and we've marked it out in 250mm squares to start getting the feel for what the space looks like.  No apple tree on view, but some test cardboard sheep and wolves, together with our newly made cattle troughs, fill up the space. 

It was also an opportunity to look at motor speeds and load capacity. This robot is our test bed for a set of four brushless motors. We've loaded it up at the front with a 1kg weight to simulate a full load of cattle feed (rice) to see what its performance is.


So speed tests put the crossing of the arena from standing start to stop at 2 seconds with the full load, which we're happy with. None of the attachments are very heavy so we're ok to go. Might need a bit more grip on the wheels to get better acceleration and ensure a skid free stop, but the work we described in the last blog has paid off, so success. A small accident in control during one of the tests demonstrated it also turns very quickly as well!

Also on demonstration on the arena are the navigation beacons.


These will be used by the vision cameras on the robot chassis to give an accurate position within the arena and provide the navigation references. This picture shows three coded beacons but the arena will be surrounded by them eventually.

As well as the tea and biscuits, a quick view of the kitchen table gives an overview of what's been going on.


In the foreground on the far left is the time synchronisation test rig to provide an accurate common time reference to the independent stereo cameras.

Beside it in yellow and black, is the modified cattle feed hopper, extended at the top to hold more feed, and fitted with a large drum to deliver feed to the trough faster. Also shown are two other hoppers in green with out the capacity extension. The need for the extension followed tests with the accurate 3D printed troughs showed that we hadn't been delivering enough feed to the trough to cover the centre line so we needed to increase the amount. We could have designed some sort of shaking device to even out the feed in the trough but just increasing the amount was faster and unsophisticated.

Between the two green hoppers is the new turntable to rotate hoppers over the side of the robot chassis for dispensing, and then returning them to an inboard position to keep the weight distribution within the robot wheels.

At the rear is a yellow test robot chassis powered by an ESP32 which is used to test attachments and in front of that a pair of arms for gathering and gripping sheep. We found that the cardboard sheep we'd made were actually to big and so the arms couldn't quite reach round them! It also used fairly low cost servos which didn't perform well, so will need a bit of an upgrade before the next demonstration, as well as the lift mechanisms fitting with the new stepper motors.

Next meet will be a test of the Hungry Cattle challenge with remote control, progress! 

Finally just another picture of the arena with bits in place. We had made three wolves and six sheep but two sheep were lost, but we put them in place anyway.


Also on show are the beacons, troughs, stereo cameras, and four test bed chassis!!!!!

Tuesday, February 1, 2022

A New Chassis, new motors and fancy troughs

 So we've been designing a new chassis, based partly on the original, but with a few new ideas added. 



The coffee and biscuits are a key part of the design process though may not be part of the final implementation. These are HLC208 encapsulated brushless motors with the controller electronics built-in. They also have their own direction selection feed as well as an accurate speed output. 

The supplier website gave instructions for testing these out and some sample code, but that did little other than turn the motor. Adding an extra earth and scrapping their example code in favour of hastily written test code got them working nicely, variable speed, direction changes and feedback with very little cpu time involved. Very simple to use when you know how!!!! We'll see how they progress.

Now that we have some stl files for the Hungry Cattle troughs from the organisers, we thought investing some print time in creating three accurate troughs with halfway lines printed in. And here they are, they look ok, though we did get one line not quite right!! So just have to fill to above the line!


Next up will be the new hopper emptying mechanism, should be a bit faster than last time.

Saturday, January 29, 2022

S.L.A.M

Simultaneous Locating And Mapping summary 16/Jan/2022

Actually, there’s not a lot of mapping, as we build the arena, so we hopefully know where everything is, but locating the robot within the arena is a big deal in PiWars 2022 as there is a lot more stuff about than in 2021. 

General concept: stereo cameras and beacons.



Beacons

The logic chain ...

  • You need identifiable landmarks in a known location.
  • How do you pick them out from the background clutter? If you use LED beacons then you can drastically underexpose the image, leaving only the LEDS showing.
  • How do you identify them? Use different colours.
  • Why not a modulation? Because you have to do this fast on a moving platform, you can’t afford the time to observe the beacon over a time period to see changes.
  • What colours? Well, it turns out that the obvious RGB colours have a problem, which is that the Green is too close to the Blue for rapid distinguishing, so just Red and Blue then.
  • How high? First guess was on the ground with the cameras underslung (leaving the robot top completely clear for attachments). But what about the sheep and troughs obstructing the view, let alone attachments hanging down? So current guess is 110mm up. That means we can have the cameras on the back of the robot unobstructed.
  • What if that’s wrong? They are mounted on 8mm square section carbon fibre tube, so if we need them higher up, we just use longer tubes.
  • What kind of LEDs? First we chose RGB LEDs. This means that if we change our minds about colours we can just solder in some new resistors and get any colour we like. We started out with clear 5mm LEDs with 3D printed HD Glass diffusers, but why make work for yourself when you can get 10mm diffused LEDs?
  • How many LEDs? Given just two colours and four LEDs you get 16 combinations. Each arena wall has a maximum of seven LEDs (if you include the corners) so can then have a unique pattern of beacons. If we need each beacon to be unique in the whole arena we will have to go to three colours or five LEDs

They are powered at 9V, so could use PP3s, hence the little box at the base



Beacon identification software

    First thought, use OpenCV for both image capture and processing. It’s a bit worrying that it takes 25 seconds to load (not to mention 5 hours to install), but runtime is lightning fast and the loading takes place before the timed run, so should not be a real problem. So we start with a pair of 640x480x3 RGB images (possibly on different computers) captured with OpenCV. 
    We can get 29 frames per second (FPS) capturing stereo pairs on a single computer (the Stereo Pi). However, it turns out that we can process them in a very basic way just with numpy and get a calculation ‘frame rate’ of 1880 FPS, so simple image processing has no real effect on performance. The killer is reliability.     OpenCV just doesn’t control the camera hardware properly. This means that every now and then the image goes green monochrome or the GStreamer is incorrectly invoked. Even after weeks of trying I cannot resolve this, so it’s PiCamera and NumPy for now.

Phase 1

The base image is 640 columns wide, 480 rows high, and with 3 colours (RGB)

Locating beacons
This is done by just looking for at least 5 consecutive bright columns in the image to make a column set.

Measuring the Angle

The dreadful barrel distortion of the lens is compensated for by a cosine formula determined experimentally from calibration images. This is then used to create a lookup table to convert the column number of the middle column to an angle, i.e. the bearing from the camera.

Locating LEDs

Look for at least 3 consecutive bright rows in a column set. Note that the LEDs are separated by quite thick separators so that they don’t run into one another in the image. Produces a set of rectangles in the image.

Determining the colours

Because we only have red and blue, we just sum those colours in an LED rectangle; if there’s more red than blue, it’s a red LED, otherwise it’s blue. Note that using OpenCV and YUV encoding we may be able to reliably distinguish green as well, but can’t do that currently.

Identifying the beacons

We have a database of beacons and their colour codes, so RBBR is a beacon with, from the top, Red, Blue, Blue, and Red, LEDs. The database records their location in arena coordinates (garage is 0,0)
The end result of Phase 1 is a set of beacon identifiers and angles. These are written to a database (currently on the PI Zero, but eventually will be on the central Pi)

Phase 2 - Getting a Fix

Choosing the bearings

From the bearings table we choose those ones to use. We want bearings of the same beacons from both cameras taken at the same time. From those we want the pair of beacons furthest apart to get the best angles, so from those beacons which occur in both images we choose the leftmost beacon and the rightmost beacon.

Calculating the position

This is some trigonometry, using the cosine rule and the sine rule. The result is the location of the beacons relative to the robot. Translating the co-ordinate systems we calculate the location of the robot relative to the arena.

Next ...

PID control of motors using the location delivered above (PID = Proportional Integral Derivative). Planned path will be a series of locations (arena x,y co-ordinates), plus angle (orientation of the robot relative to the arena).

Performance

Cameras

The basic picamera is a very cheap device using a tiny plastic lens. It has bad barrel distortion and you might think that we have to do a complex correction grid, but actually, because of the very specific use case a fairly straightforward correction does the job. So long as the camera sensor is absolutely vertical and at exactly the same height as the middle of the LED beacon the barrel distortion above and below the middle doesn’t affect it.

Timing

Obviously the calculation of location from a stereo pair of images taken from a moving vehicle is dependent on the two images being taken at the same time. Paula has done a study of synchronisation procedures which should solve the problem of clock differences. Because picamera capture cannot be directly triggered (you are picking up frames from a continuous video stream) some more work is required to convert clock synchronicity into camera synchronicity.

Basic Capture Frame Rate

Stereo Pi (= Pi 3), single camera, RGB, picamera
straight capture:  1.7 fps 
capture using video port:  5.0 fps

Stereo Pi (= Pi 3), camera pair, BGR, OpenCV 
capture using video port:  29.3 fps

Single Pi Zero 2 W, single camera, BGR, OpenCV 
capture using video port:  61.7 fps

Geometry Frame Rate

from column compute location
using numpy  1880 fps

Accuracy

This is the big one. To avoid the need for supplementary location systems we need to get pretty close to 1mm accuracy. 10mm might be OK, but 100mm would be a waste of time. At present we are not near that, but there is time for more optimisation and calibration.

Friday, January 14, 2022

Enter Turbo-shepherd

 The Shepherds Pi challenge has been looked at a bit, but after a few trial runs with wooden paddles and cardboard sheep, it was obvious a bit of extra manipulation was going to be necessary to do it quickly, and not just shove the sheep and wolves around. Enter a pair or arms to help.





The arms are articulated to fold flat to the chassis, and then when folded out, articulate half way along to allow nudging, flipping and to form a funnel. Each arm can also be raised and lowered to be lifted out of the way quickly to provide easier chassis manoeuvring. Building arms this way also provides a mechanism to provide for gate opening and closing.

The first prototype for this has been built but currently a bit slow for competition, but will be ok for testing the concept and planning movements.


No peace for the busy, the basic Hungry Cattle hopper feeders work but when a side funnel is fitted, the rate of dispensing goes down and even stops unless the funnel has a steep angle. Experiments with this has set a good angle at 40 degrees or more which raises the hopper height by at least 100mm, making the whole robot approach 300mm height. This is ok in the rules but there will be a bit of weight in the feed so may make the chassis unstable at speed.

So a new concept is under review, rotating hoppers which sit in a carrousel and rotate out over the trough to be filled.


In the barn/filling position, the hoppers sit inside the footprint of the chassis, openings pointing upwards.



When alongside the trough, the carousel rotates to position a hopper over the trough where it empties and the chassis them moves on to the next trough. While the chassis is moving, the carrousel rotates again to position the next hopper, ready to empty in the next trough. Returning to the barn, the carousel rotates back to the starting position.

Because the hoppers now have larger openings, its expected that the rate of discharge will be faster than the previous design, and will be more reliable, there being no closing mechanism to jam. Placing the carousel like this also gives a clearer view for the stereo vision system to navigate.

That's all for now, more experiments to do, bits to make and robots to crash!!!