Wednesday, February 7, 2024

New Year Progress - Pins and Needles!

Now well into the new year, we got together again to review progress and where we were. 

We now had two Zyderbots to practice with and a visitor from another team, Phil from PiDrogen, came to play too.

Of course most of the meet was about Zyderbot and exchanging what we'd done so far. 

Getting the right version and configuration of OpenCV working on the Raspberry Pi platforms had been a significant task, if only for the compilation time involved. Difficult to show that in a blog, so here's some pictures of laptops with running code. 


This is our idea of a zombie for training the recognition engine. We'll have to revisit this on the day when we get to calibrate against the actual competition zombies.

We didn't get an updated demonstration of the Zombie gun though, the experimental use of solenoids for the trigger mechanism now reverting to more powerful servos for reliable operation. 

The line follower for the Lava Palaver has now reduced to a single camera sensor which detects the white line position, the course 'bump' and the end of course, sending out positional data to the controller on request.. 


The picture shows a streamed version of the detector output which actually slows it down quite a bit but gives confidence that it's doing what it's meant to, here showing three detection zones. Turning the image update and streaming off improves the response time dramatically and isn't necessary for the actual challenge but gives confidence that all is working as it should.

The sensor for the Minesweeper challenge is the same design as the line detector but with a different mounting and software setting. It wasn't setup for demonstration at the meeting, but works on the same principal, reporting the observed relative position of the red square to the controller. The following picture is a test view of what it 'sees'.

This is an uncalibrated image of a red A4 sheet on a noisy floor, we don't have an illuminated 'disco' floor to test it on! It's also surprisingly undistorted for a 160 degree lens, I'm sure the final version will 'fix' that! Code for this sensor will be on Github here.


A lot of the afternoon was occupied with High Noon and we were intending to have some practice with the PiDrogen robot Firefly, but we didn't have a compatible controller so it became a Zyderbot vs Zyderbot challenge. 

Wire coat hangers are becoming more scarce, but we found a couple of ex dry-cleaning hangers and  cut them to size. 

Using bolt cutters might be overkill but works!


We now had the 'official' wire mount and both robots were duly equipped. 






The video is of Phil from PiDrogen vs Colin from EastDevonPirates trying out their skill. We really didn't have enough balloons but we had fun, quite a bit of skill to acquire yet though. It was also a try out of mecanum vs tank steering as the two robots dance inexpertly around each other.


What we now need is more practice and a lot of balloons!


Finally, we took a team picture, might have to do another one, bit of an odd distortion!


Next up will be smaller team meeting to ensure we have solutions for each challenge and possibly some technical info!

 



 


Thursday, January 4, 2024

Zyderbot 9 - A Thoughtful Rebuild - PiWars 2024

Firstly, we have now moved from being a PiWars 2024 reserve team to being in the competition.  It may have seemed like we already were, but now it's official.  

These are a few pictures of the build/rebuild of our robot entry Zyderbot. We wanted two so that we could practice for the High Noon challenge and a rebuild also tidies up the ragged edges of quick modifications that accumulate along the way.

To start, a new design and layout for the front component base.

And with a components attached, from the top, 12V distribution, regulator and LiPo battery box.

Beneath this is the front chassis. The two brushless motors are shown mounted, and the central tunnel is the fitting for the rotation axis of the suspension. 

Next up is the fitting of the cab cover with the battery bay exposed.

And again with the bonnet/battery cover screwed on and a couple of wheels fitted.

The rebuild took a break over Christmas, but was soon underway again afterwards. Always good to see a tidy construction area!!!

While Zyderbot will complete most challenges autonomously, the remote control will be provided by a helicopter RC controller and associated IC's

Once breadboarded, the final circuit is constructed on a piece of stripboard. For the curious, this is a FrSky SBUS receiver / transmitter feeding an SN74AHC14N Schmitt trigger inverter at 5 Volts then a KeeYees level converter to get 3.3 Volts for the Pico

The rear layout of Zyderbot is a separate section with it's own layout.

And with the components fitted. It can look neat until the wiring starts!


Initial wiring, with USB being used to connect Pi and Pico.
Oh dear, it all does something but covering it up for sanity's sake is advisable. 

Completed, here it is, a new build. On/off switch relocated to the top of the cab for easy access, front headlights to evenly illuminate targets. Bonnet colour changed to avoid conflicts with a 360 degree camera looking for red squares.

The rear view, we like to know what voltage our batteries are at!!!!

Christmas rebuild task complete, on to challenge solutions!


Saturday, December 23, 2023

Show and tell December meeting - PiWars 2024

 And the next update meeting. It might seem to be quick on the heels of the previous meeting but that was just the delay of the blog writer getting on with it.

Not a full meeting this time due to illness, it's been going round the team, so we didn't see any progress in the nerf gun mount but we did look at what the zombie ideas might look like and a trial camera mount of for the minesweeper challenge.

Firstly a small update on the lava palaver challenge.


Apart from just a reprint of the front attachment, the ball bearing skids have been replaced by two jockey wheels to cope better with small steps in the course and for the 'bump'. The microswitch is still in place for the physical bump detection but will be replaced by an optical switch when the hinge is remade.


For this iteration, the line guidance is still using IR detectors in two rows, to improve resolution using off the shelf parts but it's likely that a camera interface will be used finally.

The image recognition for the Zombie Apocalypse and Eco-Disaster challenges moves on with good recognition and aiming now achievable. 


These are recognition tests as well as calibration screens, we're guessing at what zombies look like. 

Finally we have built a basic overhead camera detector for the Minesweeper challenge to check the calculations on being able to 'see' the whole arena.

Here the 160 degree camera is mounted at 350mm above the surface, which gives a 600mm radius surrounding view. Raising the gantry to 450mm extends this to a 800mm radius.


This is a bit ungainly but does demonstrate the approach to view the whole 1600mm square arena in one image for processing. For the next meet we'll be aiming to get that extended as well as some line following and mine hunting code.





Saturday, December 9, 2023

Minesweeper - PiWars 2024

The Challenge

The Minesweeper challenge requires the competing robot to reposition itself  to one of sixteen squares in a 4x4 matrix when that square is illuminated as red from the normal white. Covering any part of the square with the robot counts as covering it. and thus mine found. Once the robot has remained over a square for one second, then another square is randomly selected and illuminated, to which the robot must move next.


A total of 20 squares can be illuminated for each robots run and a run ends after either 5 minutes or all 20 mines have been found. If all 20 are found then the time taken to complete the run is recorded and points awarded to the robot according to the formula system

There are points awarded for each square found, 50 for autonomous robots and 35 for remote controlled robots. All competitors get 150 points extra for finding more than 10 mines. Just for competing, autonomous robots get 250 points, so entering as autonomous is worthwhile even if the robot can only move briefly. There are therefore a total of 1550 points available for autonomous robots and 1300 for remote controlled robots.

Analysis

Using conventional wheels driving to a square will require first steering, and then driving. Using tank steering this would be a simple manoeuvrer, but with Akerman steering this could become more complex and in the confines of the 4x4 matrix more longwinded. The use of mecanum or omnidirectional wheels would make moving to a square more direct as the robot only has to set the drive parameters to move in the correct direction. Sticking to the intersections would mean that the robot would only have to move the minimum distance to succeed in finding a square and also covering the central intersection equally as effective. 

For remote control and autonomous robots, once a robot has gained an aligned position over an intersection, instead of analogue steering, commands could be to move 400mm on the required vector simplifying actions and potentially making the robot more reliable.

Each square is 400mm on each side so a robot positioned on an intersection could cover four squares. With a random selection, if a robot can get itself parked on one of these intersections then there is a 1 in 4 chance that the next illuminated square is already covered. A not unreasonable tactic for an autonomous robot is to drive between these intersection points in a circular motion, stopping when it's detected that a square is illuminated. This could be a slightly more complex line following algorithm, ignoring other lines and stopping when the colour change is detected. This strategy would assist any type of steering. If it takes a robot 10 seconds to do a circuit of the squares, 3.2m, it might be expected to complete all 20 squares in less than 100 seconds.

Detecting a red square versus a white square for autonomous robots is a matter of both detecting the red and it's position. If the action of the robot is to actively search for the red, as in the intersection follower above, then the mechanism is built into the movement and detectors built into the corners of the robot might be expected to be enough. 

The detector above will give a precise colour measurement. Simpler detectors could be devised, with say an LDR covered with a green or blue coloured filter, and would give a good signal and no signal if over white or red, dodging the black lines as it goes!  A one minute calibration period is allowed for each robot before each competitive run to allow for adjustments so any detector can be optimised. 

Using this method also needs the dimensions of the robot taking into account and the detectors may need to be a permanent feature of the robot.

A popular choice may be a colour camera which can be used in simple or complex ways. Again, deploying a separate camera to the corners may be an effective way of determining the position of the red square when the robot passes over it, though may be more complex than imagined. Mounting a camera in a single line of vision would require the robot to reposition itself to find a red square, and then drive the robot to locate it. This may be effective but also time consuming as the robot searches and illuminated squares close to the robot may be difficult to see. Placing the camera on a rotating platform may help, though some additional engineering may be required, but would speed the process as only the camera would have to be used to locate a red square and a tilt mechanism could be used to view squares close by. 



Placing a camera high above the robot would give a better overall view of the arena and thus make the task of finding an illuminated square easier. If this camera were also a wide angle camera and positioned pointing straight down, then it could feasibly observe the whole of the arena and thus detect a lit red square without movement. For a robot navigating to a square using this camera method the task is easier because all that is required is that the navigation moves the red square as close to the centre of the cameras view as possible.

 If mounting the camera above the robot doesn't give an adequate view, or the camera angle isn't wide enough, the camera could be mounted on the top of the robot pointing upwards towards either a  plain or convex mirror. 

These are just a few ideas for solving this challenge, not sure which we'll use yet but the camera and mirror might be easiest, we'll just see what we do and might come up with another idea.

Update Meeting - PiWars 2024

 We finally all got together in the same room to see what we'd been up to. The big attraction of the afternoon was the Zombie Apocalypse challenge and where we'd gone with it so far.

This is very much in two parts, the gun platform and the gun.


Here we see the a mock-up of the gun, using parts removed from a working Nerf gun, and 3D printed parts to adapt them to test usage.




For the demonstration, the flywheel motor was driven by a switched AA battery pack (actually a 6 x AA pack adapted for 4 x AA) and the 'firing mechanism' is an SG90 servo connected to a servo tester.


As this mock-up shows, it works well. To get a feel for what the targets might look like, we have to practice, a few small pictures mocked up onto a cupboard helps. This is a view from the turret camera.


They look a bit small don't they! This was with a Pi Zero fitted camera, but a little upgrade helped.

This helped a lot!!!



The chassis has undergone a few changes as well, and now has a Pi3 inside, as well as the original wiring. The following picture shows what we've fitted inside.


This has meant that for operational testing the chassis has been modified further to get access to the boards inside, both to change the SD cards and to plug in USB cables. 


We also have headlights as well which will be very useful for the Lava Palaver challenge when consistent lighting will be important. Here's a video of the chassis in action.

This also demonstrates the control from a standard RC handset via SBus which is working well.

We have been active on building the Line Follower but that demonstration we'll leave until the next meeting.





Monday, November 13, 2023

Nerf Gun Platform - PiWars 2024

      In a team, items come along all the time and if you're writing the blog then either you leave a mountain of things to add, or just publish them there and then. 

     Some things just get decided as the obvious way to go so building a platform for aiming and firing a Nerf gun just seemed the sensible thing to do for the Zombie Apocalypse challenge, though there'll be a blog analysis of it along soon.

Mounted on the rear lid of the chassis, the arm is rather overkill to support a Nerf gun, but it has it's origins in one cooked earlier for a drawing robot arm, so easier than developing a whole new arm. The servos are AX12 serial servos controlled via the official Dynamixel USB controller by a Pi Zero, which can be just seen mounted under the arm, with camera for aiming. The gun connector is a Picatinny rail, Nerf fans will appreciate that.

The heart of the arm is this bearing. It weighs 220gms just on its own without the rest of the arm, but gives superb glitch-free smooth rotation with any weight, even unbalanced.

The whole truck now weighs 2Kg, but it still drives OK

The Nerf gun we are intending to use is a Nerf Stryfe which may not be available in shops any longer, but we have one because we like playing with things. NOTE: this is different to many of the newer versions of the electric Nerf guns in that it is screwed together and so can be disassembled easily, some of  the newer ones are glued together so much more easily damaged when being taken apart, possibly a job for a Dremel if using one.

The following pictures are of the gun mechanism.

Electric motors spin up flywheels just before the barrel of the gun and after the Nerf dart magazine. 

Once the flywheels are up to speed, the trigger is pulled and a mechanism pushes a dart forwards from the magazine, see the arrow in the picture above, until it is caught by the spinning flywheels which catch hold of it and accelerate the dart out of the gun barrel. The flywheel motors are designed to be run by four 1.5V alkaline batteries (about 15Wh) and are normally expected to not last long, but powering them from the vehicle batteries (31Wh) is expected to be ok for the challenge. 
A test cradle has been designed to hold the active portion of the Nerf gun and dart magazine to facilitate fitting to the aiming mechanism.

Out of the box, these foam dart guns have a surprising range and accurate over short distances so are worth the effort to modify. In the unmodified versions, the flywheels are only powered when a trigger switch is pulled, connecting the batteries, so are unlikely to be operated for long periods of time, this is an area which needs to be understood in order to ensure the reliability of the mechanism and prevent overheating. Of course, if the mechanism is going to be modded, increasing the voltage, and thus the speed of the flywheels, might increase the accuracy, something to experiment with. The mechanism also has a variety of interlocks to prevent operation when the gun is not complete, all of which will need to be removed or bypassed.

So, on with the fun, though it might be a bit longer before the next blog post!!!!