Showing posts with label Pico. Show all posts
Showing posts with label Pico. Show all posts

Wednesday, April 10, 2024

Last before Cambridge

So this is possibly our last blog entry before we're off to Cambridge with Zyderbot, ready or not. I'm sure we'll take part in everything anyway but this is where we're at right now. 

So we had to make a video, I suppose we could have done it weeks ago but even though it's one of the challenges, somehow it doesn't feel like it. This blog has been going since last September, but even so, it still feels more real. 

Two of us got together in the kitchen with an old camera (well, old for modern times, it was still digital!!!)  and with a basic script took turns to operate the camera and do some impromptu words about Zyderbot. We aren't natural publicists so we're a bit awkward, but it's done.


The video became an explanation of some of the key points, with added picture within picture to provide voice over explanations of key points, this picture showing the emergency fuse to prevent magic smoke emissions.

While it's a competition and we want to compete in all the challenges, we have to enjoy doing it, and the real competition is solving the puzzles we're given. The rest of the pictures here are all from the video of some things we've learnt.

Nerf guns and Stepper motors

Being geeks we had nerf guns, but adapting the mechanism for use in PiWars turned out to have a range of issues. But here it is in all its chaos. Eventually all that is used from the gun is the flywheel acceleration module, and the dart magazine.
We had some fun with the aim of the gun, and have fitted a recycled barrel to increase accuracy. The barrel was originally a croquet mallet handle, which then became a battery holder in a previous Zyderbot iteration, and now a nerf gun barrel. You can just see a cat toy laser clipped on underneath.

The flywheel module of the original gun was easily unscrewed and was easy to remount.

The trigger mechanism wasn't so easily recreated and we ended up using a stepper motor to push the darts into the flywheels. 
Lesson learnt, do not power stepper motors continuously when not in use. So it was left powered on, heated up and shortly the plastic mount for it melted. Replaced with a metal mount and a change to code in a float state when not actually in use.

Magnets

We use a lot of magnets for quick assembly and disassembly of Zyderbot between challenges. Here is the main arm which is reused between the Eco-Disaster and Zombie Apocalypse challenges. The base is attached to the robot rear using a pair of magnets at each corner which gives a very secure base.

 
Four magnets on the arm allow for holding the zombie gun and barrel handling attachments

Four on the attachment line up 

And fully assembled with four matching magnets it's a secure joint.

The nerf gun similarly has four matching magnets

And so has a very secure platform.

Lesson learnt. You don't need four matching magnets for everything. The magnets are strong enough to hold using just a metal washer on one side and magnet on the other. We fitted four pairs to the battery box and now need a crowbar to get it open. They're also not always cheap so it costs twice as much!

Barrel Handler

One of the things we want to avoid in Eco-Disaster is to have barrels rolling around the arena. Partly because they'll be difficult to get into the end zones, and also because they'll potentially get caught under Zyderbot and need a rescue. 


Consequently, last minute mods to the barrel handler to add a plough to push barrels away that accidentally get in the way. This means not lifting the barrels very high, but then also not having to worry about dropping them on another barrel, though stacking them would be an interesting challenge (a la toys). We've also fitted a rear bumper to do the same thing.

Configuration
Zyderbot has a set of 8 dip switches on the rear which allow easy configuration between challenges.

Also shown in the picture is the battery monitor display, the rear light and the stop/start button. The rear light also doubles as an information display. For the Pi-Noon challenge we'll be covering this with a magnetically attached plate to protect the settings!!!!

Ultrasonic sensor
Lots of us will recognise a HC-SR04 ultrasonic sensor, so we couldn't miss one off, it's very easy to use and adds a bit more of a feature than a 5mm square laser sensor on pcb, though we're using one of those as well!!! Might come in for barrel detection as well.


Lighting
We have very little idea of what the lighting will be like on the actual day and most of our testing has been done in very variable conditions, so having a pair of headlights on the robot helps a lot in colour and line detection.


It's also part of the robots image, we can't have a lightbar because of the attachments but these will work well.

Might actually do a blog of the weekend but we'll see how excited we are :) 

Well that's this bit done, just the competition itself, then back to building and demonstrating other robots, but I'm sure we'll find a place for Zyderbot!



Monday, April 1, 2024

Tuning!

 Tuning!

So we build solutions, they should work, and then the real world gets in the way. 

Lava Palaver progress

We'd gone for a camera sensor for the Lava Palaver challenge, not because it was a better option for following a line, but because it was automatically immune to the hump in the course. However, this threw up new issues. The competition course is 7m long and painted black, the only surface available for testing is a light wooden floor which makes for very poor contrast.

 Early testing was plagued with phantom readings from reflections which confused the robot, so testing moved to using a black line instead, with the intention of switching over to white for the competition day. This was much more successful, but space is a limit again. To make a test worthwhile, we're going to have to go for a bigger course outside, using a long roll of black art paper with white electrical tape down the middle and hope for dull days!

Illumination was oddly a very important feature. Off the shelf line follower attachments tend to be IR sensor based and have built in illumination which tends not to be noticed, but for the camera, even illumination is essential to get good contrast for the line, so the robots headlights have to be used to light up the track ahead, together with an overhead light illuminating from above. 



The result is a fairly simple bit of code at the end, but beneath it is a lot background code doing the heavy lifting and interfacing. 

One of the items outdoor testing might help with is coping with bright sunlight and shadow. Watching the competition from previous years, it's obvious that this can have a detrimental impact on robots using optical sensors so we'll be looking at that. It's hasn't been done yet but we'll be experimenting with a sunshade for the robot to give more even lighting for the camera and hopefully reduce errors.


Zombie Shooting
For the Zombie Apocalypse, we've had an idea to use a butchered standard nerf gun which looked very promising out in bits on a table, see an earlier blog post, but we discovered a weakness. Using the servo method of pushing the nerf pellet into the gun worked a lot of the time, but then would jam the gun requiring it to be dismantled to clear the jam, it wasn't suitable for the competition. This held us up for quite a while but an obtuse solution using a stepper motor and a rotating cam came to the rescue, so now we're back in business.


The video shows the cam operating and firing off a magazine of pellets. We then got a cat laser toy and pressed it into service to help with aiming, now how many competitors will be aiming at butterflies?




As the video shows, the gun fires to the left, a bit more work to do but on our way.:) 

Minesweeper
Again, this is one we had a solution to some time ago, but when we come to use it, there's a surprise for us. We're using an overhead camera, but processing the image in memory leaves off a lot of the arena, a 'feature' of the camera.


 The image was always going to be distorted but missing bits is a problem we haven't time to overcome, we'll just have to code round it. This means that the robot will have to search for the red square sometimes instead of just being able to see it and move to it. 

The algorithm for this will be basic, something like the following.



Just how accurate the movement will be isn't known, we were expecting to just move to the red square wherever it was, but this now requires some idea of where the centre of the arena is. Only testing will tell. We're doing this challenge on mecanum wheels. As the arena is 4 x 4 matrix, all red squares will be within it and so a map can be built of the arena based on squares that have already been red. Will this be good enough for the minimum 20 moves required we'll have to see. Anyway, signing off with an errant robot which finds it's square, and then ignores it, ho hum.




 

Friday, March 22, 2024

Days tick by......

 With less than a month to go it seems we have lots of solutions which are all but ready to go. Line followers that can see lines but can they drive seven metres in a straight line? Square finders that can see the squares, but can they drive to one and then the next? All will be revealed on the 21st April in Cambridge but for now here's some progress.

The Lava Palaver line detector works well but instead of a fixed mount, needed a bit of adjustment so has now been given an articulated mount.


This has been 3D printed to be integral with the magnet attached battery box cover. More magnets have been deployed to the rear part of the robot for quick change attachments for shooting zombies and collecting barrels.


It's great to see the layout before dozens of wires obscure the view!!!! More uses for magnets has been found in adjusting the barrel handler. 


This also makes it quicker and easier to change the types of attachment to see which works best. 


This is our barrel handler in serious searchlight mode. It obviously works but needs a bit more software behind it to get barrels to their destination. 

More on this to come. 

The Escape Route challenge still needs a bit of work, but we're working from an old formula so it's more tidying up than new development. To get cleaner lines on the robot we've hidden a VL53 laser range finder inside one of the cab windows, along with IR sensors on the sides and an ultrasonic sensor on the front of the robot.


When not in use we'll be closing the window with bluetack!!!!

First experiments with the sensors aren't perfect, as the following video shows.


But after a bit of tuning and building a test course, it's starting to look the part.

 

This is our goth themed video selection but we just need to get this running a bit more smoothly.

 We've had a solution for Minesweeper for sometime, but we've only just now got round to fitting it to the robot. 

It's an overhead gantry mounted camera which reports an XY co-ordinate of the red illuminated square on demand from the robot. The robot will be running on mecanum wheels for this challenge and the gantry is adjustable to keep the robot within the height limit of 450mm. Navigation is all relative to the red square, so, see a red square and drive to it!! To reduce the impact of the gantry on the image, 4mm black carbon fibre rods have been used for support. Also in the picture is the carry handle for the robot, to make handling easier and safer. A better view here.



With a variety of configurations needed for the competition day, a new control panel is being created for the rear of the robot which will allow the robots Pi to be quickly switched between challenge modes, as well as a battery monitor and a start/stop button. To make sure we've got everything covered for each challenge configuration, we've started a checklist, I'm sure it'll get longer soon!!! Here's where it is now.

Challenge: Lava Palaver
  • Wheels: 105mm
  • Bonnet: Line Detector
  • Back cover: Plain
  • Attachment name: LINES
Challenge: Zombie Apocalypse
  • Wheels: 70mm
  • Bonnet: Plain
  • Back cover: Nerf Gun
  • Attachment name: ZOMBI
Challenge: Eco-Disaster
  • Wheels: 70mm
  • Bonnet: ???Not yet determined???
  • Back cover: Barrel Picker
  • Attachment name: ECODI
Challenge: Minesweeper
  • Wheels: Mecanum
  • Bonnet: Plain
  • Back cover: Camera Gantry
  • Attachment name: MINES
Challenge: Escape Route
  • Wheels: 70mm
  • Bonnet: Ultrasonic detector
  • Back cover: Plain
  • Attachment name: None
Challenge: Pi Noon
  • Wheels: Mecanum
  • Bonnet: Balloon attachment
  • Back cover: Plain
  • Attachment name: RC
Challenge: Temple of Doom
  • Wheels: 105mm
  • Bonnet: Plain
  • Back cover: Plain
  • Attachment name: RC
Next time we hope to have more demonstration videos of being fully ready for all challenges.....but then again maybe not!!!!








Saturday, December 23, 2023

Show and tell December meeting - PiWars 2024

 And the next update meeting. It might seem to be quick on the heels of the previous meeting but that was just the delay of the blog writer getting on with it.

Not a full meeting this time due to illness, it's been going round the team, so we didn't see any progress in the nerf gun mount but we did look at what the zombie ideas might look like and a trial camera mount of for the minesweeper challenge.

Firstly a small update on the lava palaver challenge.


Apart from just a reprint of the front attachment, the ball bearing skids have been replaced by two jockey wheels to cope better with small steps in the course and for the 'bump'. The microswitch is still in place for the physical bump detection but will be replaced by an optical switch when the hinge is remade.


For this iteration, the line guidance is still using IR detectors in two rows, to improve resolution using off the shelf parts but it's likely that a camera interface will be used finally.

The image recognition for the Zombie Apocalypse and Eco-Disaster challenges moves on with good recognition and aiming now achievable. 


These are recognition tests as well as calibration screens, we're guessing at what zombies look like. 

Finally we have built a basic overhead camera detector for the Minesweeper challenge to check the calculations on being able to 'see' the whole arena.

Here the 160 degree camera is mounted at 350mm above the surface, which gives a 600mm radius surrounding view. Raising the gantry to 450mm extends this to a 800mm radius.


This is a bit ungainly but does demonstrate the approach to view the whole 1600mm square arena in one image for processing. For the next meet we'll be aiming to get that extended as well as some line following and mine hunting code.





Saturday, November 11, 2023

Chassis development - PiWars 2024

 So we needed a chassis and while we had previous skeleton chassis we could have gone back to, one of us has been developing a more substantial bit of kit.

It started out as a four wheel drive 'tonka toy' with big rubber tyres that can handle uneven surfaces. As originally made, it looked a bit blank in lemon yellow, but a bit of paint for windows and a load to carry it looked a bit better. The chassis is in two halves, articulated in the middle, around a central battery holder for three 18650 LiPo's in series. 


The video shows a basic try-out though not very exciting. The remote control is via a helicopter RC and converted from SBUS via a Pico PIO program, code on github here EastDevonPirates2024. The voltage display on the rear, as well as an obvious on/off switch, are very useful.



But never quite happy, it wasn't really up to our PiWars 2024 standard,  so it's been remodelled. The wheels are much smaller to accommodate PiWars chassis size rules, they may change yet again but these are the working wheels for now.


Gone are the flashy headlights and a new detachable bonnet fitted to cover the remodelled battery box. 



We haven't remodelled the wiring yet, but we do have a flashy invisibility cloak (lid) so you can't see them!!! As well as being a lid, that will also be a platform for attachments, more on those in later blogs.


All that's in there right now is a Pico and an RC receiver. Independent FIT0441 brushless motors on each wheel with built in motor controllers and pulse counting for rotation sensing provide the driving force.

Fans of this look can get the stl files from github.




Monday, October 23, 2023

Lava Palaver - PiWars 2024

Introduction 

We aren't actually in PiWars 2024, but just a reserve team for the advanced category, which doesn't mean we don't have to do anything! Assessing the challenges and thinking about what's involved has to be done. 

We've all been involved with a robot workshop so haven't had much time to look at these things in depth but here's a view on the first of the challenges listed, Lava Palaver. The official description of the challenge is here Lava Palava – Pi Wars

The Challenge

This is a black painted course 7 metres long and 55cm wide, with walls 7cm high and part way along is a double chicane where the robot has to turn right then left, followed by a left and then right. A white line 19mm wide is positioned along the centre of the course. Without attachments, the maximum width of a robot is 225mm, or half the width of the course.


A course like this has been used in previous years, but as a change to the layout, a 'speed bump' will be inserted onto the course on the day of the competition, dimensions shown below.

This one feature does introduce a range of considerations also applicable to the obstacle course challenge covered later. The overhang on a robot, that part of the chassis ahead of the front wheels or tracks, will need to be able to clear the leading edge of the 'bump' and also once traversed, be able to avoid colliding with the level part of the course when coming off the 'bump'. A robot could, of course, be made sufficiently robust to collide with this and continue on, either with strengthened chassis, the addition of a skid or with a leading idler roller or wheel. The table below lists a range of overhang lengths and clearances required.


The course is to be navigated autonomously and 225 points are awarded to each of three runs completed within 5 minutes and additional 100 points for each of these three runs where the robot doesn't collide with the sides of the course. The combined run times of three runs are compared to the other robots and up to 150 points can be awarded on a decreasing basis for the fastest robot times according to the PiWars formula, described here Formula Scoring System – Pi Wars. Finally, for the fastest individual run of a robot, 275 points are awarded. The maximum points awardable are therefore 1400. From a strategic viewpoint, completing the course three times without touching the sides and within the time limit gains 975 points, 70% of the maximum, so from an effort perspective is a worthwhile target in itself. Once a successful run navigation strategy is achieved then the speed could be increased to competitively gain the extra points.

Navigation strategies

There are several strategies which come to mind, and may be adopted either individually or combined. These are dead-reckoning, wall following, line following and for the second and third runs, memorised tracks. Collision avoidance, while not a navigation strategy, is desirable so will be included!

Dead-reckoning

As the shape of the course is known, separate routines could be incorporated into the robot navigation code to drive the robot  in different ways depending on the assumed position of the robot. Therefore, the routines could be for example.

Drive-forward-2.5-metres
Turn-right-45-degrees
Drive-forward-700mm
Turn-left-45-degrees
Drive-forward-1-metre
Turn-left-45 degrees
Drive forward-700mm
Turn-right-45-degrees
Drive forward-3-metres

These descriptive routines could be taken to successfully navigate the centre of a course, without reference to the 'speed bump'. This is not the shortest route of course and starting the robot already close to the right-side wall and navigating close to the left hand wall through the chicane, returning to near the right side wall on the lead out would be shorter and thus faster for a robot with similar speed capabilities. Robots with a chassis width less than the maximum would be able to take best advantage of this strategy.

Dead-reckoning has been done successfully with timed robot runtimes, but works more reliably when the wheel dimensions are combined with measured rotations of the wheel to calculate the distance accurately. Similarly, robot direction can be estimated by relative wheel rotations (combined with wheel orientation depending on steering method). A robot using mecanum wheels could simply move at a required angle without changing orientation. Gyroscope/accelerometer circuits can be incorporated to provide even more orientation information.

On a simple course such as Lava Palaver, dead-reckoning can provide an effective method of autonomous completion. The assumed measurements in the example could be confirmed on the day of competition, and corrected by physical measurement of the course. This could also include the position of the 'speed bump'  to accommodate any speed/power variations which might be needed. Including collision avoidance to improve the usefulness of the estimations input will aid the navigation.

Wall following

The course has a wall on either side, and while colliding with either wall might reduce the score, and time, available, it does offer a consistent guidance reference throughout. Detecting walls can be done with a variety of non-contact technologies, such as ultra sonics, laser and infrared (IR) distance measurements. 

Ultrasonics

Detectors are mounted on the sides and front of the robot and provide a reading how long an ultrasonic pulse of sound takes to reflect from a surface. These can be either self contained, carrying out the measurement and providing measurement information, or controlled by the robots controller and the timing and subsequent distance measurement calculated directly. These detectors can be prone to errors  due to the angle of the surface they are facing and the level of reflectivity, hard surfaces working best. 
This is a very common low cost ultrasonic sensor, in this case, run from a controller.



Laser

In recent years, small laser equipped distance sensors have become available, such as the VL6180X or VS53L0X models, which can provide an accurate and fast measurement providing that the target surface is reflective enough. The surface of the course walls are painted black and this may significantly reduce the effectiveness of this type of sensor, but trying it may be a useful lesson. They also cost a bit more than the ultrasonic sensors, which may need to be taken into consideration when adding multiple sensors to a robot.
   LIDAR (LIght Detection And Ranging) sensors are not beyond the budget of many robots (both in cost and size) and can provide a detailed map of a robots surroundings, but do need to be able to 'see' the course walls which may prove difficult to engineer a robot to do in this case.
This picture of a low cost LIDAR sensor is driven by an electric motor to give an all round view. It costs in the region of a good serial servo which many roboteers use.



InfraRed(IR)

These sensors rely on the level of reflected IR light from a surface, which is illuminated by an associated IR source. These can be very effective measuring small distances where the ultrasonic detector would fail completely but may suffer the same problems as the laser sensors when observing the black sides of the course.
This is a pair of IR sensors with adjustment for triggering sensitivity.

The following is an example sensor layout.

The rectangles describe the locations of the sensors for both wall following and collision avoidance and could be either ultrasonic, IR, or both. One option for the front collision detector is to mount it on a servo to provide a sweep of the area in front of the robot for greater coverage. 



Line Following

The white line down the middle of the course provides an immediate point of focus for guidance being consistent throughout. Line following is a very common entrance subject to robotics and using error correction to establish a reliable guidance mechanism. Information about the position of the line relative to the robot and its directions is typically obtained via an array of point sensors or from a high resolution optical camera. There are also low resolution optical cameras available which provide a much simpler interface and image to analyse for guidance.

Point Sensor Arrays

These come in various types but are typically a light sensor and light source as adjacent pairs and provide a signal based on the reflectivity of the surface which can be used to detect a white or black line on a background black or white field. 
Individual sensor, this is a TCRT5000

Here, eight sensors have been soldered to a sensor bar and an I2C interface provides access.


Some colour sensors can detect coloured lines to enable multiple lines to be used for different guidance uses in the same plane. A basic array would be two such pairs a short distance apart and mounted across the robots chassis at right-angle to the direction of travel. These give basic information such that when the right sensor is over the line, turn right, when the left sensor is over the line, turn left. Adding more sensor pairs enables the robot to more accurately determine the position of a line and placing them closer together enables a greater degree of granularity of control. Using two or more lines of sensors enables more directional information to be gathered, and varying the shape of the sensor array ( an arc can be beneficial) , together with varying the spacing of sensors to give both coarse and fine positional sensing can be helpful.
Positioning the sensor array ahead of the robot gives more time to make corrections, as well as placing arrays further back on the robot chassis to reduce over correction situations. They can also be useful with providing initial alignment at the start of a line following run ensuring that the robot is positioned as straight as it can be. 
This is a basic two sensor layout which can be very successful but the line follower typically has low speed as it constantly has to hunt for the line it's following.

Adding a third central sensor provided focus, reduces hunting and increases the speed possible. 

As with the commercial example above, this is an eight sensor bar. If the line is wide enough then the two central sensors can be the focus, but if it is a narrow line, then ignoring one of the outlier sensors and using the fourth sensor as the focus can help performance.

A nine sensor commercial sensor bar is unusual, but automatically provides for a central focus sensor. The wide sensor bar provides for an increased sensor sweep area when negotiating corners or having to perform line finding.

This is an enhanced nine sensor arrangement with a dense focus in the centre, allowing the robot to line follow using multiple sensors, perhaps not necessarily in the centre, but also has outlier sensors for improving corner performance.
This curved sensor is common n competitive line follower robots, maintaining a focus area and providing depth in the outlier sensors. This is very useful where the robot will be encountering many corners in quick succession.




This final layout is more extreme and might be more at home in a commercial robot but can still be useful in smaller robots. The central sensor bar provides the focus and the core steering input. The lead bar provides advance information to allow the controller to take predictive actions, and the trailing sensor bar provides some alignment information to help reduce crabbing and hunting of the robot as well as aiding aligning the robot in a straight line at the start.



High Resolution Cameras

      While they can require significantly great processing power in a robot controller, the cost of adding a camera can be very modest and equivalent to a point sensor array. The processing may be more intense but effectively provides the same level of  guidance as a multilevel array giving a degree of lookahead absent from single line sensors. Cameras can be mounted away from the surface of the course so can avoid being snagged on a 'speed hump'. Using cameras can give a very high quality of control but does require significant investment in learning to implement in code. A variation on this is to add a pan feature to the camera to provide additional lookahead capability.
Wide angle cameras such as this can provide a good view for line following but some compensation for the lens distortion might be needed for accuracy.



Combinations

    Combining all these may be difficult, but a few would be very useful.  Wall following can be a complete solution, but including it with the others for collision avoidance makes the extra points more likely and gives greater confidence in increasing the speed. Line following can achieve the whole navigation of the course, but adding the dead-reckoning information to it can aid in speed control, accelerating the robot from the start, slowing as a corner approaches and accelerating afterwards. Without knowing where the 'speed bump' is, a robot either must moderate its speed throughout or potentially risk crashing, however dead-reckoning can add a suitable speed reduction to safely navigate it. 

Memorising and recall

    Having completed one successful run of the course (we will all be successful!!!), we should have enabled our robot to do it again just as easily, but with the extra knowledge of having done it once. Recording the robots good run means that without sensors it should be able to do it again and perhaps faster. The distance to the corners is known, the 'speed bump' has been located and where the robot can and can't run at full speed determined. 
The methods of recording a 'good' run are varied but the course isn't complicated so a small array of control points may suffice.

What will make for a good robot for this challenge?

     The Lava Palaver is one of the challenges and going all out to win just this one thing may be the sole objective, but in PiWars, a robot chassis will need to be adaptable to the other challenges.

    Adding a line follower array attachment will be perhaps the easiest option, but some mechanism may be needed to allow it to navigate the 'speed bump', fitting it with a hinge and either a roller or idler wheel for the time it is in contact with the 'speed bump'. Fitting this hinged part with a detector would also inform the robot that it had found the bump. Line following competitions sometimes feature quite extreme 'bumps' which delicate high performance robots just get on with.  

    Placing the array well in front of the robot with this mechanism would also perhaps give the robot a small amount of time to decelerate to navigate the bump safely. However, placing the array far in front of the robot may be a problem for steering depending on the technique involved. 


A few mock-up pictures of a suspended sensor running on ball castors. The spring provides suspension to hold the sensor bar down as well as accommodate the rise and fall of the bar.


     
    A robot can be up to 300mm long in its base configuration, longer than half the width of  the course so a mecanum wheel equipped robot could find itself colliding with the sides of the course at corners. Using skid or differential steering would offer a robot the chance to steer precisely but only if  it was to slowdown to do so at corners. Ackermann steering would give the best control over the course at speed, but might prove problematical to use in the other challenges. One thing which would be consistent is that full length robots with attachments will be at a disadvantage on this challenge.

Remember: 70% of the points are available just for finishing the course without errors three times in 5 minutes, which isn't fast, so just that would be a good result for any robot entrant.

It's not certain, but I suspect East Devon Pirates will have a camera's eye view of the course with skid steering :)  There's code out there we've used before if you want a look.  uggoth/EastDevonPirates2024: Work towards Pi Wars 2024 by East Devon Pirates (github.com)