Showing posts with label NTP. Show all posts
Showing posts with label NTP. Show all posts

Friday, March 22, 2024

Days tick by......

 With less than a month to go it seems we have lots of solutions which are all but ready to go. Line followers that can see lines but can they drive seven metres in a straight line? Square finders that can see the squares, but can they drive to one and then the next? All will be revealed on the 21st April in Cambridge but for now here's some progress.

The Lava Palaver line detector works well but instead of a fixed mount, needed a bit of adjustment so has now been given an articulated mount.


This has been 3D printed to be integral with the magnet attached battery box cover. More magnets have been deployed to the rear part of the robot for quick change attachments for shooting zombies and collecting barrels.


It's great to see the layout before dozens of wires obscure the view!!!! More uses for magnets has been found in adjusting the barrel handler. 


This also makes it quicker and easier to change the types of attachment to see which works best. 


This is our barrel handler in serious searchlight mode. It obviously works but needs a bit more software behind it to get barrels to their destination. 

More on this to come. 

The Escape Route challenge still needs a bit of work, but we're working from an old formula so it's more tidying up than new development. To get cleaner lines on the robot we've hidden a VL53 laser range finder inside one of the cab windows, along with IR sensors on the sides and an ultrasonic sensor on the front of the robot.


When not in use we'll be closing the window with bluetack!!!!

First experiments with the sensors aren't perfect, as the following video shows.


But after a bit of tuning and building a test course, it's starting to look the part.

 

This is our goth themed video selection but we just need to get this running a bit more smoothly.

 We've had a solution for Minesweeper for sometime, but we've only just now got round to fitting it to the robot. 

It's an overhead gantry mounted camera which reports an XY co-ordinate of the red illuminated square on demand from the robot. The robot will be running on mecanum wheels for this challenge and the gantry is adjustable to keep the robot within the height limit of 450mm. Navigation is all relative to the red square, so, see a red square and drive to it!! To reduce the impact of the gantry on the image, 4mm black carbon fibre rods have been used for support. Also in the picture is the carry handle for the robot, to make handling easier and safer. A better view here.



With a variety of configurations needed for the competition day, a new control panel is being created for the rear of the robot which will allow the robots Pi to be quickly switched between challenge modes, as well as a battery monitor and a start/stop button. To make sure we've got everything covered for each challenge configuration, we've started a checklist, I'm sure it'll get longer soon!!! Here's where it is now.

Challenge: Lava Palaver
  • Wheels: 105mm
  • Bonnet: Line Detector
  • Back cover: Plain
  • Attachment name: LINES
Challenge: Zombie Apocalypse
  • Wheels: 70mm
  • Bonnet: Plain
  • Back cover: Nerf Gun
  • Attachment name: ZOMBI
Challenge: Eco-Disaster
  • Wheels: 70mm
  • Bonnet: ???Not yet determined???
  • Back cover: Barrel Picker
  • Attachment name: ECODI
Challenge: Minesweeper
  • Wheels: Mecanum
  • Bonnet: Plain
  • Back cover: Camera Gantry
  • Attachment name: MINES
Challenge: Escape Route
  • Wheels: 70mm
  • Bonnet: Ultrasonic detector
  • Back cover: Plain
  • Attachment name: None
Challenge: Pi Noon
  • Wheels: Mecanum
  • Bonnet: Balloon attachment
  • Back cover: Plain
  • Attachment name: RC
Challenge: Temple of Doom
  • Wheels: 105mm
  • Bonnet: Plain
  • Back cover: Plain
  • Attachment name: RC
Next time we hope to have more demonstration videos of being fully ready for all challenges.....but then again maybe not!!!!








Saturday, January 29, 2022

S.L.A.M

Simultaneous Locating And Mapping summary 16/Jan/2022

Actually, there’s not a lot of mapping, as we build the arena, so we hopefully know where everything is, but locating the robot within the arena is a big deal in PiWars 2022 as there is a lot more stuff about than in 2021. 

General concept: stereo cameras and beacons.



Beacons

The logic chain ...

  • You need identifiable landmarks in a known location.
  • How do you pick them out from the background clutter? If you use LED beacons then you can drastically underexpose the image, leaving only the LEDS showing.
  • How do you identify them? Use different colours.
  • Why not a modulation? Because you have to do this fast on a moving platform, you can’t afford the time to observe the beacon over a time period to see changes.
  • What colours? Well, it turns out that the obvious RGB colours have a problem, which is that the Green is too close to the Blue for rapid distinguishing, so just Red and Blue then.
  • How high? First guess was on the ground with the cameras underslung (leaving the robot top completely clear for attachments). But what about the sheep and troughs obstructing the view, let alone attachments hanging down? So current guess is 110mm up. That means we can have the cameras on the back of the robot unobstructed.
  • What if that’s wrong? They are mounted on 8mm square section carbon fibre tube, so if we need them higher up, we just use longer tubes.
  • What kind of LEDs? First we chose RGB LEDs. This means that if we change our minds about colours we can just solder in some new resistors and get any colour we like. We started out with clear 5mm LEDs with 3D printed HD Glass diffusers, but why make work for yourself when you can get 10mm diffused LEDs?
  • How many LEDs? Given just two colours and four LEDs you get 16 combinations. Each arena wall has a maximum of seven LEDs (if you include the corners) so can then have a unique pattern of beacons. If we need each beacon to be unique in the whole arena we will have to go to three colours or five LEDs

They are powered at 9V, so could use PP3s, hence the little box at the base



Beacon identification software

    First thought, use OpenCV for both image capture and processing. It’s a bit worrying that it takes 25 seconds to load (not to mention 5 hours to install), but runtime is lightning fast and the loading takes place before the timed run, so should not be a real problem. So we start with a pair of 640x480x3 RGB images (possibly on different computers) captured with OpenCV. 
    We can get 29 frames per second (FPS) capturing stereo pairs on a single computer (the Stereo Pi). However, it turns out that we can process them in a very basic way just with numpy and get a calculation ‘frame rate’ of 1880 FPS, so simple image processing has no real effect on performance. The killer is reliability.     OpenCV just doesn’t control the camera hardware properly. This means that every now and then the image goes green monochrome or the GStreamer is incorrectly invoked. Even after weeks of trying I cannot resolve this, so it’s PiCamera and NumPy for now.

Phase 1

The base image is 640 columns wide, 480 rows high, and with 3 colours (RGB)

Locating beacons
This is done by just looking for at least 5 consecutive bright columns in the image to make a column set.

Measuring the Angle

The dreadful barrel distortion of the lens is compensated for by a cosine formula determined experimentally from calibration images. This is then used to create a lookup table to convert the column number of the middle column to an angle, i.e. the bearing from the camera.

Locating LEDs

Look for at least 3 consecutive bright rows in a column set. Note that the LEDs are separated by quite thick separators so that they don’t run into one another in the image. Produces a set of rectangles in the image.

Determining the colours

Because we only have red and blue, we just sum those colours in an LED rectangle; if there’s more red than blue, it’s a red LED, otherwise it’s blue. Note that using OpenCV and YUV encoding we may be able to reliably distinguish green as well, but can’t do that currently.

Identifying the beacons

We have a database of beacons and their colour codes, so RBBR is a beacon with, from the top, Red, Blue, Blue, and Red, LEDs. The database records their location in arena coordinates (garage is 0,0)
The end result of Phase 1 is a set of beacon identifiers and angles. These are written to a database (currently on the PI Zero, but eventually will be on the central Pi)

Phase 2 - Getting a Fix

Choosing the bearings

From the bearings table we choose those ones to use. We want bearings of the same beacons from both cameras taken at the same time. From those we want the pair of beacons furthest apart to get the best angles, so from those beacons which occur in both images we choose the leftmost beacon and the rightmost beacon.

Calculating the position

This is some trigonometry, using the cosine rule and the sine rule. The result is the location of the beacons relative to the robot. Translating the co-ordinate systems we calculate the location of the robot relative to the arena.

Next ...

PID control of motors using the location delivered above (PID = Proportional Integral Derivative). Planned path will be a series of locations (arena x,y co-ordinates), plus angle (orientation of the robot relative to the arena).

Performance

Cameras

The basic picamera is a very cheap device using a tiny plastic lens. It has bad barrel distortion and you might think that we have to do a complex correction grid, but actually, because of the very specific use case a fairly straightforward correction does the job. So long as the camera sensor is absolutely vertical and at exactly the same height as the middle of the LED beacon the barrel distortion above and below the middle doesn’t affect it.

Timing

Obviously the calculation of location from a stereo pair of images taken from a moving vehicle is dependent on the two images being taken at the same time. Paula has done a study of synchronisation procedures which should solve the problem of clock differences. Because picamera capture cannot be directly triggered (you are picking up frames from a continuous video stream) some more work is required to convert clock synchronicity into camera synchronicity.

Basic Capture Frame Rate

Stereo Pi (= Pi 3), single camera, RGB, picamera
straight capture:  1.7 fps 
capture using video port:  5.0 fps

Stereo Pi (= Pi 3), camera pair, BGR, OpenCV 
capture using video port:  29.3 fps

Single Pi Zero 2 W, single camera, BGR, OpenCV 
capture using video port:  61.7 fps

Geometry Frame Rate

from column compute location
using numpy  1880 fps

Accuracy

This is the big one. To avoid the need for supplementary location systems we need to get pretty close to 1mm accuracy. 10mm might be OK, but 100mm would be a waste of time. At present we are not near that, but there is time for more optimisation and calibration.

Wednesday, January 12, 2022

After the holidays

 While we've all been paddling furiously beneath the water, there isn't a lot to show for the last few weeks.  One under the cover development is the synchronisation of the stereo vision system, which is combining the output from two cameras connected to two independent computers. 

I've slightly edited this to fit, but this is the detailed work team member Paula did as the solution. I'll leave most of it as Paula's own words.



Executive Summary:

Over the Christmas period I assembled the hardware and then commenced testing the accuracy of a

Raspberry pi providing a hardware pulse per second, to try to achieve a accuracy of under a

millisecond to enable a pair of raspberry pi zeros that each use a camera to create stereo pairs for

range detection. This was achieved using the pps-gpio overlay module. In the process I discovered

that accuracy can be maintained between reboot or shutdown by using the appropriate driftfile or

adjtime so long as the overlying daemon processes is still enabled.

Object:

To find a way of synchronising external Pi zeros to a hardwire pulse.

Discussion:

A trawl of the internet found that we could use a pulse per second (PPS) provided in the distribution

overlays.

Method:

1: Configure a raspberry pi as a source using a gps receiver dongle to give the 1PPS on a GPIO pin.

2: find software resources to measure the uncertainty.

3: train internal clock using supplied network time protocol with the addition of ntp-tools

4: compare results with both GPS and a RTC chip ds1307

5: report findings, recommendations and conclusions.

Hardware used.

OS Buster 10.5.63 on raspberry pi 2 model A

Real Time Clock using ds1307.

GPS module MTK3339 as source for PPS on pin 4

Important considerations.

The pulse is measured as a leading rising edge on the pin

Temperature is held fairly constant so that drift of internal clock is minimal.

Unfortunately we cannot control the pressure, but for the period of use in the arena ,that we plan, this

may be considered negligible .

configuring the ntpd.conf is quite confusing and detailed to get the best performance.

use a static ip address as using dhcp can lead to higher jitter.

Software changes used

sudo apt-get update

sudo apt dist-upgrade && sudo apt rpi-update

Only enable the firmware update NOT a full update to latest beta os

sudo reboot

sudo apt install pps-tools libcap-dev -y


Optional for Real Time Clock(RTC) module only

enable i2c in


Preferences > Raspberry Pi Configuration > Interfaces

sudo apt install i2c-tools -y

sudo nano /boot/config.txt - Add dtoverlay=i2c-rtc,ds1307 on a new line,

check that the # is removed from dtparam=i2c_arm=on

save and close


Optional for GPS display of satellites etc. only

Preferences > Raspberry Pi Configuration > Interfaces

disable console

sudo apt install gpsd gpsd-clients python-gps -y

Now add the following altering gpiopin to suit.

sudo nano /boot/config.txt - Add dtoverlay=pps-gpio,gpiopin=4 on a new line,

save and close


Now type

sudo echo “pps-gpio” >>/etc/modules


Now reboot by typing

reboot


On restart check that the pps is loaded and being received (once connected and source started).

dmesg | egrep pps

or to see them

sudo ppstest /dev/pps0 ctrl+c to quit

We now have a pulse to align to the internal clock


How does a raspberry pi find the time without an internal real time clock?

In the current kernel on booting a RPi the date and time are taken from a file /etc/fake-hwclock.data

and incremented at regular hourly intervals. If and only if your device is able to receive valid time

sources e.g. Network Time Protocol(NTP) or the newer CHRONYD etc., then internal time is

corrected on receipt of a valid string and continually used until you reboot or shutdown, hence it can

perturb any statistics unless you create a driftfile more later. Incidentally in the case of the Pico, I

believe that there is no saved file hence it starts from a fixed date.

Even if we have no external source we need to define one in ntp.conf and mark it with prefer for the

PPS to work(see below)

Processes:

1) from GPS hardware 1PPS periodic > pin 4 > NTPD/CHRONYD

2) from GPS software NMEA messages periodic > GPSD

3) from GPSD in Shared Memory > NTPD

4) from NTP servers periodic > NTPD

5) from RTC on demand


A) Using the Network time protocol daemon(NTPD) is very confusing in the beginning, but

perseverance is required. Edit the default /etc/ntp.conf file as follows:

# /etc/ntp.conf, configuration for ntpd; see ntp.conf(5) for help

driftfile /var/lib/ntp/ntp.drift

# Enable this if you want statistics to be logged.

statsdir /var/log/ntpstats/

statistics loopstats peerstats clockstats

filegen loopstats file loopstats type day enable

filegen peerstats file peerstats type day enable

filegen clockstats file clockstats type day enable

# You do need to talk to an NTP server or two (or three).

#server ntp.your-provider.example

# pool.ntp.org maps to about 1000 low-stratum NTP servers. Your server will

# pick a different set every time it starts up. Please consider joining the

# pool: <http://www.pool.ntp.org/join.html>

server 0.debian.pool.ntp.org iburst prefer

#server 1.debian.pool.ntp.org iburst

#server 2.debian.pool.ntp.org iburst

#server 3.debian.pool.ntp.org iburst

# Server from shared memory provided by gpsd PLT

#server 127.127.28.0 minpoll 4 maxpoll 4 prefer

#server 127.127.28.0 minpoll 4 maxpoll 4

### Server from Microstack PPS on gpio pin 4 PLT

server 127.127.22.0 minpoll 4 maxpoll 4

fudge 127.127.22.0 refid kPPS

##fudge 127.127.22.0 flag3 1

#next line just so we can process the nmea for string offset note invert value

from ntpq PLT

server 127.127.28.0 minpoll 4 maxpoll 4 iburst

fudge 127.127.28.0 time1 +0.320 refid GPSD flag 1 1 stratum 6

#### end of changes PLT

# UK pool servers

pool uk.pool.ntp.org minpoll 10 iburst prefer

# Access control configuration; see /usr/share/doc/ntp-doc/html/accopt.html for

# details. The web page <http://support.ntp.org/bin/view/Support/

AccessRestrictions>

# might also be helpful.

#

# Note that "restrict" applies to both servers and clients, so a configuration

# that might be intended to block requests from certain clients could also end

# up blocking replies from your own upstream servers.

# By default, exchange time with everybody, but don't allow configuration.

restrict -4 default kod notrap nomodify nopeer noquery

restrict -6 default kod notrap nomodify nopeer noquery

# Local users may interrogate the ntp server more closely.

restrict 127.0.0.1

restrict ::1

# Clients from this (example!) subnet have unlimited access, but only if

# cryptographically authenticated.

#restrict 192.168.123.0 mask 255.255.255.0 notrust

# If you want to provide time to your local subnet, change the next line.

# (Again, the address is an example only.)

#broadcast 192.168.123.255

broadcast 192.168.1.255

# If you want to listen to time broadcasts on your local subnet, de-comment the

# next lines. Please do this only if you trust everybody on the network!

#disable auth

#broadcastclient

#end of file /etc/ntp.conf

B: As Real Time Clocks are not provided on the board of raspberry pi’s

we need to add as in the options above, but to read and set we use

an old tool hwclock as I find the latest tool timedatectl a pain.

to set the time for the first time use:

1: sudo hwclock -w this will take the current time from the Rpi>RTC

or 2: timedatectl set-time “yyyy-mm-dd hh:mm:ss”

To read use

1: sudo hwclock -r

or 2: timedatectl status

you have to fiddle about with hwclock-set

sudo nano /lib/udev/hwclock-set

comment out the following lines to look like:

#if [ -e /run/systemd/system ] ; then

# exit 0

#fi

save and return

Now we can compare results, but for more consult the spell foundry webpage above.

To casually look at the RTC performance use

timedatectl status

But we really need to make the system learn the drift characteristics of the RTC clock to do this we

run the system for days and be connected to the internet, then use the /etc/adjtime file to store the

results it requires a minimum of 4 hours before it records any value!

use periodically over a few days

sudo hwclock —w —-update-drift -v

There are other parameters we need to change if running independently of the internet, that are

outlined on the reference below.

Results:

After an hour I get these from /var/log/ntpstats/loopstats



Which is as good as we can get with a pi and GPS with limited view of satellites. Note: the vacillating

-+ of the accuracy reading in seconds indicates a narrowing of the measurements, further narrowing

will take many hours.



and using gps tool

gpsmon -n (exit with q then return)


Conclusions:

The internal timing on a raspberry pi are not sufficient to maintain the needed 1 millisecond

accuracy for our purposes.Just the change in ambient temperature or pressure is enough to thwart

our goal in stand alone mode, the use of the ability to use a 1PPS seems to be the way forward. The

results speak for themselves when compared with both the raw data from either NTP and GPS,

disciplining the local clock drifts before we launch would be sensible to maintain the accuracy

required by using the above techniques.


Further reading:

References:

1: David Taylor’s page https://www.satsignal.eu/ntp/Raspberry-Pi-NTP.html

and additions from corespondents on that site.

2: John Watkins’s spell foundry on https://spellfoundry.com/docs/setting-up-the-real-time-clock-onraspbian-

jessie-or-stretch/

802.1AS - Timing and Synchronisation https://www.ieee802.org/1/pages/802.1as.html

Paula Taylor 2022109