21 Oct 2020
I’ve previously written about my plan to collect much more data about my house. In the current work-from-home
environment the quality of our internet connection is paramount, and I wanted to be able to monitor it and
potentially be alerted to any degradation before it becomes an issue.
Although I’ve replaced my wifi with a UniFi based system, I still use the router
that was supplied by my ISP - which is a ZyXEL VMG1312-B10D. Like most
networking equipment the ZyXel supports SNMP
which is a technology for reading and writing stats and configuration from equipment, and aggregating them together.
On paper it sounds great, but unfortunately SNMP is a nightmare to work with, and you need a mapping file for each
device, which doesn’t exist for this model. After looking into creating this mapping, and integrating my preferred
technology slack of Grafana and Prometheus, I decided to change tack
and extract the data myself.
Fortunately the router UI contains some plain text data which looks easy to scrape. So, filled with confidence that
this would be an easier approach that learning SNMP I spun up a GitHub project and got to work cranking out some code.
14 Oct 2020
For a long time now I’ve tracked the weather outside my house with my weather station. I also
have smart electric and gas meters which display my usage on a little screen in my kitchen, but I didn’t try to do anything
useful with that data. Recently I brought an electric car and given that it’s essentially a giant iPad on wheels it inspired me
to look into what data I could collect from it, and from elsewhere in my house.
Towards the end of last year, I upgraded my Synology NAS to a newer model which has an Intel,
rather than MIPS processor, partly because it was old and I was worried about it dying, but mostly so I could run Docker containers
on it. I’ve been running both a Ubiquiti UniFi Controller and PiHole since then,
but I knew as part of this project I’d want to run many more containers so I took the opportunity to tidy up the setup.
Docker Compose is a tool that sits above the normal
docker command and it lets you run multiple docker containers while simplifying
the management of images and the options you need to set for the container to work correctly. You can find my
10 Oct 2020
Last week the UK government announced more than 12,000 cases
of COVID-19 - more than double the number from the day before. This increase was accompanied by the following
message on the Government’s Data Dashboard.
Due to a technical issue, which has now been resolved, there has been a delay in publishing a number of
COVID-19 cases to the dashboard in England. This means the total reported over the coming days will
include some additional cases from the period between 24 September and 1 October, increasing the number
of cases reported.
This turned out to be an issue with the reporting of positive tests caused by a limit in Excel. An interesting
part of the story is the way it was initially reported in the media. Their focus is on the number of the cases
reported each day, which due to the delay in processing tests is not accurate normal, but with this delay is
a pretty meaningless number. The key measure used to make important decisions is the rolling average of new
case over the last seven days, and with this number you can’t at a glance know how the average is changing.
What is particularly odd, is that it turns out that the government do publish the number of positive cases by
the date the sample was taken. It’s just that for the last few days the media narrative has been “huge number
of cases”, even though that’s largely an artefact of the old incorrect data. Sure they’re high, but they went
up a week before, not last Saturday.
30 Sep 2020
tap, tap, tap Hello, is this thing on?
I started this blog way back in 2008 and was fairly active through to 2012. Unfortunately, apart from a period in 2017, it
has been dormant since then. Working for a hedge fund, who are notoriously secretive, and having children, who are
notoriously good at sucking up all your free time, meant that blogging really wasn’t an option. Now I work for a
more open company (Ocado Technology) and my kids are a little
older, so require a bit less time, I’m hoping to resurrect my blog.
Originally I created the blog on Wordpress.com, which worked great. It was free, reliable, and easy
to use. Unfortunately, when I started writing a new post the editor had become unusable for me. Perhaps I’m an old
fuddy-duddy, but I just want to be able to write my text, add some simple pictures, and have the editor get out of my way.
Sadly, it quickly became clear that WordPress was going to get in my way, so it was time to give the blog a new home and a
new lick of paint.
For many years I’ve had a Linode server, which I use
to host a few small websites. I’ve always been a fan of Markdown, so a static site generated by Jekyll seemed like the obvious choice.
22 Nov 2017
Docker is a great tool for running your applications in a consistent and
repeatable environment. One issue that I’ve come across occasionally is getting data into and out of the
environment when it’s running.
In this post I want to talk about exposing ports that are published by applications running inside a
container. When you start up the container it’s pretty easy to configure the ports you want to expose using
-p parameter. It’s followed by the internal port number, a colon, and the
external port number. For example:
docker run --publish 80:8080 myapp
This will publish port 80 from inside the container as port 8080 on the host.
This works great if you know want ports you want to expose before you run the container. Once it’s running,
if you decide you need access to a port, you can’t expose it. Unless that is, you cheat.