Easy and not-so easy listening · 22 days ago

I’ve been getting back into podcast listening of late, and rather than my usual listening to guitar or tech podcasts, I’ve been trying to listen to a mix of things to help increase my understanding of the current issues we find ourselves in around politics and the environment, and then mixing it up with some podcasts that look at more light-hearted topics that are still educational in some way.

Here’s a list of things I can recommend, grouped by topic, and roughly also happens to be the order of priority in which I listen to them when they pop up on my podcast player.

Politics

Polarized by the RSA – this is an attempt to look at why the current political environment is as divisive as it is. It (usually) stays away from the day-to-day politics, and tries to understand the general landscape of why we seem to have ended up in a situation where we have two sides that share no common ground or hope for compromise.

I binge listened to this from the start over the course of a couple on months, and if you do try this one I recommend listening in order.

Talking Politics – whereas Polarized tries to take a macroscopic look, Talking Politics is more a mix of current affairs analysis and longer term trend reviews, but is always more analytical and thoughtful, avoiding personality politics and focussing on the actual political/legal side of things.

I tend to prefer the longer term episodes, but I do find in the current run up to a general election the microscopic is useful.

Environmental

The Beam Podcast – The Beam is a magazine publication that looks into environmental topics, and their podcasts continues that theme. I do like that this takes broader topical looks, but I find that compared to the politics podcasts it lacks a little actionable bite, but then that’s probably more a reflection on the domain. Still, worth listening to.

General Interest

Reply All – Reply All attempts to explain how the modern Internet impacts life, from a non-technical standpoint. It has a standard set of topics it cycles through, my favourite of which is super-tech-support: here they dig into things like how did someone’s snapchat account get hacked, or how did someone trying to listen to relaxation sounds on their Alexa get something with creepy footsteps on it – all of which expose how interconnected everything is and nothing comes from where you expect.

99% Invisible – Each week 99PI picks a different niche topic and takes a detailed look at it. It usually has a slight design bend, but topics range from automating pepper farming (which made me aware of how much automation in farming ruins bio-diversity), the design of call holding systems, the and how placing a garden store Buddha statue can change crime patterns significantly. It’s not very deep, but it’s usually quite interesting, and makes a nice antidote to the more serious podcasts I listen to.

The Incomperable – The Incomperable is a film/book/game review show, commonly with a science fiction theme, but not exclusively. I tend to pick and choose which episodes to listen to on this one, as either I’m not interested in the latest Marvel/Star Wars films, or I’m trying to avoid spoilers. However, it has introduced me to some great classic films, like The Thin Man, and I enjoy their annual review of nominations for the two big sci-fi book awards as a way of finding new reading material.

Comment

Remember Grandad · 185 days ago

My Grandad passed away recently, someone who was a large part of my childhood. As I’ve got older and things in life get in the way I’d not seen him (or any other of my family) nearly as often as I should have, but he’ll always be a special person to me, and someone who had a large impact on my life. It’s not just me: he and Nanna had five daughters, so I have a lot of cousins, and we all share the similar fond memories of our Grandad.

Of the countless memories I have of him, there’s two that for some reason stick out right now, both I guess from when I was around 12 or thereabouts.

The first I think captures his fun side. One evening, whilst everyone else had their supper cup of tea in the living room, Grandad and I went into the kitchen to have a biscuit with our tea – he was always a fan of gingernuts – and I realised after nattering with him for a while that we’d ate the entire packet between us! A slightly mischievous act, but I’ve no idea why it sticks with me. Perhaps because it’s one of the few that’s just me and him rather than as a larger family unit. But it also shows his of child-like fun he had, and this seemed to cement part of that for some reason. To this day I still have a (helpfully) similar sense of childish fun, which I attribute in a large part to both to him and my gran on the other side of the family.

The other memory that came to mind is of him being amazed at how brattish I was being about not getting to play an arcade game I really wanted to play (as ever, it was something my “cooler” friends were playing, and so I felt the need to play it to just be on par, but didn’t have the money to do so). He wasn’t being nasty about his observation, just bemused I think, hardly surprising given what his generation would have had to deal with at a similar age. But despite clearly seeing me for the brat I was a lot of the time as a child, he still treated me as someone worthy of attention and playing with. I hope that I can be as inclusive and as generous as he was to others, and I guess this is a textbook example of unconditional love. Dear me, I must have been a major pain in the neck as a child (sorry Mum & Dad), but Grandad still gave me attention like the rest of his grandkids.

His funeral was this Monday gone, and at the get together of family and friends afterwards it was lovely to see everyone share their happy memories. Grandad disliked dark clothing, so we all tried to wear something bright – thankfully I’m well stocked for bright floral shirts. But the lasting memory will be watching the set of grandkids playing the games he’d play with us all – we brought in the marbles and the dominos and the other toys he’d spent hours playing with us, and we had some more fun in his memory. To me that’s a near-perfect way to remember his impact on us.

Photo of my Grandad with my and my sister, on a beach, possibly mid-80s

Grandad passing was a reminder to me that our time is finite, something that it’s easy to forget in the day-to-day. That blue guitar I recently completed which everyone has said nice things about, had been stuck in limbo waiting for me to finish it as I procrastinated due to fear of things not being perfect. But Grandad’s passing spurred me to just get on with it – stop worrying about the maybes, and just do your best and give it a go. So that guitar is there thanks to his memory, and I’ll always think of him now when I think of it.

I was sharing my memories above with my Mum after the funeral, and she remembers me at a similar age complaining for the n-th time that I was bored (I really was a terrible child), and Grandad turning around and saying “life is boring – you have to make it not boring”. Words that didn’t take at the time, but speak to me now. This is definitely one of the reasons I’m very fortunate to have Laura in my life: she helps life not be boring, both by being there and by encouraging me to do things I might not otherwise try.

And that saying is also the broader point of this note: time is limited, and whilst I don’t think you can treat every moment as precious (that’d be as tiring as it is impractical), it’s worth being reminded that you can’t put things off indefinitely. Whatever it is that is important to you, ensure you make time for it, as it’s only you that can make it happen: life is boring, you have to make it nor boring.

I’ll try my best Grandad.

Comment

Better testing for golang http handlers · 659 days ago

I’m writing this up as it doesn’t seem to be a common testing pattern for Go projects that I’ve seen, so might prove useful to someone somewhere as it did for me in a recent project.

One of the things that bugs me about the typical golang http server setup is that it relies on hidden globals. You typically write something like:

package main

import (
    "net/http"
    "fmt"
)

func myHandler(w http.ResponseWriter, r *http.Request) {
     fmt.Fprintf(w, "Hello, world!")
}

func main() {
     http.HandleFunc("/", myHandler)
     http.ListenAndServe(":8080", nil)
}

This is all lovely and simple, but there’s some serious hidden work going on here. The bit that’s always made me uncomfortable is that I set up all this state without any way to track it, which makes it very hard to test, particularly as the http library in golang doesn’t allow for any introspection on the handlers you’ve set up. This means I need to write integration tests rather than unittests to have confidence that my URL handlers are set up correctly. The best I’ve seen done test wise normally with this setup is to test each handler function.

But there is a very easy solution to this, just it’s not really considered something you’d ever do in the golang docs – they explicitly state no one would ever really do this. Clearly their attitude to testing is somewhat different to mine :)

The solution is in that nil parameter in the last line, which the golang documents state:

“ListenAndServe starts an HTTP server with a given address and handler. The handler is usually nil, which means to use DefaultServeMux.”

That handler is a global variable, http.DefaultServeMux, which is the request multiplexer that takes the incoming requests, looks at the paths, and then works out which handler to call (including the default built in handlers if there’s no match to return 404s etc.). This is all documented extrememly well in this article by Amit Saha, which I can highly recommend.

But you don’t need to use the global, you can just instantiate your own multiplexer object and use that. If you do this suddenly your code stops using side effects to set up the http server and suddenly becomes a lot nicer to reason about and test.

package main

import (
    "net/http"
    "fmt"
)

func myHandler(w http.ResponseWriter, r *http.Request) {
     fmt.Fprintf(w, "Hello, world!")
}

func main() {
     mymux := http.NewServeMux()
     mymux.HandleFunc("/", myHandler)
     http.ListenAndServe(":8080", mymux)
}

The above is functionally the same as our first example, but no longer takes advantage of the hidden global state. This in itself may seem not to buy us much, but in reality you’ll have lots of handlers to set up, and so your code can be made to look something more like:

func SetupMyHandlers() *http.ServeMux {
     mux := http.NewServeMux()

    // setup dynamic handlers
     mux.HandleFunc("/", MyIndexHandler)
     mux.HandleFunx("/login/", MyLoginHandler)
    // etc.

    // set up static handlers
     http.Handle("/static/", http.StripPrefix("/static/", http.FileServer(http.Dir("/static/"))))
    // etc.

     return mux
}

func main() {
     mymux := SetupMyHandlers()
     http.ListenAndServe(":8080", mymux)
}

At this point you can start using setupHandlers in your unit tests. Without this the common pattern I’d seen was:

package main

import (
    "net/http"
    "net/http/httptest"
    "testing"
)

func TestLoginHandler(t *testing.T) {

     r, err := http.NewRequest("GET", "/login", nil)
     if err != nil {
          t.Fatal(err)
     }
     w := httptest.NewRecorder()
     handler := http.HandlerFunc(MyLoginHandler)
     handler.ServeHTTP(w, r)

     resp := w.Result()

     if resp.StatusCode != http.StatusOK {
          t.Errorf("Unexpected status code %d", resp.StatusCode)
     }
}

Here you just wrap your specific handler function directly and call that in your tests. Which is very good for testing that the handler function works, but not so good for checking that someone hasn’t botched the series of handler registration calls in your server. Instead, you can now change one line and get that additional coverage:

package main

import (
    "net/http"
    "net/http/httptest"
    "testing"
)

func TestLoginHandler(t *testing.T) {

     r, err := http.NewRequest("GET", "/login", nil)
     if err != nil {
          t.Fatal(err)
     }
     w := httptest.NewRecorder()
     handler := SetupMyHandlers()  // <---- this is the change :)
     handler.ServeHTTP(w, r)

     resp := w.Result()

     if resp.StatusCode != http.StatusOK {
          t.Errorf("Unexpected status code %d", resp.StatusCode)
     }
}

Same test as before, but now I’m checking the actual multiplexer used by the HTTP server works too, without having to write an integration test for that. Technically if someone forgets to pass the multiplexer to the server then that will not be picked up by my unit tests, so they’re not perfect; but that’s a single line mistake and all your URL handlers won’t work, so I’m less concerned about that being not picked up by the developer than someone forgetting one handler in dozens. You also will automatically be testing any new http wrapper functions people insert into the chain. This could be a mixed blessing perhaps, but I’d argue it’s better to make sure the wrappers are test friendly than have less overall coverage.

The other win of this approach is you can also unittest that your static content is is being mapped correctly, which you can’t do using the common approach. You can happily test that requests to the static path I set up in SetupMyHandlers returns something sensible. Again, that may seem more like an integration style test, rather than a unit test, but if I add a unit test to check that then I’m more likely to find a fix bugs earlier in the dev cycle, rather than wasting time waiting for CI to pick up my mistake.

In general, if you have global state, you have a testing problem, so I’m surprised this approach isn’t more common. It’s hardly any code complexity increase to do what I suggest, but your test coverage grows a lot as a result.

Comment

Some luthier notes · 748 days ago

I’ve spent the week locked in Makespace working on guitars, and thought I’d write up some notes on the things I’ve been working on to give insight into what goes into making guitars. You can see it here on the Electric Flapjack blog.

Comment

Managing GOPATH for multiple projects with direnv · 761 days ago

I’ll stop with the golang tips shortly, but another quick time saver incase you’ve not seen this before: you can use direnv to manage your GOPATH settings for each of your projects.

direnv is a small utility that will set/unset environmental variables as you enter/leave directories. It’s dead easy to set up, and in homebrew if you’re on a Mac. This means I can set a GOPATH specifically for each go project, without having to remember to do GOPATH=$PWD each time – direnv just sets it as a change directory to the project, and unsets it when I move away.

This can be useful for other things to, like setting PYTHONPATH or other project specific environmental variables.

Hat tip to Day Barr for alerting me to that one.

Comment

Handling golang third party dependancies robustly · 769 days ago

I wrote recently about my thoughts on golang, concluding that although far from perfect, I quite like the language as it makes solving a certain class of problem much easier than traditional methods.

One of the things I was a bit dismissive of was how it manages packages. Whilst I’m not a fan of its prescriptive nature, it’s out of the box behavior is in my mind just not compatible with delivering software repeatedly and reliably for production software. However, it’s fairly easy to work around this, I’ve not seen anyone use this particular approach, so I thought I’d document it for future people searching for a solution.

The problem is this: by default golang has a nice convenience feature that third party packages are referred to by their source location. For example, if I want to use GORM (a lightweight ORM for Go), which is hosted on github, I’ll include it in my program by writing:

import "github.com/jinzhu/gorm"

And as a build stage I’ll need to fetch the package by running the following command:

go get -v github.com/jinzhu/gorm

This command does is checkout the package into your $GOPATH/src directory at $GOPATH/src/github.com/jinzhu/gorm, doing a git clone of whatever their latest master code is.

On one hand this is very nice: you build in how to find and fetch third party dependencies. However, it’s enforced two things that I don’t want when I’m trying to build production software:

  1. I now rely on a third party service being around at the time I build my software
  2. The go get command always fetches the latest version, so I can’t control what goes into my build

Both of these are not something I’m willing to accept in my production environment, where I want to know I can successfully build at any time, and I have full control over what goes into each build.

There is a feature of the golang build system you can use to solve this, just it’s not that obvious to newcomers, and this alone isn’t very useful, so here’s my solution, bsaed on the assumption you’re already using git for version control, and you have $GOPATH pointed at your project’s root folder:

  1. Clone the project into your own code store repository. I always do this anyway, as you never know when third party projects will vanish or change significantly.
  2. Create a vendor directory in your project. The golang build system will look $GOPATH/vendor for packages before looking in the $GOPATH/src directory.
  3. Add as a git submodule the project at the appropriate point under vendor. For GORM that’d be vendor/github.com/jinzhu/gorm, similar to how go get would have put it in the src directory.
  4. Replace your go get build step with a git submodule update command.

And voila, you’re done. Using git submodules means you can control which commit on the third party project you’re using, and by pointing it at your own mirror, you can ensure if your own infrastructure is there you can still deliver software regardless of external goings ons.

As a friend of mine pointed out, there are tools you can do to try and manage third party code into the vendor location, such as vndr, but the fewer tools I need to install to build a product the better – still, if you want to avoid the creation of directories yourself then you should give this a look.

Comment

Some thoughts on Golang · 777 days ago

The Go programming language has been around for about a decade now, but in that time I’ve not had much call to create new networked services, so I’d never given it a go (I find I can’t learn new programming languages in abstract, I need a project otherwise the learning doesn’t stick). However I had cause to redo some old code at work that had grown a bit unwieldy in its current Python + web framework de jour, so this seemed like a chance to try something new.

I was drawn to Go by the promise of picking up some modern programming idioms, particularly around making concurrency manageable. I’m still amazed that technologies like Grand Central Dispatch (GCD) that save programmers from worrying about low level concurrency primitive (which as weak minded humans we invariable get wrong) are not more widely adopted – modern machines rely on concurrency to be effective. In the Bromium Mac team we leaned heavily on GCD to avoid common concurrency pitfalls, and even then we created a lot of support libraries to simplify it even further.

Modern web service programming is inherently a problem of concurrency – be it on the input end when you’re managing many requests at once to your service, and on the back end when you are trying to off load long running and periodic tasks away from the request service path. Unfortunately the dominant language for writing webservices, Python, is known to be terrible at handling concurrency, so you end up offloading concurrency to other programs (e.g., nginx on the front end, celery on the back end), which works, but means you can only deal with very coarse grain parallelism.

Go seems to have been designed to solve this problem. It’s a modern language, with some C like syntax but free of the baggage of memory management and casting (for the most part), and makes concurrency a first class citizen in its design. Nothing it does is earth shatteringly new – the go routine concurrency primative is very old, and the channel mechanism used to communicate between these routines is standard IPC fair – but what it seems to pull off is putting these things together in a way that is very easy to leverage. It also lacks the flexibility of the aforementioned GCD to my mind, but ultimately it is sufficiently expressive that I find it very productive to write highly concurrent code safely. It actually makes writing web services that have such demands fun again, as you end up with a single binary that does everything you need, removing the deployment tedium of the nginx/python/celery pipeline. You can just worry about your ideas, which is really all I want to do.

Another nice feature is the pseudo object orientation system I Go. Go has two mechanisms that lead you in the same direction as traditional OO programming – structs and interfaces. Structs just let you define structs as you might in C, but you can do composition that gives you a sort of inheritance if you need it, and interfaces just define a list of function interfaces you can use on a struct. But an interface isn’t tied to a struct as it might be in a traditional OO, they’re defined separately. This seems weird at first, but is really quite powerful, and makes writing tests very easy (and again, fun) as it means you can “mock” say the backend object simply by writing an object that obeys an interface, rather than worrying about actual inheritance. Again, it’s nothing new, it’s just pulled together in a way that is simple and easy to be productive with.

The final nicety I’ll mention is another feature is an idiom that in the mac team at Bromium we forced on ourselves – explicit error handling and explicit returns of errors next to the valid result. This again makes writing code to handle errors really natural: this is important, as programmers are inherently lazy people and it’s a common cause of bugs in that the programmer simply didn’t think about error handling. Go’s library design and error type make this easy.

For all this, Go has its flaws. Out of a necessity to allow you to have values that may have no value, Go has a pointer type. But it also makes accessing concrete values and pointers look identical in most cases, so it’s easy to confuse those, which can occasionally lead to unexpected bugs, particularly when looping over things where you take the loop pointer rather than the value it’s pointing to. The testing framework is deliberately minimal, and the lack of class based testing means you can’t really use setup and teardown methods, but this leads to a lot of boiler plate code in your tests – this is a shame, as otherwise Go makes writing tests really easy. And let’s not get started on the package system I Go, which is opaque enough to be a pain to use.

It’s also a little behind say Python in terms of full stack framework support. The Go community seems against ORMs and Django style stacks, but that does mean it’s hard to justify its use if you’re writing a website for humans to use with any complexity. There is at least a usable minimal DB ORM in the form of GORM that saves you from writing SQL all the time.

But for all its flaws, I really have taken to Go, and I’ve written a small but reasonable amount of production quality code in it now, and I still find it a joy to use as it’s so productive. For writing backend web services, it’s a joy. There’s not enough mature framework support yet that I’d use it instead of Django/Python for a full user interactive website, but for IoT backends or such it’s really neat (in both senses).

If any of this sounds interesting to you then I can recommend The Go Programming Language book. Not only is it easy to read, it gives you good practical examples that let you see the power of its primitives very quickly. If you think I’ve got it wrong with any of my criticisms of the language, then do let me know – I’m still a relative newbie to golang and very happy to be corrected so I can be even more productive with it!

Comment

Practice practice · 826 days ago

About 18 months ago I wrote something here about how I was trying to get better at playing guitar, and I was going to try post a video to youtube once a week with a new song snippet as a way of having some discipline. If you do recall that, you also know I didn’t do it (I think I managed one more after that post).

But the reason for not doing it was at least reasonable: I actually started taking lessons, and my teacher makes me practice daily, so the discipline sorted itself out, and saved you, dear reader, from lots of bad cover songs.

Instead you can watch some bad bits of me doing blues style improv from my last daily practice session, warts and all:

Now, I may not be giving Joe Bonamassa cause to question his career choices, but I look at this and am somewhat amazed how far I’ve managed to come in 18 months thanks to David, my guitar teacher. When I wrote that original post back in May I was just trying to copy bits of other songs, and here I am today able to throw down a 12 bar blues backing track and then ad-lib over it, even throwing in a bit of wah pedal, to my heart’s content (albeit in a slightly repetitive and formulaic way :).

Partly this is the direction David and I have been working towards – rather than learning to cover old songs or work towards grades I’ve just been trying to understand the building blocks for playing the blues. What is the grammar and vocabulary that makes up a song. I may not yet be writing more than basic sentences, but despite the fact that I feel occasionally learning a song might be more short term satisfying, it’s when I get time to do a little bit of ad-lib like above that it all pays off. Ask me to play a song and I’m hopeless, but give me a looper pedal and I can entertain myself for an age with things like this.

The closest I get to playing an actual song is things like this, where I’m riffing on the great Jeff Beck Group track Rock My Plimsoul (who turn were riffing off B.B. King’s Rock My Baby):

I’ve still a long way to go, of that I’ve no delusion – the open stages of Cambridge are in no danger of seeing me any time soon. But it’s nice to occasionally reflect that one has at least made some progress, even if I can’t play a tune on demand :)

Comment

Ocean Waves · 846 days ago

I’ve been blogging a bunch about my brother’s band of late – mostly as there’s lot happening right now. After the success of their King Tut’s gig, they’ve just launched a kickstarter to get the song that was one of the starts of the set into a single:

If you like your rock music, go check this out and give them a nudge to help get the single out.

Comment

IKARI at King Tut's · 851 days ago

I’m a very proud brother: my brother Tristan and his friends, known collectively as IKARI played their first gig at the famous King Tut’s Wah Wah Hut in Glasgow, and really knocked it out the park.

Tristan doing what he does by Michael Dales on 500px.com



They played a six song set as part of a four band line up at King Tut’s that evening. They executed their songs flawlessly, they sounded great, and the audience packed (by far the biggest audience of the evening). They even had the audience singing along loud to their single Ghosts and the as yet unreleased Ocean Waves.

A wee bit of metal by Michael Dales on 500px.com



I was also a wee bit proud for another reason: Tristan used the guitar I built for him on stage. That guitar, which was a labour of love for the better part of a year, has featured both in the single IKARI recorded and now their live set: I feel truly honoured that this thing I’ve made has been a part of IKARI’s story. In software we have the concept of shipping our products: I’d say this guitar has definitely shipped :)

Comment

Previous