Hail to the Bus Driver

Hail to the Bus Driver

A few friends of mine and I have got together in a new partnership to carry out an end-to-end data project using Transport for London data.  I'll cover that in later blogs but work continues apace on that, as we all strive to be legends in our Lunch Hours and between tending to loved ones after hours.

The main way we have been obtaining our data is via APIs.  This represented my first real brush with the practicalities of APIs, having dealt with them in theoretical terms before.  While I don't have much difficulty with the abstract, I like to trick my brain with short-term goals, so I ended up setting myself the target to create some kind of internet of things device using data from the Transport for London API and avoiding the need for any overly complex electronics chicanery.

I had seen that various people had used Transport for London APIs for bus tickers, but TFL had retired the original countdown feed in favour of a unified API a year or so ago.  It seemed like an ideal opportunity to contribute toward an open source community that would welcome code that reintroduces the functionality of such a bus ticker, as well as providing a useful and practical exercise in pulling down and parsing the data feed and extracting the features I wanted.  Further, it would  help my wife and daughter to get to Baby Spanish or Sing-a-Long Sally on time so solving a real world problem, there.

I had the data source, but I had to settle on a medium.  I had a Raspberry Pi in the house and considered it ideal, but I wanted ideally to create something that could run headless. And so I placed an order for a UnicornHat HD by Pimoroni.  Physical installation of the UnicornHat HD on arrival was pretty straightforward, since it uses the GPIO pins on the Raspberry Pi Model B V2.  The code I'll be using is in Python 3 so adapt as necessary if you're still on Python 2.

Installation of the Software

Installation of the software was pretty straightforward and proceeded pretty smoothly.  First, I performed the usual hygiene updates on the Raspberry Pi (with Raspbian as OS).

sudo apt-get update

sudo apt-get upgrade

which was followed by time passing while updates completed.  It took about 45 minutes but I think I'd a fair number to get through.

Then a final bit of hygiene:

sudo apt-get upgrade

in order to ensure that the kernel was up to date.  Since the UnicornHat HD uses the SPI, you may find it's necessary to turn it on. This can be done using the following code:

sudo raapi-config nonint do_spi 0

sudo reboot

while waiting for a reboot. You can then download the full install for the UnicornHat HD using the following code:

curl https://get.pimoroni.com/unicornhathd | bash

with more information available at the pimoroni repository.  Once done, my Raspberry Pi's storage had a new addition in the form of a series of demos and examples.  They are worth checking out, particularly the simulation of Conway's Game of Life.  I'd also strongly urge you to use the diffuser they include in the packaging; when they say that the screen is bright, they aren't kidding.

Drawing Down the Data

From there, the first task was to get the bus ticker working.  Those who want to use the code I've uploaded to my repository linked to later in this blog post will want to sign up for a TFL Developer Account and will need to include their given credentials they obtain on signing up in calls to the API.  It is possible to use the feed without signing up as a developer but it's good practice to do it, avoids throttling of the feed under anonymous calls and helps TFL understand who its users are and gives you better support if you run into problems.  There's a tech forum set up by TFL and worth searching it to see if someone has had a similar issue before getting in touch with TFL.

The first thing you will need to generate a bus ticker are the bus stop ids for the bus route you want to check.  Unless you're planning on changing bus lookups a lot, you're probably only going to need to get this information once and the TFL API offers a useful means to get it searching the API for the common name of the bus stop.  

APIs typically are a good refresher in data structures.  The TFL Unified API is available in Javascript Object Notation format. Once we make a call via the API feed that we need to use next we have a list of dicts (with some further nested dicts in each dict).  To make the call to the API, we use the requests package, which makes a get request to obtain data from the specified feed.  A get request is only used to request data, not to modify it.  A successful interrogation should result in a 200 response meaning it has been successful and data has been provided.  We will of course have additional confirmation later of whether our feed is working in the form of visual feedback via the UnicornHat HD but more on that later.

The information we get in each call to the API is a list of dicts.  Each dict within the list relates to a bus, and each key value pair relates to a piece of information about that bus. There are some nested dicts within each dict that relates to a bus but we will not be using these for the purposes of this ticker.  We need the values that relate to the following keys in each dict:

"destinationName" #corresponds to the destination of the bus

"timeToStation" #corresponds to the time in seconds to arrival of the bus at the bus station we are concerned with

To note that we are only looking at this API in the UI provided by TFL to understand its structure and content at this point - we'll need to build a script we can run from the command line. 

Creating our Code

It may help to map out in pseudocode what we want to do:

  • Create a generalised function that uses the syntax of the API URL, plus the relevant unique reference of the stopPoint/bus stop ID that we'll store in a variable, plus the suffix of our url that comes after the unique stopPoint ID
  • Make a get request to the API using the URL
  • Store the resulting data as a JSON object, and then store it as a separate variable that we will then manipulate
  • Create an empty list to contain the parsed information from our manipulation of the JSON object
  • Iterate over the JSON object, pulling out the destination name for each object and the corresponding time to station of that bus (dividing the time by 60 to obtain the approximate and time to station rounded to the nearest minute) and storing this information as a string
  • Append each string representing a bus to our prepared list and return the list as the product of our fruitful function
  • Create a while loop that continues while true to make calls to the API every 15 seconds (to balance timeliness of information with ability to display that information on the led screen of the UnicornHat HD
  • Make calls to the generalised function for each bus we want to track, assigning the result to a variable, then concatenating each item in the list to display on the UnicornHat HD using suitable styling (I made heavy use of example code supplied by Pimoroni to get the ticker up and running but I plan to tinker with it further)

The resultant function main function looks like this:

`def next_bus(stop_point):

stop = str(stop_point)

r = requests.get('https://api-radon.tfl.gov.uk/StopPoint/' +         
stop_point + '/Arrivals')

json_result = r.json()

all_stops = json_result

my_buses = []

for i in all_stops:

    i = '{}, {}-min'.format(str(i['destinationName']),str(round(i['timeToStation']/60)))

    my_buses.append(i)

return my_buses`

Reflections

Certainly, the exercise was useful in helping to concretise the interrogation and manipulation of information from APIs.  All too often we can become accustomed to dealing with tabular data, and it's a good exercise in wrangling and thinking about data structures to roll up sleeves and deal with a live stream of data.  

The Raspberry Pi running headless is also a useful way to pull down modest amounts of data and has got me thinking about how I might be able to build information for future projects.  I've been playing with time series analysis, including via FB Prophet, and given the recent hot spell would like to try my hand at weather prediction in the next few weeks.  More on that later.

The exercise has also set me up for the rest of the end-to-end project with my peers. Onward.

The overall script is available on my github repository along with details of how to use it.  Have fun. It's a work in progress so it'll be updated over the next few weeks.

Time Series Forecasting with FB Prophet

Time Series Forecasting with FB Prophet

Can you read my handwriting? Adventures in Image Recognition

Can you read my handwriting? Adventures in Image Recognition