Architecting a WiFi Hotspot on a Remote Island

Internet access on Lord Howe Island is very limited. The island is extremely remote. I am intensely interested in providing affordable, accessible, and reliable internet connections to residents and guests.

The ‘Thornleigh Farm’ Internet (internally code-named ‘Nike’) is a newly launched service that offers public internet access on Lord Howe Island. Here are some of the architectural choices I made in developing the service.

Island Cloud

WiFi hotspot solutions require a server that acts to authenticate, authorise, and account for network access. The current industry trend appears to be toward doing this in the ‘cloud’ – I.e. remote data-centres.

Such a solution is not suitable for Lord Howe Island, because of satellite latency. Signals travel well over 70,000 kilometres through space between transmitting and receiving stations, yielding a practical minimum latency of around 600ms, often higher. This high latency creates a crappy customer experience during sign-on.

Instead, Nike utilises local servers for network control. Power is very expensive on Lord Howe Island, which led to a choice of low voltage Intel CPU’s for processing. Two dual-core Intel ‘NUC’ machines serve as hypervisors for an array of network control virtual machines.

Intel NUC machines and Ubiquiti switching equipment

Going local means replicating infrastructure we take for granted in the cloud. Nike utilises local DNS (Bind9), database (Postgres), cache (Redis), and web (NGINX) servers. It’s like stepping back in time, and really makes you appreciate Amazon Web Services (AWS)!

DNS Spaghetti

Bringing the Nike application “island-side” meant dealing extensively with the Domain Name System (DNS). Local requests to the application domain, thornleigfarm.com, need to be routed locally or via satellite depending on their purpose.

For example, new clients are served the Nike purchase page from a local server. Clients of the Thornleigh Farm Store, which offers food orders, are served from an AWS server via satellite.

A local Bind9 DNS captures all thornleighfarm.com domain traffic on our network, and punts it to the local Nginx server. Nginx then chooses to proxy the request to local applications, or to the external thornleighfarm.com AWS Route 53 DNS, depending on the request path.

An island-side client receiving content served from island servers

This request spaghetti has some cool effects: Clients requesting thornleighfarm.com/internet receive an information page when off the island, and a purchase page when they are on it.

Client Identification

From the outset, I wanted to avoid requiring user accounts. Older customers in particular react very poorly to needing to create a new user account, set a password, remember it, and so on.

Also, I am a privacy psychopath and I want to collect the absolute bare minimum customer data necessary to provide the service.

Instead, Nike identifies clients by device Media Access Control (MAC) address. This is uniquely possible on the Thornleigh network because all public clients are on the same subnet. The Nike application can get the MAC associated with a particular IP address in real-time by making a request to the network router.

Part of the Nike codebase that identifies clients by MAC

A small custom HTTP API runs on our Ubiquiti Edgemax router, that looks up a given MAC in its routing table and returns the associated IP if available.

Payments

Stripe is an amazing payments provider, full-stop. Their API is fantastically well documented, customer service brilliant, and tools of exceptional quality. They pay out every day, and offer low fees. I cannot recommend them highly enough.

Nike ran into a minor problem with the Stripe Checkout system: It does not work in Android WebViews. Android uses WebViews in a manner analogous to the Apple Captive Network Assistant: They sandbox public WiFi DNS capture. In Android’s case, the sandboxing is strict enough to kill Checkout.

Stripe Elements inside the MacOS Captive Network Assistant

This problem was easily solved by moving to Stripe Elements, and building a simple custom payments form utilising existing Nike styling.

Layer 1

Deploying physical network infrastructure on Lord Howe Island presents a few challenges. First, power is scarce. Second, regulatory approvals for any sort of island-modifying work are very difficult to obtain.

The property that serves as the nexus for Nike, Thornleigh Farm, is hidden inside a shield of palm forest. It is not possible to broadcast any meaningful signal out of the property, though we do offer the Public Internet network across the farm for the use of farm customers.

Fortunately, the property includes a glorious old boat-shed sitting on Lagoon Beach. Even more fortunately, an old copper conduit runs under the forest between farm and boat-shed. This enabled the installation of an optical fibre. The shed then acts as the southernmost network node.

Ubiquiti NanoBeam AC Gen2 radios provide multiple radio links in the Nike Layer 1 network

A 5Ghz link then penetrates a treeline to the north, linking to another island business, with whom we have joined forces, and who serve as the northernmost node.

All in all, a mixture of Cat6 copper, 5Ghz point-to-point radios, and optical fibre connect the satellite dishes with our server room and then on to the boat sheds on the beach.

Access Control

The Thornleigh Farm network is mostly built from Ubiquiti Unifi equipment. The WiFi networks, including the Nike ‘Public Internet’ network, are controlled by the proprietary Unifi Controller (UC), running on a local virtual machine.

The UC has a publicly documented API that ostensibly allows fairly fine grained manipulation of client network access. In practice, the documentation is not developer-friendly, and interacting with the UC was the most difficult part of the project outside construction of the physical network.

For a while, I flirted with deploying a fully custom system utilising open-source RADIUS and ChilliSpot software. This path did not bare fruit, and I settled back on bashing through the UC API.

An example of some of the calculations that occur while authorising a client

Nike functions as a Python application that interfaces with the UC whenever it needs to authorise, de-authorise, or check usage by a client. Data usage tracking is handled by custom code and stored in our local Postgres database.

The custom implementation allows us to do some fun stuff, like offer refunds of partial usage, and allow customers to stack multiple data packs on top of each other. Nike continuously updates the UC whenever a client’s remaining quota changes, and then the UC internally handles disconnecting the client when they exceed their quota .

Final Thoughts

Isolation, latency, and high operating costs make Lord Howe Island a difficult environment in which to deploy public internet. The internet is, however, a more and more crucial element of modern life. Participation in modern economic activity requires reliable connection to the internet, and I hope that in the long term Nike can serve a valuable service to residents and guests of Lord Howe Island.

If you’d like to discuss the project, hit me up on Twitter.