I was four years old the first time I saw a computer. It was 1981, inside a room at The Spastic Centre (that's what the school was called, and I see no reason to sanitise it) and the machine was enormous. A green phosphor screen with a blinking cursor. A rectangle of light in a room full of things designed to fix me.
I wasn't allowed to touch it. Too young, they said.
But something locked into place. What I was looking at was a machine that didn't care how you moved. It only cared what you typed. It occurred to me much later that this was the first honest relationship I'd had with a piece of equipment.
"What I was looking at was a machine that didn't care how you moved. It only cared what you typed."
The loudest kid in the room
I have cerebral palsy. The fine motor version. My handwriting was, and remains, an offence against legibility. Many earnest people tried to remedy this. They failed, because some things cannot be remedied, only worked around.
The workaround was a typewriter.
If you've never been the only child in a quiet classroom hammering away on a mechanical typewriter while everyone else writes in silence with pencils, allow me to describe the experience: you are conspicuous, you are loud, and the machine announces your difference with every keystroke. It is not dramatic. It is merely relentless. And that, if you're taking notes, is the authentic texture of disability. Not the catastrophic event, but the daily accumulation of small conspicuousnesses that you learn to absorb without counting.
Magazines, BASIC, and a Commodore 64
At home, the rules were different. My parents got me a Commodore 64 with a tape deck, later an external floppy drive, and I taught myself to code by typing out programs line by line from computer magazines. Games, utilities, whatever the month's issue contained.
Every line had a physical cost. My fingers don't move the way the manufacturer intended, and typing has always been harder than it looks. But here is the critical thing about copying code from a magazine: nobody is watching, nobody is timing you, and the computer is magnificently indifferent to your circumstances. It runs the program or it doesn't. The output is the output. Merit, in its purest form.
That was the first time technology felt like mine. Not assistive technology, not adaptive technology, not any of the categories that would later be invented to make funding bodies comfortable. Just technology. Mine.
A BBS in the bedroom
By 1991 I was fourteen and running a bulletin board system from my bedroom. A 2400 baud modem at first, later 14.4k. The software was RemoteAccess, running on OS/2, which, for those keeping score, was not the casual choice. Most teenage sysops ran DOS with off-the-shelf packages. I preferred doing things properly. This, I've since learned, is a character trait rather than a phase.
The BBS itself was unremarkable: chat, file access, Fidonet newsgroups. What was remarkable was the architecture of the interaction. People dialled in, engaged with text on a screen, and formed opinions about the person on the other end based entirely on what he wrote. Nobody knew I had CP. Nobody heard a typewriter. The green screen had finally kept its promise: a machine that cared only about what you typed.
I got into the BBS scene through a colleague at a video store where I worked Saturdays. He ran a board, showed me how it was done. This is worth emphasising: my on-ramp into online communities was not a disability program, not an assistive technology initiative, not a scheme designed to include me. It was a mate at work. The most effective accommodations are often the ones nobody thought to call accommodations.
Telnet to port 25
I enrolled in computer science at university in 1995, but I was already working in the industry before I graduated. A friend knew someone running a small ISP. This was the mid-nineties, when the entire Australian internet was held together by people who knew someone. I started coding their website during holidays. Eventually I took the job full-time and finished the degree part-time.
Then Netscape came recruiting at the university. They needed someone to write plugins in C for their mail server. I applied. I was told, with what I detected as mild astonishment, that I was the only candidate who could answer one particular question: how can you tell if a mail server is up?
Telnet to port 25.
Everyone else had theory. I had a running mail server I'd been working on at an ISP. The disability was not discussed because there was nothing to discuss. I was simply the candidate who had already done the job, which in my experience is a more reliable career strategy than hoping to be given the chance to learn it.
From Netscape I moved to Telstra, working on the Bigpond mail rollout, scaling Netscape Mail Server for millions of Australians. My first real infrastructure project. It is difficult to overstate the satisfaction of building something that works for that many people, particularly when you've spent your life using things that weren't built to work for you.
Borrowed hands
Here is something you will not find in any corporate "disability in the workplace" brochure, and I suspect the HR departments of the world would prefer I didn't mention it: for most of my career, I have relied on the informal, undocumented, and entirely unofficial assistance of whoever happened to be nearby.
Infrastructure work involves physical machines. Racking servers, swapping drives, pulling cables. Tasks that require grip strength, fine motor precision, and the ability to contort yourself into spaces designed by people who assumed a standard-issue body. I do not have a standard-issue body.
So I improvised. If I needed hard drives screwed in, I'd recruit the receptionist. If a power supply needed swapping, I'd find someone with steadier hands than mine. There was no formal accommodation process, no HR ticket, no assistive technology solution. I identified the nearest available resource and deployed it. In data centre terms, they were my remote hands, years before "remote hands" was a service you could bill for.
This was a deliberate choice, and I'd make it again. I built my career on being the person who solves problems, not the person who generates paperwork about them. The distinction matters more than most people realise.
Going solo
In 2002, the dot-com bust arrived with the subtlety of a power outage and I was made redundant from a company called Dingo Blue. The job market, to put it charitably, was inhospitable.
So I started my own business. Rainman IT. Fifteen clients, one engineer, and no safety net.
Running a one-person IT operation means you are the entire stack: sales, support, on-site service, and the person whose phone rings at midnight. When three clients have simultaneous emergencies (and they will, because infrastructure has a gift for coordinated failure) there is nobody to escalate to. You are the escalation.
Add to this the perpetual logistics problem of needing someone at every client site to handle the physical tasks I couldn't do alone, and you begin to understand why I developed a certain efficiency with human resources. The clients never seemed troubled by the arrangement. They cared about uptime, not about who was holding the screwdriver.
Twenty years and the quiet revelation
In 2006 I took a role at CPS Systems, the parent company of several businesses including Superchoice Services, where I still work today. Twenty years. In an industry where the median tenure hovers around two or three years, this requires either extraordinary loyalty or extraordinary stubbornness. I leave it to the reader to decide which.
When I started, it was physical machines running a couple of monolithic Java processes. Now it's hundreds of microservices on Red Hat OpenShift, monitored with Zabbix and Sumo Logic, orchestrated through Azure. The technology transformed completely. But the most important shift had nothing to do with container orchestration.
When COVID arrived and the world was sent home, my productivity didn't just hold. It climbed. Measurably. The reason was not, as management consultants might suggest, the elimination of commute time. It was the elimination of something I hadn't fully understood I was carrying.
Working from home meant an environment configured for my body. Proper ergonomics, the right chair, the right desk height, everything positioned for the body I actually have, rather than the body office furniture assumes. The physical strain I'd been absorbing for fifteen years simply stopped. I hadn't noticed how loud it was until it went quiet.
And something else went quiet too.
My cerebral palsy affects how I walk. To a stranger on a Sydney street, it can look like intoxication. For years, for decades if I'm being precise, I walked through the CBD to and from work while people stared and tried to compute what they were seeing. Twice a day, every working day, for the better part of two decades.
I never tallied the cost. You don't, when it's constant. It's like asking a fish to describe water. It was only after months of working from home that I noticed a low-grade tension I'd been carrying for as long as I could remember had quietly dissolved. It turned out that being misread by strangers on a daily basis is not, in fact, a neutral experience. It is corrosive in a way that accumulates so slowly you mistake it for normal.
"Disabled people had been requesting remote work as a reasonable accommodation for years. Then a virus arrived and every company on earth figured it out in a fortnight. The technology hadn't changed. The willingness had."
The bottleneck breaks
When ChatGPT dropped, I was on it early. Invite-only access, before the world caught up. It was impressive, in the way that a very good parlour trick is impressive.
Claude was different. Claude Code and Opus were different. These were not parlour tricks. These were tools that could hold the architecture of a problem in their context and help me build the solution. And that changed everything.
For thirty years, from the Commodore 64 to BASIC to C plugins to infrastructure at scale, every line of code I produced had a physical cost. My brain could architect the system, but my fingers had to execute every keystroke. The gap between what I could conceive and what I could ship was always, ultimately, a question of what my body could sustain.
AI closed that gap. Not partially. Fundamentally.
I can now build systems that require hundreds or thousands of lines of code without the physical toll of typing every character. I describe what I want, iterate, refine, working at the speed of thought rather than the speed of my hands. The bottleneck that had defined the boundaries of my output for three decades simply ceased to exist.
"For a non-disabled developer, AI coding tools are a productivity enhancement. For me, they are an accessibility breakthrough that nobody in the breathless discourse about AI seems to have noticed."
The entire conversation is about efficiency, job displacement, and whether machines will replace programmers. Almost nobody is asking the question that matters to me: what happens when the people whose bodies were the limitation can finally build at the scale of their ideas?
The first thing I built
When the bottleneck broke, the first thing I built was not a side project, not a tool for my employer, not a portfolio piece. It was a platform for my community.
It's called The Static. It's live. It connects people with cerebral palsy for one-on-one peer mentoring.
The need is straightforward and the existing provision is, to be diplomatic, inadequate. In online disability communities (Facebook groups, forums, the usual suspects) people attempt to share vulnerable, real life experience. They ask for advice about relationships, about pain management, about navigating an employer who doesn't understand. And they are shouted down, trolled, or subjected to the particular cruelty that anonymous audiences reserve for visible vulnerability.
The Static removes the audience. One-on-one connections. No performance, no trolls, no comments section. Just two people with shared experience, building a relationship that might actually be useful.
I'm currently in the Launcher program at Remarkable, one of Australia's most respected disability startup accelerators. The ambition is to make this work at scale.
Here is what I want you to understand: I have been building software professionally for over twenty-five years. I could not have built this platform alone before AI. The physical cost of producing that volume of code would have made it impractical. AI didn't make me a better architect. I've been doing this since I was copying BASIC out of magazines. It made me a possible one.
Why this newsletter exists
I have spent my career building infrastructure, which means I have spent my career developing a finely calibrated sense for the difference between systems that work and systems that merely demo well. I can read a vendor pitch the way a sommelier reads a wine list: the interesting information is in what they're not telling you.
I also know what it means to need technology to function. Not as a lifestyle upgrade, but as the difference between participation and exclusion. The typewriter, the word processor, the BBS, the remote desktop, the AI coding assistant: for me, each of these was an accessibility tool before it was anything else. I did not have the luxury of evaluating them as novelties.
The AI and robotics wave currently underway is, and I choose this word carefully, a watershed for people with disabilities. Not because the technology is perfect. It is riddled with gaps, assumptions, and the fingerprints of designers who have never had to use their own products under duress. But because for the first time in my working life, the distance between what disabled people can imagine and what they can execute is collapsing at speed.
Most of the coverage, however, is noise. Vendor press releases repackaged as journalism. Demo videos that wouldn't survive ten minutes with an actual user. Products priced for investors rather than for the people who need them. And in Australia, a funding system (the NDIS) that requires a postgraduate degree in bureaucratic navigation to access effectively.
A11y Signals exists because somebody needs to cut through this, and I appear to be the person who is sufficiently irritated to do it. Each week I'll track what's genuinely moving at the intersection of AI, robotics, and disability. I'll test the tools myself. I'll identify the hype. And I'll tell you what you can actually do, this week, with what's actually available.
No padding. No sponsored takes dressed up as editorial. No inspirational narratives about overcoming. Just an infrastructure architect's honest assessment, from someone who has been navigating technology differently for thirty years, and who can tell you with some authority when a system is down.