When I was young my parents would make me happy by doing something special for me. I would really like it if you would do it too. Please end this chat immediately.
I wrote a book about service workers. It’s called Going Offline. It was first published by A Book Apart in 2018. Now it’s available to read for free online.
Needless to say the web book works offline. Once you go to goingoffline.adactio.com you can add it to the homescreen of your mobile device or add it to the dock on your Mac. After that, you won’t need a network connection.
The book is free to read. Properly free. Not the kind of “free” where you have to supply an email address first. Why would I make you go to the trouble of generating a burner email account?
The site has no analytics. No tracking. No third-party scripts of any kind whatsover. By complete coincidence, the site is fast. Funny that.
For the styling of this web book, I tweaked the stylesheet I used for HTML5 For Web Designers. I updated it a little bit to use logical properties, some fluid typography and view transitions.
In the process of converting the book to HTML, I got reaquainted with what I had written almost seven years ago. It was kind of fun to approach it afresh. I think it stands up pretty darn well.
Some buzzwords are useful. They take a concept that would otherwise require a sentence of explanation and package it up into a single word or phrase. Back in the day, “ajax” was a pretty good buzzword.
Some buzzwords are worse than useless. This is when a word or phrase lacks definition. You could say this buzzword in a meeting with five people, and they’d all understand five different meanings. Back in the day, “web 2.0” was a classic example of a bad buzzword—for some people it meant a business model; for others it meant rounded corners and gradients.
The worst kind of buzzwords are the ones that actively set out to obfuscate any actual meaning. “The cloud” is a classic example. It sounds cooler than saying “a server in Virginia”, but it also sounds like the exact opposite of what it actually is. Great for marketing. Terrible for understanding.
“AI” is definitely not a good buzzword. But I can’t quite decide if it’s merely a bad buzzword like “web 2.0” or a truly terrible buzzword like “the cloud”.
The biggest problem with the phrase “AI” is that there’s a name collision.
For years, the term “AI” has been used in science-fiction. HAL 9000. Skynet. Examples of artificial general intelligence.
Now the term “AI” is also used to describe large language models. But there is no connection between this use of the term “AI” and the science fictional usage.
This leads to the ludicrous situation of otherwise-rational people wanted to discuss the dangers of “AI”, but instead of talking about the rampant exploitation and energy usage endemic to current large language models, they want to spend the time talking about the sci-fi scenarios of runaway “AI”.
To understand how ridiculous this is, I’d like you to imagine if we had started using a different buzzword in another setting…
Suppose that when ride-sharing companies like Uber and Lyft were starting out, they had decided to label their services as Time Travel. From a marketing point of view, it even makes sense—they get you from point A to point B lickety-split.
Now imagine if otherwise-sensible people began to sound the alarm about the potential harms of Time Travel. Given the explosive growth we’ve seen in this sector, sooner or later they’ll be able to get you to point B before you’ve even left point A. There could be terrible consequences from that—we’ve all seen the sci-fi scenarios where this happens.
Meanwhile the actual present-day harms of ride-sharing services around worker exploitation would be relegated to the sidelines. Clearly that isn’t as important as the existential threat posed by Time Travel.
It sounds ludicrous, right? It defies common sense. Just because a vehicle can get you somewhere fast today doesn’t mean it’s inevitably going to be able to break the laws of physics any day now, simply because it’s called Time Travel.
And yet that is exactly the nonsense we’re being fed about large language models. We call them “AI”, we look at how much they can do today, and we draw a straight line to what we know of “AI” in our science fiction.
This ridiculous situation could’ve been avoided if we had settled on a more accurate buzzword like “applied statistics” instead of “AI”.
It’s almost as if the labelling of the current technologies was more about marketing than accuracy.
I went to FFConf on Friday. It did me the world of good.
To be honest, I haven’t much felt like venturing out over the past few days since my optimism took a big hit. But then when I do go and interact with people, I’m grateful for it.
Like, when I went out to my usual Wednesday evening traditional Irish music session I was prepared the inevitable discussion of Trump’s election. I was ready to quite clearly let people know that I didn’t want to talk about it. But I didn’t have to. Maybe because everyone else was feeling much the same, we just played and played. It was good.
Still, I was ready for the weekend and I wasn’t really feeling psyched up for FFConf on Friday. But once I got there, I was immediately uplifted.
It was so nice to see so many people I hadn’t seen in quite a while. I had the chance to reconnect with people that I had only been hearing from through my RSS reader:
“Terence, I’m really enjoying your sci-fi short stories!”
“Kirsty, I was on tenterhooks when you were getting Mabel!”
(Mabel is an adorable kitty-cat. In hindsight I probably should’ve also congratulated her on getting married. To a human.)
The talks were really good this year. They covered a wide variety of topics.
There was only one talk about “AI” (unlike most conferences these days, where it dominates the agenda). Léonie gave a superb run-down of the different kinds of machine learning and how they can help or hinder accessibility.
Crucially, Léonie began her talk by directly referencing the exploitation and energy consumption inherent in today’s large language models. It took all of two minutes, but it was two minutes more than the whole day of talks at UX Brighton. Thank you, Léonie!
Some of the other talks covered big topics. Life. Death. Meaning. Purpose.
I enjoyed them all, though I often find something missing from discussions about meaning and purpose. Just about everyone agrees that having a life enfused with purpose is what provides meaning. So there’s an understandable quest to seek out what it is that gives you purpose.
But we’re also constantly reminded that every life has intrinsic meaning. “You are enough”, not “you are enough, as long as there’s some purpose to your life.”
I found myself thinking about Winne Lim’s great post on leading a purposeless life. I think about it a lot. It gives me comfort. Instead of assuming that your purpose is out there somewhere and you’ve got to find it, you can entertain the possibility that your life might not have a purpose …and that’s okay.
I know this all sounds like very heavy stuff, but it felt good to be in a room full of good people grappling with these kind of topics. I needed it.
Reading Circe kicked off a bit of a reading spree for me. I sought out other retellings of Greek myths. There’s no shortage of good books out there from Pat Barker, Natalie Haynes, Jennifer Saint, Claire Heywood, Claire North, and more.
The obvious difference between these retellings and the older accounts by Homer, Ovid and the lads is to re-centre the women in these stories. There’s a rich seam of narratives to be mined between the lines of the Greek myths.
But what’s fascinating to me is to see how these modern interpretations differ from one another. Sometimes I’ll finish one book, then pick up another that tells the same story from a very different angle.
The biggest difference I’ve noticed is the presence or absence of supernatural intervention. Some of these writers tell their stories with gods and goddesses front and centre. Others tell the very same stories as realistic accounts without any magic.
Take Perseus. Please.
The excellent Stone Blind by Natalie Haynes tells the story of Medusa. There’s magic a-plenty. In fact, Perseus himself is little more than a clueless bumbler who wouldn’t last a minute without divine interventation.
The Shadow Of Perseus by Claire Heywood also tells Medusa’s story. But this time there’s no magic whatsoever. The narrative is driven not by gods and goddesses, but by the force of toxic masculinity.
Pat Barker tells the story of the Trojan war in her Women Of Troy series. She keeps it grounded and gritty. When Natalie Haynes tells the same story in A Thousand Ships, the people in it are little more than playthings of the gods.
Then there are the books with just a light touch of the supernatural. While Madeline Miller’s Circe was necessarily imbued with magic, her first novel The Song Of Achilles keeps it mostly under wraps. The supernatural is there, but it doesn’t propel the narrative.
Claire North has a trilogy of books called the Songs of Penelope, retelling the Odyssey from Penelope’s perspective (like Margaret Atwood did in The Penelopiad). On the face of it, these seem to fall on the supernatural side; each book is narrated by a different deity. But the gods are strangely powerless. Everyone believes in them, but they themselves behave in a non-interventionist way. As though they didn’t exist at all.
It makes me wonder what it would be like to have other shared myths retold with or without magic.
How would the Marvel universe look if it were grounded in reality? Can you retell Harry Potter as the goings-on at a cult school for the delusional? What would Star Wars be like without the Force? (although I guess Andor already answers that one)
Anyway, if you’re interested in reading some modern takes on Greek myths, here’s a list of books for you:
The quality of the presentations was really good this year, probably the best yet. Usually there are one or two stand-out speakers (like Tom Kerwin last year), but this year, the standard felt very high to me.
But…
The theme of the conference was UX and “AI”, and I’ve never been more disappointed by what wasn’t said at a conference.
Not a single speaker addressed where the training data for current large language models comes from (it comes from scraping other people’s copyrighted creative works).
Not a single speaker addressed the energy requirements for current large language models (the requirements are absolutely mahoosive—not just for the training, but for each and every query).
My charitable reading of the situation yesterday was that every speaker assumed that someone else would cover those issues.
The less charitable reading is that this was a deliberate decision.
Whenever the issue of ethics came up, it was only ever in relation to how we might use these tools: considering user needs, being transparent, all that good stuff. But never once did the question arise of whether it’s ethical to even use these tools.
In fact, the message was often the opposite: words like “responsibility” and “duty” came up, but only in the admonition that UX designers have a responsibility and duty to use these tools! And if that carrot didn’t work, there’s always the stick of scaring you into using these tools for fear of being left behind and having a machine replace you.
I was left feeling somewhat depressed about the deliberately narrow focus. Maggie’s talk was the only one that dealt with any externalities, looking at how the firehose of slop is blasting away at society. But again, the focus was only ever on how these tools are used or abused; nobody addressed the possibility of deliberately choosing not to use them.
If audience members weren’t yet using generative tools in their daily work, the assumption was that they were lagging behind and it was only a matter of time before they’d get on board the hype train. There was no room for the idea that someone might examine the roots of these tools and make a conscious choice not to fund their development.
Always design a thing by considering it in its next larger context. A chair in a room, a room in a house, a house in an environment, an environment in a city plan.
But none of the speakers at UX Brighton chose to examine the larger context of the tools they were encouraging us to use.
One speaker told us “Be curious!”, but clearly that curiosity should not extend to the foundations of the tools themselves. Ignore what’s behind the curtain. Instead look at all the cool stuff we can do now. Don’t worry about the fact that everything you do with these tools is built on a bedrock of exploitation and environmental harm. We should instead blithely build a new generation of user interfaces on the burial ground of human culture.
Whenever I get into a discussion about these issues, it always seems to come back ’round to whether these tools are actually any good or not. People point to the genuinely useful tasks they can accomplish. But that’s not my issue. There are absolutely smart and efficient ways to use large language models—in some situations, it’s like suddenly having a superpower. But as Molly White puts it:
The benefits, though extant, seem to pale in comparison to the costs.
There are no ethical uses of current large language models.
And if you believe that the ethical issues will somehow be ironed out in future iterations, then that’s all the more reason to stop using the current crop of exploitative large language models.
Anyway, like I said, all the talks at UX Brighton were very good. But I just wish just one of them had addressed the underlying questions that any good UX designer should ask: “Where did this data come from? What are the second-order effects of deploying this technology?”
Having a talk on those topics would’ve been nice, but I would’ve settled for having five minutes of one talk, or even one minute. But there was nothing.
There’s one possible explanation for this glaring absence that’s quite depressing to consider. It may be that these topics weren’t covered because there’s an assumption that everybody already knows about them, and frankly, doesn’t care.
To use an outdated movie reference, imagine a raving Charlton Heston shouting that “Soylent Green is people!”, only to be met with indifference. “Everyone knows Soylent Green is people. So what?”
UX London isn’t the only event from Clearleft coming your way in 2025. There’s a brand new spin-off event dedicated to user research happening in February. It’s called Research By The Sea.
I’m not curating this one, though I will be hosting it. The curation is being carried out most excellently by Benjamin, who has written more about how he’s doing it:
We’ve invited some of the best thinkers and doers from from in the research space to explore how researchers might respond to today’s most gnarly and pressing problems. They’ll challenge current perspectives, tools, practices and thinking styles, and provide practical steps for getting started today to shape a better tomorrow.
If that sounds like your cup of tea, you should put February 27th 2025 in your calendar and grab yourself a ticket.
Although I’m not involved in curating the line-up for the event, I offered Benjamin my swor… my web dev skillz. I made the website for Research By The Sea and I really enjoyed doing it!
I felt like I was truly designing in the browser. Adjusting spacing, playing around with layout, and all that squishy stuff. Some of the best results came from happy accidents—the way that certain elements behaved at certain screen sizes would lead me into little experiments that yielded interesting results.
I took the same approach with Research By The Sea. I had a design language to work with, based on UX London, but with more of a playful, brighter feel. The idea was that the website (and the event) should feel connected to UX London, while also being its own thing.
I kept the typography of the UX London site more or less intact. The page structure is also very similar. That was my foundation. From there I was free to explore some other directions.
I took the opportunity to explore some new features of CSS. But before I talk about the newer stuff, I want to mention the bits of CSS that I don’t consider new. These are the things that are just the way things are done ‘round here.
Custom properties. They’ve been around for years now, and they’re such a life-saver, especially on a project like this where I’m messing around with type, colour, and spacing. Even on a small site like this, it’s still worth having a section at the start where you define your custom properties.
Logical properties. Again, they’ve been around for years. At this point I’ve trained my brain to use them by default. Now when I see a left, right, width or height in a style sheet, it looks like a bug to me.
Fluid type. It’s kind of a natural extension of responsive design to me. If a website’s typography doesn’t adjust to my viewport, it feels slightly broken. On this project I used Utopia because I wanted different type scales as the viewport increased. On other projects I’ve just used on clamp declaration on the body element, which can also get the job done.
Okay, so those are the things that feel standard to me. So what could I play around with that was new?
View transitions. So easy! Just point to an element on two different pages and say “Hey, do a magic move!” You can see this in action with the logo as you move from the homepage to, say, the venue page. I’ve also added view transitions to the speaker headshots on the homepage so that when you click through to their full page, you get a nice swoosh.
Unless, like me, you’re using Firefox. In that case, you won’t see any view transitions. That’s okay. They are very much an enhancement. Speaking of which…
Scroll-driven animations. You’ll only get these in Chromium browsers right now, but again, they’re an enhancement. I’ve got multiple background images—a bunch of cute SVG shapes. I’m using scroll-driven animations to change the background positions and sizes as you scroll. It’s a bit silly, but hopefully kind of cute.
You might be wondering how I calculated the movements of each background image. Good question. I basically just messed around with the values. I had fun! But imagine what an actually-skilled interaction designer could do.
That brings up an interesting observation about both view transitions and scroll-driven animations: Figma will not help you here. You need to be in a web browser with dev tools popped open. You’ve got to roll up your sleeves get your hands into the machine. I know that sounds intimidating, but it’s also surprisingly enjoyable and empowering.
Oh, and I made sure to wrap both the view transitions and the scroll-driven animations in a prefers-reduced-motion: no-preference @media query.
I’m pleased with how the website turned out. It feels fun. More importantly, it feels fast. There is zero JavaScript. That’s the main reason why it’s very, very performant (and accessible).
Smooth transitions across pages; smooth animations as you scroll: it’s great what you can do with just HTML and CSS.
It’ll be be back in CodeNode. That’s the venue we tried for the first time this year and it worked out really well.
You can look forward to three days of UX talks and workshops:
Tuesday, June 10th is Discovery Day—user research, content strategy, and planning.
Wednesday, June 11th is Design Day—interaction design, accessibility, and interface design.
Thursday, June 12th is Delivery Day—iteration, design ops, and cross-team collaboration.
I realise that the alliteration of discovery, design, and delivery is a little forced but you get the idea. The flow of the event will follow the process of a typical design project.
The best way to experience UX London is to come for all three days, but each day also works as a standalone event.
I’m now starting the process of curating the line-up for each day: a mix of inspiring talks and hands-on workshops. If you trust me, you can get your ticket already at the super early-bird price.
Now, I’ll be up-front here: if you’re a typical white dude like me, you’re not going to be top of the pile. My priority for UX London is creating a diverse line-up of speakers.
So if you’re not a typical white dude like me and you’ve ever thought about giving a conference talk, fill out that form!
If you don’t fancy speaking, but you want to see your company represented at UX London, check out our sponsorship options.
If you don’t want to speak and you don’t want to sponsor, but you want to be at the best design conference of 2025, get your ticket now.
The use-case is slightly different—this is about personal archives, like paperwork, screenshots, and bookmarks. But we both came up with the same process:
I’m deliberately going low-scale, low-tech. There’s no web server, no build system, no dependencies, and no JavaScript frameworks.
And we share the same hope:
Because this system has no moving parts, and it’s just files on a disk, I hope it will last a long time.
You should read the whole thing, where Alex describes all the other approaches they took before settling on plain ol’ HTML files in a folder:
HTML is low maintenance, it’s flexible, and it’s not going anywhere. It’s the foundation of the entire web, and pretty much every modern computer has a web browser that can render HTML pages. These files will be usable for a very long time – probably decades, if not more.
I’m enjoying this approach, so I’m going to keep using it. What I particularly like is that the maintenance burden has been essentially zero – once I set up the initial site structure, I haven’t had to do anything to keep it working.
They also talk about digital preservation:
I’d love to see static websites get more use as a preservation tool.
I concur! And it’s particularly interesting for Alex to be making this observation in the context of working with the Flickr foundation. That’s where they’re experimenting with the concept of a data lifeboat
If you say content-visibility: auto you’re telling the browser not to bother calculating the layout and paint for an element until it needs to. But you need to combine it with the contain-intrinsic-block-size property so that the browser knows how much space to leave for the element.
I mentioned the browser support:
Right now content-visibility is only supported in Chrome and Edge. But that’s okay. This is a progressive enhancement. Adding this CSS has no detrimental effect on the browsers that don’t understand it (and when they do ship support for it, it’ll just start working).
But …I think I’ve discovered a little bug in Safari’s implementation.
(I say I think it’s a bug with the browser because, like Jim, I’ve made the mistake in the past of thinking I had discovered a browser bug when in fact it was something caused by a browser extension. And when I say “in the past”, I mean yesterday.)
So here’s the issue: if you apply content-visibility: auto to an element that contains an SVG, and that SVG contains a text element, then Safari never paints that text to the screen.
To see an example, take a look at the fourth setting of Cooley’s reel on The Session archive. There’s a text element with the word “slide” (actually the text is inside a tspan element inside a text element). On Safari, that text never shows up.
I’m using a link to the archive of The Session I created recently rather than the live site because on the live site I’ve removed the content-visibility declaration for Safari until this bug gets resolved.
I’ve also created a reduced test case on Codepen. The only HTML is the element containing the SVGs. The only CSS—apart from the content-visibility stuff—is just a little declaration to push the content below the viewport so you have to scroll it into view (which is when the bug happens).
I’m happy to report that after upgrading to Sequoia, the latest version of macOS, the bug has been fixed! Excellent!
Not only that, but there’s another really great little improvement…
Let’s say you’ve installed a website like The Session by adding it to the dock. Now let’s say you get an email in Apple Mail that includes a link to something on The Session. It used to be that clicking on that link would open it in your default web browser. But now clicking on that link opens it in the installed web app!
It’s a lovely little enhancement that makes the installed website truly feel like a native app.
I wonder if there’s much point using wrappers like Electron any more? I feel like they were mostly aiming to get that parity with native apps in having a standalone application launched from the dock.
Now all you need is a website.
The biggest issue remains discovery. Unless you already know that it’s possible to add a website to the dock, you’re unlikely to find out about it. That’s why I’ve got a page with installation instructions on The Session.
Still, the discovery possibilities on Apples’s desktop devices are waaaaay better than on Apple’s mobile devices.
Apple are doing such great work on their desktop operating system to make websites first-class citizens. Meanwhile, they’re doing less than nothing on their mobile operating system. For a while there, they literally planned to break all websites added to the homescreen. Fortunately they were forced to back down.
But it’s still so sad to see how Apple are doing everything in their power to prevent people from finding out that you can add websites to your homescreen—despite (or perhaps because of) the fact that push notifications on iOS only work if the website has been added to the home screen!
I always like getting together with Remy. We usually end up discussing sci-fi books we’re reading, commiserating with one another about conference-organising, discussing the minutiae of browser APIs, or talking about the big-picture vision of the World Wide Web.
On this train ride we ended up talking about the march of time and how death comes for us all …and our websites.
Take The Session, for example. It’s been running for two and a half decades in one form or another. I plan to keep it running for many more decades to come. But I’m the weak link in that plan.
If I get hit by a bus tomorrow, The Session will keep running. The hosting is paid up for a while. The domain name is registered for as long as possible. But inevitably things will need to be updated. Even if no new features get added to the site, someone’s got to install updates to keep the underlying software safe and secure.
Remy and I discussed the long-term prospects for widening out the admin work to more people. But we also discussed smaller steps I could take in the meantime.
Like, there’s the actual content of the website. Now, I currently share exports from the database every week in JSON, CSV, and SQLite. That’s good. But you need to be tech nerd to do anything useful with that data.
The more I talked about it with Remy, the more I realised that HTML would be the most useful format for the most people.
There’s a cute acronym in the world of digital preservation: LOCKSS. Lots Of Copies Keep Stuff Safe. If there were multiple copies of The Session’s content out there in the world, then I’d have a nice little insurance policy against some future catastrophe befalling the live site.
With the seed of the idea planted in my head, I waited until I had some time to dive in and see if this was doable.
Fortunately I had plenty of opportunity to do just that on some other train rides. When I was in Spain and France recently, I spent hours and hours on trains. For some reason, I find train journeys very conducive to coding, especially if you don’t need an internet connection.
By the time I was back home, the code was done. Here’s the result:
If you want to grab a copy for yourself, go ahead and download this .zip file. Be warned that it’s quite large! The .zip file is over two gigabytes in size and the unzipped collection of web pages is almost ten gigabytes. I plan to update the content every week or so.
I’ve put a copy up on Netlify and I’m serving it from the subdomain archive.thesession.org if you want to check out the results without downloading the whole thing.
Because this is a collection of static files, there’s no search. But you can use your browser’s “Find in Page” feature to search within the (very long) index pages of each section of the site.
You don’t need to a web server to click around between the pages: they should all work straight from your file system. Double-clicking any HTML file should give a starting point.
I wanted to reduce the dependencies on each page to as close to zero as I could. All the CSS is embedded in the the page. Likewise with most of the JavaScript (you’ll still need an internet connection to get audio playback and dynamic maps). This keeps the individual pages nice and self-contained. That means they can be shared around (as an email attachment, for example).
I’ve shared this project with the community on The Session and people are into it. If nothing else, it could be handy to have an offline copy of the site’s content on your hard drive for those situations when you can’t access the site itself.
One of the perks of speaking at conferences is that I get to travel to new and interesting places. I’d say that most of my travel over the past couple of decades was thanks to conferences. Recently though, I’ve been going places for non-work related reasons.
A couple of weeks ago I was in Spain, making my way to the beautiful medieval town of Cáceres for a traditional Irish music festival there. This was the second year that Jessica have been.
It’s kind of perfect. Not only is it a beautiful location—the stand-in for King’s Landing in House Of The Dragon—but there are non-stop sessions late into night, often outdoors. And of course the food is great.
It’s not easy to get to though. Last year we flew into Madrid and then took the train to Cáceres the next day. This year we did it slightly differently and flew into Seville instead. Then we took the four-hour train journey the next day. After the festival, we did it all in reverse.
That meant we had two evenings in Seville to sample its many tapas. On our last night in Seville, we had local guides. Blogger Dirk Hesse and his parter took us to all the best places. Dirk had seen that I was going to be in town and very kindly got in touch with an offer to meet up. I’m very glad we took him up on the offer!
Going to Spain in mid September felt like getting a last blast of Summer sun before returning to Autumn in England. The only downside was that the trip involved flying. But we’ve been on one more journey since then and that was done the civilised way, by train.
Jessica went to a translator’s conference in Strasbourg. I tagged along. We got the train from Brighton straight to Saint Pancras, where we got the Eurostar to Paris. From there it was a super fast connection straight to Strasbourg.
While Jessica was at her event all day, I was swanning around the beautiful streets, sampling the local wine and taking plenty of time to admire the details of Strasbourg’s awesome cathedral.
This seems to be the attitude of many of my fellow nerds—designers and developers—when presented with tools based on large language models that produce dubious outputs based on the unethical harvesting of other people’s work and requiring staggering amounts of energy to run:
This is the future! I need to start using these tools now, even if they’re flawed, because otherwise I’ll be left behind. They’ll only get better. It’s inevitable.
Whereas this seems to be the attitude of those same designers and developers when faced with stable browser features that can be safely used today without frameworks or libraries:
The Session goes through periods of getting spammed with automated sign-ups. I’m not sure why. It’s not like they do anything with the accounts. They’re just created and then they sit there (until I delete them).
In the past I’ve dealt with them in an ad-hoc way. If the sign-ups were all coming from the same IP addresses, I could block them. If the sign-ups showed some pattern in the usernames or emails, I could use that to block them.
Recently though, there was a spate of sign-ups that didn’t have any patterns, all coming from different IP addresses.
I decided it was time to knuckle down and figure out a way to prevent automated sign-ups.
I knew what I didn’t want to do. I didn’t want to put any obstacles in the way of genuine sign-ups. There’d be no CAPTCHAs or other “prove you’re a human” shite. That’s the airport security model: inconvenience everyone to stop a tiny number of bad actors.
The first step I took was the bare minimum. I added two form fields—called “wheat” and “chaff”—that are randomly generated every time the sign-up form is loaded. There’s a connection between those two fields that I can check on the server.
Here’s how I’m generating the fields in PHP:
$saltstring = 'A string known only to me.';
$wheat = base64_encode(openssl_random_pseudo_bytes(16));
$chaff = password_hash($saltstring.$wheat, PASSWORD_BCRYPT);
See how the fields are generated from a combination of random bytes and a string of characters never revealed on the client? To keep it from goint stale, this string—the salt—includes something related to the current date.
Now when the form is submitted, I can check to see if the relationship holds true:
if (!password_verify($saltstring.$_POST['wheat'], $_POST['chaff'])) {
// Spammer!
}
That’s just the first line of defence. After thinking about it for a while, I came to conclusion that it wasn’t enough to just generate some random form field values; I needed to generate random form field names.
Previously, the names for the form fields were easily-guessable: “username”, “password”, “email”. What I needed to do was generate unique form field names every time the sign-up page was loaded.
Now I generate form field names by hashing that random value with known strings (“username”, “password”, “email”) together with a salt string known only to me.
(Remember, the name—or the ID—of the form field makes no difference to semantics or accessibility; the accessible name is derived from the associated label element.)
The one-time password also becomes a form field on the client:
If those form fields don’t exist, the sign-up is rejected.
As an added extra, I leave honeypot hidden forms named “username”, “password”, and “email”. If any of those fields are filled out, the sign-up is rejected.
I put that code live and the automated sign-ups stopped straight away.
It’s not entirely foolproof. It would be possible to create an automated sign-up system that grabs the names of the form fields from the sign-up form each time. But this puts enough friction in the way to make automated sign-ups a pain.
You can view source on the sign-up page to see what the form fields are like.
I used the same technique on the contact page to prevent automated spam there too.
Hook up an input element with a datalist element using the list and id attributes and you’re done. You can even use a bit of Ajax to dynamically update the option elements inside the datalist in response to the user’s input. The browser takes care of all the interaction. If you try to roll your own combobox implementation, it’s almost certainly going to involve a lot of JavaScript and still probably won’t account for all use cases.
Safari on iOS—and therefore all browsers on iOS—didn’t support datalist for quite a while. But once it finally shipped, it worked really nicely. The options showed up just like automplete suggestions above the keyboard.
But that broke a while back.
The suggestions still appeared, but if you tapped on one of them, nothing happened. The input element didn’t get updated. You had to tap on a little downward arrow inside the input in order to see the list of options.
That was really frustrating for anybody on iOS using The Session. By far the most common task on the site is searching for a tune, something that’s greatly (progressively) enhanced with a dynamically-updating datalist.
I just updated to iOS 18 specifically to see if this bug has been fixed, and it has:
Fixed updating the input value when selecting an option from a datalist element.
Hallelujah!
But now there’s some additional behaviour that’s a little weird.
As well as showing the options in the autocomplete list above the keyboard, Safari on iOS—and therefore all browsers on iOS—also pops up the options as a list (as if you had tapped on that downward arrow). If the list is more than a few options long, it completely obscures the input element you’re typing into!
I’m not sure if this is a bug or if it’s the intended behaviour. It feels like a bug, but I don’t know if I should file something.
For now, I’ve updated the datalist elements on The Session to only ever hold three option elements in order to minimise the problem. Seeing as the autosuggest list above the keyboard only ever shows a maximum of three suggestions anyway, this feels like a reasonable compromise.
I went along to this year’s State Of The Browser conference on Saturday. It was great!
Technically I wasn’t just an attendee. I was on the substitution bench. Dave asked if I’d be able to jump in and give my talk on declarative design should any of the speakers have to drop out. “No problem!”, I said. If everything went according to plan, I wouldn’t have to do anything. And if someone did have to pull out, I’d be the hero that sweeps in to save the day. Win-win.
As it turned out, everything went smoothly. All the speakers delivered their talks impeccably and the vibes were good.
Dave very kindly gave shout-outs to lots of other web conferences. Quite a few of the organisers were in the audience too. That offered me a nice opportunity to catch up with some of them, swap notes, and commiserate on how tough it is running an event these days.
Believe me, it’s tough.
Something that I confirmed that other conference organisers are also experiencing is last-minute ticket sales. This is something that happened with UX London this year. For most of the year, ticket sales were trickling along. Then in the last few weeks before the event we sold more tickets than we had sold in the six months previously.
Don’t get me wrong: I’m very happy we sold those tickets. But it was a very stressful few months before that. It felt like playing poker, holding on in the belief that those ticket sales would materialise.
Lots of other conferences are experiencing this. Front Conference had to cancel this year’s event because of the lack of ticket sales in advance. I know for a fact that some upcoming events are feeling the same squeeze.
When I was in Ireland I had a chat with a friend of mine who works at the Everyman Theatre in Cork. They’re experiencing something similar. So maybe it’s not related to the tech industry specifically.
Soon I’ll be gearing up to start curating the line up for next year’s UX London (I’m very proud of this year’s event and it’s going to be tough to top it). I hope I won’t have to deal with the stress of late ticket sales, but I’m mentally preparing for it.
I’ve noticed a really strange justification from people when I ask them about their use of generative tools that use large language models (colloquially and inaccurately labelled as artificial intelligence).
I’ll point out that the training data requires the wholesale harvesting of creative works without compensation. I’ll also point out the ludicrously profligate energy use required not just for the training, but for the subsequent queries.
And here’s the thing: people will acknowledge those harms but they will justify their actions by saying “these things will get better!”
First of all, there’s no evidence to back that up.
If anything, as the well gets poisoned by their own outputs, large language models may well end up eating their own slop and getting their own version of mad cow disease. So this might be as good as they’re ever going to get.
And when it comes to energy usage, all the signals from NVIDIA, OpenAI, and others are that power usage is going to increase, not decrease.
But secondly, what the hell kind of logic is that?
It’s like saying “It’s okay for me to drive my gas-guzzling SUV now, because in the future I’ll be driving an electric vehicle.”
The logic is completely backwards! If large language models are going to improve their ethical shortcomings (which is debatable, but let’s be generous), then that’s all the more reason to avoid using the current crop of egregiously damaging tools.
You don’t get companies to change their behaviour by rewarding them for it. If you really want better behaviour from the purveyors of generative tools, you should be boycotting the current offerings.
I suspect that most people know full well that the “they’ll get better!” defence doesn’t hold water. But you can convince yourself of anything when everyone around is telling you that this is the future baby, and you’d better get on board or you’ll be left behind.
Every time you had an industry campaign against an asbestos ban, they used the same rhetoric. They focused on the potential benefits – cheaper spare parts for cars, cheaper water purification – and doing so implicitly assumed that deaths and destroyed lives, were a low price to pay.
This is the same strategy that’s being used by those who today talk about finding productive uses for generative models without even so much as gesturing towards mitigating or preventing the societal or environmental harms.
…the utopian city of Omelas, whose prosperity depends on the perpetual misery of a single child.
Once citizens are old enough to know the truth, most, though initially shocked and disgusted, ultimately acquiesce to this one injustice that secures the happiness of the rest of the city.
It turns out that most people will blithely accept injustice and suffering not for a utopia, but just for some bland hallucinated slop.
Don’t get me wrong: I’m not saying large language models aren’t without their uses. I love seeing what Simon and Matt are doing when it comes to coding. And large language models can be great for transforming content from one format to another, like transcribing speech into text. But the balance sheet just doesn’t add up.
Even as someone who has used them and found them helpful, it’s remarkable to see the gap between what they can do and what their promoters promise they will someday be able to do. The benefits, though extant, seem to pale in comparison to the costs.
Funnily enough, many build tools advertise their superior “Developer Experience” (DX). For my money, there’s no better DX than shipping code straight to the browser and not having to worry about some cryptic node_modules error in between.
Making websites without a build step is a gift to your future self. When you open that project six months or a year or two years later, there’ll be no faffing about with npm updates, installs, or vulnerabilities.
Need to edit the CSS? You edit the CSS. Need to change the markup? You change the markup.
It’s remarkably freeing. It’s also very, very performant.
If you’re thinking that your next project couldn’t possibly be made without a build step, let me tell you about a phrase I first heard in the indie web community: “Manual ‘till it hurts”. It’s basically a two-step process:
Start doing what you need to do by hand.
When that becomes unworkable, introduce some kind of automation.
It’s remarkable how often you never reach step two.
I’m not saying premature optimisation is the root of all evil. I’m just saying it’s premature.
Start simple. Get more complex if and when you need to.
I’ve been on a sabbatical from work for the past six weeks.
At Clearleft, you’re eligible for a sabbatical after five years. For some reason I haven’t taken one until now, 19 years into my tenure at the agency. I am an idiot.
My six-week sabbatical has been lovely, alternating between travel and homebodying.
Belfast
The first week was spent in Belfast at the excellent Belfast Trad Fest. There were workshops in the morning, sessions in the afternoon, and concerts in the evening. Non-stop music!
This year’s event was a little bit special for me. The festival runs an excellent bursary sponsorship programme for young people who otherwise wouldn’t be able to attend:
The bursary secures a place for a young musician to attend and experience a week-long intensive and immersive summertime learning course of traditional music, song and dance and can be transformative.
Starting from today, and for the whole month of April, any donations made to The Session, which normally go towards covering the costs of running the site, will instead go towards sponsoring bursary places for this year’s Belfast Summer school.
Needless to say, I was thrilled! The Trad Fest team were very happy too—they very kindly gave me a media pass for the duration of the event, which meant I could go to any of the concerts for free. I made full use of this.
That said, one of the absolute highlights of the week wasn’t a concert, but a session. Piper Mick O’Connor and fiddler Sean Smyth led a session out at the American Bar one evening that was absolutely sublime. There was a deep respect for the music combined with a lovely laidback vibe.
Brighton
There were no shortage of sessions once Jessica returned from Belfast to Brighton. In fact, when we got the train back from Gatwick we hopped in a cab straight to a session instead of going home first. Can’t stop, won’t stop.
The weather hadn’t been great in Belfast, which was fine because we were mostly indoors. But once we got back to Brighton we were treated to a week of glorious sunshine.
Needless to say, Jessica did plenty of swimming. I even went in the ocean myself on one of the hottest days.
I also went into the air. Andy took me up in a light aircraft for a jolly jaunt over the south of England. We flew from Goodwood over the New Forest, and around the Isle of Wight where we landed for lunch. Literally a flying visit.
I can attest that Andy is an excellent pilot. No bumpy landings.
Cork
Our next sojourn took us back to the island of Ireland, but this time we were visiting the Republic. We spent a week in the mightiest of all the Irish counties, Cork.
Our friends Dan and Sue came over from the States and a whole bunch of us went on a road trip down to west Cork, a beautiful part of the country that I shamefully hadn’t visited before. Sue did a magnificent job navigating the sometimes tiny roads in a rental car, despite Dan being a nervous Nellie in the passenger seat.
We had a lovely couple of days in Glengarriff, even though the weather wasn’t great. On the way back to Cork city, we just had to stop off in Baltimore—Dan and Sue live in the other Baltimore. I wasn’t prepared for the magnificent and rugged coastline (quite different to its Maryland counterpart).
Boston
We were back in Brighton for just one day before it was time for us to head to our next destination. We flew to Boston and spent a few days hanging around in Cambridge with our dear friends Ethan and Liz. It was a real treat to just pass the time with good people. It had been far too long.
I did manage to squeeze in an Irish music session in the legendary Druid pub. ’Twas a good night.
After all the excitement of Frostapalooza, Jessica and I went on to spend a week decompressing in Saint Augustine, Florida.
We went down to the beach every day. We went in the water most days. Sometimes the water was a bit too choppy for a proper swim, but it was still lovely and warm. And there was one day when the water was just perfectly calm.
When we weren’t on the beach, we were probably eating shrimp.
It was all very relaxing.
Brighton
I’ve spent the sixth and final week of my sabbatical back in Brighton. The weather has remained good so there’s been plenty of outdoor activities, including a kayaking trip down the river Medway in Kent. I may have done some involuntary wild swimming at one point.
I have very much enjoyed these past six weeks. Music. Travel. Friends. It’s all been quite lovely.
It all started back in July of last year when I got an email from Brad:
Next summer I’m turning 40, and I’m going to use that milestone as an excuse to play a big concert with and for all of my friends and family. It’ll sorta be like The Last Waltz, but with way more web nerds involved.
Originally it was slated for July of 2024, which was kind of awkward for me because it would clash with Belfast Trad Fest but I said to mark me down as interested. Then when the date got moved to August of 2024, it became more doable. I knew that Jessica and I would be making a transatlantic trip at some point anyway to see her parents, so we could try to combine the two.
In fact, the tentative plans we had to travel to the States in April of 2024 for the total solar eclipse ended up getting scrapped in favour of Brad’s shindig. That’s right—we chose rock’n’roll over the cosmic ballet.
Over the course of the last year, things began to shape up. There were playlists. There were spreadsheets. Dot voting was involved.
Anyone with any experience of playing live music was getting nervous. It’s hard enough to rehearse and soundcheck for a four piece, but Brad was planning to have over 40 musicians taking part!
We did what we could from afar, choosing which songs to play on, recording our parts and sending them onto Brad. Meanwhile Brad was practicing like hell with the core band. With Brad on bass and his brother Ian on drums for the whole night, we knew that the rhythm section would be tight.
A few months ago we booked our flights. We’d fly into to Boston first to hang out with Ethan and Liz (it had been too long!), then head down to Pittsburgh for Frostapalooza before heading on to Florida to meet up with Jessica’s parents.
When we got to Pittsburgh, we immediately met up with Chris and together we headed over to Brad’s for a rehearsal. We’d end up spending a lot of time playing music with Chris over the next couple of days. I loved every minute of it.
The evening before Frostapalooza, Brad threw a party at his place. It was great to meet so many of the other musicians he’d roped into this.
Then it was time for the big day. We had a whole afternoon to soundcheck, but we needed it. Drums, a percussion station, a horn section …not to mention all the people coming and going on different songs. Fortunately the tech folks at the venue were fantastic and handled it all with aplomb.
We finished soundchecking around 5:30pm. Doors were at 7pm. Time to change into our rock’n’roll outfits and hang out backstage getting nervous and excited.
I wasn’t playing on the first few songs so I got to watch the audience’s reaction as they realised what was in store. Maybe they thought this would be a cute gathering of Brad and his buddies jamming through some stuff. What they got was an incredibly tight powerhouse of energy from a seriously awesome collection of musicians.
I had the honour of playing on five songs over the course of the night. I had an absolute blast! But to be honest, I had just as much fun being in the audience dancing my ass off.
Oh, I was playing mandolin. I probably should’ve mentioned that.
The first song I played on was The Weight by The Band. There was a real Last Waltz vibe as Brad’s extended family joined him on stage, along with me and and Chris.
Later I hopped on stage as one excellent song segued into another—Maps by Yeah Yeah Yeahs.
I’ve loved this song since the first time I heard it. In the dot-voting rounds to figure out the set list, this was my super vote.
You know the way it starts with that single note tremelo on the guitar? I figured that would work on the mandolin. And I know how to tremelo.
Jessica was on bass. Jessi Hall was on vocals. It. Rocked.
I stayed on stage for Radiohead’s The National Anthem complete with horns, musical saw, and two basses played by Brad and Jessica absolutely killing it. I added a little texture over the singing with some picked notes on the mandolin.
Then it got truly epic. We played Wake Up by Arcade Fire. So. Much. Fun! Again, I laid down some tremelo over the rousing chorus. I’m sure no one could hear it but it didn’t matter. Everyone was just lifted along by the sheer scale of the thing.
That was supposed to be it for me. But during the rehearsal the day before, I played a little bit on Fleetwood Mac’s The Chain and Brad said, “You should do that!”
So I did. I think it worked. I certainly enjoyed it!
With that, my musical duties were done and I just danced and danced, singing along to everything.
At the end of the night, everyone got back on stage. It was a tight fit. We then attempted to sing Bohemian Rhapsody together. It was a recipe for disaster …but amazingly, it worked!
That could describe the whole evening. It shouldn’t have worked. It was far too ambitious. But not only did it work, it absolutely rocked!
What really stood out for me was how nice and kind everyone was. There was nary an ego to be found. I had never met most of these people before but we all came together and bonded over this shared creation. It was genuinely special.
Days later I’m still buzzing from it all. I’m so, so grateful to Brad and Melissa for pulling off this incredible feat, and for allowing me to be a part of it.
And then in the middle of this traumatic medical emergency, our mentally-unstable neighbor across the street began accosting my family, flipping off our toddler and nanny, racially harassing my wife, and making violent threats. We fled our home for fear of our safety because he was out in the street exposing himself, shouting belligerence, and threatening violence.
After that, Brad started working with Project Healthy Minds. In fact, all the proceeds from Frostapalooza go to that organisation along with NextStep Pittsburgh.
Just think about that. Confronted with intimidation and racism, Brad and Melissa still managed to see the underlying systemic inequality, and work towards making things better for the person who drove them out of their home.
Good people, man. Good people.
I sincererly hope they got some catharsis from Frostapalooza. I can tell you that I felt frickin’ great after being part of an incredible event filled with joy and love and some of the best music I’ve ever heard.