Journal tags: v

1174

sparkline

Re-dConstruct

From 2005 to 2015 Clearleft ran the dConstruct event here in Brighton (with one final anniversary event in 2022).

I had the great pleasure of curating dConstruct for a while. I’m really proud of the line-ups I put together.

It wasn’t your typical tech event, to put it mildy. You definitely weren’t going to learn practical techniques to bring back into work on Monday morning. If anything, it was the kind of event that might convince you to quit your job on Monday morning.

The talks were design-informed, but with oodles of philosophy, culture and politics.

As you can imagine, that’s not an easy sell. Hence why we stopped running the event. It’s pretty hard to convince your boss to send you to a conference like that.

Sometimes I really miss it though. With everything going on in the tech world right now (and the world in general), it sure would be nice to get together in a room full of like-minded people to discuss the current situation.

Well, here’s the funny thing. There’s a different Clearleft event happening next week. Research By The Sea. On the face of it, this doesn’t sound much like dConstruct. But damn if Benjamin hasn’t curated a line-up of talks that sound very dConstructy!

Those all sound like they’d fit perfectly in the dConstruct archive.

Research By The Sea is most definitely not just for UX researchers—this sounds to me like the event to attend if, like me, you’re alarmed by everything happening right now.

Next Thursday, February 27th, this is the place to be if you’ve been missing dConstruct. See you there!

Making the new Salter Cane website

With the release of a new Salter Cane album I figured it was high time to update the design of the band’s website.

Here’s the old version for reference. As you can see, there’s a connection there in some of the design language. Even so, I decided to start completely from scratch.

I opened up a text editor and started writing HTML by hand. Same for the CSS. No templates. No build tools. No pipeline. Nothing. It was a blast!

And lest you think that sounds like a wasteful way of working, I pretty much had the website done in half a day.

Partly that’s because you can do so much with so little in CSS these days. Custom properties for colours, spacing, and fluid typography (thanks to Utopia). Logical properties. View transitions. None of this takes much time at all.

Because I was using custom properties, it was a breeze to add a dark mode with prefers-color-scheme. I think I might like the dark version more than the default.

The final stylesheet is pretty short. I didn’t bother with any resets. Browsers are pretty consistent with their default styles nowadays. As long as you’ve got some sensible settings on your body element, the cascade will take care of a lot.

There’s one little CSS trick I think is pretty clever…

The background image is this image. As you can see, it’s a rectangle that’s wider than it is tall. But the web pages are rectangles that are taller than they are wide.

So how I should I position the background image? Centred? Anchored to the top? Anchored to the bottom?

If you open up the website in Chrome (or Safari Technical Preview), you’ll see that the background image is anchored to the top. But if you scroll down you’ll see that the background image is now anchored to the bottom. The background position has changed somehow.

This isn’t just on the home page. On any page, no matter how tall it is, the background image is anchored to the top when the top of the document is in the viewport, and it’s anchored to the bottom when you reach the bottom of the document.

In the past, this kind of thing might’ve been possible with some clever JavaScript that measured the height of the document and updated the background position every time a scroll event is triggered.

But I didn’t need any JavaScript. This is a scroll-driven animation made with just a few lines of CSS.

@keyframes parallax {
    from {
        background-position: top center;
    }
    to {
        background-position: bottom center;
    }
}
@media (prefers-reduced-motion: no-preference) {
        html {
            animation: parallax auto ease;
            animation-timeline: scroll();
        }
    }
}

This works as a nice bit of progressive enhancement: by default the background image stays anchored to the top of the viewport, which is fine.

Once the site was ready, I spent a bit more time sweating some details, like the responsive images on the home page.

But the biggest performance challenge wasn’t something I had direct control over. There’s a Spotify embed on the home page. Ain’t no party like a third party.

I could put loading="lazy" on the iframe but in this case, it’s pretty close to the top of document so it’s still going to start loading at the same time as some of my first-party assets.

I decided to try a little JavaScript library called “lazysizes”. Normally this would ring alarm bells for me: solving a problem with third-party code by adding …more third-party code. But in this case, it really did the trick. The library is loading asynchronously (so it doesn’t interfere with the more important assets) and only then does it start populating the iframe.

This made a huge difference. The core web vitals went from being abysmal to being perfect.

I’m pretty pleased with how the new website turned out.

Research By The Sea

I’m going to be hosting Research By The Sea on Thursday, February 27th right here in Brighton. I’m getting very excited and nervous about it.

The nervousness is understandable. I want to do a good job. Hosting a conference is like officiating a wedding. You want to put people at ease and ensure everything goes smoothly. But you don’t want to be the centre of attention. People aren’t there to see you. This is not your day.

As the schedule has firmed up, my excitement has increased.

See, I’m not a researcher. It would be perfectly understandable to expect this event to be about the ins and outs of various research techniques. But it’s become clear that that isn’t what Benjamin has planned.

Just as any good researcher or designer goes below the surface to explore the root issues, Research By The Sea is going to go deep.

I mean, just take a look at what Steph will be covering:

Steph discusses approaches in speculative fiction, particularly in the Solarpunk genre, that can help ground our thinking, and provide us—as researchers and designers—tenets to consider our work, and, as humans, to strive towards a better future.

Sign me up!

Michael’s talk covers something that’s been on my mind a lot lately:

Michael will challenge the prevailing belief that as many people as possible must participate in our digital economies.

You just know that a talk called In defence of refusal isn’t going to be your typical conference fare.

Then there are talks about accessibility and intersectionality, indigenous knowledge, designing communities, and navigating organisational complexity. And I positively squeeled with excitement when I read Cennydd’s talk description:

The world is crying out for new visions of the future: worlds in which technology is compassionate, not just profitable; where AI is responsible, not just powerful.

See? It’s very much not just for researchers. This is going to be a fascinating day for anyone who values curiosity.

If that’s you, you should grab a ticket. To sweeten the deal, use the discount code JOINJEREMY to get a chunky 20% of the price — £276 for a conference ticket instead of £345.

Be sure to nab your ticket before February 15th when the price ratchets up a notch.

And if you are a researcher, well, you really shouldn’t miss this. It’s kind of like when I’ve run Responsive Day Out and Patterns Day; sure, the talks are great, but half the value comes from being in the same space as other people who share your challenges and experiences. I know that makes it sound like a kind of group therapy, but that’s because …well, it kind of is.

Changing

It always annoys me when a politician is accused of “flip-flopping” when they change their mind on something. Instead of admiring someone for being willing to re-examine previously-held beliefs, we lambast them. We admire conviction, even though that’s a trait that has been at the root of history’s worst attrocities.

When you look at the history of human progress, some of our greatest advances were made by people willing to question their beliefs. Prioritising data over opinion is what underpins the scientific method.

But I get it. It can be very uncomfortable to change your mind. There’s inevitably going to be some psychological resistance, a kind of inertia of opinion that favours the sunk cost of all the time you’ve spent believing something.

I was thinking back to times when I’ve changed my opinion on something after being confronted with new evidence.

In my younger days, I was staunchly anti-nuclear power. It didn’t help that in my younger days, nuclear power and nuclear weapons were conceptually linked in the public discourse. In the intervening years I’ve come to believe that nuclear power is far less destructive than fossil fuels. There are still a lot of issues—in terms of cost and time—which make nuclear less attractive than solar or wind, but I honestly can’t reconcile someone claiming to be an environmentalist while simultaneously opposing nuclear power. The data just doesn’t support that conclusion.

Similarly, I remember in the early 2000s being opposed to genetically-modified crops. But the more I looked into the facts, there was nothing—other than vibes—to bolster that opposition. And yet I know many people who’ve maintainted their opposition, often the same people who point to the scientific evidence when it comes to climate change. It’s a strange kind of cognitive dissonance that would allow for that kind of cherry-picking.

There are other situations where I’ve gone more in the other direction—initially positive, later negative. Google’s AMP project is one example. It sounded okay to me at first. But as I got into the details, its fundamental unfairness couldn’t be ignored.

I was fairly neutral on blockchains at first, at least from a technological perspective. There was even some initial promise of distributed data preservation. But over time my opinion went down, down, down.

Bitcoin, with its proof-of-work idiocy, is the poster-child of everything wrong with the reality of blockchains. The astoundingly wasteful energy consumption is just staggeringly pointless. Over time, any sufficiently wasteful project becomes indistinguishable from evil.

Speaking of energy usage…

My feelings about large language models have been dominated by two massive elephants in the room. One is the completely unethical way that the training data has been acquired (by ripping off the work of people who never gave their permission). The other is the profligate energy usage in not just training these models, but also running queries on the network.

My opinion on the provenance of the training data hasn’t changed. If anything, it’s hardened. I want us to fight back against this unethical harvesting by poisoning the well that the training data is drawing from.

But my opinion on the energy usage might just be swaying a little.

Michael Liebreich published an in-depth piece for Bloomberg last month called Generative AI – The Power and the Glory. He doesn’t sugar-coat the problems with current and future levels of power consumption for large language models, but he also doesn’t paint a completely bleak picture.

Effectively there’s a yet-to-decided battle between Koomey’s law and the Jevons paradox. Time will tell which way this will go.

The whole article is well worth a read. But what really gave me pause was a recent piece by Hannah Ritchie asking What’s the impact of artificial intelligence on energy demand?

When Hannah Ritchie speaks, I listen. And I’m well aware of the irony there. That’s classic argument from authority, when the whole point of Hannah Ritchie’s work is that it’s the data that matters.

In any case, she does an excellent job of putting my current worries into a historical context, as well as laying out some potential futures.

Don’t get me wrong, the energy demands of large language models are enormous and are only going to increase, but we may well see some compensatory efficiencies.

Personally, I’d just like to see these tools charge a fair price for their usage. Right now they’re being subsidised by venture capital. If people actually had to pay out of pocket for the energy used per query, we’d get a much better idea of how valuable these tools actually are to people.

Instead we’re seeing these tools being crammed into existing products regardless of whether anybody actually wants them (and in my anecdotal experience, most people resent this being forced on them).

Still, I thought it was worth making a note of how my opinion on the energy usage of large language models is open to change.

But I still won’t use one that’s been trained on other people’s work without their permission.

Conference line-ups

When I was looking back at 2024, I mentioned that I didn’t give a single conference talk (though I did host three conferences—Patterns Day, CSS Day, and UX London).

I almost spoke at a conference though. I was all set to speak at an event in the Netherlands. But then the line-up was announced and I was kind of shocked at the lack of representation. The schedule was dominated by white dudes like me. There were just four women in a line-up of 30 speakers.

When I raised my concerns, I was told:

We did receive a lot of talks, but almost no women because there are almost no women in this kind of jobs.

Yikes! I withdrew my participation.

I wish I could say that it was one-off occurrence, but it just happened again.

I was looking forward to speaking at DevDays Europe. I’ve never been to Vilnius but I’ve heard it’s lovely.

Now, to be fair, I don’t think the line-up is finalised, but it’s not looking good.

Once again, I raised my concerns. I was told:

Unfortunately, we do not get a lot of applications from women and have to work with what we have.

Even though I knew I was just proving Brandolini’s law, I tried to point out the problems with that attitude (while also explaining that I’ve curated many confernce line-ups myself):

It’s not really conference curation if you rely purely on whoever happens to submit a proposal. Surely you must accept some responsibility for ensuring a good diverse line-up?

The response began with:

I agree that it’s important to address the lack of diversity.

…but then went on:

I just wanted to share that the developer field as a whole tends to be male-dominated, not just among speakers but also attendees.

At this point, I’m face-palming. I tried pointing out that there might just be a connection between the make-up of the attendees and the make-up of the speaker line-up. Heck, if I feel uncomfortable attending such a homogeneous conference, imagine what a woman developer would think!

Then they dropped the real clanger:

While we always aim for a diverse line-up, our main focus has been on ensuring high-quality presentations and providing the best experience for our audience.

Double-yikes! I tried to remain calm in my response. I asked them to stop and think about what they were implying. They’re literally setting up a dichotomy between having a diverse line-up and having a good line-up. Like it’s inconceivable you could have both. As though one must come at the expense of the other. Just think about the deeply embedded bias that would enable that kind of worldview.

Needless to say, I won’t be speaking at that event.

This is depressing. It feels like we’re backsliding to what conferences were like 15 years ago.

I can’t help but spot the commonalaties between the offending events. Both of them have multiple tracks. Both of them have a policy of not paying their speakers. Both of them seem to think that opening up a form for people to submit proposals counts as curation. It doesn’t.

Don’t get me wrong. Having a call for proposals is great …as long as it’s part of an overall curation strategy that actually values diversity.

You can submit a proposal to speak at FFconf, for example. But Remy doesn’t limit his options to what people submit. He puts a lot of work into creating a superb line-up that is always diverse, and always excellent.

By the way, you can also submit a proposal for UX London. I’ve had lots of submissions so far, but again, I’m not going to limit my pool of potential speakers to just the people who know about that application form. That would be a classic example of the streetlight effect:

The streetlight effect, or the drunkard’s search principle, is a type of observational bias that occurs when people only search for something where it is easiest to look.

It’s quite depressing to see this kind of minimal-viable conference curation result in such heavily skewed line-ups. Withdrawing from speaking at those events is literally the least I can do.

I’m with Karolina:

What I’m looking for: at least 40% of speakers have to be women speaking on the subject of their expertise instead of being invited to present for the sake of adjusting the conference quotas. I want to see people of colour too. In an ideal scenario, I’d like to see as many gender identities, ethnical backgrounds, ages and races as possible.

25, 20, 15, 10, 5

I have a feeling that 2025 is going to be a year of reflection for me. It’s such a nice round number, 25. One quarter of a century.

That’s also how long myself and Jessica have been married. Our wedding anniversary was last week.

Top tip: if you get married in year ending with 00, you’ll always know how long ago it was. Just lop off the first 2000 years and there’s the number.

As well as being the year we got married (at a small ceremony in an army chapel in Arizona), 2000 was also the year we moved from Freiburg to Brighton. I never thought we’d still be here 25 years later.

2005 was twenty years ago. A lot of important events happened that year. I went to South by Southwest for the first time and met people who became lifelong friends (including some dear friends no longer with us).

I gave my first conference talk. We had the first ever web conference in the UK. And myself, Rich, and Andy founded Clearleft. You can expect plenty of reminiscence and reflection on the Clearleft blog over the course of this year.

2010 was fifteen years ago. That’s when Jessica and I moved into our current home. For the first time, we were paying off a mortgage instead of paying a landlord. But I can’t bring myself to consider us “homeowners” at that time. For me, we didn’t really become homeowners until we paid that mortgage off ten years later.

2015 was ten years ago. It was relatively uneventful in the best possible way.

2020 was five years ago. It was also yesterday. The Situation was surreal, scary and weird. But the people I love came through it intact, for which I’m very grateful.

Apart from all these anniversaries, I’m not anticipating any big milestones in 2025. I hope it will be an unremarkable year.

2024

There goes 2024.

It was a year dominated by Ukraine and Gaza. Utterly horrific and unnecessary death courtesy of Putin and Netanyahu.

For me personally, 2024 was just fine. I was relatively healthy all year. The people I love were relatively healthy too. I don’t take that for granted.

Looking back on what I did and didn’t do during the year, here’s something interesting: I didn’t give a single conference talk. I spoke at a few events but as the host: Patterns Day, CSS Day, and UX London. That’s something I really enjoy and I think I’m pretty darn good at it too.

I was wondering why it was that I didn’t give a talk in 2024. Then Rachel said something:

I really miss An Event Apart.

That’s when I realised that An Event Apart would usually be the impetus behind preparing a conference talk. I’d lock myself away and spend ages crafting a presentation to match the calibre of that event. Then, once I had the talk prepared, I could give modified versions of it at other conferences.

With An Event Apart gone, I guess I just let the talk prep slide. Maybe that’ll continue into 2025 …although I’m kind of itching to give a talk on HTML web components.

In most years, speaking at conferences is what allows me to travel to interesting places. But even without being on the conference circuit I still travelled to lovely places in 2024. Turin, Belfast, Amsterdam, Freiburg, west Cork, Boston, Pittsburgh, Saint Augustine, Seville, Cáceres, Strasbourg, and Galway.

A lot of the travel was motivated by long-standing friendships. Exploring west Cork with Dan and Sue. Celebrating in Freiburg with Schorsch and Birgit. Visting Ethan and Liz in Boston. And playing music in Pittsburgh with Brad.

Frostapolooza was a high note:

I felt frickin’ great after being part of an incredible event filled with joy and love and some of the best music I’ve ever heard.

Being on sabattical for all of August was nice. It also meant that I had lots of annual leave to use up by the end of the year, so I ended up taking all of December off too. I enjoyed that.

I played a lot of music in 2024. I played in a couple of sessions for pretty much every week of the year. That’s got to be good for my mandolin playing. I even started bringing the tenor banjo out on occasion.

I ate lots of good food in 2024. Some of it was even food I made. I’ve been doing more and more cooking. I’ve still got a fairly limited range of dishes, but I’m enjoying the process of expanding my culinary repertoire a bit.

I read good books in 2024.

All in all, that’s a pretty nice way to spend a year: some travel, seeing some old friends, playing lots of music, reading books, and eating good food.

I hope for more of the same in 2025.

Books I read in 2024

I’ve been keeping track of the books I’m reading for about seven years now. I do that here on my own website, as well as on bookshop.org.

It has become something of a tradition for me to post an end-of-year summary of the books I’ve read in the previous twelve months. Maybe I should be posting my thoughts on each book right after I finish it instead. Then again, I quite like the act of thinking about a book again after letting it sit and stew for a while.

I should probably stop including stars with these little reviews. They’re fairly pointless—pretty much everything I read is right down the middle in the “good” category. But to recap, here’s how I allocate my scores:

  • One star means a book is meh.
  • Two stars means a book is perfectly fine.
  • Three stars means a book is a good—consider it recommended.
  • Four stars means a book is exceptional.
  • Five stars is pretty much unheard of.

No five-star books this year, but also no one-star books.

This year I read about 29 books. A bit of an increase on previous years, but the numbers can be deceptive—not every book is equal in length.

Fiction outnumbered non-fiction by quite a margin. I’m okay with that.

The Wager by David Grann

“A tale of shipwreck, mutiny and murder” is promised on the cover and this book delivers. What’s astonishing is that it’s a true story. If it were fiction it would be dismissed as too far-fetched. It’s well told, and it’s surely only a matter of time before an ambitious film-maker takes on its Rashomon-like narrative.

★★★☆☆

Bridge by Lauren Beukes

I think this might be Lauren’s best book since Zoo City. The many-worlds hypothesis has been mined to depletion in recent years but Bridge still manages to have a fresh take on it. The well-rounded characters ensure that you’re invested in the outcome.

★★★☆☆

The Penelopiad by Margaret Atwood

Part of my ongoing kick of reading retellings of Greek myths, this one focuses on a particularly cruel detail of Odysseus’s return.

★★★☆☆

Elektra by Jennifer Saint

Keeping with the Greek retellings, this was the year that I read most of Jennifer Saint’s books. All good stuff, though I must admit that in my memory it’s all starting to blend together with other books like Costanza Casati’s Clytemnestra.

★★★☆☆

Children Of Memory by Adrian Tchaikovsky

The final book in the trilogy, this doesn’t have the same wham-bam page-turning breathlessness as Children Of Time, but I have to say it’s really stuck with me. Whereas the previous books looked at the possibilities of biological intelligence (in spiders and octopuses), this one focuses inwards.

I don’t want to say anymore because I don’t want to spoil the culmination. I’ll just say that I think that by the end it posits a proposition that I don’t recall any other sci-fi work doing before.

Y’know what? Just because of how this one has lodged in my mind I’m going to give it an extra star.

★★★★☆

Stone Blind by Natalie Haynes

I think this is my favourite Natalie Haynes book so far. It also makes a great companion piece to another book I read later in the year…

★★★☆☆

The Great Hunger by Patrick Kavanagh

I picked up this little volume of poems when I was in Amsterdam—they go down surprisingly well with some strong beer and bitterballen. I was kind of blown away by how funny some of these vignettes were. There’s plenty of hardship too, but that’s the human condition for you.

★★★★☆

Europe In Autumn, Europe At Midnight, Europe In Winter, and Europe At Dawn by Dave Hutchinson

I read the Fractured Europe series throughout the year and thoroughly enjoyed it. I’ll readily admit that I didn’t always follow what was going on but that’s part of the appeal. The world-building is terrific. It’s an alternative version of a Brexity Europe that by the end of the first book starts to go in an unexpected direction. Jonathan Strange meets George Smiley.

★★★☆☆

The Odyssey by Homer translated by Robert Fagles

Seeing as I’m reading all the modern retellings, it’s only fair that I have the source material to hand. This is my coffee table book that I dip into sporadically. I’ve got a copy of the prequel too.

I am not going to assign stars to this.

Faith, Hope and Carnage by Nick Cave and Seán O’Hagan

Fairly navel-gazing stuff, and you get the impression that Nick Cave thinks so too. Just as Neil Young would rather talk about his model trains, Nick Cave would rather discuss his pottery. The music stands on its own, but this is still better than most books about music.

★★☆☆☆

Julia by Sandra Newman

Now this is an audacious move! Retelling 1984 from Julia’s perspective. Not only does it work, it also shines a light on some flaws in Orwell’s original (and I say that as someone who’s read everything Orwell ever wrote). I’m happy to say that the execution of this book matches its ambition.

★★★☆☆

Hamnet by Maggie O’Farrell

So if I’ve been reading alternative perspectives on Homer and Orwell, why not Shakespeare too? This is beautifully evocative and rich in detail. It’s also heartbreaking. A gorgeous work.

★★★★☆

Pandora’s Jar: Women in the Greek Myths by Natalie Haynes

I didn’t enjoy this as much as I enjoyed Natalie Hayne’s novels. It’s all good informative stuff, but it feels a bit like a collection of separate essays rather than a coherent piece.

★★☆☆☆

Best Of British Science Fiction 2023 edited by Donna Scott

I was lucky enough to get a pre-release copy of this from one of the authors. I love a good short story collection and this one is very good indeed.

★★★☆☆

Ithaca and House Of Odysseus by Claire North

Remember how I said that some of the Greek retellings started to blend together? Well, no fear of that with this terrific series. Like Margaret Atwood’s retelling, Penelope is the main character here. Each book is narrated by a different deity, and yet there is little to no supernatural intervention. I’m really looking forward to reading the third and final book in the series.

★★★☆☆

The Shadow Of Perseus by Claire Heywood

This is the one I was hinting at above that makes a great companion piece to Natalie Hayne’s Stone Blind. Two different—but equally sympathetic—takes on Medusa. This one is grittily earthbound—no gods here—and it’s a horrifying examination of toxic masculinity. And don’t expect any natural justice here.

★★★☆☆

Dogs Of War by Adrian Tchaikovsky

Adrian Tchaikovsky has a real knack for getting inside the animal mind. This story is smaller in scale than his Children Of Time series but it succeeds in telling its provocative tale snappily.

★★★☆☆

Reading 84K by Claire North

I described Dave Hutchinson’s Fractured Europe series as Brexity, but this Claire North’s book is one that pushes Tory austerity to its dystopian logical conclusion. It’s all-too believable, if maybe a little over-long. Grim’n’good.

★★★☆☆

Ariadne by Jennifer Saint

The first of Jennifer Saint’s books is also my favourite. There’s a fantasically vivid description of the arrival of Dionysus into the narrative.

★★★☆☆

The Female Man by Joanna Russ

I’ve been meaning to read this one for years, but in the end I didn’t end up finishing it. That’s no slight on the book; I just wasn’t in the right frame of mind for it. I’m actually kind of proud of myself for putting a book down—I’m usually stubbornly completionist, which is stupid because life is too short. I hope to return to this at some future time.

Atalanta by Jennifer Saint

Another vividly-written tale by Jennifer Saint, but maybe suffers from trying to cram in all the varied accounts of Atalanta’s deeds and trials—the character’s motivations are hard to reconcile at different points.

★★★☆☆

Polostan by Neal Stephenson

This was …fine. It’s the first in a series called Bomb Light. Maybe I’ll appreciate it more in its final context. As a standalone work, there’s not quite enough there to carry it (including the cutesiness of making a young Richard Feynman a side character).

★★☆☆☆

Tomorrow, and Tomorrow, and Tomorrow by Gabrielle Zevin

This too was …fine. I know some people really love this, and maybe that raised my expectations, but in the end it was a perfectly good if unremarkable novel.

★★★☆☆

The Fates by Rosie Garland

Pairs nicely with Jennifer Saint’s Atalanta. A decent yarn.

★★★☆☆

Earth Abides by George R. Stewart

I’ve just started this post-apocalyptic classic from 1949. Tune in next year to find out if I end up enjoying it.

Okay, so that was my reading for 2024. Nothing that completely blew me away but nothing that thoroughly disappointed me either. Plenty of good solid books. If I had to pick a favourite it would probably be Maggie Farrell’s Hamnet. And that Patrick Kavanagh collection of poems.

If you fancy going back in time, here are my previous round-ups:

Progressively enhancing maps

The Session has been online for over 20 years. When you maintain a site for that long, you don’t want to be relying on third parties—it’s only a matter of time until they’re no longer around.

Some third party APIs are unavoidable. The Session has maps for sessions and other events. When people add a new entry, they provide the address but then I need to get the latitude and longitude. So I have to use a third-party geocoding API.

My code is like a lesson in paranoia: I’ve built in the option to switch between multiple geocoding providers. When one of them inevitably starts enshittifying their service, I can quickly move on to another. It’s like having a “go bag” for geocoding.

Things are better on the client side. I’m using other people’s JavaScript libraries—like the brilliant abcjs—but at least I can self-host them.

I’m using Leaflet for embedding maps. It’s a great little library built on top of Open Street Map data.

A little while back I linked to a new project called OpenFreeMap. It’s a mapping provider where you even have the option of hosting the tiles yourself!

For now, I’m not self-hosting my map tiles (yet!), but I did want to switch to OpenFreeMap’s tiles. They’re vector-based rather than bitmap, so they’re lovely and crisp.

But there’s an issue.

I can use OpenFreeMap with Leaflet, but to do that I also have to use the MapLibre GL library. But whereas Leaflet is 148K of JavaScript, MapLibre GL is 800K! Yowzers!

That’s mahoosive by the standards of The Session’s performance budget. I’m not sure the loveliness of the vector maps is worth increasing the JavaScript payload by so much.

But this doesn’t have to be an either/or decision. I can use progressive enhancement to get the best of both worlds.

If you land straight on a map page on The Session for the first time, you’ll get the old-fashioned bitmap map tiles. There’s no MapLibre code.

But if you browse around The Session and then arrive on a map page, you’ll get the lovely vector maps.

Here’s what’s happening…

The maps are embedded using an HTML web component called embed-map. The fallback is a static image between the opening and closing tags. The web component then loads up Leaflet.

Here’s where the enhancement comes in. When the web component is initiated (in its connectedCallback method), it uses the Cache API to see if MapLibre has been stored in a cache. If it has, it loads that library:

caches.match('/path/to/maplibre-gl.js')
.then( responseFromCache => {
    if (responseFromCache) {
        // load maplibre-gl.js
    }
});

Then when it comes to drawing the map, I can check for the existence of the maplibreGL object. If it exists, I can use OpenFreeMap tiles. Otherwise I use the old Leaflet tiles.

But how does the MapLibre library end up in a cache? That’s thanks to the service worker script.

During the service worker’s install event, I give it a list of static files to cache: CSS, JavaScript, and so on. That includes third-party libraries like abcjs, Leaflet, and now MapLibre GL.

Crucially this caching happens off the main thread. It happens in the background and it won’t slow down the loading of whatever page is currently being displayed.

That’s it. If the service worker installation works as planned, you’ll get the nice new vector maps. If anything goes wrong, you’ll get the older version.

By the way, it’s always a good idea to use a service worker and the Cache API to store your JavaScript files. As you know, JavaScript is unduly expensive to performance; not only does the JavaScript file have to be downloaded, it then has to be parsed and compiled. But JavaScript stored in a cache during a service worker’s install event is already parsed and compiled.

Cocolingo

This year I decided I wanted to get better at speaking Irish.

Like everyone brought up in Ireland, I sort of learned the Irish language in school. It was a compulsory subject, along with English and maths.

But Irish wasn’t really taught like a living conversational language. It was all about learning enough to pass the test. Besides, if there’s one thing that’s guaranteed to put me off something, it’s making it compulsory.

So for the first couple of decades of my life, I had no real interest in the Irish language, just as I had no real interest in traditional Irish music. They were both tainted by some dodgy political associations. They were both distinctly uncool.

But now? Well, Irish traditional music rules my life. And I’ve come to appreciate the Irish language as a beautiful expressive thing.

I joined a WhatsApp group for Irish language learners here in Brighton. The idea is that we’d get together to attempt some converstation as Gaeilge but we’re pretty lax about actually doing that.

Then there’s Duolingo. I started …playing? doing? Not sure what the verb is.

Duolingo is a bit of a mixed bag. I think it works pretty well for vocabulary acquisition. But it’s less useful for grammar. I was glad that I had some rudiments of Irish from school or I would’ve been completely lost.

Duolingo will tell you what the words are, but it never tells you why. For that I’m going to have to knuckle down with some Irish grammar books, videos, or tutors.

Duolingo is famous for its gamification. It mostly worked on me. I had to consciously remind myself sometimes that the purpose was to get better at Irish, not to score more points and ascend a league table.

Oh, did I ascend that league table!

But I can’t take all the credit. That must go to Coco, the cat.

It’s not that Coco is particularly linguistically gifted. Quite the opposite. She never says a word. But she did introduce a routine that lent itself to doing Duolingo every day.

Coco is not our cat. But she makes herself at home here, for which we feel inordinately honoured.

Coco uses our cat flap to come into the house pretty much every morning. Then she patiently waits for one of us to get up. I’m usually up first, so I’m the one who gives Coco what she wants. I go into the living room and sit on the sofa. Coco then climbs on my lap.

It’s a lovely way to start the day.

But of course I can’t just sit there alone with my own thoughts and a cat. I’ve got to do something. So rather than starting the day with some doomscrolling, I start with some Irish on Duolingo.

After an eleven-month streak, something interesting happened; I finished.

I’m not used to things on the internet having an end. Had I been learning a more popular language I’m sure there would’ve been many more lessons. But Irish has a limited lesson plan.

Of course the Duolingo app doesn’t say “You did it! You can delete the app now!” It tries to get me to do refresher exercises, but we both know that there are diminishing returns and we’d just be going through the motions. It’s time for us to part ways.

I’ve started seeing other apps. Mango is really good so far. It helps that they’ve made some minority languages available for free, Irish included.

I’m also watching programmes on TG4, the Irish language television station that has just about everything in its schedule available online for free anywhere in the world. I can’t bring myself to get stuck into Ros na Rún, the trashy Irish language soap opera, but I have no problem binging on CRÁ, the gritty Donegal crime drama.

There are English subtitles available for just about everything on TG4. I wish that Irish subtitles were also available—it’s really handy to hear and read Irish at the same time—but only a few shows offer that, like the kid’s cartoon Lí Ban.

Oh, and I’ve currently got a book on Irish grammar checked out of the local library. So now when Coco comes to visit in the morning, she can keep me company while I try to learn from that.

Going Offline is online …for free

I wrote a book about service workers. It’s called Going Offline. It was first published by A Book Apart in 2018. Now it’s available to read for free online.

If you want you can read the book as a PDF, an ePub, or .mobi, but I recommend reading it in your browser.

Needless to say the web book works offline. Once you go to goingoffline.adactio.com you can add it to the homescreen of your mobile device or add it to the dock on your Mac. After that, you won’t need a network connection.

The book is free to read. Properly free. Not the kind of “free” where you have to supply an email address first. Why would I make you go to the trouble of generating a burner email account?

The site has no analytics. No tracking. No third-party scripts of any kind whatsover. By complete coincidence, the site is fast. Funny that.

For the styling of this web book, I tweaked the stylesheet I used for HTML5 For Web Designers. I updated it a little bit to use logical properties, some fluid typography and view transitions.

In the process of converting the book to HTML, I got reaquainted with what I had written almost seven years ago. It was kind of fun to approach it afresh. I think it stands up pretty darn well.

Ethan wrote about his feelings when he put two of his books online, illustrated by that amazing photo that always gives me the feels:

I’ll miss those days, but I’m just glad these books are still here. They’re just different than they used to be. I suppose I am too.

Anyway, if you’re interested in making your website work offline, have a read of Going Offline. Enjoy!

Going Offline

The meaning of “AI”

There are different kinds of buzzwords.

Some buzzwords are useful. They take a concept that would otherwise require a sentence of explanation and package it up into a single word or phrase. Back in the day, “ajax” was a pretty good buzzword.

Some buzzwords are worse than useless. This is when a word or phrase lacks definition. You could say this buzzword in a meeting with five people, and they’d all understand five different meanings. Back in the day, “web 2.0” was a classic example of a bad buzzword—for some people it meant a business model; for others it meant rounded corners and gradients.

The worst kind of buzzwords are the ones that actively set out to obfuscate any actual meaning. “The cloud” is a classic example. It sounds cooler than saying “a server in Virginia”, but it also sounds like the exact opposite of what it actually is. Great for marketing. Terrible for understanding.

“AI” is definitely not a good buzzword. But I can’t quite decide if it’s merely a bad buzzword like “web 2.0” or a truly terrible buzzword like “the cloud”.

The biggest problem with the phrase “AI” is that there’s a name collision.

For years, the term “AI” has been used in science-fiction. HAL 9000. Skynet. Examples of artificial general intelligence.

Now the term “AI” is also used to describe large language models. But there is no connection between this use of the term “AI” and the science fictional usage.

This leads to the ludicrous situation of otherwise-rational people wanted to discuss the dangers of “AI”, but instead of talking about the rampant exploitation and energy usage endemic to current large language models, they want to spend the time talking about the sci-fi scenarios of runaway “AI”.

To understand how ridiculous this is, I’d like you to imagine if we had started using a different buzzword in another setting…

Suppose that when ride-sharing companies like Uber and Lyft were starting out, they had decided to label their services as Time Travel. From a marketing point of view, it even makes sense—they get you from point A to point B lickety-split.

Now imagine if otherwise-sensible people began to sound the alarm about the potential harms of Time Travel. Given the explosive growth we’ve seen in this sector, sooner or later they’ll be able to get you to point B before you’ve even left point A. There could be terrible consequences from that—we’ve all seen the sci-fi scenarios where this happens.

Meanwhile the actual present-day harms of ride-sharing services around worker exploitation would be relegated to the sidelines. Clearly that isn’t as important as the existential threat posed by Time Travel.

It sounds ludicrous, right? It defies common sense. Just because a vehicle can get you somewhere fast today doesn’t mean it’s inevitably going to be able to break the laws of physics any day now, simply because it’s called Time Travel.

And yet that is exactly the nonsense we’re being fed about large language models. We call them “AI”, we look at how much they can do today, and we draw a straight line to what we know of “AI” in our science fiction.

This ridiculous situation could’ve been avoided if we had settled on a more accurate buzzword like “applied statistics” instead of “AI”.

It’s almost as if the labelling of the current technologies was more about marketing than accuracy.

FFConf 2024

I went to FFConf on Friday. It did me the world of good.

To be honest, I haven’t much felt like venturing out over the past few days since my optimism took a big hit. But then when I do go and interact with people, I’m grateful for it.

Like, when I went out to my usual Wednesday evening traditional Irish music session I was prepared the inevitable discussion of Trump’s election. I was ready to quite clearly let people know that I didn’t want to talk about it. But I didn’t have to. Maybe because everyone else was feeling much the same, we just played and played. It was good.

The session on Thursday was good too. When we chatted, it was about music.

Still, I was ready for the weekend and I wasn’t really feeling psyched up for FFConf on Friday. But once I got there, I was immediately uplifted.

It was so nice to see so many people I hadn’t seen in quite a while. I had the chance to reconnect with people that I had only been hearing from through my RSS reader:

Terence, I’m really enjoying your sci-fi short stories!”

Kirsty, I was on tenterhooks when you were getting Mabel!”

(Mabel is an adorable kitty-cat. In hindsight I probably should’ve also congratulated her on getting married. To a human.)

The talks were really good this year. They covered a wide variety of topics.

There was only one talk about “AI” (unlike most conferences these days, where it dominates the agenda). Léonie gave a superb run-down of the different kinds of machine learning and how they can help or hinder accessibility.

Crucially, Léonie began her talk by directly referencing the exploitation and energy consumption inherent in today’s large language models. It took all of two minutes, but it was two minutes more than the whole day of talks at UX Brighton. Thank you, Léonie!

Some of the other talks covered big topics. Life. Death. Meaning. Purpose.

I enjoyed them all, though I often find something missing from discussions about meaning and purpose. Just about everyone agrees that having a life enfused with purpose is what provides meaning. So there’s an understandable quest to seek out what it is that gives you purpose.

But we’re also constantly reminded that every life has intrinsic meaning. “You are enough”, not “you are enough, as long as there’s some purpose to your life.”

I found myself thinking about Winne Lim’s great post on leading a purposeless life. I think about it a lot. It gives me comfort. Instead of assuming that your purpose is out there somewhere and you’ve got to find it, you can entertain the possibility that your life might not have a purpose …and that’s okay.

I know this all sounds like very heavy stuff, but it felt good to be in a room full of good people grappling with these kind of topics. I needed it.

Dare I say it, perhaps my optimism is returning.

Myth and magic

I read Madeline Miller’s Circe last year. I loved it. It was my favourite fiction book I read that year.

Reading Circe kicked off a bit of a reading spree for me. I sought out other retellings of Greek myths. There’s no shortage of good books out there from Pat Barker, Natalie Haynes, Jennifer Saint, Claire Heywood, Claire North, and more.

The obvious difference between these retellings and the older accounts by Homer, Ovid and the lads is to re-centre the women in these stories. There’s a rich seam of narratives to be mined between the lines of the Greek myths.

But what’s fascinating to me is to see how these modern interpretations differ from one another. Sometimes I’ll finish one book, then pick up another that tells the same story from a very different angle.

The biggest difference I’ve noticed is the presence or absence of supernatural intervention. Some of these writers tell their stories with gods and goddesses front and centre. Others tell the very same stories as realistic accounts without any magic.

Take Perseus. Please.

The excellent Stone Blind by Natalie Haynes tells the story of Medusa. There’s magic a-plenty. In fact, Perseus himself is little more than a clueless bumbler who wouldn’t last a minute without divine interventation.

The Shadow Of Perseus by Claire Heywood also tells Medusa’s story. But this time there’s no magic whatsoever. The narrative is driven not by gods and goddesses, but by the force of toxic masculinity.

Pat Barker tells the story of the Trojan war in her Women Of Troy series. She keeps it grounded and gritty. When Natalie Haynes tells the same story in A Thousand Ships, the people in it are little more than playthings of the gods.

Then there are the books with just a light touch of the supernatural. While Madeline Miller’s Circe was necessarily imbued with magic, her first novel The Song Of Achilles keeps it mostly under wraps. The supernatural is there, but it doesn’t propel the narrative.

Claire North has a trilogy of books called the Songs of Penelope, retelling the Odyssey from Penelope’s perspective (like Margaret Atwood did in The Penelopiad). On the face of it, these seem to fall on the supernatural side; each book is narrated by a different deity. But the gods are strangely powerless. Everyone believes in them, but they themselves behave in a non-interventionist way. As though they didn’t exist at all.

It makes me wonder what it would be like to have other shared myths retold with or without magic.

How would the Marvel universe look if it were grounded in reality? Can you retell Harry Potter as the goings-on at a cult school for the delusional? What would Star Wars be like without the Force? (although I guess Andor already answers that one)

Anyway, if you’re interested in reading some modern takes on Greek myths, here’s a list of books for you:

Unsaid

I went to the UX Brighton conference yesterday.

The quality of the presentations was really good this year, probably the best yet. Usually there are one or two stand-out speakers (like Tom Kerwin last year), but this year, the standard felt very high to me.

But…

The theme of the conference was UX and “AI”, and I’ve never been more disappointed by what wasn’t said at a conference.

Not a single speaker addressed where the training data for current large language models comes from (it comes from scraping other people’s copyrighted creative works).

Not a single speaker addressed the energy requirements for current large language models (the requirements are absolutely mahoosive—not just for the training, but for each and every query).

My charitable reading of the situation yesterday was that every speaker assumed that someone else would cover those issues.

The less charitable reading is that this was a deliberate decision.

Whenever the issue of ethics came up, it was only ever in relation to how we might use these tools: considering user needs, being transparent, all that good stuff. But never once did the question arise of whether it’s ethical to even use these tools.

In fact, the message was often the opposite: words like “responsibility” and “duty” came up, but only in the admonition that UX designers have a responsibility and duty to use these tools! And if that carrot didn’t work, there’s always the stick of scaring you into using these tools for fear of being left behind and having a machine replace you.

I was left feeling somewhat depressed about the deliberately narrow focus. Maggie’s talk was the only one that dealt with any externalities, looking at how the firehose of slop is blasting away at society. But again, the focus was only ever on how these tools are used or abused; nobody addressed the possibility of deliberately choosing not to use them.

If audience members weren’t yet using generative tools in their daily work, the assumption was that they were lagging behind and it was only a matter of time before they’d get on board the hype train. There was no room for the idea that someone might examine the roots of these tools and make a conscious choice not to fund their development.

There’s a quote by Finnish architect Eliel Saarinen that UX designers like repeating:

Always design a thing by considering it in its next larger context. A chair in a room, a room in a house, a house in an environment, an environment in a city plan.

But none of the speakers at UX Brighton chose to examine the larger context of the tools they were encouraging us to use.

One speaker told us “Be curious!”, but clearly that curiosity should not extend to the foundations of the tools themselves. Ignore what’s behind the curtain. Instead look at all the cool stuff we can do now. Don’t worry about the fact that everything you do with these tools is built on a bedrock of exploitation and environmental harm. We should instead blithely build a new generation of user interfaces on the burial ground of human culture.

Whenever I get into a discussion about these issues, it always seems to come back ’round to whether these tools are actually any good or not. People point to the genuinely useful tasks they can accomplish. But that’s not my issue. There are absolutely smart and efficient ways to use large language models—in some situations, it’s like suddenly having a superpower. But as Molly White puts it:

The benefits, though extant, seem to pale in comparison to the costs.

There are no ethical uses of current large language models.

And if you believe that the ethical issues will somehow be ironed out in future iterations, then that’s all the more reason to stop using the current crop of exploitative large language models.

Anyway, like I said, all the talks at UX Brighton were very good. But I just wish just one of them had addressed the underlying questions that any good UX designer should ask: “Where did this data come from? What are the second-order effects of deploying this technology?”

Having a talk on those topics would’ve been nice, but I would’ve settled for having five minutes of one talk, or even one minute. But there was nothing.

There’s one possible explanation for this glaring absence that’s quite depressing to consider. It may be that these topics weren’t covered because there’s an assumption that everybody already knows about them, and frankly, doesn’t care.

To use an outdated movie reference, imagine a raving Charlton Heston shouting that “Soylent Green is people!”, only to be met with indifference. “Everyone knows Soylent Green is people. So what?”

Making the website for Research By The Sea

UX London isn’t the only event from Clearleft coming your way in 2025. There’s a brand new spin-off event dedicated to user research happening in February. It’s called Research By The Sea.

I’m not curating this one, though I will be hosting it. The curation is being carried out most excellently by Benjamin, who has written more about how he’s doing it:

We’ve invited some of the best thinkers and doers from from in the research space to explore how researchers might respond to today’s most gnarly and pressing problems. They’ll challenge current perspectives, tools, practices and thinking styles, and provide practical steps for getting started today to shape a better tomorrow.

If that sounds like your cup of tea, you should put February 27th 2025 in your calendar and grab yourself a ticket.

Although I’m not involved in curating the line-up for the event, I offered Benjamin my swor… my web dev skillz. I made the website for Research By The Sea and I really enjoyed doing it!

These one-day events are a great chance to have a bit of fun with the website. I wrote about how enjoyable it was making the website for this year’s Patterns Day:

I felt like I was truly designing in the browser. Adjusting spacing, playing around with layout, and all that squishy stuff. Some of the best results came from happy accidents—the way that certain elements behaved at certain screen sizes would lead me into little experiments that yielded interesting results.

I took the same approach with Research By The Sea. I had a design language to work with, based on UX London, but with more of a playful, brighter feel. The idea was that the website (and the event) should feel connected to UX London, while also being its own thing.

I kept the typography of the UX London site more or less intact. The page structure is also very similar. That was my foundation. From there I was free to explore some other directions.

I took the opportunity to explore some new features of CSS. But before I talk about the newer stuff, I want to mention the bits of CSS that I don’t consider new. These are the things that are just the way things are done ‘round here.

Custom properties. They’ve been around for years now, and they’re such a life-saver, especially on a project like this where I’m messing around with type, colour, and spacing. Even on a small site like this, it’s still worth having a section at the start where you define your custom properties.

Logical properties. Again, they’ve been around for years. At this point I’ve trained my brain to use them by default. Now when I see a left, right, width or height in a style sheet, it looks like a bug to me.

Fluid type. It’s kind of a natural extension of responsive design to me. If a website’s typography doesn’t adjust to my viewport, it feels slightly broken. On this project I used Utopia because I wanted different type scales as the viewport increased. On other projects I’ve just used on clamp declaration on the body element, which can also get the job done.

Okay, so those are the things that feel standard to me. So what could I play around with that was new?

View transitions. So easy! Just point to an element on two different pages and say “Hey, do a magic move!” You can see this in action with the logo as you move from the homepage to, say, the venue page. I’ve also added view transitions to the speaker headshots on the homepage so that when you click through to their full page, you get a nice swoosh.

Unless, like me, you’re using Firefox. In that case, you won’t see any view transitions. That’s okay. They are very much an enhancement. Speaking of which…

Scroll-driven animations. You’ll only get these in Chromium browsers right now, but again, they’re an enhancement. I’ve got multiple background images—a bunch of cute SVG shapes. I’m using scroll-driven animations to change the background positions and sizes as you scroll. It’s a bit silly, but hopefully kind of cute.

You might be wondering how I calculated the movements of each background image. Good question. I basically just messed around with the values. I had fun! But imagine what an actually-skilled interaction designer could do.

That brings up an interesting observation about both view transitions and scroll-driven animations: Figma will not help you here. You need to be in a web browser with dev tools popped open. You’ve got to roll up your sleeves get your hands into the machine. I know that sounds intimidating, but it’s also surprisingly enjoyable and empowering.

Oh, and I made sure to wrap both the view transitions and the scroll-driven animations in a prefers-reduced-motion: no-preference @media query.

I’m pleased with how the website turned out. It feels fun. More importantly, it feels fast. There is zero JavaScript. That’s the main reason why it’s very, very performant (and accessible).

Smooth transitions across pages; smooth animations as you scroll: it’s great what you can do with just HTML and CSS.

Announcing UX London 2025

Is it too early to start planning for 2025 already? Perhaps. But you might want to add some dates to your calender:

June 10th, 11th, and 12th, 2025.

That’s when UX London will return!

It’ll be be back in CodeNode. That’s the venue we tried for the first time this year and it worked out really well.

You can look forward to three days of UX talks and workshops:

  1. Tuesday, June 10th is Discovery Day—user research, content strategy, and planning.
  2. Wednesday, June 11th is Design Day—interaction design, accessibility, and interface design.
  3. Thursday, June 12th is Delivery Day—iteration, design ops, and cross-team collaboration.

I realise that the alliteration of discovery, design, and delivery is a little forced but you get the idea. The flow of the event will follow the process of a typical design project.

The best way to experience UX London is to come for all three days, but each day also works as a standalone event.

I’m now starting the process of curating the line-up for each day: a mix of inspiring talks and hands-on workshops. If you trust me, you can get your ticket already at the super early-bird price.

If you reckon you’d be a good addition to the line-up, here’s a form you can fill out.

Now, I’ll be up-front here: if you’re a typical white dude like me, you’re not going to be top of the pile. My priority for UX London is creating a diverse line-up of speakers.

So if you’re not a typical white dude like me and you’ve ever thought about giving a conference talk, fill out that form!

If you don’t fancy speaking, but you want to see your company represented at UX London, check out our sponsorship options.

If you don’t want to speak and you don’t want to sponsor, but you want to be at the best design conference of 2025, get your ticket now.

Archives

Speaking of serendipity, not long after I wrote about making a static archive of The Session for people to download and share, I came across a piece by Alex Chan about using static websites for tiny archives.

The use-case is slightly different—this is about personal archives, like paperwork, screenshots, and bookmarks. But we both came up with the same process:

I’m deliberately going low-scale, low-tech. There’s no web server, no build system, no dependencies, and no JavaScript frameworks.

And we share the same hope:

Because this system has no moving parts, and it’s just files on a disk, I hope it will last a long time.

You should read the whole thing, where Alex describes all the other approaches they took before settling on plain ol’ HTML files in a folder:

HTML is low maintenance, it’s flexible, and it’s not going anywhere. It’s the foundation of the entire web, and pretty much every modern computer has a web browser that can render HTML pages. These files will be usable for a very long time – probably decades, if not more.

I’m enjoying this approach, so I’m going to keep using it. What I particularly like is that the maintenance burden has been essentially zero – once I set up the initial site structure, I haven’t had to do anything to keep it working.

They also talk about digital preservation:

I’d love to see static websites get more use as a preservation tool.

I concur! And it’s particularly interesting for Alex to be making this observation in the context of working with the Flickr foundation. That’s where they’re experimenting with the concept of a data lifeboat

What should we do when a digital service sinks?

This is something that George spoke about at the final dConstruct in 2022. You can listen to the talk on the dConstruct archive.

content-visibility in Safari

Earlier this year I wrote about some performance improvements to The Session using the content-visibility property in CSS.

If you say content-visibility: auto you’re telling the browser not to bother calculating the layout and paint for an element until it needs to. But you need to combine it with the contain-intrinsic-block-size property so that the browser knows how much space to leave for the element.

I mentioned the browser support:

Right now content-visibility is only supported in Chrome and Edge. But that’s okay. This is a progressive enhancement. Adding this CSS has no detrimental effect on the browsers that don’t understand it (and when they do ship support for it, it’ll just start working).

Well, that’s happened! Safari 18 supports content-visibility. I didn’t have to do a thing and it just started working.

But …I think I’ve discovered a little bug in Safari’s implementation.

(I say I think it’s a bug with the browser because, like Jim, I’ve made the mistake in the past of thinking I had discovered a browser bug when in fact it was something caused by a browser extension. And when I say “in the past”, I mean yesterday.)

So here’s the issue: if you apply content-visibility: auto to an element that contains an SVG, and that SVG contains a text element, then Safari never paints that text to the screen.

To see an example, take a look at the fourth setting of Cooley’s reel on The Session archive. There’s a text element with the word “slide” (actually the text is inside a tspan element inside a text element). On Safari, that text never shows up.

I’m using a link to the archive of The Session I created recently rather than the live site because on the live site I’ve removed the content-visibility declaration for Safari until this bug gets resolved.

I’ve also created a reduced test case on Codepen. The only HTML is the element containing the SVGs. The only CSS—apart from the content-visibility stuff—is just a little declaration to push the content below the viewport so you have to scroll it into view (which is when the bug happens).

I’ve filed a bug report. I know it’s a fairly niche situation, but there are some other issues with Safari’s implementation of content-visibility so it’s possible that they’re all related.

Docks and home screens

Back in June I documented a bug on macOS in how Spaces (or whatever they call they’re desktop management thingy now) works with websites added to the dock.

I’m happy to report that after upgrading to Sequoia, the latest version of macOS, the bug has been fixed! Excellent!

Not only that, but there’s another really great little improvement…

Let’s say you’ve installed a website like The Session by adding it to the dock. Now let’s say you get an email in Apple Mail that includes a link to something on The Session. It used to be that clicking on that link would open it in your default web browser. But now clicking on that link opens it in the installed web app!

It’s a lovely little enhancement that makes the installed website truly feel like a native app.

Websites in the dock also support the badging API, which is really nice!

Like I said at the time:

I wonder if there’s much point using wrappers like Electron any more? I feel like they were mostly aiming to get that parity with native apps in having a standalone application launched from the dock.

Now all you need is a website.

The biggest issue remains discovery. Unless you already know that it’s possible to add a website to the dock, you’re unlikely to find out about it. That’s why I’ve got a page with installation instructions on The Session.

Still, the discovery possibilities on Apples’s desktop devices are waaaaay better than on Apple’s mobile devices.

Apple are doing such great work on their desktop operating system to make websites first-class citizens. Meanwhile, they’re doing less than nothing on their mobile operating system. For a while there, they literally planned to break all websites added to the homescreen. Fortunately they were forced to back down.

But it’s still so sad to see how Apple are doing everything in their power to prevent people from finding out that you can add websites to your homescreen—despite (or perhaps because of) the fact that push notifications on iOS only work if the website has been added to the home screen!

So while I’m really happy to see the great work being done on installing websites for desktop computers, I’m remain disgusted by what’s happening on mobile:

At this point I’ve pretty much given up on Apple ever doing anything about this pathetic situation.