Server Timing

Harry wrote a really good article all about the performance measurement Time To First Byte. Time To First Byte: What It Is and Why It Matters:

While a good TTFB doesn’t necessarily mean you will have a fast website, a bad TTFB almost certainly guarantees a slow one.

Time To First Byte has been the chink in my armour over at thesession.org, especially on the home page. Every time I ran Lighthouse, or some other performance testing tool, I’d get a high score …with some points deducted for taking too long to get that first byte from the server.

Harry’s proposed solution is to set up some Server Timing headers:

With a little bit of extra work spent implementing the Server Timing API, we can begin to measure and surface intricate timings to the front-end, allowing web developers to identify and debug potential bottlenecks previously obscured from view.

I rememberd that Drew wrote an excellent article on Smashing Magazine last year called Measuring Performance With Server Timing:

The job of Server Timing is not to help you actually time activity on your server. You’ll need to do the timing yourself using whatever toolset your backend platform makes available to you. Rather, the purpose of Server Timing is to specify how those measurements can be communicated to the browser.

He even provides some PHP code, which I was able to take wholesale and drop into the codebase for thesession.org. Then I was able to put start/stop points in my code for measuring how long some operations were taking. Then I could output the results of these measurements into Server Timing headers that I could inspect in the “Network” tab of a browser’s dev tools (Chrome is particularly good for displaying Server Timing, so I used that while I was conducting this experiment).

I started with overall database requests. Sure enough, that was where most of the time in time-to-first-byte was being spent.

Then I got more granular. I put start/stop points around specific database calls. By doing this, I was able to zero in on which operations were particularly costly. Once I had done that, I had to figure out how to make the database calls go faster.

Spoiler: I did it by adding an extra index on one particular table. It’s almost always indexes, in my experience, that make the biggest difference to database performance.

I don’t know why it took me so long to get around to messing with Server Timing headers. It has paid off in spades. I wish I had done it sooner.

And now thesession.org is positively zipping along!

Have you published a response to this? :

Responses

1 Like

# Liked by Chris M. on Saturday, August 10th, 2019 at 5:30pm

Related posts

Progressively enhancing maps

How I switched to high-resolution maps on The Session without degrading performance.

Fidinpamp

A small-scale conspiracy theory from the innards of Google.

PageSpeed Insights bookmarklet

With this bookmarklet you’re only ever one click away from the Lighthouse results for a page.

The intersectionality of web performance

Business, sustainability, and inclusivity.

JavaScript

Inside me there are two wolves. They’re both JavaScript.

Related links

Moving on from React, a Year Later

Many interactions are not possible without JavaScript, but that doesn’t mean we should look to write more than we have to. The server doing something useful is a requirement for building an interesting business. The client doing something is often a nice-to-have.

There’s also this:

It’s really fast

One of the arguments for a SPA is that it provides a more reactive customer experience. I think that’s mostly debunked at this point, due to the performance creep and complexity that comes in with a more complicated client-server relationship.

Tagged with

Website Speed Test

Here’s a handy free tool from Calibre that’ll give your website a performance assessment.

Tagged with

Pivoting From React to Native DOM APIs: A Real World Example - The New Stack

One dev team made the shift from React’s “overwhelming VDOM” to modern DOM APIs. They immediately saw speed and interaction improvements.

Yay! But:

…finding developers who know vanilla JavaScript and not just the frameworks was an “unexpected difficulty.”

Boo!

Also, if you have a similar story to tell about going cold turkey on React, you should share it with Richard:

If you or your company has also transitioned away from React and into a more web-native, HTML-first approach, please tag me on Mastodon or Threads. We’d love to share further case studies of these modern, dare I say post-React, approaches.

Tagged with

Faster Connectivity !== Faster Websites - Jim Nielsen’s Blog

The bar to overriding browser defaults should be way higher than it is.

Amen!

Tagged with

Standing still - a performance tinker | Trys Mudford

What Trys describes here mirrors my experience too—it really is worth occasionally taking a little time to catch the low-hanging fruit of your site’s web performance (and accessibility):

I’ve shaved nearly half a megabyte off the page size and improved the accessibility along the way. Not bad for an evening of tinkering.

Tagged with

Previously on this day

7 years ago I wrote Flexibility

Web design is a two-way street. And that’s okay.

10 years ago I wrote dConstruct 2015 podcast: Ingrid Burrington

Time travel, terminators, and network infrastructure.

11 years ago I wrote Responding

A pair of responsive design events in London.

12 years ago I wrote August in America, day seven

Philadelphia, Pennsylvania.

13 years ago I wrote Laboratory conditions

The testlab setup.

22 years ago I wrote Reflections

There’s a new picture by Jessica up at The Mirror Project.

23 years ago I wrote Tales of the Plush Cthulhu

"After vigintillions of years plush Cthulhu was loose once more, and ravening for delight. How He slavered and gibbered! And the stuffed animals fled or went mad at the sight of Him."

23 years ago I wrote The Spiders

From E-Sheep comes one of the best comics I’ve read on or off the internet: The Spiders.