Yesterday I wrote this on Twitter and it got a bunch of love:
How you could tell a site was well-crafted:— Adrian Holovaty (@adrianholovaty) April 27, 2016
2002 all-CSS layout
2003 nice URLs
2016 works offline w/ serviceworker
I’d like to expand on the idea here, given the lack of a 140-character limit.
In web development, often a new technology or technique comes out — one that’s obviously an improvement in how we do things — but it takes years to become mainstream. Developers need to get educated, tools/frameworks need to catch up, and new browser versions need adoption.
This period between a web technology’s creation and its mainstream standardization is quite interesting. If you know what to look for, you can get a sense of how any given website, or company, values web craftsmanship (and, to some extent, their tolerance for bleeding-edge technology). You can also get a taste of the future.
In 18 years as a web developer, I’ve come to love these subtle hints: Oooh, nice URLs on this site. Or: Lovely job making the site responsive on smaller screens. If you’re a web developer or designer, how many times have you resized your browser window while looking at a site you didn’t make, just to admire the responsiveness?
Over time, technology stabilizes and the techniques become expected. 10+ years ago, in the era of
.asp, I remember geeking out with Simon Willison about beautiful URL structures we’d seen. No file extensions! Readable! Hackable!
To us, they were signals that a web development team sweated the small stuff. It’s like the famous Steve Jobs story about making the inside of the hardware look just as nice as the outside, even if nobody ever sees it, because you have pride in your work.
With this in mind, I put together a list of these hints, as I remember them. Perhaps it’s of some historical interest, or maybe it’s just fun nostalgia. Almost all of these have become mainstream by now.
(Note: This isn’t to say that a site can’t be well-crafted without these things. It’s just that these things are strong signals that a site is made by craftsmen — people who care about getting the details right. And it’s also my own list, from my own perspective; I’m curious what other people’s “hints” have been over the years!)
1990s: Dynamic websites
Back in the day, we’d manually create HTML files and copy them, one by one, to an FTP server. Server-side includes helped cut redundancy, but it was still quite manual and time-consuming.
When I started seeing URLs like
example.com/page.cgi?id=123, I was quite impressed. It was a sign that the site wasn’t just a collection of pages — it was being generated dynamically!
Today, this is completely mainstream, and manually FTPing pages would be like using a rotary phone.
2002: All-CSS layouts
Back in the day, web designers used
<table> tags and spacer GIFs to lay out their pages. It worked, and it gave you decent control over layout, but it was ugly and mixed presentation with page semantics.
Though CSS existed at the time, most people (including myself) used it merely to style text. Using it for page layout was complicated, requiring lots of CSS hacks that relied on bugs in browsers’ CSS-parsing engines.
Around this time, I remember having a “Disable CSS” bookmarklet that I’d frequently use on sites I came across. If I clicked the bookmarklet and nothing happened, that meant they used tables for layout. If I clicked the bookmarklet and the page turned into a text-only mish-mash, that meant they were using CSS for layout — awesome!
Today, all-CSS layouts are the norm, no longer a sign that a web design team is ahead of the curve. Unfortunately,
<table> tags are still used for layout in HTML emails.
2003: Nice URLs
Back in the day, a website would “advertise” the technology it used, directly in its URLs. With PHP, URLs might look like
example.com/page.php?id=234. With ASP, they’d look like
example.com/page.aspx?id=234. And don’t get me started on URLs used by the expensive CMS called Vignette. (Commas in URLs?!)
Not only were these URLs ugly, but changing server-side technologies resulted in breaking all the URLs on your site. Cool URLs don’t change was a commonly cited document amongst web developers of the day who cared about craftsmanship.
Thanks to tools such as .htaccess files and mod_rewrite, people began considering the design of their URL structure. The original approach was merely rewriting
example.com/page.php?id=123 behind the scenes. Then, server-side frameworks came around, with their own URL routing logic, giving you full control.
[It’s safe to say that one of the reasons we started our own Python framework (which became Django), rather than using one of the dozens of existing Python frameworks, was that we couldn’t find one that had pretty URLs.]
Over time, more and more developers began designing nice URL schemes. Around 2002 and 2003, it was still a fringe thing. In the mid 2000s it became common knowledge that nice URLs provided a clear SEO benefit — which I imagine is a big reason more and more sites adopted this practice.
Today, URLs are generally good-looking and technology-less, though SEO concerns made them overly long and spammy in places.
Back in the day, if you wanted your page to update with new data, you’d need to do a page refresh. This, combined with the slower Internet connections of the era, made for especially irritating experiences with sites like Mapquest — which required a page refresh each time you panned or zoomed the map (!!).
XMLHttpRequest didn’t get a lot of adoption until the idea got a better name: Ajax. Jesse James Garrett’s famous blog post was successful because it (1) let people know that this technique existed and (2) gave it a much easier-to-pronounce name.
I recall lots of indie sites/blogs started using Ajax for stuff at the time, as it was quite a novel experience for a web user to see stuff change without a page reload. In this era, when I saw a site use Ajax, it was a clear “well-crafted” hint.
Ajax became mainstream thanks to two high-profile projects: Gmail and Google Maps. Though saying this must make me sound like an old fogey to the youngest generation of web developers, I will never forget the first time I experienced Google Maps. The map panned and zoomed without a page reload! To this day, I consider it one of the best jumps forward in web technology.
2009: Custom web fonts
Back in the day, web designers had six choices for fonts. For serif, you could use Times New Roman or Georgia. For sans-serif, it was Verdana, Arial or Trebuchet. For fixed-width, you had Courier.
That’s because there was no way for web pages to embed custom fonts. Custom fonts weren’t possible until the late 2000s with CSS3. Until then, we could only rely on fonts that had a high probability of being installed on users’ browsers (the aforementioned six). Oh, and techniques like sIFR that relied on Flash.
When websites were finally capable of using custom fonts, I spent about a year doing wide-eyed double-takes. My eyes were so accustomed to the Big Six fonts — I could not only tell you something was in Verdana, but whether it was 10px Verdana vs. 11px vs. 12px — that introducing such a fundamental a change was like magically getting access to a previously unseen set of colors in the rainbow.
During this period, whenever I saw a site use custom fonts, it was a clear “this site is well-crafted” hint. It took a while for this technique to become mainstream. Significant moments for me were when the New York Times and New Yorker changed their headline fonts to match the distinctive look from their print editions.
Today, custom fonts are common, and it’s hard to imagine we were so limited — such a relatively short time ago!
2010: Responsive web design
Back in the day, when designing your site, you’d choose between fluid layout or fixed layout. For fixed layouts, you’d choose a target screen width — say, 1024 pixels — and design everything around that. Given the prevalence of smaller screen resolutions (remember 640x480?), perhaps you’d reserve the right-most column of your design for things that weren’t as important, such as ads, because they’d be cut off, hidden behind a horizontal scrollbar, on smaller screen resolutions.
Fluid layout — where content stretches to fit your browser window, regardless of width — was a thing as long as I can remember, but the classic approach was a bit extreme. This very blog had a fluid design circa 2002, and I remember thinking: “The nice thing about this technique is that people can resize their browser window to achieve whatever their preferred column width for reading is.” In retrospect, it was awfully arrogant to expect people to resize browser windows to find a comfortable width just for this one website.
Responsive web design, where sites are designed for optimal viewing regardless of screen size, started getting traction in the mid 2000s. It’s sort of a hybrid of the old-school fluid-vs-fixed approaches, where you carefully consider which approach is best at each screen width.
At some point, I got in the habit of resizing my desktop browser’s window, upon visiting an interesting new site, to see how responsive it was. This was another clue that the developers cared about their craft.
Today, responsive design is quite common, and perhaps even expected.
Some things never go out of style. None of the following is tied to a particular time or event, but each is a sign a website was made by people who care about their craft:
- Semantic markup
- Following accessibility standards
2016 and beyond
Which brings us to the modern day. What’s a big hint that a site is crafted by forward-looking web developers?
I’d say it’s service workers, the most interesting thing happening in web development. I haven’t been this excited about a new web technology since Ajax.
Back in the day, accessing websites required an Internet connection. If you were offline, or just on a spotty connection, you couldn’t access a page. (In some cases, browsers have fallen back to cached versions if you’re offline, but that’s unreliable behavior.)
It’s clear to me, studying the history, that service workers are the next big “well-crafted” hint. The signs and similarities are all there. Today, they’re used mostly by fringe sites, in an experimental fashion. It’s inevitable that in a short generation they’ll become common and expected. Progressive Web Apps are clearly the future. The tools will catch up, developers will educate themselves and browser support will increase.
For a good example of what service workers make possible, watch this short video demo of my own music-education product Soundslice. When you see it in action, you can’t help but think: What? That's possible on the web?
Seek out the “hints”
As I write and reflect on this, I’m surprised at how far we’ve come.
Web development is often depressing and frustrating. If you want to use bleeding-edge web technologies, you get smacked in the face with buggy or nonexistent browser support, sparse documentation/tutorials, or just risk aversion and lack of buy-in at your organization.
But considering this history of improvements to the web platform, it’s clear we’ve got it so much better today, and things continue to get even more amazing.
Seek out those “well-crafted” hints. They’re inspiring and motivating.