Tips for designing bare-bones pages

Written by Adrian Holovaty on September 10, 2002

At ajc.com, I've been designing a bare-bones home page in case something terrible happens tomorrow (on the one-year anniversary of the Sept. 11, 2001, U.S. terror attacks) and our site is flooded with traffic. If there were ever a perfect case for using a CSS-based layout, this'd be it. A few pointers on making such a page:

  • Separate all content from presentation using a style sheet. When millions of people are hitting your home page frequently, you don't want them to have to download lousy FONT tags and TABLEs over and over again. Don't embed formatting in the page. Rather, design it so users only have to download straight content. My opinion is: When it comes down to it, pretty background colors and fonts don't matter when there's a huge breaking news story -- but if you're intent on formatting content, use an external style sheet. That way, users will only have to download it once, saving bandwidth for your site and your users. (Subsequent page views will use the cached style sheet.)
  • Use correct structural markup. For example, use HTML tags that give meaning to your document structure -- like H1/H2/H3 for headlines, UL for lists, P for paragraphs, etc. If you do this, your page will be accessible to PDAs and alternate Internet devices. And I'm betting that in the time of an emergency, many people will use whatever means they can (cellphones, PDAs, etc.) to get news from your Web site. Make it accessible to them. (More information on correct structure is available at Dan's Web tips, Web Design Group and HTMLSource.)
  • Keep the page size low as possible. This tip goes without saying, but last Sept. 11 when I viewed the source of many news sites' stripped-down, supposedly "fast-loading" home pages, I wasn't too impressed. There's a difference between keeping the size of visible content low and keeping the size of code low. A few tips: Delete all newline characters (i.e., line breaks). Do this right before you upload, and save a local copy of the file for yourself with readable code. Trim unnecessary quotes from HTML attributes (unless you're using XHTML, in which case they're required). Don't close tags that don't need to be closed. (I hesitate to say this, because it goes completely against proper coding technique, but desperate times call for desperate measures if your site gets lots of traffic...) And strip out all unnecessary gunk, like META tags and JavaScript calls.

Any other tips? Leave a comment.

Comments

Posted by Wohleber on September 10, 2002, at 11 p.m.:

Just today I gave my sermon on CSS and the separation of content and presentation, but I never thought to tie it in to 9/11.

I did mention accessibility for blind users as one reason for maintaining a logical document structure, and that proper use of hedline levels would be particularly helpful for people navigating a page with a screen reader. But I was wondering to myself, "Is that actually true? What do I know about how blind people use the web?" So I was happy to come across your Aug. 27 Interview with blind Internet user.

Posted by Jay Small on September 11, 2002, at 7:24 p.m.:

As always, excellent points. I'll add a couple of things.

First, sites that, on normal days, front-load their home pages with hundreds of links (rather than creating a layered, well-organized site architecture) will always have the hardest time directing a crush of site traffic for breaking news.

Even if they slim down the code of their home pages, and whittle down the home page content and number of things they link to, they may well have an inadequate "next layer down" of index pages to help people find their way to all those things NOT directly related to the breaking news.

Second, sites where the content management architecture is set to dynamically build most pages may suffer disproportionately under a crush of traffic to oft-changing breaking news. Even with a good caching scheme, if you're updating stories rapidly the benefits of the cache on server performance may be compromised.

The alternatives are: a CMS that writes cache files on the fly every time articles are published, so that few or no pages are dynamically served; or an emergency system for publishing static content (with the assumptions that you'll add it later to the DB for archiving, and you'll publish it in a place where it can be permalinked).

Posted by Nathan on September 11, 2002, at 9 p.m.:

The other idea I've been kicking around for a while is trying to figure out some way to cache unchanging elements of the page -- like the left rail, headers and footers -- within the user's browser. It might not help very much in extreme situations (I wonder if Adrian's bare-bones ajc.com page includes the site's left rail at all), but on any normal day it's still ridiculous that I have to download the output of (on most sites) a bunch of server-side includes on every single page. How about "client-side includes"?

I think this could either be done with JavaScript or with IFRAME tags. The problem is the former would be inaccessible to people without JavaScript and the latter may require setting a width and a height on the embedded "frame." Does anyone know of a way this could actually work?

Also, in the department of "unnecessary gunk," I want to add HTML comments to the list, although I agree that bloated but irrelevant META keywords are the worst offenders.

Posted by Adrian on September 12, 2002, at 6:44 p.m.:

Nathan: You left out an obvious solution -- normal frames. Of course, those should be avoided at all costs. :-)

There was some talk about a client-side include on one of the W3C lists a while back. Otherwise, I can't think of any other solution besides the ones you mentioned.

Posted by Wohleber on September 12, 2002, at 8:43 p.m.:

Nathan, Adrian: What about using Javascript for standing content, with a server-side-include in a <noscript> tag for those without Javascript? Seems a bit messy, and means a longer download for those more likely to have a slow connection, but it should work.

Posted by Adrian on September 12, 2002, at 9:46 p.m.:

Unfortunately, the contents of the <noscript> tag are transferred, along with the rest of the document, regardless of whether a user's browser supports JavaScript -- so I think that might not solve the cache/bandwidth problem.

Posted by Wohleber on September 13, 2002, at 4 a.m.:

Doh! The server would process the SSI whether it was in noscript tags or not, so that would make for even more code to download, and perhaps a fractional increase in response time for the server to parse the file.

Comments have been turned off for this page.