Us web developers are a proud people that want to build the best websites every single time. To help us identify areas that could be improved, I’ve compiled a list of various automated and manual checks that can help us make better websites. My professional pride urges me to do the entire list, but I must admit that I sometimes skip some of the checks due to time constraints. I did just finish a project where I did all the checks and that felt very satisfying.

Validation

Mark-up validation

Choose the right DOCTYPE and make sure that the mark-up you write conforms to that DOCTYPE. No matter if you choose HTML or XHTML, always go for a strict DOCTYPE since it forces all browsers to render in standards compliant mode. Otherwise, you might end up with different renderings. Here is a list of DOCTYPEs to choose from. Remember that real men use XHTML 1.0 Strict or XHTML 1.1.

I use the Total Validator or the W3C validator to validate the mark-up against the DOCTYPE.

Stylesheet validation

It’s always a good idea to validate your stylesheets against the W3C standards. You will probably find a lot of small things you can easily fix. This is also to ensure better browser compatibility, but it is not as important as mark-up validation. In some cases you deliberately put hacks in your stylesheets to cope with browser differences and that might break the validation. The W3C CSS validator works for me.

RSS and ATOM feed validation

There are literally hundreds of feed aggregators on the market today and that doesn’t include the custom aggregation done by blog engines, content management systems etc. To ensure they can all read your RSS and ATOM feeds, make sure to validate them against the standard.

Accessibility validation

Make sure your mark-up is structured in a way that is readable for everyone. There are two standards widely used today and that’s the Section 508 used in America and the WAI which is more comprehensive. It is pretty easy to make your site compliant with the rules of Section 508, so you should as a minimum do that.

The WAI is split up into three categories: A, AA and AAA where A is the simplest and AAA is the most comprehensive. If you can make your site validate Section 508, you should easily be able to validate against WAI-A. In Europe all government websites must conform to WAI-AA. If you are hardcore and stubborn, you should go for the WAI-AAA valid website. For a regular website, this is not easy but it can be done.

You can perform the validation using the Total Validator or Cynthia Says.

Performance

YSlow/Fiddler

Before finishing your website, make sure to check the YSlow grade. It has to be as high as possible. Also make sure to use Fiddler to analyze the request/response graph by Internet Explorer. Using those two tools can make your site twice as fast with just minor tweaks.

Use general website analysis

It is a little redundant using the Website Analyzer after you’ve optimized the page using YSlow and Fiddler, but it almost always give you inspiration on where to do further performance optimizations.

No errors

Broken links check

This one requires no explanation. Take an automated test for broken links before releasing.

Browser compatability

Even though you have produced valid mark-up and CSS, there is always a need to go through all the pages of the website to make sure they look good in Internet Explorer 6, Firefox 2, Opera 9, Safari 2.5, Chrome and newer versions hereof. The number of different browsers you need to check varies by the type of website and the target group. This check is unfortunately manual and can take a loooong time.

Add meaning to content

SEO

You don’t need to know an awful lot about search engine optimization to do just the basic stuff right. I suggest you at least install the senSEO add-on to Firefox. It will provide you with a lot of useful tips and tricks by analyzing your mark-up and suggest improvements

Semantic Extractor

The Semantic Extractor can help you find inconsistencies in the structure of the different tags on your website. It lets you see how the search engine crawlers understand your site.

P3P Policy

If you set cookies then you need to have a P3P policy HTTP header as a minimum. The header briefly describes how the website behaves in regards to privacy. You might also want to add a more complete privacy statement in a certain XML format. The benefit is that your website will receive elevated privileges under stricter security settings in Internet Explorer. Learn more about P3P. Remember also to validate your P3P policy.

PICS-label

The PICS-label is also an HTTP header, but it describes the type of content on your website. The PICS-label is used by various tools build to protect kids on the web, such as NetNanny etc. Learn more about PICS.

Use semantic mark-up where it makes sense

Are you listing events or contact information then wrap them in meaningful mark-up such as microformats. This adds extra meaning to your content and opens the door to data portability. Here is an introduction to semantic mark-up.

Go mobile

This is relevant if you’re website is targeted mobile clients such as PDA’s and smartphones.

MobileOK Checker

Run your website through this online tool that will give you a lot of good feedback on your code. The W3C made it to provide a set of best-practices for mobile web apps.

XHTML Basic 1.1 DOCTYPE

This DOCTYPE can be difficult to code against if your website is targeted both mobile and richer browsers. However, if you are creating a mobile-only website then this DOCTYPE is for you. It’s basically a stripped down version of the XHTML 1.1 DOCTYPE with some mobile specific enhancements.

Input mode

One of the enhancements of the XHTML Basic 1.1 DOCTYPE is the inputmode attribute on textarea and text input fields. It allows you to specify the type of input that is best suited for the input field. It could be digits, Latin lowercase letters or Greek letters. Devices that understand this attribute will then adjust the input mode.

Checklist for high quality websites part 2

Just about every web project I’ve been involved in have, at one time or the other, needed to present some text to the visitor through JavaScript. It could be in an alert box or some other way and the problem has always been to localize that text into different languages using resource files or satellite assemblies.

There are many ways of localizing the keys, but most of them involve writing out variables on a page with the localized text and then let the .js include files read from those variables. That’s not a good solution. It would be much better if the .js files could be rendered with the localized text directly.

At work, I’ve written exactly such a mechanism and here is a cleaned up, plug ‘n play version you can use in your own web project. It’s an HttpHandler that intercepts the requests to the .js files and performs the localization based on regular expressions. Here’s how it works.

Resource files

It doesn’t matter whether you use .resx files or satellite assemblies for localizing your text, because both methods work with the System.Resources.ResourceManager class.

A resource file contains keys and a values. The key is always the same, but the value varies for each language. Each language is represented in its own .resx file and the name of the file decides which language it contains like below.

 

If no language is added in the file name, it automatically becomes the default localizations – normally English. The three .resx files each contain one line with the key nameOfPage. The value in each of the files is localized text that can be referenced by the key.

text.resx: Name of page
text.da.resx: Sidens navn
text.es.resx: Namo de la pago (Sorry, my Spanish is slightly rusty)

The script method

In the .js script files you need a way to specify a certain text as localizable. I’ve chosen to pick a syntax that looks like this:

<script type="text/javascript">
  alert(‘Translate(nameOfPage)’);
</script>

Notice that the Translate method above takes the localizable key as parameter. This is the syntax the regular expression in the HttpHandler is looking for.

The HttpHandler

The HttpHandler does a couple of things. It reads the content of the script file, localizes the text, caches the response and compresses it using HTTP compression. It does all this in about 150 lines of easy readable code.

To make it work, you need to update your script references a little bit. Instead of pointing to /scripts/messages.js you need to append .axd to the end of the reference so it becomes /scripts/messages.js.axd. Then you need to add this line to the web.config so that the new script reference actually works.

<httpHandlers>
  <add verb="*" path="*.js.axd" type="ScriptTranslator" />
</httpHandlers>

If your site isn’t yet localized you probably need to add the uiCulture attribute to the globalization element in the web.config and set its value to auto.

<globalization uiCulture="auto" />

This will trigger the correct .resx file to be used based on the visiting browser’s language. If your .resx files aren’t called text.resx or you use satellite assemblies, then you need to update the instantiation of the ResourceManager in the TranslateScript method in the handler.

Download code and sample

Download the code below and place the ScriptTranslator.cs in your App_Code folder. Then update your web.config with the web.config values found in the zip file. If you unzip the zip file, the entire contents can be opened directly in Visual Studio and you will be able to try it out easily.

Localization.zip (8,37 kb)