For the past six months or so, I’ve been working on a really cool website in ASP.NET 2.0. As soon as the last bugs and design tweaks have been dealt with, we are releasing it to the world – we’re almost there.

Yesterday, the webmaster wanted to do some changes to the website himself. He knows a little HTML and has managed sites before - including the old version of the site we’re building. So, I gave him the FTP login information and he connected from Dreamweaver as he is used to. However, Dreamweaver do not understand ASP.NET 2.0 Master Pages and screwed everything up.

He knows Dreamweaver and feels comfortable using it, but he had to make the switch to Visual Web Developer Express (VWDE) in order to make it work and that didn’t please him. We finally connected using VWDE and all works fine. Or at least we thought it did…

A new problem that I hadn’t foreseen presented itself in the form of the complexity of the .aspx, .ascx and .cs files. For a person used to manage old ASP pages this turned out to be a nightmare. That’s because with the old ASP it was fairly easy to manage the different #include files and find the text you want to change, but in ASP.NET, the text could be located at various different location including in the C#/VB.NET code from another assembly or an embedded resource file.

I would imagine that this issue is shared throughout the community, but it was my first encounter with it.

Back in the days when ASP.NET 1.0 was released, Microsoft said that it was now possible for developers and designers to work together on the same project, because of the separation of HTML and the code-behind. In the light of the above mentioned issue, I do not see how this is possible unless the designer/webmaster knows his way around assemblies, resource files and C#/VB.NET, which then would make him a developer and not a webmaster.

Also, the webmaster uses the VWDE designer and that is something I NEVER do, so I didn’t make an effort to make design time databinding and stuff like that, which makes the page look miserable in the designer.

It is easier to manage an ASP website for non-developers and that’s such a shame for, let’s be honest, it sucked.

The more you separate code to user- and server controls the more difficult it becomes to manage it. So, the next question would naturally be: If the website is to be managed by a non-developer, should you avoid the separation?

A solution could be to change the password for the FTP site, so the webmaster won’t get access, but that will only lead to extra work for myself, so that doesn’t count as a solution.

In the light of the immense popularity AJAX has received the last couple of months and the emerging tools like Atlas and AJAX.NET, I thought it was the right time to talk about the implications on search engine behaviour on AJAX enabled websites.

In this post I’ll split websites into two categories – the public and the protected. A public website is accessible to all people and does not require login of any kind. It’s the most common type of website out there. The protected website could be an intranet site or a password protected membership site. In other words, the search engines index the public sites and not the protected sites.

When developing a protected site, you can do just about anything without the concern of search engine ranking. When developing a public website, you do not have that kind of liberty. The public site has to be search engine friendly.

So, when AJAX enabling a public website you have to make sure to keep the search engines happy at all times. If they aren’t happy with your website, neither should you be.

That’s why I’ve made a quick little list of Do’s and Don’ts about AJAX enabling your public website without loosing the search engines in the process.

Do’s

Do use AJAX for user specific actions
Set cookies, track sessions and log actions as long as the content isn’t dependant of it. Search engines will have no trouble indexing your content.

Do use AJAX to save content
When a user enters information in a form field and hits the save button, you can use AJAX as much as you like. Search engines will never push the save button anyway and is therefore unaware of the use of AJAX.

Do use AJAX to do form field validation
When validating form fields, you can use AJAX to validate the input without disturbing the search engines. Search engines do not fill out forms so that won’t be a problem.

Do use AJAX to display status messages
Displaying status messages of any kind based on user actions, is no problem for search engines, because the do not execute the JavaScript needed anyway and status messages are not important content for search engines to index anyway.

Don’ts

Don’t use AJAX for displaying static text content
By static content I mean the main text content of a page and not simple information like the number of current active session or something like that. The main text content of a page is the single most important thing for search engines, so never use AJAX for this purpose.

Don’t use AJAX for paging a table or list
If the table is filled with numbers with no search engine relevancy, you can skip this point. If your table or list contains book reviews, chances are that you want them indexed correctly. If your paging is AJAX enabled, the search engines will only index the first page of the table.

Don’t use AJAX for navigational purposes
This is not AJAX specific, the same rule applies to simple JavaScript as well. Search engines don’t follow JavaScript links, so they will get stuck on the entry page and leaves again without indexing the rest of your site.

The list isn’t complete, but I think it covers the basics and will help you to avoid the biggest caveats.