I was recently made aware of a peculiar bug in BlogEngine.NET that would delete all posts, comments and pages. Now, this specific issue is not new, but it was new to me so I thought I would share it with you. Maybe it’s new to you too.

The scenario is extremely rare and that’s why I’ve never come across it before. Here’s the step to reproduce this issue:

  1. Sign in to your BlogEngine.NET installation using Internet Explorer.
  2. Open Microsoft Visio and use it’s reverse engineering to generate a sitemap of the blog.
  3. All your posts, comments and pages are now deleted.

The reason you need to use Internet Explorer is that Visio and Internet Explorer share the same cookie container behind the scenes. The cookie you got when you signed in using Internet Explorer is still present when you open Visio and therefore you are still signed in when you use Visio.

Ok, so now you are signed in using Visio and you start Visio’s crawling feature and point it to your blog address. All the delete-links under each post, comment or page gets crawled and thereby you delete them all.

The protection

It’s very easy to protect against this kind of bug. Just change the delete-links. This is an example of an unprotected link:

[code:html]

<a href="?delete=1234" onclick="return confirm('Are you sure?')">Delete</a>

[/code]

And this is the protected version

[code:html]

<a href="#" onclick="if (confirm('Are you sure?')) location.href='?delete=1234'">Delete</a>

[/code]

The difference is that now you can only delete if the client supports JavaScript, which of course Visio doesn’t. Remember that this is only an issue if you are signed in, so this is not something everybody can do and that is why I’ve never come across it before. In other words, it is not a dangerous bug at all and by fixing the links you will just be protected from your self.

The point is that if you expose delete-links on your page; make sure they are protected from Visio and other applications that share cookie container with Internet Explorer.

FYI, this has been corrected in the upcoming 1.2.5 release of BlogEngine.NET due in about a week.

Any application ever build have an architect designing it. That architect could be a software architect, lead developer or just a regular code monkey and probably all of them at some point or the other. The point is that someone has made a lot of important decisions on how an application is structured. For me, that’s the best part of the project life cycle – the challenge to make qualified decisions that has to withstand time and changes.

For a regular component library, business- or data logic layer, service endpoints etc. this is very challenging but there is a lot of written material you can look to for help until you get it down. Dare I mention design patterns? In my experience, the initial design and architectural phase of a component or class library project is highly prioritized by managers and project leads. That’s good, because if the overall architecture is wrong, the project will ultimately fail.

ASP.NET web projects are a little different when it comes to architecture. First of all, the initial architecture doesn’t get as highly prioritized as it needs. Secondly, there are very different rules that apply to web projects. They are hard to impossible to unit test without spending and insanely amount of time writing click-test scripts etc. The output also varies from browser to browser, which can cause trouble for even the most experienced web developer and ultimately prolong the development.

When the ASP.NET developers start hammering on their keyboards things start to get complicated again. For every little feature there has to be made decisions to optimize the implementation. If you are adding a business object to your business layer it is much simpler. You “just” need to create a class derived from your business base class, add some properties and persist its state to the data base. I’m simplifying this but a lot of times that’s all you need to do if you have a decent base class. That’s because the architecture was done up front.

A new feature to a web project on the other hand, needs much more architecting by the individual developer. Will you use user controls, server controls and AJAX? If you use AJAX, will it be most optimal to use Prototype with an HttpHandler, ASP.NET AJAX or client-callbacks? Will you lazy load your page/control and what about caching? Will you store state in ViewState, session or the cache? How will you handle exceptions since you can’t bubble them up the stack and what about usability and accessibility? Will you use a Repeater or manually write out HTML to a PlaceHolder and what do you do to mitigate SQL injection and cross site scripting attacks? These decision and many more have to be made for every new feature.

It seems that the ASP.NET architecture is backwards from conventional software architecture. You spend fewer hours up front but for every little feature you need to spend a lot of time architecting. It’s totally opposite from conventional development in that regard and it leaves the decisions in the hand of the developers instead of the architect. The more time you get from management to architect your ASP.NET project, the more control you’ll have over the end result implemented by the developers, but in the end it might just be enough with thorough guidelines and accept that a lot of decisions are left in the hands of the developers.

I’ve done both conventional architecture and ASP.NET architecture over the years and my experience tells me that ASP.NET is more prone to diversity in implementation because of all the decisions that are left with the developers. Maybe it’s just the nature of web development.