Denial of Service (DoS) attacks are becoming a more and more common way to bring websites and servers down. They are very easy to do and pretty hard to protect against, which is why they are so popular. The only thing you can do to prevent such an attack is to block the response to the attackers. You have no control over the requests, so you have to catch the attacker as early as possible after the request has been received by the web server.

There are two challenges to blocking the attacks

  • Identify the attackers
  • Block the response only to the attackers

To catch the request as early as possible, an HttpModule is the right place. It is executed before any page or any other handler so the impact on the server can be minimized. This HttpModule monitors all requests and block requests coming from IP addresses that make many requests in a short period of time. After a while the attacking IP address gets released from blocking.

The module is a high performance and lightweight protection from DoS attacks and very easy to implement.

Implementation

Download the DosAttackModule.cs file below and put it into the App_Code folder of your website. Then add the following lines to the web.config’s <system.web> section:

< httpModules >

  < add type = " DosAttackModule " name = " DosAttackModule " />

</ httpModules >

Download

DosAttackModule.zip (1,13 KB)

Comments

 Dave Frank

private const int BANNED_REQUESTS = 10; private const int REDUCTION_INTERVAL = 1000; // 1 second private const int RELEASE_INTERVAL = 5 * 60 * 1000; // 5 minutes I see that the threshold is 10 requests in one second, which is certainly more than a normal human would do, but is it more than, say, google would do? The last thing you want to do is block google or another search engine from spidering your site, so either we need to know what is reasonable to set the settings as, or we need some sort of whitelist for search engines (though I don't even know how that would be possible considering the overwhelming number of spider and spider IP's out there that we would have to match against). Also, I'm wondering what the performance impact would be on a high volume site - granted you're only storing 5 minutes worth of IP addresses ....

Dave Frank

NinjaCross

I think too that this is a good point to consider. Maybe Mads is using this value only for sample-purposes, cause I don't think that an absolute value can be established for all conditions. More realisticly, a suitable value should be calculated for eash specific case. Think i.e. to all those SOAP/non-SOAP service-oriented web sites that are developed to accomplish to tenths/hundreds of calls from a restrict pool of callers... they probably shouldn't block the IPs neither for much more than 10 calls.

NinjaCross

Mads Kristensen

I understand your questions, so here is some explaining. The threshold is 10 requests per second, but that can be adjusted easily to suit the individual site as NinjaCross points out. Google and other search engines has a delay on 1 second per request, so that should not be a problem at all. I we are lucky, spam spiders are not so kind to delay their requests so they might get blocked. That means that no search engine whitelist is needed. The only IP addresses that are stored in 5 minutes are the ones that are banned and only them. All other IP's are removed from memory within max 10 seconds.

Mads Kristensen

 Dave Frank

OK, so what threshold would prevent a DoS attack but yet would not limit access to, say, an AOL Proxy server...

Dave Frank

Mads Kristensen

Good question. I have no idea. Maybe a whitelist is the way to go for AOL customers. There is probably an IP range that belongs to those proxies, so that would be fairly easy. As NinjaCross wrote, the treshold is something different for all websites/services.

Mads Kristensen

 Oskar Austegard

Shouldn't a return; statement be added inside the if block to prevent ChekIpAddress(ip) from being executed on a banned ip? HttpContext.Current.Response.End won't kill the running of the method, will it (I mean, won't it just continue running after the Response has been sent back)? private void context_BeginRequest(object sender, EventArgs e) { string ip = HttpContext.Current.Request.UserHostAddress; if (_Banned.Contains(ip)) { HttpContext.Current.Response.StatusCode = 403; HttpContext.Current.Response.End(); } CheckIpAddress(ip); }

Oskar Austegard

Mads Kristensen

Oskar, Response.End() will stop the execution of the rest of the method.

Mads Kristensen

Mads F

Great idea, Mads. Although the requests will reach the application, they will be terminated early in the request pipeline. I'll be looking forward to testing this on IIS7 - as I understand, its possible to add HTTP Modules to IIS7 directly, and therefore prevent the attackers requests for ever reaching the application. I'm thinking that a configuration section would properly be a good idea, so it would be easy to configure threshold, and some kinda whitelist. Has there been any studies on DoS attacks? I for one have no idea how many requests a typical attack makes per IP.

Mads F

Mads Kristensen

I don't know of any studies on DoS attacks, but I would love to see a more detailed pattern in how they operate. You can easily ad a configuration section to the module if you'd like.

Mads Kristensen

 Shiva

Your code is not thread safe and that's a crime when it comes to web applications. There are numerous ways in which it'll throw out exceptions due to race conditions thereby failing the user request.

Shiva

NinjaCross

Unfortunately Shiva is right, infact the module throws a InvalidOperationException in mscorlib.dll for each timer tick. This is a big problem that makes the module useless because even if those exceptions seems to not interfere (AFAIK) with normal page execution, the presence of those log records into the Output Console makes debugging hard. Mads, could you plz check the code and fix this problem ? This would be very appreciated :) (BTW, this blog comments post is still requiring double-send to works correctly :S )

NinjaCross

Markus R.

lock _Banned and _IpAddresses to get rid of the InvalidOperationException copy the keys into a string array first: string[] tmpKeys = new string[_IpAdresses.Keys.Count]; int i = 0; foreach (string key in _IpAdresses.Keys) { tmpKeys[i++] = key; } foreach(string key in tmpKeys) { _IpAdresses[key]--; if (_IpAdresses[key] == 0) _IpAdresses.Remove(key); } TODO: performance hit of using lock and copying the keys

Markus R.

Josh Stodola

Mads, how can we make this code thread safe? How can we avoid the InvalidOperationException? I plan to use this to help block trackback spam, except I will be using the URL parameter as the identifier instead of the IP address. In my logs, I will have 20 or 30 consecutive trackback attempts, all from the same URL but with different IP addresses. My trackback handler is extremely similiar to the one you wrote for BlogEngine.NET, and that is quite a performance hit on the server. Do you think using this module would be better performing (overall) than going through all of those trackback spam prevention techniques over and over?

Josh Stodola

Andrey Leybovich

Hi Mads, Why not use Context.Cache for storing banned ip's? This way you don't need to care about thread-safety (Cache is thread-safe) and you don't need to explicitly use Timer - you can set expiration on an object when you add it to the cache. I'm thinking of implementing such a module; do you think there migth be any problems with using Context.Cache here?

Andrey Leybovich

Comments are closed