As long as you are using websites to interact with end-users, you are more or less likely to encounter DoS attacks of some kind, to some extend. Since modern society becomes more and more relied on the internet, the consequences of DoS attacks are severe. DoS attacks can cause extended network outages and thus preventing end users from using the affected websites. Some hackers use DoS to gage the inner workings of the target system and then devise his method for more lethal attacks.

The definition of a DoS attack is as follows (from Wikipedia):

“A denial-of-service attack (also, DoS attack) is an attack on a computer system or network that causes a loss of service to users, typically the loss of network connectivity and services by consuming the bandwidth of the victim network or overloading the computational resources of the victim system.”

Unlike computer virus, DoS attack does not typically inject malicious code into victim’s system and thus is harder to detect and to prevent. Lots of research has been done to cop with DoS attacks. Despite the research advances, most of the out-of-the-box operating systems and http servers are inherently vulnerable to such attacks. The best first level defense would be to design your website and infrastructure to minimize the risk at first place.

  • Close unnecessary ports This is the simplest method. Historically, most DoS attacks were carried out using many compromised PCs (dummies) to perform “ping” attacks. This could be prevented by closing all the unused ports.
  • Caching Even the simplest caching method (cache for a predefined duration, does not vary by any parameters) can prevent the same page from being sent from the server at every request, and can significantly increase the simultaneous request a server can handle. This is especially useful for pages that include heavy server side processing (e.g. database operations).
  • Paging Similar to caching, using paging can significantly reduce the size of working sets and thus allow more pages to be served given the same bandwidth.
  • Force human interactions Pages such as searches that require significant server bandwidth can further benefit from using human interactions. An example can be found from Marketleap Search Engine Marketing Tools. In order to use a resource-intensive service, the user must enter a string from a dynamically generated image. This would prevent the vast majority of exploits since recognizing the randomly generated images pragmatically would be difficult and resource intensive.
  • Load balancing For large corporate web servers, a correctly designed web farm can dynamically allocate computing resources across multiple servers and thus minimize the risk of any single server being taken down from the attacks.

By using the combination of the methods mentioned above, the raw bandwidth of the network traffic can be minimized and thus prevent all but the most dedicated attacks.

Be Sociable, Share!