And this time it's more sophisticated than mere traffic overloading.

A less known part of the recent ARP attack against H D Moore's MetaSploit site was an attempted Denial of Service attack that coincided with the successful ARP attack.

Denial of Service (DoS) and distributed Denial of Service (dDoS) attacks are almost a mainstay of background Internet traffic and have become an accepted part of hosting content online.

A pattern has been emerging from the background noise over the last few weeks which suggests that something is taking place that is resulting in an increasing number of successful attacks against moderate to large sites.

In recent weeks, there have been reports of sites such as IMDB, Attack that crippled Tevision3, CNN, Radio Free Europe, and new. Rumours have even circulated that Amazon's outage in early June was at least partially due to a DDoS attack, correlating with the attack against IMDB but this has yet to be confirmed. The odd errors that users were encountering when trying to access the Amazon site during the outage seems to point to some sort of network traffic related problem for the site.

Somewhat surprisingly, there were sporadic reports of users recently finding innocuous Google searches returning odd errors - claiming that the user was demonstrating bot-like behaviour (a fairly rare to find Google response), but disappearing on subsequent searches.

If there is a common group or technology driving this recent spate of DDoS attacks, it will probably take some time for information confirming this to be available, but it is just as likely that whoever is responsible for the attacks just doesn't want to be found or their capabilities explored, which assumes that there is some level of co-ordination about the attacks.

It has been some years since the basic type of attacks could take down a major site, but the growing use of botnets and other malicious networks has allowed anybody with a grudge and a little bit of money to pay for renting a botnet the ability to aim a significant hose of network traffic at a network for a small bit of effort.

Some attacking networks rely upon people having to manually activate a piece of software, activate their own attack tools, or manually visit a site to achieve the same aim. A number of Chinese hacking groups have been observed to use this particular technique to co-ordinate and manage their proposed attacks against targets, such as was observed with the recent targeting of CNN. In their day, a slashdotting, digging, or redditing could achieve a similar, though non-malicious effect and they still can for poorly designed and implemented websites.

The biggest reason why the trend has moved to DDoS attacks over DoS attacks is that to successfully complete a traditional DoS attack the attacker required access to bandwidth equivalent to that being used to target the victim's system. This became problematic as hosting providers were soon able to offer greater incoming bandwidth than most attackers had outgoing.

Variations to the simple flood attack (use up all available bandwidth) included modifying the attack to use different parts of the TCP networking handshake, leaving the targeted system sitting there waiting for connections that didn't exist to complete their networking handshake and forcing it to use up its pool of available connections without actually forcing enormous amounts of data through the available bandwidth.

Other attacks relied upon repeatedly submitting requests for large media files or content, a simple means of sidestepping the issue of the site having more bandwidth than the attacker - forcing the site to fill its own bandwidth outbound or use up all system resources just to meet the numerous demands.

As defences were created to counter these attack types, attackers began to utilise bot networks and social hacking networks to spread their attacks across multiple source points on the Internet. This had two immediate benefits, the first to hide the true origin of an attack unless a defender was an active part of the attack and was able to access/observe the command and control traffic associated with it, and secondly to make it harder for defenders to isolate attack traffic from legitimate traffic when the attacker could shape their requests to match legitimate request rates from each attacking IP, but make up for it with the volume of IPs under their control.

To help address this problem, there are a number of companies that have established themselves in the specialised niche to help protect sites against DoS/DDoS attacks, from distributed content hosting providers such as Akamai, to service providers that aggressively filter and manage network traffic at the hosting provider level to drop attack packets while still allowing legitimate users through.

What does the future hold for DDoS attacks and defences? Perhaps we are already observing the first of the next generation of attack techniques. It is difficult to speculate what sort of attack might be the successor to the current DDoS types, but if it is a continuation of the improvements that the DDoS gave to the DoS, then it will be a distributed attack where requests are not perfectly timed with each other, are varied in content and request type, and do not exceed the threshold that a normal site user would create.

It will be an attack that blends so well with the background noise that it will be almost impossible to isolate from the noise and primarily identifiable due to excessive traffic spiking.