Evaluating Intrusion Prevention Systems


With intrusion prevention systems (IPS) fast becoming as essential a purchase as the ubiquitous firewall, the choice is becoming ever more bewildering as more and more vendors scurry to bring new products to market.

Some of these vendors are coming from a solid IDS (intrusion detection) background, while others are essentially hardware manufacturers (switches or anti-mitigation devices) that are crossing over into the IPS world. The resulting products are often quite different.

For example, the largely software-based IDS products tend to turn into software-based IPS products running on standard Intel hardware. While performance can be perfectly adequate, you can never expect them to match those ASIC/FPGA-based dedicated hardware devices which can yield near switch-like latencies, and handle a gigabit or more of 64-byte packets without blinking.

On the other hand, the new kids on the block might be able to boast superior performance, but they are often starting from scratch when it comes to signature coverage and resistance to anti-evasion techniques; areas in which the more established IDS/IPS vendors excel.

Of course, these distinctions are disappearing as the market matures, and in the latest round of IPS testing in our labs we noted a much improved success rate in terms of which products passed our stringent tests to achieve NSS Approved awards.

Using hardware accelerators, for example, can provide a much needed performance boost for the software-based products, whilst sheer experience (along with the creation or boosting of an internal security research team) can usually improve signature coverage and quality in the newer products.

Quality v. Quantity

Quality is really the watchword here, rather than quantity. It is possible to throw tens or even hundreds of signatures at a problem when you are not limited by hardware performance, but that does not necessarily mean those signatures are good. A single, well-written signature (or protocol decoder) can often provide much more comprehensive coverage for a range of exploits.

It is important, for example, that signatures are written to detect not only the specific exploits currently in the wild, but the underlying vulnerability of which those exploits take advantage. Thus, the next time a new exploit appears riding on the back of that particular vulnerability, it will be detected and blocked immediately without requiring a signature specific to that piece of exploit code.

Similarly, it should not be possible to evade the IPS detection capability by any common means such as URL obfuscation, TCP segmentation, IP fragmentation, and so on.

The quality of the signatures will also have a bearing on the susceptibility of the device to raising false positive alerts. With IDS devices, false positives are a nuisance, but only that. With IPS devices, installed in-line and in blocking mode, a false positive can have a detrimental effect on the user experience, as legitimate traffic is dropped mistakenly.

This is, therefore, a key area to investigate when planning your own trial deployments. All the lab tests in the world cannot tell you how any IPS product is going to perform when subjected to your traffic on your network.

Test, Test, Test …

This is a key point: no matter how much research you do using reports such as the ones we produce, you should never use those reports as the only basis for making your buying decisions. You should always set aside the time and budget and technical resource to perform a full bake-off in-house between all the vendors on your short-list.

This means installing all the devices at key points in your network (they can be installed in-line in detect-only mode to begin with to minimize problems), and all the necessary management software. And don’t rely on the single-device Web interface if you know you will eventually need the full-blown enterprise management product.

It will never be possible to vet all of the signatures in a vendor’s database, and it is just a waste of effort to try. Independent testing should give you a good idea of the quality and extent of coverage.

It is more important to run your own traffic through the device and monitor the effects. Are you seeing a large number of alerts raised against what you know to be legitimate traffic?

This could point to problems with the signature database or could highlight where traffic from custom applications in your own organization genuinely resembles exploit traffic. The latter case is easily handled, but large numbers of false positives from clean traffic indicates a potential problem, especially once the device is placed in blocking mode.

Performance testing is also important. NSS tests push devices to the extreme, but if you can accurately categorise the make-up of traffic on your own network, you may find that you would be happy with a much lower-performing device at a much more reasonable cost. Latency can sometimes be a very subjective issue.

A device which we identify as having higher than normal latency for internal deployments may well have no effect whatsoever when installed at the perimeter of your network. Do some simple user-based testing, such as downloading large files both with and without the IPS in-line, and note the difference.

At least part of the evaluation period should also be performed with blocking enabled. It is not unknown for devices which work perfectly well in detect-only mode to fail completely once placed into blocking mode.

While this type of testing could be considered “disruptive,” it is better to discover such a failing before committing to a major purchasing decision.

You can reduce the risk of nasty surprises and major failures during evaluation by short-listing those devices which have achieved NSS Approved status. You can be sure that we have tested these devices extensively in-line in both detect-only and full blocking mode, with a wide range of exploits and evasion techniques, and under a wide range of network loads and traffic conditions.

A thorough bake-off in your own network, however, will allow you to assess more accurately the effect of these devices when subjected to your own traffic, and is likely to create some unique challenges for the vendors taking part.

Bob Walder is director of The NSS Group security testing labs in the south of France. With over 25 years in the industry, he brings broad experience to the testing environment.