Faith in Firewalls?

reading time Reading Time: 4 minutes

Even the least tech-savvy individuals will likely have heard of firewalls, and understand the purpose they serve in keeping computers safe from becoming attacked or debilitated by external sources. For a long period of time, firewalls tended to be pleasantly reliable for the most part, and would defend against the entry of malicious threats to the unit itself or the network it was a part of.

Like any solid Canadian web hosting provider, those of us here at 4GoodHosting don’t need to be convinced of the need for these safeguards. However, given the nature of our business we’re also front and center for seeing how firewalls aren’t the far-reaching and reliable solution they once were.

A new report called the Dirty Secrets of Network Firewalls has found that one-in-four IT managers could not identify around 70% of network traffic, and that on average 45% of network traffic goes unidentified. The most crucial finding of the survey, however, was that most firewalls were failing to do their job adequately. Along with this ever-growing lack of visibility into network traffic comes a more threatening reality of these individuals not being able to control what they can’t see.

84% of the IT professionals who responded admitted to having real concerns about security due to lack of visibility into network traffic. A similar percentage of those same respondents agreed that lack of application visibility was a serious security concern for businesses and could impact effective network management. The results of such looseness? Ransomware, malware, data breaches and other advanced threats for starters.

Reasons for Less Reliability

Major increases in the use of encryption, browser emulation, and advanced evasion techniques are the primary factors that detract from a network firewall’s ability to provide a sufficient amount of visibility when it comes to application traffic.

The report also states that organizations spend an average of seven working days remediating infected machines each month. Even small-sized enterprises spent an average of five working days on average doing the same thing. Larger ones? Try an average of ten working days to remediate 20 machines per month. That’s a sign of both ineffectiveness and forced inefficiency, to say nothing of squandered productivity elsewhere.

The organizations polled were all looking for an integrated network and endpoint security solution that would put an end to the threats. 99% of IT managers wanted a firewall technology that will take infected computers and isolate them automatically, while 79% of them would more simply like their current firewall to serve them better. 97% of these respondents expected the same vendors to offer firewall protection that allowed direct sharing of security status information.

Lack of visibility into network traffic

In addition to the severity of these security risks, the lack of visibility is also a major concern. 52% of IT managers reported that a lack of network visibility had negative implications for business productivity, and primarily because they could not prioritize the bandwidth for critical applications. Industries that rely on custom software to meet specific business needs are finding that an inability to prioritize these mission critical applications over less important traffic is becoming quite costly. Highlighting that is the fact that 50% of the respondents who invested in custom applications were unable to identify the traffic, and making insufficiently informed decisions has been significantly impacting their return on investment.

The survey also found that:

  • An average of 45% of network traffic was going unidentified, and not controlled accordingly.
  • 4% organizations are concerned about security.
  • 3% organizations are concerned about productivity.
  • 9% IT pros wish for better protection than that provided by their current firewall.
  • Organizations dealt with 10-20 infections per month.

It’s unlikely that anyone ever saw the first incantations of firewalls to have any sort of ‘set-it, forget-it’ promise, but it’s clear now that a big-picture review of their design, function, and -most importantly - results in application is very much due. It shouldn’t take much for this to become a priority issue in the web and digital security world.

You may also like: