4GoodHosting
Live Help
24/7 Support
4GoodHosting Canada 1 866 708 4678
Blog Menu G
Search
Categories
m
m

Day: May 1, 2018

adobestock 101441734
reading time Reading Time: 4 minutes

Even the least tech-savvy individuals will likely have heard of firewalls, and understand the purpose they serve in keeping computers safe from becoming attacked or debilitated by external sources. For a long period of time, firewalls tended to be pleasantly reliable for the most part, and would defend against the entry of malicious threats to the unit itself or the network it was a part of. Like any solid Canadian web hosting provider, those of us here at 4GoodHosting don’t need to be convinced of the need for these safeguards. However, given the nature of our business we’re also front and center for seeing how firewalls aren’t the far-reaching and reliable solution they once were. A new report called the Dirty Secrets of Network Firewalls has found that one-in-four IT managers could not identify around 70% of network traffic, and that on average 45% of network traffic goes unidentified. The most crucial finding of the survey, however, was that most firewalls were failing to do their job adequately. Along with this ever-growing lack of visibility into network traffic comes a more threatening reality of these individuals not being able to control what they can’t see. 84% of the IT professionals who responded admitted to having real concerns about security due to lack of visibility into network traffic. A similar percentage of those same respondents agreed that lack of application visibility was a serious security concern for businesses and could impact effective network management. The results of such looseness? Ransomware, malware, data breaches and other advanced threats for starters. Reasons for Less Reliability Major increases in the use of encryption, browser emulation, and advanced evasion techniques are the primary factors that detract from a network firewall’s ability to provide a sufficient amount of visibility when it comes to application traffic. The report also states that organizations spend an average of seven working days remediating infected machines each month. Even small-sized enterprises spent an average of five working days on average doing the same thing. Larger ones? Try an average of ten working days to remediate 20 machines per month. That’s a sign of both ineffectiveness and forced inefficiency, to say nothing...

You may find this interesting too.
On This Page G
Explore 4GOODHOSTING
Copyright © 2025 4GoodHosting. All Rights Reserved.
+1 866 708 4678