Skip to main content

Just three years ago, technology headlines were rife with articles stating that the firewall was obsolete. And maybe that prophecy would have come true if we were stuck with the same old firewalls that could only perform simple packet filtering. But even as the way we define network perimeter evolves and blurs, firewalls maintain a prominent position in the enterprise security stack. That’s because every time the firewall seems like it’s about to lose its spot, another innovation emerges to refresh its usefulness and effectiveness.

The prevalence of firewalls is quantifiable: 30 percent of respondents to FireMon’s State of the Firewall 2019 survey reported that they manage 100 or more firewalls, and 95 percent said the firewall will be as critical or more critical than ever in the next five years.

Yet today, as network security risks continue to escalate and hybrid environments grow ever more complex, technology leaders don’t have confidence that they have visibility into their networks. They’ve purchased all kinds of firewalls – internal firewalls, application firewalls, mobile device firewalls, and more – but because they lack a central way to manage them or verify that one firewall’s rules aren’t conflicting with another, they can’t tell if these firewalls are protecting their infrastructures or not.

Learn how FireMon Firewall Security solves your most complex problems

Why Complexity is Unavoidable

Not very long ago, the standard advice for enterprises seeking security guidance was “Keep it simple.” Certainly, a simple infrastructure is easier to manage than a complex one, but what enterprise can avoid complexity today? Business imperatives require digital transformation, Nth-tier supply chains, and connectivity for workforces that are now mostly remote. That’s the world we live in, and if we were to prioritize a “keep it simple” approach, we’d all be out of business.

Consider the scenario where a DMZ is established to permit access to public servers.  It is not possible to limit the services to only the publicly available services as it is necessary to monitor and manage the servers.  So, management services are enabled, such as SSH, SNMP, backup services, etc.  Each service increases the attack surface.

This could be addressed by using native host firewalls to limit access to services from only known hosts – this is access control filtering at the host level.  So, to effectively control access, every host must be individually configured to define access.  While it may seem an easy solution to standardize configurations as a solution to this problem, it does not address the reality of how these devices are often used.

What Prevents Network Visibility?

With so many firewalls in the typical enterprise, it’s no surprise that there are many vendors in the mix. Each vendor delivers a product with its own default rules and its own way of structuring those rules. Managing all of them is overwhelming – if each one contains 2000 lines of code, there could be 200,000 lines of code or more to sift through every time there’s a rule change. And rule changes happen all the time.

So, is the problem that firewalls are too complicated and disparate? Or is the problem that many organizations don’t have the right resources to manage so much dynamic technology?

Clearly, managing changes at this scale cannot be accomplished with spreadsheets. Most enterprises have two or teams involved in processing or approving each change request, and even then, they struggle to keep up.

The solution is to implement an automated firewall management solution. That’s pretty straightforward, yet 65 percent of technology managers report that they do not use automation to manage their environments.

Firewall Security Issues

The biggest problems with firewalls are not the technology itself. They are human-based and situational: Misconfigurations and policy complexity.

Misconfigurations are a nice way to say human error. There’s no way to program mistakes out of existence – as long as a person has to key in any information, mistakes are going to happen. And boy, are they happening. In 2018, there was a 424 percent increase in data breaches due to cloud misconfigurations that were traced back to human error and, according to Gartner, human error will be responsible for 99 percent of firewall breaches going forward.

There are a number of reasons humans make mistakes: they’re not trained properly, they’re overworked and unable to focus, or they’re overwhelmed by out-of-control policies and rules. The first two causes are self-explanatory. The last, however, is worth examining.

According to Gartner Cloud Report, “Through 2022, at least 95% of cloud security failures will be the customer’s fault.”

Policies and rules build up over time. Legacy technology is already bloated with them, an acquisition or merger can introduce another load, and then there’s the normal additions that accumulate in the daily course of business. Problems arise when the growing pile of policies and rules is not proactively managed. Un-optimized, outdated, and redundant rules can conflict with others and create vulnerabilities that attackers can exploit.

Then there are bad rules. The IT professional is asked to provide access but is not given information, such as where the order came from or what services, ports, or applications are relevant. They do the best they can with the information they have and create an overly permissive rule with the intention of fixing it later when they have time to do more research. But more rules requests are coming in, and they never circle back to make the fix. The overly permissive rule is embedded in a policy where it might remain for years – and if a malicious actor finds it, it can provide entry into the network.

Best Practice: Implement Security Automation

These problems can all be solved with automation. Automation can reduce time spent on rule management by 90 percent and eliminate firewall misconfiguration by 80 percent. It works by constantly monitoring changes to the environment, mitigating common conflicts, and only referring exceptional cases to a human.

But if automation is that good, why aren’t more organizations managing their firewalls with network security automation?

One problem is the sunk cost fallacy. As businesses move into the cloud, they tend to use the same tool sets they were using on-premise. Those tools don’t necessarily translate to the cloud, but they get used anyway because business leaders want to get a decent return on investment for those purchases. The costs of change request management and compliance are often harder to notice, since they’re hidden in man-hour costs.

The cost of the new automation solution can be a hurdle. People may reason that their staff can do the work manually, so there’s no reason to purchase another product. But adherence to manual processes is a false economy. Not only are the man-hours expensive, but so are the associated risks of downtime and data breaches. Automation can save businesses with 200 firewalls  $1.7M each year.

Another reason business don’t adopt automation is that they’re simply not aware that a solution exists to solve their problem. They think automation can still only perform repetitive, low-level tasks and don’t know that automation and artificial intelligence are now combined to produce powerful solutions.

Firewalls are Still an Essential Piece of a Strong Security Posture

A firewall is not a silver bullet. There are no silver bullets in security, and an over-reliance on firewalls would be foolish and put the network at severe risk.  But using a firewall as a core and central device to limit risk, in concert with other security technologies, is an appropriate implementation.

Get 9x
BETTER

Book your demo now

Sign Up Now