Get to know us better! Gain valuable insights into how we think by visiting our blog, or take a look at the industry events we're frequenting on our events page. You can also geek out with us by attending one of our security management webinars, or dive head first into the products and solutions we provide in our Resource Library. There's lots to keep you busy!
Last week I had the privilege of speaking at the United Security Summit. I spoke about Risk Analysis, and compared the world of automobile traffic engineering and network security risk analysis. In my discussions with the attendees after the session, it was clear that two key elements are required to have effective and relevant risk analysis in the enterprise environments most of us work in today. For this post, I want to focus on the first key: context.
Having the full awareness and context of your network topology and the security controls that have been put in place are required to have a full understanding of your organizations true risk posture. Many times, when automobile traffic engineers are attempting to solve a congestion problem, they will lay data cables on a section of highway to get an idea of the number of cars, frequency and distance between units. However, they are only getting the raw data of the section of road that they happened to deploy the data cable to. They don't see the entire 30 miles of the given highway, the on-ramps, or feeder roads leading to the ramps. The data cable does not account for weather, day of week, time, holidays, etc. There are many elements that go into understanding the full context of why traffic is congesting on a given multi-lane highway. Unfortunately, many decesions about resolving congestion are made based off the raw data captured from the deployed data cable, and don't account for the full context of what is causing the congestion in the first place. This why the default response of adding more capacity many times doesn't solve the problem.
Much like the traffic engineer, in network security, we tend to rely on our own data cable solution. Most enterprises today when assessing risk simply run a vulnerability scanner. In the large enterprise environments we now work in, this can result in a list of 1000's of vulnerabilities that need to be addressed. Many times, the engineer looking at this result will simply choose what they think are the important patches to fix, and assume they have reduced risk to the organization. Without providing the full context of the entire network topology and the security controls put in place to control data flow, the vulnerability results have no frame of reference. If the results list a sql vulnerability on a high value web server as severe, one might assume this needs to be addressed immediately. However, the scan results aren't aware that the firewall cluster fronting the web services prevent sql from coming to the web server from any internal or external source. So, is this truly a severe risk that needs to be addressed immediately? Without the full network context,vulnerability scanners by themselves are unable to truly give you an accurate picture of the risk to your network environment.
In our next post, we'll discuss the importance of speed in network risk analysis.
So you’ve purchased a new firewall. Now what?
You’ve got to decide which access is allowed, which isn’t allowed and whether or not rules are compliant with internal and regulatory standards.
Things are running along smoothly and then the dreaded “change.” A user submits a new access request and the fun begins. Is this access necessary? Safe? Compliant? And what happens when it’s time to retire unused rules?
How Effective Security Management Can Help Teams Cover the Exponentially Increasing Gap between Technology & the Resources Available to Manage It
Security teams today are under tremendous pressure due to the rising frequency and impact of breaches and a business that wants to move faster and faster. The answer to both of these challenges has always been to add more technology and staff resources.
However, each new technology added creates complexity. More rules are created and more data is generated. As networks continue to evolve, this complexity will only grow. And while staff resources may increase, they will never match the exponential growth of technology.
FireMon calls this phenomenon The Complexity Gap and has set out to help security teams close it.
Join us for this webinar with Frost & Sullivan where we’ll explore the causes of “The Gap” and how workforce multipliers such as intelligence and automation help staff manage their security more efficiently and more effectively.
Helping Enterprise Security Teams Improve Resource Efficiency & Reduce Overall Risk Exposure
Firewall technology has come a long way since its initial, most rudimentary forms. Next-Generation Firewalls (NGFW) are the latest development, and organizations are accelerating adoption to the new technology. But NGFWs aren’t a fix-all solution.