Get to know us better! Gain valuable insights into how we think by visiting our blog, or take a look at the industry events we're frequenting on our events page. You can also geek out with us by attending one of our security management webinars, or dive head first into the products and solutions we provide in our Resource Library. There's lots to keep you busy!
My years of experience managing security programs, across a broad spectrum of industries, has given me a greater understanding of how technology and people both play a critical role in influencing the overall security posture of any organization.
If Software Defined Networking (SDN) becomes the open ubiquitous technology that I think it will, everything changes.
That sounds dramatic, but I believe that SDN will change many aspects of how we deploy and manage networks. It also creates a completely new paradigm for security enforcement and an opportunity to think differently.
I think it will be amazing for people, for the industry, and for everything we try to do in security. It will power an Internet of Things (IoT) and forever elevate the value of data anytime, anywhere. I see SDN as the next critical step that no one will ever know happened.
When is this amazing change supposed to happen, you ask? It's already started and it will be ongoing for many years to come. It's not something where you can just flick a switch and suddenly it's all there and running; there's still lots of work to do.
But we can flick the switch ahead of time when thinking about how to build SDN strategy, and ultimately a secure one. To do this, you have to drop all current expectations of the technologies that you're running today and think about what SDN is meant to change at all levels.
To get in the right state of mind for this exercise, consider a situation where you've been running a library for many years. It's stacked full of books, magnificent collections for anyone to access and read via a book tracking system that you've spent millions on, essentially putting the Dewey Decimal System online.
Then tragedy strikes one night and the entire collection, along with the building, burns. The insurance money comes in and we are left with a real question. Does it make any sense to rebuild a building full of books, knowing what we already know about technology? Is there still a place for this? Before, due to a long history of value, this option was assumed, but when presented with, or in fact forced to recreate the library, does the design and deployment of a building of books make any sense?
I ask this question because you have to go into SDN with just that frame of mind. Ask yourself if what you're doing today makes any sense in this new design, then go a step further. Ask yourself what you need to do to empower SDN instead of looking at it from the perspective of how it might work based on how you do things today.
What It Takes
Let's flick that switch now and consider how SDN is evolving the network by walking through an SDN-enabled infrastructure from network to application.
SDN extracts network intelligence directly from switches into a centralized controller. This controller contains all the objects in the environment, from switches to applications, and everything between. The controller can send commands like “put, get, forward, delete, etc.”, as well as take in data about the state of any forwarding tables (and that's without getting into the technical details, which is another blog unto itself).
Consider a network where you can make forwarding decisions based on far more than IP data. I'm talking about simply knowing where the connection needs to be and forwarding it across any infrastructure to any application, against any security controls that you may need. Maybe you rewrite the IP header as it moves across physical connections, but that's not even necessary to consider when working with SDN as the process is abstracted away from us.
Think about what you could do with the power to forward packets based on a myriad of possible scenarios from network to application, and being able to track and protect that flow on demand. Running out of CPU and memory in one datacenter? Send the flow over to another. That one getting tapped out? Push it out into a cloud infrastructure.
New version going online of your application? No problem, as the next flow will be directed to the virtual machine running the new code. Problem with the new code? OK then, the next flow goes back to the previous version service, all on demand and orchestrated. I can't wait to see the creative things that people do with this level of program-ability and control.
The Security Perspective
How is security affected by all of this?
For starters, it's simply abstracted to a service with policy eventually moving into orchestration of that service. Don't get me wrong, security policy management remains relevant, but it moves from a dictated security policy to a monitored security policy, just not right away. And over time, traditional enterprise security policies will become less relevant. To show you what I mean, we can jump ahead to the concept of a monitored policy as part of this exercise.
Let's say that an application request comes in the form of a network call to a Web service to return data for a custom application, perhaps a new wearable armband health application. The network then checks its table to see where to send the connection, tags it accordingly, and forwards it on. In turn, the controller knows an application request is on its way for this particular service, and most likely already has a server up and running, ready to service it.
Since the controller knows how many clients it can service per virtual machine, with defined CPU and memory, it keeps spinning up new virtual machines and redirecting traffic accordingly. To include security in this process becomes a simple task. There's no need to deploy hardware and create choke points as security simply becomes another application to the abstracted network.
For example, we can forward the data based on any decision, not just the network setup, and offload a copy for traffic validation; essentially run an on-demand security scan on the same flow and let the controller know if there's a problem. Based on the orchestration decisions, the controller can have the traffic flow quarantined, blocked, redirected or just plain dropped; how and why will be tied to the value and risk of the service.
This is the point where we move from security policy management to security policy monitoring. As applications are defined and brought online, information will be collected on what data is handled by which users and corresponding threat scanning can scale up or down accordingly. It will be this on-demand delivery of security services that will enable rapid scaling of new applications.
While excited about all these the possibilities, I'm fearful of the potential nose dive that could occur if vendors try to create some form of lock in. SDN as a technology can't be stopped by this and will emerge no matter what, it's just a matter of how long it takes. Being realistic, it's just going to take a few generations of equipment to get there.
However, if we truly enable SDN from networks, along with security, and into the application, many of our current challenges go away. Not to say we won't have new issues to consider, but I'll save that discussion for another time.
We encourage you to share your thoughts, and we look forward to reading your comments. We invite you to subscribe to our blog to keep up with the latest posts of our new series.
So you’ve purchased a new firewall. Now what?
You’ve got to decide which access is allowed, which isn’t allowed and whether or not rules are compliant with internal and regulatory standards.
Things are running along smoothly and then the dreaded “change.” A user submits a new access request and the fun begins. Is this access necessary? Safe? Compliant? And what happens when it’s time to retire unused rules?
How Effective Security Management Can Help Teams Cover the Exponentially Increasing Gap between Technology & the Resources Available to Manage It
Security teams today are under tremendous pressure due to the rising frequency and impact of breaches and a business that wants to move faster and faster. The answer to both of these challenges has always been to add more technology and staff resources.
However, each new technology added creates complexity. More rules are created and more data is generated. As networks continue to evolve, this complexity will only grow. And while staff resources may increase, they will never match the exponential growth of technology.
FireMon calls this phenomenon The Complexity Gap and has set out to help security teams close it.
Join us for this webinar with Frost & Sullivan where we’ll explore the causes of “The Gap” and how workforce multipliers such as intelligence and automation help staff manage their security more efficiently and more effectively.
En la actualidad, uno de los retos principales es preparar las redes de seguridad, no sólo para enfrentar las amenazas, sino también para enfrentar los cumplimientos. El día 26 de enero se publicó en el Diario Oficial la LEY GENERAL DE PROTECCIÓN DE DATOS PERSONALES EN POSESIÓN DE SUJETOS OBLIGADOS.
¿Está tu red preparada?
¿Cuentas con los procesos necesarios para el cumplimiento?
En esta era digital los datos personales de nuestros clientes y proveedores pasan por una red y se almacenan en una base de datos. Éstos, por ley, deben protegerse por medio de sistemas y procesos. Uno de los objetivos de esta ley es establecer las condiciones de tratamiento de datos personales y fomentar la cultura de protección.
La Ley de protección de datos es mucho más que un simple aviso de privacidad; esta ley describe derechos y obligaciones que de incumplirse pueden ser penalizados. Asiste a este Webinar para conocer más y prepararte. Te mostraremos:
In the fall of 2016, we sought the answer to a very simple question: What benefits do users who have a firewall management tool deployed with their firewalls see over nonusers? To find out, we commissioned Forrester Consulting to survey 188 IT security decision makers.
In their study, “Automate Zero Trust Policy & Enforcement,” Forrester Consulting found that organizations with firewall auditing and configuration tools realize more benefits that those without, including:
In this webinar, guest Speaker Josh Zelonis, Senior Analyst with Forrester, will review and discuss the results of the study with FireMon CTO Paul Calatayud who will bring his own experiences and best practices for deploying firewall management tools to improve productivity and reduce risk.
Helping Enterprise Security Teams Improve Resource Efficiency & Reduce Overall Risk Exposure
Firewall technology has come a long way since its initial, most rudimentary forms. Next-Generation Firewalls (NGFW) are the latest development, and organizations are accelerating adoption to the new technology. But NGFWs aren’t a fix-all solution.