Web Filtering 101
There are a variety of motivating factors for filtering web content depending on the business at hand. For example, a school may be required under Federal Children’s Internet Protection Act (CIPA) to limit access to obscene content. Businesses concerned with data loss may choose to limit access to open storage sites such as Box or Dropbox. Regardless of the motivations to pursue web content filtering, there are options in architecting a solution that works best for different environments.
I’ll discuss dedicated proxy deployment options, next generation firewall (NGFW) inline web filtering and OpenDNS filtering. Lastly, I’ll discuss SSL decryption at a high level as I may write up another blog post solely on this topic considering the complexities that should be considered before deployment.
As a 101 topic, I’d first like to set the stage by explaining what web content filtering is. In a nutshell, web filtering is very much as is sounds; the ability to filter which websites can be accessed by the endpoints whose data flows through the enforcement point. I explicitly state the latter part of that description to ensure the point is made that if a user has a device not being inspected, it obviously can’t be filtered. I see this time and time again as decisions are made to block non work related content like social media. The idea is, if blocked, users will be more focused on work related tasks. I’ve learned that users will always find a way to Facebook. Typically they will leverage smart phones bypassing the filter by using a LTE or 3G service to provide the same distracting content. In the same way the enforcement point can’t filter traffic it can’t see, it also can’t filter traffic it doesn’t understand. In an effort to not get too nerdy, I’ll try to explain the limits of filtering encrypted traffic. Encrypted flows by design, are not easily readable. Typically the granularity of configuration options available for blocking encrypted traffic is limited in comparison to clear (http) traffic where full URL paths are readable.
Now that I’ve covered what a proxy does, how do we deploy the solution? The answer as it always is, “it depends.” Large deployments that prefer a point product will often buy a dedicated appliance to solve their web filtering needs. On the other hand, web filtering capabilities are included as an add-on service to most Next Generation Firewall platforms. If money is of no concern, I’d recommend the point product web filter proxy solution. In my opinion, I see the purpose built appliance offering a more robust solution. I feel the categorization of URLs are more accurate and detailed, the caching ability provides better performance, it’s easier to scale considering you don’t need to factor in all the other features of a NGFW and lastly, it offers more flexibility in deployment as a dedicated proxy. With that being said, the dedicated solution isn’t cheap and if a customer already has NGFW, the cost can be difficult to justify. Again, I stand by the statement “it depends”. Decision makers will need to balance the additional cost with the bells and whistles of a point product.
OpenDNS addresses content filtering in a different way. Not all user traffic is required to flow through the solution, only DNS requests. OpenDNS keeps a very large database of website categorization and will apply filtering when a DNS request is made. If the website is configured to be allowed, it will resolve the IP as expected. If the website is configured to be blocked, the URL is not resolved to the true IP. This model is very effective in combination with a NGFW to limit the ability to bypass an alternate DNS resolver. The service is also very quick to identify new threats due to its vast metadata analysis on 80 billion (that’s with a “B”) requests daily. The service is licensed by user and is managed through a cloud based webpage.
Each of the aforementioned deployment models offers some basic functionality. In my experience, initial requirements are pretty simple when deciding on a filtering solution. Either model offers the ability to categorize websites for ease of management. For example, they will both block pornography without the need to block every pornographic website individually. All of the solutions will allow policy base control of who can get to what category. An example of this would be allowing HR recruiting to access social media while denying other users. Each solution also allows for blacklisting or whitelisting of individual websites. This functionality is where most time as an administrator is spent. Classifications are never perfect and will need either websites allowed or denied outside the built in categories.
I alluded to SSL decryption earlier. I’ll be upfront about the fact that I’m not the biggest fan of the technology but I certainly understand why it is becoming a standard tool in the security world. To me, there is something wrong with educating users that the little lock favicon equals safe, encrypted, free from prying eyes surfing, all the while acting as a corporate Man-In-The-Middle attacker on our own employees. Yeah, yeah, acceptable use policy…I get it. The value of SSL decryption is visibility into encrypted flows. This allows the same protections enforceable on clear text http traffic. With more and more exploits leveraging encryption, I concede that SSL decryption is a necessary solution in many verticals. Point products and NGFWs support SSL decryption but with significant overhead. A SSL decryption strategy should to be developed before jumping in head first. That strategy should include the scope of categories and users that are planned to require decryption. The results of that strategy will likely alter the size of web filtering solution; as I mentioned, it is quite CPU intensive.
In conclusion, what do I suggest for 95% of companies facing web filtering challenges? I personally like the combination of NGFW paired with a DNS filtering solution. I think the point products do a great job but are hard to justify the cost when comparing them with my original suggestion. The next question revolves around SSL decryption. Depending on the vertical it may or may not be needed. That is a judgment call that comes down to cost and level of risk acceptance. If the decision is made to proceed with SSL decryption, I suggest tackling that during a refresh period in order to address the horsepower needed to effectively keep the CPU at bay. Whew…that was a lot of typing, I need a Facebook break.
Jeremy has built his career around protecting assets in the most critical IT sectors.