Article first published on March 16th 2017 in Information Security Buzz.
An interview with Nicolas Bouthors, Chief Technical Officer at Enea Qosmos, about the key role of Deep Packet Inspection (DPI) to strengthen cybersecurity solutions with application-level traffic visibility and detailed metadata.
Erik Larsson, VP Marketing, Qosmos division of Enea (EL): There is a lot of talk about Deep Packet Inspection in the cybersecurity market, what is it all about?
Nicolas Bouthors (NB): Many developers of security products need to understand network traffic in detail. The typical example is an NG firewall, which must have built-in real-time traffic visibility to do its job of blocking and alerting. Other solutions such as malware protection, data loss prevention and threat intelligence platforms also require detailed traffic visibility in order to be effective. DPI provides the required insight.
EL: Exactly what type of visibility do you get with DPI?
NB: DPI-based software classifies traffic flows and extracts additional information in the form of metadata, which gives security systems detailed visibility of all packets traversing a network, up to the application layer.
EL: This need for visibility into traffic flows is confirmed by SDxCentral’s 2016 Next-Gen Infrastructure Security Report, which shows that “lack of visibility” is cited by 49% of respondents as the top security challenge.
EL: So how do developers embed this visibility inside their products?
NB: Some security vendors who have been around for a while have built their own internal teams specializing in technology such as DPI. A more modern approach is to leverage a DPI engine from a specialist like Qosmos; this is why we get an increasing number of enquiries from security innovators and startups. For these vendors, using off-the-shelf, industrial strength DPI software means that they can spend more resources on developing their complete solution and less on enabling technology such as DPI; not to mention the constant updates required to remain accurate in identifying new protocols and applications. They get more focus and shorter time to market.
EL: What is the impact of virtualization on cybersecurity, in particular inside data centers?
NB: Data centers are typically protected using perimeter security technologies such as firewalls and IDS/IPS. These products focus on north-south traffic, in and out of the data center. While they are very effective protecting the perimeter, they are not built for securing east-west traffic within the data center. This is becoming an issue since east-west traffic could represent 5x the amount of north-south traffic, because of the all the traffic generated by web, application, and database servers. This means that if a malware penetrates the outer security perimeter, it can launch further attacks inside a vulnerable data center.
EL: What’s the solution?
NB: One way is to use a technique called “micro-segmentation” to divide the data center into smaller zones, which can be protected separately. The advantage is that in case of a breach, the damage can quickly be contained to a small number of compromised devices. This new approach requires a real-time association between applications and security policies. Therefore, east-west traffic between VMs must be analyzed in real-time, up to the Layer 7 application. A solution is to integrate a DPI engine inside the hypervisor, providing visibility all the way up to Layer 7 and correlating traffic and policy so that the vSwitch can strengthen access control rules between VMs based on application traffic.
EL: SDxCentral’s Next-Gen Infrastructure Security Report shows the rate of adoption for the different security technologies. Which of these use embedded DPI?
NB: Several of these security systems use firewalling technology, and for them, the challenge is to effectively classify traffic and manage rulesets based on application identification. A DPI engine classifies all flows so that developers can implement effective security policy management.
EL: What about more sophisticated solutions such as malware protection, data loss prevention and forensics?
NB: To prevent breaches, these products need to dig deep into the payload of network traffic and also extract full content and reconstruct selected parts of the traffic. This enables the solution to expose file movements at the network level to track potential malware and data exfiltration. A good DPI engine will extract raw traffic content and metadata to reconstruct emails, attached files, images, videos, transferred files (uploaded or downloaded via FTP, HTTP, Dropbox), Websites, etc.
EL: What about Security Information and Event Management (SIEM) and security analytics?
NB: SIEM solutions and other security analytics products rely on network visibility to provide a timeline around actors and actions. This timeline is then used to model and analyze behavior. Netflow and IDS events have been used to accomplish this, but the results are not always satisfactory: Netflow lacks the L7 protocol depth and IDS logs and events tend to focus on alerts instead of actions. DPI software provides protocol classification throughout network sessions to identify, track and categorize all network activity to the most minute detail. This improves the accuracy of machine-learning detection and alerting; searching and alerting is more fine-grained, with fewer false positives, and more accurate alerts.
EL: So how do you see the evolution of DPI in cybersecurity?
NB: The need for detailed traffic visibility is here to stay: we see new use cases each year, and the move to virtualized architectures, SDN and NFV amplifies this trend!
To download the full SDxCentral report, click here: SDxCentral 2016 Next Gen Infrastructure Security Report