A year ago this month Napatech (News - Alert) predicted that 2016 would be the year of the 100G network. So we thought now might be a good time to check in with the company to see what it’s been up to and what trends the company is spotting now.
Napatech is the leading provider of Smarter Data Delivery solutions. What does this mean?
For network availability and security solution providers, Napatech provides the right data, at the right time, to the right place. Unlike other data delivery solutions, Napatech’s Smarter Data Delivery ensures faster, more efficient and reliable delivery of data.
Napatech sells solutions that address the challenges of [getting the] right data, at the right time, to the right place within the network management and cybersecurity space. We do this through our three solution areas:
• Napatech accelerators, which address the challenges of accurate real-time monitoring of high-speed networks;
• Napatech recorders, which are designed to capture 100 percent network data with zero packet loss and allow retrieval of relevant data on demand; and
• Napatech virtualization solutions, which are designed to monitor data in real time while efficiently and cost-effectively supporting multiple virtual networking, storage, and compute acceleration solutions.
Who does Napatech see as its closest competitors?
Our biggest competition is from firms providing standard NIC (News - Alert) cards and solutions, which are designed to address the broader challenges but do not meet the niche or specific solutions of customers… because they require higher performance. We are also seeing increasing competition from in-house solution developers; but, again, they lack the resources needed to further develop the solutions to keep up with the increasing data demands.
What are some of the most important developments you’re seeing relative to cloud and the data center?
The increasing need for cloud services in several industries has led to a boom in the number of data centers that are springing up. But at the same time, firms are struggling with the soaring costs of rack space in the data centers and the associated power and cooling costs. As a solution to this, we see component manufacturers and technology providers developing smaller, sleeker solutions that take up less space and use less power.
Napatech recently announced the launch of its 2x100G compact network accelerator card, the NT200A01. Why this product now?
The market for 100G is developing fast, which means that there is a need for more compact network accelerator card form factors to complement the full throughput 200G solution we have today, which is based on two full-length PCIe cards. We have several customers who need to have multiple 100G ports per server (as many as eight accelerators in some cases), and it is difficult to find servers with more than one or two full-length slots. The NT200A01 provides the ideal solution in this case, allowing all PCIe slots in the server to be used.
In addition, we are also seeing more solutions where backup paths are established in which the second port can be used as a backup to the first port, which means that the ability to provide 2x100G ports, but just have one active at a time, makes the NT200A01 the ideal solution. Combined with our existing 100G network accelerator card solutions, Napatech can now provide a compelling portfolio to fit any use case need.
What are some of the most important developments you’re seeing relative to security?
We are seeing two strong trends: the importance of threat detection, and the move to 100G.
Advanced Threat Detection has been identified already in 2015 as an important complement to traditional perimeter security solutions such as firewalls and intrusion prevention solutions to detect breaches that have successfully bypassed perimeter security solutions. This includes sandboxing and network and user behavior analysis based on full packet capture. These solutions are crucial to quickly detecting and reacting to breaches.
At the same time, we are seeing a lot of interest from end users who want or need to build 100G security solutions themselves based on open source solutions like Suricata, Bro, or Snort. The issue for these companies is that there is a lack of commercial 100G offerings that are truly capable of handling the speed and volume of data from 100G connections. We expect that this need will dramatically increase in 2017.
Tell me about the Pandion network recorder. Who uses it and for what?
There is a growing focus on network security, policy, and regulatory compliance and the sheer complexity of managing the data deluge. The Napatech Pandion network recorder captures 100 percent network data with zero packet loss and allows retrieval on demand for security and network management applications. With the Pandion network recorder, users can be assured that all the data available for analysis is 100 percent accurate.
With the Pandion-Splunk integration, network security managers can now get complete visibility into network traffic when a breach occurs. This is a huge scale-up in the area of operational intelligence where lack of visibility into data traffic hinders network security efforts, both before and after an event has occurred. The integration allows network security personnel to reduce resolution times of events as the Splunk software generates alarms when unusual activity occurs in the network traffic. With the Pandion recorder, data relating to the event can be quickly retrieved and sent for further analysis and investigation. The integration provides an overall reduction in the cost and resources to maintain users’ cybersecurity operations.
What are some of the most important developments you’re seeing relative to the telecom operators?
The major development is the evolution to SDN/NFV-based networks, which will be driven in earnest by plans to deploy 5G and IoT connectivity solutions. Even though work on SDN/NFV has been underway now for close to five years, there is still a lot of work to do, and Napatech is working with top-tier NFV infrastructure vendors (both TEMs and major server vendors) to ensure that the NFV infrastructure can provide the performance that is needed without compromising on the flexibility to deploy and move virtual functions as needed in a cost-efficient manner.
Facebook recently unveiled its Voyager solution and contributed the specs to the Telecom Infra Project. What does this say about what’s happening relative to the data center and networking at large?
Facebook and the OCP (News - Alert) movement in general are good examples of the open source mentality migrating from software to hardware, which will force everyone to re-assess the value that they bring and how that value is delivered. We work with a lot of these kinds of customers, who are often the first to see the future challenges, such as security at 100G and others. This forces them to react faster than most market participants and build their own solutions. The scale of their operations provides them with the basis for a positive ROI on this activity, but they realize that not all internet and cloud service companies can do the same. By providing their designs to the community, they are helping the entire market to keep up with the latest challenges, which is commendable. What that means in turn is that the market can now react faster so the gap can be closed quicker so companies like Facebook don’t need to perform developments themselves from the ground up, but can rely on available open source hardware to build or even source solutions from third parties. At the end of the day, Facebook is not an equipment vendor and is making these products out of necessity rather than choice.
What are some of the most important developments you’re seeing relative to network management?
If SDN and NFV never succeed in reaching their ultimate visions, they will at least have done the community one major service, and that is to highlight the need for more intelligent management of networks. The automation and orchestration solutions that are being developed and delivered to power SDN and NFV are also having a positive impact on physical legacy network management, as it means that processes can be automated and manual intervention is kept to a minimum. This is vital when moving to faster networks like 100G where there is as little as a few nanoseconds between one packet and the next and when considering the advent of billions of IoT devices that need to be connected, each dynamically entering and leaving the network. This is more than can be handled by manual intervention. In addition, the recognition of the need for real-time network behavior analysis to support network management and security along with artificial intelligence and big data analysis promises to make operation of networks much more agile and economical in the future.