infoTECH Feature

February 18, 2015

Rethinking Network Acceleration in a Third Platform World

In order to enable a more robust network in today’s fast-paced environments and stay ahead of the ever-expanding data growth curve, network management and security appliances will now, more than ever, need to remain in front of advancing network speeds to ensure apps run quickly, videos stream smoothly and end-user data is secure and accessible.

At the end of 2014, Forrester released its Top 15 Emerging Technologies of 2015. Its predictions identified trends that enterprises today must consider when addressing new digital competitors. The firm identified “Software acceleration platforms and tools” as an emerging technology necessary to maximize engagement and value for organizations looking to enable innovation and remain ahead of the data explosion.

IDC’s (News - Alert) predictions for 2014 touted the Battle for Dominance, and Survival, on the 3rd Platform, the next-generation IT software foundation that includes Cloud computing, mobile, Big Data and social engagement, all of which generate data that needs to be analyzed, secured and managed, in real time.

The common theme to many of the recent IT predictions is the volumetric quantities of data being produced by cloud, mobile, Big Data and social technologies. As analyst giants monitor the ever-changing IT landscape, it is clear that regardless of the platform, or the means of delivery, the volume, variety and velocity of data in networks is continuing at explosive rates.

In light of these new reports, it is clear that there is a definitive need in today’s infrastructure for platforms and tools that accelerate access to data. As network engineers work to deliver these massive data streams in real time, performance and application monitoring is turning into a pressure cooker, with multiple usage crises dragging down network performance at any given time.

Rethinking Appliance Design

Whether cloud, mobile or Big Data, there is a need for software acceleration and support across a variety of platforms. To address this need, hardware acceleration must be used to both abstract/de-couple hardware complexity from the software while also providing performance acceleration.

De-coupling the network from the application layer helps to realize this focus, while at the same time opening appliances up to opportunities that support new functions that are not normally associated with their original design.

By implementing high-performance network adapters, administrators can identify well-known applications in hardware by examining layer-one to layer-four header information at line-speed. By identifying what is performed in hardware and what is performed in application software, more network functions can be offloaded to hardware, thus allowing application software to focus on application intelligence while freeing up CPU cycles to allow more analysis to be performed at greater speeds.

In addition, hardware that provides this information can be used to identify and distribute flows up to 32 server CPU cores allowing massive parallel processing of data. All of this should be provided with low CPU utilization.

Appliance designers should consider features that ensure as much processing power and memory resources as possible and identify applications that require memory-intensive packet payload processing.

Accelerating the Third Platform

There are a significant number of tools on the market that address the problem of downstream analytics in a voluminous environment; however, the ability of these tools to perform real-time analysis and alerting is limited by their performance. Solutions that are used to extract, transform and load data into downstream systems tend to increase the latency between data collection and data analysis. Moreover, the volume and variety of data being ingested makes it impossible for analysts and decision makers to locate the data they need across the various analysis platforms.

Improving real-time analysis capabilities by pushing intelligence to the point of data ingress will accelerate “third platform” activities. Some best practices include:
 

Real-time Alerting

Real-time alerting is the ability to know what data is entering the system in real time, before it reaches decision making tools, thus providing intelligent alerts to stakeholders, informing them of the presence of new data that is of interest for their area of responsibility.

In-Line Analytics

Making use of perishable insights—that is, data whose value declines rapidly over time— requires that organizations begin to analyze data at the very moment it is received. Doing so ensures that an organization can begin acting on what is happening immediately.

Intelligent Data Flow

By inspecting data immediately upon ingress, data flow decisions can be made to direct data to downstream consumers at line-rate. This minimizes the unnecessary flow of data through downstream brokers and processing engines.

Future-proofing the Network

Specialized network applications can be incredibly expensive, making scaling to meet demand a costly proposition for telcos, carriers, cloud providers and enterprises alike. Even worse, if the market shifts toward adoption of novel network hardware, these organizations must bear the cost of updating legacy infrastructure in order to stay competitive.

By de-coupling network and application data processing and building in flexibility and scalability into the design, appliance designers now have the ability to introduce a powerful, high-speed platform into the network that is capable of capturing data with zero packet loss at speeds up to 100 Gbps.

The analysis stream provided by the hardware platform can support multiple applications, not just performance monitoring. Multiple applications running on multiple cores can be executed on the same physical server with software that ensures that each application can access the same data stream as it is captured.

This transforms the performance monitor into a universal appliance for any application requiring a reliable packet capture data stream. With this capability, it is possible to incorporate more functions in the same physical server, increasing the value of the appliance.

New technologies are setting the stage to enable organizations to manage the ever-increasing data loads without compromise. By scaling with increasing connectivity speeds, as well as accelerating network management and security applications, enterprises will have greater success navigating the third platform and beyond.

About the Author

Daniel Joseph Barry (News - Alert) is VP Positioning and Chief Evangelist at Napatech and has over 20 years experience in the IT and Telecom industry. Prior to joining Napatech in 2009, Dan Joe was Marketing Director at TPACK (News - Alert), a leading supplier of transport chip solutions to the Telecom sector.  From 2001 to 2005, he was Director of Sales and Business Development at optical component vendor NKT Integration (now Ignis Photonyx) following various positions in product development, business development and product management at Ericsson (News - Alert). Dan Joe joined Ericsson in 1995 from a position in the R&D department of Jutland Telecom (now TDC). He has an MBA and a BSc degree in Electronic Engineering from Trinity College Dublin.


Daniel Joseph Barry is VP of Marketing at Napatech (News - Alert) and has over 20 years experience in the IT and Telecom industry. Prior to joining Napatech in 2009, Dan Joe was Marketing Director at TPACK, a leading supplier of transport chip solutions to the Telecom sector.

Edited by Stefania Viscusi
FOLLOW US

Subscribe to InfoTECH Spotlight eNews

InfoTECH Spotlight eNews delivers the latest news impacting technology in the IT industry each week. Sign up to receive FREE breaking news today!
FREE eNewsletter

infoTECH Whitepapers