ExtraHop recently released the results of a survey showing that organizations are increasingly using “Big Data” for “IT Operations Analytics” or ITOA. In particular, according to the survey, they are using it for network performance monitoring, application performance monitoring, root-cause analysis, and IT security. This begs the question: What is new here?
Network engineers in particular have always had to drink from a fire hose of data, from many different sources. It’s always been “Big Data” and calling it such doesn’t make it any more significant or challenging. The real issue isn’t so much the sources of the data – another ExtraHop survey focus – but helping network engineers analyze the data to derive the necessary intelligence to make informed decisions.
Network management and monitoring software vendors have been focusing on this for years, advancing the ability to parse raw data and deliver relevant information to network groups – mitigating the need for a protocol sniffing Svengali while delivering valuable analyses to multiple constituents within the network engineering and operations teams. Software continues to evolve to ensure that new technologies such as SDN – another fire hose of real-time data that humans cannot analyze manually – don’t obscure management intelligence and bypass network engineers altogether. After all, just because something can be automatically done in the network doesn’t mean it should be.
As Brian Boyko wrote in a recent blog post: “Real-time SDN analytics are critical to enabling engineers to make good decisions. They are also vital to allowing the network software itself to make good ‘decisions.’ If a link performs poorly, an SDN network can route around it – if it knows that the link is indeed performing poorly and what the next best route is. But if the information is incorrect or misleading, the computer will blithely go through its programming, making the ‘right’ decisions for the wrong scenario. Truth be told, a human being could also make the same mistake, given the same data, but computers have the ability to make billions of mistakes per second.”
The need to manage this latest fire hose is why we’ve been focusing on SDN management using our always-current network routing models, traffic matrices, and performance analytics. The analysis of these three data sources – unique to Packet Design – are required for intelligent, real-time orchestration by SDN controllers. They are also needed for the operational monitoring of overlay networks and dynamic applications, such as traffic engineering, bandwidth calendaring and virtualized network functions.
With these capabilities we will launch our SDN Service Assurance platform this fall and the first independent SDN application (called Software Defined Traffic Engineering or SD-TE) – that will automatically optimize and provision traffic engineered tunnels. The app will let organizations verify how the network can accommodate requested TE tunnels’ bandwidth and latency requirements without adversely impacting other applications and services. It will calculate the best paths and provision the tunnels via the OpenDaylight controller.
The world’s largest service providers, enterprises, and government entities already use our Explorer products to improve network availability and performance, deliver new services faster and more economically, and improve customer satisfaction. We could claim the Big Data mantle, but we won’t. Loads of disparate data by any other name are still loads of data. It’s all about what you do with it.
Add comment