tag:blogger.com,1999:blog-17500930.post947940373452470204..comments2024-03-20T22:57:03.923+00:00Comments on Dean Bubley's Disruptive Wireless: Debunking the Network QoS mythDean Bubleyhttp://www.blogger.com/profile/05719150957239368264noreply@blogger.comBlogger1125tag:blogger.com,1999:blog-17500930.post-56332225694393631202017-11-07T20:46:11.376+00:002017-11-07T20:46:11.376+00:00Hi Dean, good article as usual. I have been lookin...Hi Dean, good article as usual. I have been looking at Martin Geddes work, along with Predictable Network Solutions, very carefully because I think it has immediate impact and utility in "characterizing" a network path's quality over time. As a long-time IT professional that specializes in networking and Wi-Fi, I have seen how horrible network monitoring, measurement, and insights are to achieve. We have been stuck looking at interface counters, bandwidth, utilization, and related buffer and packet loss statistics on individual nodes in isolation. There is a real issue with being able to understand how a network performs in aggregate, end to end, and then to quickly diagnose issues to understand where/why/when an problem occurred. Often times with modern networks everything appears to be running fine, but a user experience issue manifests, sporadically, and is difficult to track down. That's what I like about their approach with delta-Q... give me an operational tool to characterize and baseline a network's (or network path) quality. Then at least I know what to expect. More importantly, then we can start to bridge the application development and IT networking worlds with concrete information about what applications should be able to expect from the network. If I have concrete data about what the delay and loss characteristics are of a network path or traffic path, then I can have an informed discussion with an application developer who is writing a new app or who is troubleshooting a user experience issue with an existing application already deployed. We have such metrics with other IT systems such as databases and storage. It's well past time we are able to characterize network performance, end to end, in an operational fashion.<br /><br />Will we try to engineer a level of quality... absolutely. But there will be limits of what is realistic based on effort and value - the old 80/20 rule. <br /><br />Will we try to sell a level of quality... as you stated, probably not. It's too complex and fragmented, with many modern applications traversing multiple network operators end to end. It could be achievable for specific use-cases in private networks, and that holds tremendous value for businesses. But I doubt it will develop into a marketplace in any fashion, as you have stated.<br /><br />Thanks for your analysis.<br /><br />Cheers,<br />Andrew von Nagy<br />@revolutionwifiAndrew von Nagyhttps://www.blogger.com/profile/12658799453646609565noreply@blogger.com