Your browser is unsupported

We recommend using the latest version of IE11, Edge, Chrome, Firefox or Safari.

Zuck’s call for stronger network protocols published in prestigious CS journal

communications for the acm logo

Research Professor Lenore Zuck coauthored a paper encouraging collaboration between the networking community and formal methods researchersto develop formal languages, tools, and methodologies for network protocols. The paper was published in the August issue of Communications of the ACM.

She, along with her collaborators David Basin, Nate Foster, Kenneth L. McMillan, Kedar S. Namjoshi, Cristina Nita-Rotaru, Jonathan M. Smith, and Pamela Zave, argue that while oft-used and well-understood internet protocols are assumed to be mostly correct and secure, this is simply not true.

Network protocols are a set of established rules that dictate how data is formatted, transmitted, and received between devices in a network, allowing for effective communications despite different internal structures. Vulnerabilities in old protocols, such as Denial of Service (DoS) attacks, are frequently found and exploited. During such attacks, a particular domain’s servers are flooded with information requests to disrupt or disable the domain.

In the paper, the researchers say that despite the limited use of formal methods in the design of its protocols, the internet mostly works. But “the sheer number of emerging technologies and the potential risk to critical infrastructure and human lives demand that future protocols be designed, analyzed, and validated with care.”

The next generation of computing is rapidly moving away from wired networks that connect conventional computers to internet of things devices and sensors, robotic systems, and autonomous vehicles, and is expanding to include remote healthcare, smart infrastructure, and space-based communications. These more advanced communications require additional layers of protocols and increase the likelihood of errors.

Zuck was asked to figure out what scientists should be prepared for in the next generation of computing. She and other networking experts concluded that network specifications are simply not rigorous enough.

“The stakes have gotten higher, much higher with technology,” Zuck said. “Now it’s more dire.”

For example, Zuck mentioned a specific protocol (QUIC) that, according to her, carries over  40% of browser traffic. Her work with McMillan on testing, using lightweight formal methods, found severe mistakes in multiple versions of this standard and its implementations, including several security vulnerabilities.

“We looked at the standards as a rule and examined if each round of the protocols satisfied this property,” Zuck said. “We let them play against each other.”

The communities that develop internet protocols, the Internet Engineering Task Force (IETF) and the IEEE, aim for correct and secure protocols, but current descriptions of the protocols don’t allow for rigorous analysis. These documents can be intentionally written to allow people a degree of freedom in protocol development, yet the cost is flaws and security vulnerabilities.

Moreover, current tools require a great deal of expertise to use, and existing domain-specific tools are often only tested by their own developers. Many tools are difficult to use and require a long time to master, and testing may require mastery of multiple tools.

Zuck and her colleagues believe that formal specifications, with the assistance of lightweight formal tools, can improve the correctness of protocols as well as the interactions between them. Network protocol researchers can provide tools, but need protocol designers to work with them to tell them what they need.

Their paper, “It Takes a Village: Bridging the Gaps between Current and Formal Specifications for Protocols,” includes a video interview with Zuck.