Please note: This PhD seminar will take place online.
Joseph Scott, PhD candidate
David R. Cheriton School of Computer Science
Supervisors: Professors Jo Atlee, Vijay Ganesh
We introduce Pierce, a versatile and extensible testing tool aimed at solvers for the neural network verification (NNV) problem. At its core, Pierce implements a fuzzing engine over the Open Neural Network Exchange (ONNX) — a standardized model format for deep learning and classical machine learning, and VNN-LIB — a specification standard over the input-output behavior of machine learning systems. Pierce supports the entirety of the VNN-LIB and most of ONNX v18. The API of Pierce is designed to enable users to create a variety of software testing tools, such as performance and mutation fuzzers, as well as delta debuggers, with relative ease. For example, Pierce provides a rich generator for computation graphs and specifications that allows users to easily specify a wide variety of configurations, as well as mutators that ensure that mutated computation graphs are well-formed.
Using Pierce we build a reinforcement learning (RL) driven relative performance fuzzer. Using this fuzzer, we expose performance issues in four state-of-the-art solvers, such as Marabou, ERAN, MIPVerify, and nnenum, observing up to a 13.3x times slowdown in cumulative PAR-2 score in the target solvers relative to reference solvers. Further, we leverage Pierce to create a diverse benchmark suite with 10,000 competition-grade NNV instances for the community.