Ph.D. Defence Notice: Addressing Performance Regressions in DevOps: Can We Escape from System Performance Testing?

Wednesday, June 19, 2024 12:00 pm - 2:00 pm EDT (GMT -04:00)

Candidate: Lizhi Liao

Title: Addressing Performance Regressions in DevOps: Can We Escape from System Performance Testing?

Date: June 19, 2024

Time: 12:00 PM

Place: REMOTE ATTENDANCE

Supervisor(s): Shang, Weiyi

Abstract:

Performance regression is an important type of performance issue in software systems. It indicates that the performance of the same features in the new version of the system becomes worse than that of previous versions, such as increased response time or higher resource utilization. In order to prevent performance regressions, current practices often rely on conducting extensive system performance testing before releasing the system into production based on the testing results.

However, faced with a great demand for resources and time to perform system performance testing, it is often challenging to adopt such approaches to software systems, especially large-scale systems, that employ the practice of fast-paced development and release cycles, e.g., DevOps. In addition, software performance regressions are often addressed late in the software development and release cycle, for instance, after the system is built, integrated, or even released. Not only does this make it laborious for developers to detect, locate, and fix performance regressions, but it can also lead to potential adverse effects on users and companies.

To tackle these key challenges, in this thesis, we focus on addressing software performance regressions in DevOps without relying on expensive system performance tests. In particular, in the first part of this thesis, we propose a series of automated approaches to helping developers and operators first detect performance regressions and then locate performance regression root causes directly based on analyzing field operational data, i.e., without the need for system performance testing. Finally, we adapt the existing performance analytic techniques in a real-life database-centric system. In the second part of this thesis, we leverage small-scale performance testing and architectural models to predict the impact of changes in small software components on the end-to-end performance of the entire system in order to detect performance regressions as early as in the software development phase.

Through various case studies on open-source projects and successful adoptions by our industrial research collaborator, we anticipate that our work can offer helpful insights to researchers and practitioners who are interested in addressing software performance regressions in DevOps without expensive system performance testing.