In recent years, web application development has seen an increase in utilization of continuous delivery principles, ensuring rapid delivery of functionality to end users. However, despite faster releases, correctness of the released software remains crucial. In practice, the last steps of testing a new release involve detection of regressions from the previous release, which are performed manually, and are thus time-consuming and unreliable. In this thesis, we address this problem and develop a solution for automated regression detection based on shadowing of production requests. Our solution transparently duplicates web requests into a shadow environment used by the new release, and compares responses from production and shadow environments in order to detect content and performance regressions. We integrate our solution with the deployment pipeline of an existing web application and evaluate it. We demonstrate that introduction of our solution to deployment pipelines of existing applications requires little overhead and additional infrastructure, while at the same time enabling more thorough testing of numerous boundary cases and use cases that the existing types of testing in the deployment pipelines do not cover. To the best of our knowledge, unlike the only other comparable alternative Diffy, our solution enables shadowing of both safe as well as unsafe web requests.
|