The thesis systematically analyzes the impact of data quantization on
machine learning models. It shows that its effect is highly context-dependent:
on clean data, it typically reduces predictive performance, whereas on noisy
data, it can act as a beneficial regularizer and improve generalization. The
thesis also evaluates the effectiveness of mitigation strategies and warns of
the danger of information leakage due to careless application. The results
offer insight and practical guidelines for the thoughtful use of quantization
in practice.
|