Protecting privacy in today's digital landscape has become a pressing challenge for both individuals and society at large. Despite expressing concerns about digital privacy, individuals often struggle to align their attitudes with their actual behaviour – a phenomenon known as the privacy paradox. Drawing from the predictive processing approach, this master's thesis aims to offer fresh insights into understanding the privacy paradox and its relationship with the functional principles proposed by the approach. The methodology involves developing a conceptual account of the privacy paradox from a predictive processing perspective, which is based on syntheses of knowledge derived from principles of the predictive processing approach and conceptual analysis of the privacy paradox. The proposed account suggests that the paradox may emerge due to users generating abstract and contextually non-specific predictions regarding digital privacy, which fail to translate into specific digital contexts. This discrepancy is furthered by data disclosure occurring within contexts that often lack clear privacy sensory signals or present users with misleading sensory cues. The lack of sensory signals prevents the occurrence of prediction error minimisation, with which the users' beliefs about digital privacy could be updated in a way that correlates to their attitudes. The account is compared to existing approaches for understanding the privacy paradox and evaluated through a case study approach of existing empirical evidence. Practical implications for individuals and policymakers working in the realm of digital privacy protection and direction for future research are proposed.
|