This thesis presents an interactive system that combines depth image
acquisition using the Kinect sensor with particle visualization guided by a
gradient field. The goal of the system is to create a visual installation where
particles adapt dynamically to the surface structure as well as to user movements.
The methodology involves capturing depth frames, computing the gradient
field, and updating particle positions based on local changes in the image.
The system also allows adjustment of key parameters such as particle speed,
damping, and particle count. Experimental results demonstrate that the
approach achieves real-time performance and produces visually convincing
effects.
The main contribution of the thesis lies in showing that a simple gradientbased
model can be effectively applied to interactive visualizations, making
it suitable for artistic installations and experimental user interfaces.
|