Virtual reality (VR) has recently become a real hit. Also, an increasing number of mobile devices that are used for everyday needs support and are powerful enough to run VR applications. As a result, the market is growing in number of VR glasses, which project the image from mobile device screens to user eyes. These glasses can be from different manufacturers and different shapes. Many VR glasses do not provide any additional controllers for interaction with the mobile device. The user is limited, once the mobile device is inserted into the eyewear, they remain without any means to interact with the virtual world, because there is no way to access the screen and buttons of the device. Currently, these problems in VR applications are solved with the use of the user’s gaze direction. For example, the user opens the door by staring at them for 5 seconds. The goal of this thesis is to develop new interaction techniques and give users the ability to interact with a virtual world with natural body movements. The task is to create an algorithm which would be responsible for detecting interaction from the user's gestures using only sensors already built in mobile devices. This would replace touch pads and buttons on VR glasses and all other additional Bluetooth controllers that provide additional control options. Also, it will allow developers to easily implement this algorithm in their applications and give them more options for the development of new applications and ways of gaming.