In the paper we present both discrete-time deterministic and stochastic control. We look at the idea of such models in intuitive sense and the motivation for using them. Taking into account the reasonable assumptions we develop the solving processes using dynamic programming and Hamilton--Jacobi--Bellman equation for both finite and infinite horizon discrete-time. To help understand the theory better, we introduce numerous examples, from very simple ones to more complex models, which facilitate understanding of the topic and can be used as a basis for development of more sophisticated models. Some graphs obtained by simulations of the examples are also included and serve as a good graphic demonstration of the benefits of discrete-time stochastic control.