As there are several processes in food production in agriculture for which there is a need to provide appropriate growth conditions for plants with relatively high accuracy, sensors, controllers and analysis of sensor data are appropriate and efficient for such processes. Sensors such as temperature, humidity, carbon dioxide and light intensity or optical sensors play an important role in improving agricultural processes and reducing the consumption of various resources. The aim of the technology is to enable farmers to monitor and manage processes ranging from the presence of certain plant nutrients and soil acidity to the presence of pests and weather conditions, to the health and general well-being of plants. Once these elements are interconnected, farmers can make informed decisions based on them. Within this study a time series forecast model, based on machine learning algorithm “Random Forest” [1] has been implemented in IoT (Internet of Things) prototype for performing the experiments. When performing experiments with the obtained data, it is found how the frequency of data acquisition affects the accuracy of the forecast model, as well as it was found how the size of the testing and training sets affects the forecast accuracy of the model.