Accurate determination of the Global Navigation Satellite System (GNSS) velocities plays a very important role in describing the tectonic process, crustal deformation, and many other geodetic applications. This study aims to compare the efficiency of linear regression (LR) models with machine learning (ML) algorithms, such as Decision Trees (DTs), Random Forest (RF), and Gaussian process regression (GPR), in estimating horizontal GNSS velocities. In this context, data from 20 Continuously Operating Reference Stations (CORSs) on the southern Scandinavian peninsula were used. The findings show that the RF and GPR ML techniques perform much better than the LR models, giving an R2, higher than 0.94 while having a lower root-mean-square error and mean absolute error. The velocities estimated with the help of DTs are comparable to LR results and show that DTs are applicable in GNSS time series velocity estimation. However, in this study, it is seen that DTs improve the results even further in capturing nonlinear patterns compared to the LR-based approach. This study improves the accuracy of horizontal velocity estimates based on GNSS time series data, considering the limitations of classical models, and highlights the potential success of ML techniques in GNSS velocity estimation. These findings can provide a critical framework for improving geophysical and geodetic analyses in infrastructure planning, crustal movement monitoring, and disaster risk reduction.
From linear model to machine learning algorithms: performance analysis of horizontal GNSS velocities of the coastal southern Scandinavia
Abstract:
