The introduction of fifth-generation (5G) and sixth-generation (6G) networks has brought new possibilities. But, they need dynamic radio resource management (RRM). These networks are helpful in advanced technologies like drones and virtual or augmented reality. However, they need to track current indicators and be able to predict them to do this.
Researchers have started using Artificial Intelligence (AI) and machine learning (ML) for accurately forecasting mobile network profiles using artificial intelligence (AI) and machine learning (ML) algorithms. Using AI and ML in 5G networks helps achieve effective and rational network planning and management. The prominent application of ML in fifth-generation (5G) and sixth-generation (6G) networks is in network traffic forecasting, which monitors user demands and analyzes user behavior in apps.
Thus, the researchers at RUDN University recently tried to study traffic forecasting. They explored two popular time-series analysis models: Holt-Winter model and the Seasonal Integrated Autoregressive Moving Average (SARIMA). They used a Portuguese mobile operator’s dataset, aggregating hourly downloads and uploading traffic statistics. One of the researchers emphasized that the increasing number of connected devices has led to a sharp rise in traffic volume, causing issues such as network congestion, decreased quality of service, delays, data loss, and the blocking of new connections. Therefore, the network architectures must adapt to the increasing traffic volume and consider several types of traffic with different requirements.
The researchers found that both these models worked well and were highly accurate in forecasting traffic within the following hour. They discovered that SARIMA is better at predicting user-to-base station traffic and has an average error rate of just 11.2%. The researchers emphasized that it is accurate in monitoring transient variations and patterns in mobile network traffic because of its capacity to record temporal patterns. In contrast, the Holt-Winter model performed better when estimating base station-to-user traffic and has an error of only up to 4%. The researchers attribute the Holt-Winter model’s performance to its ability to manage some traffic datasets’ intricate seasonality and trend components.
The researchers used multiple criteria to measure the performance of the models. These criteria are Mean Square Error (MSE), Root Mean Square Error (RMSE), Mean Absolute Error (MAE), Mean Absolute Percentage Error (MAPE), and Mean Scaled Logarithmic Error (MSLE). They emphasized that while both the models worked well, their performance could be further improved by fine-tuning specific hyperparameters. The researchers emphasized that while the individual models performed well, there is no universally applicable method for all situations. The researchers intend to combine statistical models with AI and ML techniques to get refined predictions and detect abnormalities promptly.
In conclusion, this study showed that with AI and ML algorithms, 5G and 6G network providers can effectively anticipate and respond to evolving traffic dynamics. As the researchers focus on improving the efficiency of the approach and fostering improved user satisfaction, this study can be significant. With cutting-edge technology and the pursuit of accuracy in forecasting network traffic and anomaly detection, the effort to maximize 5G and 6G network efficiency continues.
Check out the Paper. All credit for this research goes to the researchers of this project. Also, don’t forget to follow us on Twitter. Join our 36k+ ML SubReddit, 41k+ Facebook Community, Discord Channel, and LinkedIn Group.
If you like our work, you will love our newsletter..
Don’t Forget to join our Telegram Channel
Rachit Ranjan is a consulting intern at MarktechPost . He is currently pursuing his B.Tech from Indian Institute of Technology(IIT) Patna . He is actively shaping his career in the field of Artificial Intelligence and Data Science and is passionate and dedicated for exploring these fields.