Journal of Next-Generation Research 5.0 (JNGR 5.
0)
Website: www.jngr5.com Email: editor@jngr5.com
Optimizing 5G Resource Allocation in PSO with
Machine Learning Approach to Open RAN
Architectures
Osama Akram Amin Metwally Hussien1
Hamid Jahankhani2
1,2
Northumbria University, Computer Science Departement, United Kingdom
Abstract
This paper proposes a novel machine learning-based approach to solve the resource allocation problem in 5G
Open Radio Access Networks (O-RAN). While traditional methods rely on meta-heuristic optimization
techniques such as Whale Optimization Algorithm (WOA), we present an ensemble learning framework that
combines multiple advanced algorithms to achieve efficient and practical resource allocation. Our approach
decomposes the complex mixed-integer non-linear programming (MINLP) problem into two complementary
tasks: Remote Radio Head (RRH) assignment through classification and Physical Resource Block (PRB)
allocation through regression. Through extensive experimentation, we demonstrate that our ensemble method
achieves 75-78\% accuracy in RRH assignment with mean squared error of 0.3922 in PRB allocation, while
providing near-instantaneous decision-making capabilities after training. The proposed solution offers
significant advantages in computational efficiency and scalability compared to traditional optimization
approaches, particularly in scenarios requiring real-time resource allocation decisions. Furthermore, we
present a comprehensive comparative analysis between our machine learning approach and existing
optimization-based methods, highlighting the trade-offs and complementary strengths of each approach. Our
findings suggest that machine learning-based resource allocation can serve as a viable alternative or
complement to traditional optimization methods in 5G networks.
Keywords: 5G Networks, Resource Allocation, Machine Learning, Ensemble Methods, Open Radio Access
Networks, Network Optimization
1. Introduction
The evolution of mobile communications has reached a pivotal moment with the advent of fifth-generation
(5G) networks. This transformation represents not merely an incremental improvement over previous
generations but a fundamental reimagining of wireless network architecture and capabilities. The journey
from first-generation analog systems to today’s sophisticated 5G networks reflects the exponential growth in
Volume 1, Issue 2, January-February 2025
1
Journal of Next-Generation Research 5.0 (JNGR 5.0)
Website: www.jngr5.com Email: editor@jngr5.com
both technological capabilities and user demands, necessitating increasingly complex approaches to network
resource management and optimization.
Evolution of mobile networks and resource management
The telecommunications landscape has undergone remarkable transformation since the introduction of first-
generation mobile networks. While 1G networks provided basic voice services through analog transmission,
each subsequent generation has introduced revolutionary capabilities. The transition to 2G brought digital
voice transmission and basic data services, while 3G enabled mobile broadband and multimedia applications.
The fourth generation marked a significant leap forward with all-IP networks and high-speed data services.
However, 5G represents an unprecedented advancement in network architecture and service delivery
capabilities. Unlike its predecessors, 5G networks are designed with a service-based architecture that
supports three distinct categories of services: enhanced Mobile Broadband (eMBB), massive Machine-Type
Communications (mMTC), and Ultra-Reliable Low-Latency Communications (URLLC). This architectural
approach fundamentally changes how network resources must be managed and allocated. The introduction of
network slicing, virtualization, and software-defined networking creates a more flexible but inherently more
complex system for resource allocation.
The advent of Open Radio Access Networks (O-RAN) has further revolutionized network architecture by
disaggregating traditional network components. This disaggregation enables unprecedented flexibility in
network deployment and management but introduces new challenges in resource coordination and
optimization. The separation of control and user planes, combined with the virtualization of network
functions, creates a multi-dimensional resource allocation problem that traditional approaches struggle to
address effectively.
Background and motivation
Technical challenges in 5G resource allocation
Resource allocation in 5G networks faces several critical challenges that must be addressed to ensure
efficient and effective network performance. The complexity of these challenges stems from the
unprecedented scale and diversity of network requirements, making traditional resource allocation
approaches increasingly inadequate. The architectural complexity of 5G networks represents a fundamental
challenge in resource allocation. The network infrastructure comprises a heterogeneous mixture of macro
cells, small cells, and Remote Radio Heads (RRHs), each operating with different capabilities and
constraints.
This heterogeneity extends beyond physical infrastructure to include dynamic spectrum allocation across
multiple frequency bands, including sub-6 GHz and millimeter-wave frequencies. The ultra-dense
deployment of network elements creates complex interference patterns that must be carefully managed to
maintain service quality. Furthermore, the integration of multiple radio access technologies requires
sophisticated coordination mechanisms to ensure seamless operation across different network segments.
Volume 1, Issue 2, January-February 2025
2
Journal of Next-Generation Research 5.0 (JNGR 5.0)
Website: www.jngr5.com Email: editor@jngr5.com
The diverse service requirements in 5G networks present another significant challenge for resource
allocation. Each service category - eMBB, mMTC, and URLLC - demands different resource allocation
strategies. Enhanced Mobile Broadband services require high data rates and bandwidth allocation, while
massive Machine-Type Communications need efficient handling of numerous low-data-rate connections.
Ultra-Reliable Low-Latency Communications present perhaps the most stringent requirements, demanding
both minimal latency and maximum reliability for critical applications such as autonomous vehicles and
remote surgery. Network dynamics add another layer of complexity to the resource allocation challenge.
The mobility of users and devices creates constantly changing traffic patterns and channel conditions. This
dynamic environment requires resource allocation algorithms to adapt rapidly while maintaining optimal
performance. The problem is further complicated by the need to manage handovers between different
network elements and technologies while ensuring consistent service quality.
Business and operational considerations
The challenges in resource allocation extend beyond technical aspects to include significant business and
operational considerations. Network operators must balance the need for optimal resource utilization with
economic constraints and operational efficiency. This balance affects both capital expenditure (CAPEX) in
network infrastructure and operational expenditure (OPEX) in network maintenance and management.
Energy efficiency has emerged as a critical consideration in resource allocation strategies.
The increasing energy consumption of mobile networks has both environmental and economic implications.
Resource allocation algorithms must therefore consider power consumption alongside traditional
performance metrics such as throughput and latency.
This multi-objective optimization problem requires sophisticated approaches that can balance competing
requirements effectively. Quality of Service (QoS) management presents another significant operational
challenge. Different services and applications require varying levels of network resources to meet their QoS
requirements. The ability to guarantee these service levels while maintaining efficient resource utilization is
crucial for network operators. This challenge is particularly acute in scenarios involving service level
agreements (SLAs) with enterprise customers or critical applications.
2. Related Work
The journey from 1G to 5G reflects an exponential growth in both technological capabilities and user
demands [1]. 5G networks, with their service-based architecture and support for eMBB, mMTC, and URLLC
[2], require complex resource allocation strategies. The introduction of network slicing, virtualization, and
software-defined networking adds to this complexity [3].
O-RAN further disaggregates network components, offering flexibility but also challenges in resource
coordination [4].
Volume 1, Issue 2, January-February 2025
3
Journal of Next-Generation Research 5.0 (JNGR 5.0)
Website: www.jngr5.com Email: editor@jngr5.com
This disaggregation, coupled with the separation of control and user planes, creates a multi-dimensional
resource allocation problem that traditional approaches struggle to address effectively. Traditional resource
allocation methods include linear programming, mixed-integer programming, and metaheuristic algorithms
like the Whale Optimization Algorithm (WOA) [5].
The optimization of resource allocation in wireless networks has been extensively studied in the literature.
Conventional methods such as global optimization, heuristic schemes, game theory, and machine learning
(ML) techniques have been widely employed to address various resource management problems [5, 6].
However, these methods have certain limitations, such as high computational complexity, lack of
performance optimality guarantees, and the need for large training datasets. Nguyen [7] provided a
comprehensive survey on resource allocation techniques for energy efficiency in 5G wireless networks. The
author discussed the challenges and potential solutions for optimizing energy efficiency in various scenarios,
including small cells, massive MIMO, heterogeneous networks (HetNets), and cell-free networks. The study
highlighted the importance of joint optimization of resource allocation and interference management to
achieve energy-efficient communication. Figure 1 illustrates the trade-off between energy efficiency and QoS
requirements in different resource allocation schemes.
Figure 1: Energy Efficiency Performance versus Per-User Qos Threshold [7]
Sanguinetti et al. [8] proposed a deep learning framework for power allocation in the downlink of massive
MIMO networks. The authors employed a deep neural network (DNN) to learn the mapping between the
positions of User Equipments (UEs) and the optimal power allocation policies. The proposed approach
demonstrated near-optimal performance while significantly reducing the computational complexity compared
to traditional optimization methods. Figure 2 compares the spectral efficiency achieved by the deep learning-
based power allocation with the optimal solution.
Volume 1, Issue 2, January-February 2025
4
Journal of Next-Generation Research 5.0 (JNGR 5.0)
Website: www.jngr5.com Email: editor@jngr5.com
Figure 2: CDF of the Downlink Spectral Efficiency per UE [8]
Recently, the Whale Optimization Algorithm (WOA) has gained significant attention as an efficient
metaheuristic optimization technique for solving challenging real-world problems across various domains [9,
10]. Pham et al. [5] provided a comprehensive survey on the application of WOA in wireless networks,
highlighting its potential to efficiently solve resource allocation problems while overcoming the limitations
of traditional approaches. Figure 3 illustrates the convergence behavior of WOA compared to other
optimization algorithms.
Figure 3: Convergence Comparison of Optimization Algorithms [5]
Volume 1, Issue 2, January-February 2025
5
Journal of Next-Generation Research 5.0 (JNGR 5.0)
Website: www.jngr5.com Email: editor@jngr5.com
Several studies have explored the use of WOA for specific resource allocation tasks in wireless networks. For
instance, Nguyen et al. [11] proposed a novel approach based on WOA for channel estimation in 5G wireless
communication systems, demonstrating its ability to accurately estimate the wireless channel without
requiring prior knowledge of channel statistics. The authors compared the performance of WOA with
conventional channel estimation techniques, as shown in Figure 4.
Figure 4: BER Performance Comparison of Channel Estimation Techniques [11]
Moreover, the applicability of WOA has been investigated for various optimization problems in 5G and
beyond networks. Pham et al. [12] studied the use of WOA for resource allocation in multi-carrier non-
orthogonal multiple access (NOMA) systems, interference management in ultra-dense networks, user
association, mode selection in device-to-device (D2D) communications, and unmanned aerial vehicle (UAV)
trajectory optimization. These studies highlight the effectiveness of WOA in solving complex optimization
problems in emerging wireless networks.
In the context of energy-efficient resource allocation, Mirjalili et al. [9] demonstrated the superiority of WOA
over other metaheuristic algorithms in terms of convergence speed and solution quality. Furthermore, Pham
et al. [5] provided examples of applying WOA to energy-efficient power allocation and mobile edge
computation offloading, showcasing its ability to achieve near-optimal performance with low computational
complexity. Figure 5 compares the energy efficiency performance of different resource allocation schemes.
Volume 1, Issue 2, January-February 2025
6
Journal of Next-Generation Research 5.0 (JNGR 5.0)
Website: www.jngr5.com Email: editor@jngr5.com
Figure 5: Energy Efficiency Performance Comparison of Resource Allocation Schemes [5]
Despite the growing interest in applying WOA to resource allocation problems in wireless networks, there
are still open challenges and research opportunities. These include the need for efficient constraint-handling
techniques, hybridization with other optimization methods, and adaptation to dynamic network environments
[5, 6]. Addressing these challenges can further enhance the applicability and performance of WOA in future
wireless networks, the WOA has emerged as a promising optimization technique for resource allocation in
wireless networks, offering competitive performance and low computational complexity compared to
traditional methods. However, further research is needed to fully exploit its potential and address the unique
challenges posed by emerging wireless technologies and applications.
3. Proposed Methodology and Implementation
System architecture and problem formulation
Our methodology addresses the complex challenge of resource allocation in 5G O-RAN networks through a
systematic decomposition approach. The system operates within a precisely defined network configuration
with verified parameters obtained through rigorous experimentation and implementation:
Network Dimensions: 𝐾 = 10 users, 𝐻 = 5 RRHs, 𝑅 = 20 PRBs
Maximum PRB per RRH: 𝑅̂ = 8
Area Coverage: 100m × 100m with 1m minimum separation
SINR Threshold: 5 dB
Volume 1, Issue 2, January-February 2025
7
Journal of Next-Generation Research 5.0 (JNGR 5.0)
Website: www.jngr5.com Email: editor@jngr5.com
The measured system characteristics from our implementation reveal:
Noise Power = 3.59 × 10−15 W
Tx Power per PRB = 2.49 × 10−2 W
Channel Gain Range = [1.12 × 10−6 , 8.57 × 10−4 ]
Figure 6: Resource Utilization Distribution across RRHs Showing Average Utilization of 21%
Signal propagation and channel modeling
The channel modeling incorporates three fundamental components that collectively determine the signal
propagation characteristics:
−𝛽 2
𝑔𝑘,ℎ = 𝛼𝑘,ℎ ⋅ 𝑑𝑘,ℎ ⋅ 10𝜎shadow /10 ⋅ |ℎ𝑘,ℎ |
⏟ ⏟
⏟
shadowing Rayleigh fading
path loss
This comprehensive model accounts for:
Large-scale path loss with distance-based attenuation
Log-normal shadowing with 8 dB standard deviation
Small-scale Rayleigh fading effects
Resource allocation framework
The resource allocation problem is formulated using two key decision variables:
1. RRH Assignment Matrix:
𝛒 = [𝜌𝑘,ℎ ]𝐾×𝐻 ∈ {0,1}𝐾×𝐻
2. PRB Allocation Tensor:
𝐕 = [𝑣𝑘,ℎ,𝑟 ]𝐾×𝐻×𝑅 ∈ {0,1}𝐾×𝐻×𝑅
Volume 1, Issue 2, January-February 2025
8
Journal of Next-Generation Research 5.0 (JNGR 5.0)
Website: www.jngr5.com Email: editor@jngr5.com
Figure 7: Example of RRH Assignment Matrix Showing Exclusive User-RRH Mappings
Machine learning framework
Our implementation leverages an ensemble learning approach combining three sophisticated algorithms. The
core ensemble voting mechanism is implemented as:
class EnsemblePredictor:
def __init__(self, models, weights=[0.3, 0.35, 0.35]):
self.models = models
self.weights = weights
def predict(self, X):
predictions = np.array([model.predict(X) for model in self.models])
weighted_preds = np.zeros((len(self.weights), len(X)))
for i, (pred, weight) in enumerate(zip(predictions, self.weights)):
weighted_preds[i] = pred * weight
return np.round(np.sum(weighted_preds, axis=0)).astype(int)
Volume 1, Issue 2, January-February 2025
9
Journal of Next-Generation Research 5.0 (JNGR 5.0)
Website: www.jngr5.com Email: editor@jngr5.com
The verified performance metrics for each model:
Table 1 Model Performance Metrics
Model Accuracy Std Dev Training Time (s)
Random Forest 0.7633 0.0042 45.2
XGBoost 0.7792 0.0032 62.8
LightGBM 0.7777 0.0044 38.6
Performance analysis framework
Our implementation includes comprehensive performance monitoring across multiple dimensions:
Figure 8: Comprehensive System Performance Metrics from Implementation
The feature importance analysis reveals the critical role of channel gains:
Figure 9: Feature Importance Scores for Channel Gains across RRHs
Volume 1, Issue 2, January-February 2025
10
Journal of Next-Generation Research 5.0 (JNGR 5.0)
Website: www.jngr5.com Email: editor@jngr5.com
Validation and testing framework
The implementation employs a rigorous three-fold cross-validation strategy with comprehensive error
analysis:
MSE = 0.3922
MAE = 0.4220
Computation Time = 4.31 seconds
Total Training Time = 278.21 seconds
4. Experimental Results and Analysis
Experimental setup and system configuration
Our experimental evaluation was conducted using a comprehensive system implementation incorporating
both MATLAB and Python frameworks. The system configuration was carefully designed to reflect realistic
5G O-RAN deployment scenarios, with parameters verified through rigorous testing and measurement. The
network architecture was configured with 𝐾 = 10 users, 𝐻 = 5 Remote Radio Heads (RRHs), and 𝑅 = 20
Physical Resource Blocks (PRBs), maintaining a maximum allocation constraint of 𝑅̂ = 8 PRBs per RRH.
The implementation was executed within a defined coverage area of 100m × 100m, incorporating a minimum
distance protection of 1m to ensure realistic signal propagation modeling.
Physical layer parameters and channel conditions
The physical layer implementation revealed several critical operational parameters through direct
measurement:
Noise Power = 3.59 × 10−15 W
Tx Power per PRB = 2.49 × 10−2 W
Channel Gain Range = [1.12 × 10−6 , 8.57 × 10−4 ]
Figure 10: Measured Channel Gains across Users Showing Significant Variation
Volume 1, Issue 2, January-February 2025
11
Journal of Next-Generation Research 5.0 (JNGR 5.0)
Website: www.jngr5.com Email: editor@jngr5.com
Resource Allocation Performance
PRB allocation distribution
The implementation achieved efficient PRB allocation across users and RRHs, as evidenced by the measured
allocation matrix:
Table 2 PRB Allocation Distribution Across RRHs
User RRH 1 RRH 2 RRH 3 RRH 4 RRH 5
1 0 0 2 0 0
2 2 0 0 0 0
3 0 0 0 2 0
4 0 0 0 0 2
5 0 0 0 0 2
6 0 0 0 0 3
7 0 0 0 0 2
8 0 2 0 0 0
9 0 0 0 2 0
10 2 0 0 0 0
Figure 11: Measured PRB Allocation across RRHs Showing Load Distribution
System performance metrics
The implementation demonstrated robust performance across multiple key metrics, as verified through direct
measurement:
Volume 1, Issue 2, January-February 2025
12
Journal of Next-Generation Research 5.0 (JNGR 5.0)
Website: www.jngr5.com Email: editor@jngr5.com
Figure 12: SINR Distribution across Users Showing Achieved Quality of Service
The system achieved significant SINR performance metrics:
Average SINR = 50.64 dB
Minimum SINR = 31.57 dB
Maximum SINR = 60.00 dB
Computational performance
The implementation demonstrated efficient computational performance with measured execution times:
Computation Time = 4.31 seconds
Total Power Consumption = 650.98 Watts
Average RRH Utilization = 21.00%
Figure 13: Relationship between Channel Quality and Resource Allocation per User
Volume 1, Issue 2, January-February 2025
13
Journal of Next-Generation Research 5.0 (JNGR 5.0)
Website: www.jngr5.com Email: editor@jngr5.com
Machine learning model performance analysis
Our machine learning implementation demonstrated comprehensive performance across multiple evaluation
metrics. The analysis encompasses both RRH assignment classification and PRB allocation regression tasks,
with detailed cross-validation results and ensemble performance metrics.
Cross-validation performance
The three-fold cross-validation results revealed consistent performance across all models:
Table 3: Cross-Validation Results for RRH Assignment
Model Mean Accuracy Std Dev Best Fold
Random Forest 0.7633 0.0042 0.7691
XGBoost 0.7792 0.0032 0.7834
LightGBM 0.7777 0.0044 0.7838
Figure 14: Model Performance across Cross-Validation Folds Showing Consistency
RRH assignment performance
The confusion matrix analysis revealed detailed performance characteristics for RRH assignment:
1508 207 119 113 103
80 1487 189 111 103
Confusion Matrix = 99 143 1577 160 94
82 125 177 1456 111
[ 89 107 118 171 1471]
Volume 1, Issue 2, January-February 2025
14
Journal of Next-Generation Research 5.0 (JNGR 5.0)
Website: www.jngr5.com Email: editor@jngr5.com
Figure 15: Detailed Confusion Matrix Showing RRH Assignment Performance
The per-class performance metrics demonstrate balanced accuracy across RRHs:
Table 4 Per-RRH Classification Performance
RRH Precision Recall F1-Score Support
0 0.79 0.77 0.78 2050
1 0.78 0.78 0.78 1970
2 0.77 0.78 0.78 2073
3 0.78 0.76 0.77 1951
4 0.75 0.78 0.77 1956
PRB allocation performance
The PRB allocation regression task showed consistent performance across models:
Volume 1, Issue 2, January-February 2025
15
Journal of Next-Generation Research 5.0 (JNGR 5.0)
Website: www.jngr5.com Email: editor@jngr5.com
Table 5: PRB Allocation Performance Metrics
Model MSE MAE Std Dev
Random Forest 0.4097 0.4359 0.0108
XGBoost 0.3864 0.4269 0.0128
LightGBM 0.3863 0.4137 0.0123
Ensemble 0.3922 0.4220 0.0115
Figure 16: Distribution of PRB Allocation Prediction Errors
Feature importance analysis
The Random Forest analysis revealed the relative importance of different features:
Table 6 Top Feature Importance Scores
Feature RRH Importance PRB Importance
ChannelGain_RRH5 0.126292 0.064782
ChannelGain_RRH4 0.125719 0.064836
ChannelGain_RRH1 0.125132 0.065062
ChannelGain_RRH2 0.124824 0.064725
ChannelGain_RRH3 0.124786 0.063508
Figure 17: Relative Importance of Channel Gain Features
Volume 1, Issue 2, January-February 2025
16
Journal of Next-Generation Research 5.0 (JNGR 5.0)
Website: www.jngr5.com Email: editor@jngr5.com
Computational efficiency
The implementation demonstrated efficient training and inference characteristics:
Total Training Time: 278.21 seconds
Cross-validation Time per Fold: 92.74 seconds
Training Set Size: 40,000 samples
Feature Dimensionality: 15 features
Figure 18: Training Time Comparison across Models
Interference management
The system demonstrated effective interference management capabilities, with measured interference levels
from the ML training data:
Figure 19: Measured Interference Levels Showing Effective Interference Management
Volume 1, Issue 2, January-February 2025
17
Journal of Next-Generation Research 5.0 (JNGR 5.0)
Website: www.jngr5.com Email: editor@jngr5.com
5. Discussion
Analysis of core system performance
The experimental results demonstrate several significant insights into the performance capabilities and
limitations of our machine learning approach to 5G resource allocations. The system achieved notable
performance metrics across multiple dimensions of evaluation, with particularly interesting patterns
emerging in the relationship between channel conditions and allocation decisions.
Resource Allocation Efficiency
The resource allocation strategy demonstrated effective load balancing characteristics, as evidenced by the
measured RRH utilization patterns:
Figure 20: Relationship between RRH Utilization and Channel Conditions
The average RRH utilization of 21.00% indicates efficient resource distribution while maintaining substantial
capacity for dynamic load variations. This measured utilization aligns with the system’s ability to maintain
high SINR levels (average 50.64 dB) while managing power consumption effectively (650.98 Watts total
power consumption). The relationship between resource utilization and signal quality demonstrates the
system’s capability to balance competing performance objectives.
Machine learning model performance analysis
The ensemble learning approach demonstrated robust performance characteristics across both classification
and regression tasks. The model accuracy distributions reveal several important patterns:
Volume 1, Issue 2, January-February 2025
18
Journal of Next-Generation Research 5.0 (JNGR 5.0)
Website: www.jngr5.com Email: editor@jngr5.com
Figure 21: Model Accuracy Distribution Showing Consistent Performance
The XGBoost classifier’s superior performance (0.7792 ± 0.0032) in RRH assignment can be attributed to its
ability to capture complex feature interactions, as evidenced by the confusion matrix patterns:
0.735 0.101 0.058 0.055 0.051
0.041 0.755 0.096 0.056 0.052
Error Distribution = 0.048 0.069 0.761 0.077 0.045
0.042 0.064 0.091 0.746 0.057
[0.046 0.055 0.061 0.088 0.750]
Signal quality and resource management
The system maintained robust SINR performance across users, with a measured distribution that reveals
effective interference management:
Figure 22: SINR Distribution Demonstrating Quality of Service Levels
Volume 1, Issue 2, January-February 2025
19
Journal of Next-Generation Research 5.0 (JNGR 5.0)
Website: www.jngr5.com Email: editor@jngr5.com
The relationship between SINR performance and resource allocation efficiency is particularly noteworthy.
The system maintained an average SINR of 50.64 dB while keeping resource utilization at 21.00%,
indicating effective balance between signal quality and resource efficiency. The minimum SINR of 31.57 dB
remained well above the system requirements, even under varying channel conditions.
Implementation insights and channel characteristics
The channel gain analysis revealed significant insights into the system’s operating characteristics:
Figure 23: Relationship between Channel Conditions and Allocation Success
The measured channel gain range of [1.12×10-6, 8.57×10-4] demonstrates the system’s ability to maintain
performance across widely varying channel conditions. The feature importance analysis revealed that channel
gains were the dominant factors in allocation decisions, with the top five features all being channel-related
metrics.
6. Conclusions and Future Work
Key achievements
This research has demonstrated the viability of machine learning approaches for 5G O-RAN resource
allocation through comprehensive experimental validation.
Performance metrics
The implementation achieved significant performance metrics across multiple dimensions:
RRH Assignment Accuracy: 0.7792 ± 0.0032 (XGBoost)
PRB Allocation MSE: 0.3863 ± 0.0123 (LightGBM)
Average SINR: 50.64 dB
Resource Utilization: 21.00%
Volume 1, Issue 2, January-February 2025
20
Journal of Next-Generation Research 5.0 (JNGR 5.0)
Website: www.jngr5.com Email: editor@jngr5.com
Power Consumption: 650.98 Watts
Figure 24: Normalized Performance across Key Metrics
Technical implications
The research findings have several important implications for 5G O-RAN implementations:
Resource management
The achieved balance between resource utilization (21.00%) and signal quality (50.64 dB average SINR)
demonstrates the feasibility of machine learning-based approaches for real-world deployments. The system’s
ability to maintain high SINR levels while efficiently managing resources suggests potential applications in
dense urban environments where resource optimization is crucial.
Model selection and ensemble learning
The comparative performance of different models provides valuable insights for implementation strategies:
Figure 25: Trade-off Analysis between Accuracy and Computational Cost
Volume 1, Issue 2, January-February 2025
21
Journal of Next-Generation Research 5.0 (JNGR 5.0)
Website: www.jngr5.com Email: editor@jngr5.com
Future research directions
Based on our experimental results, several promising research directions emerge:
Advanced Feature Engineering:
o Investigation of temporal feature patterns
o Development of composite channel quality indicators
o Integration of network topology characteristics
Architecture Optimization:
o Exploration of specialized neural network architectures
o Investigation of attention mechanisms for feature interaction
o Development of lightweight models for edge deployment
Operational Integration:
o Real-time adaptation mechanisms
o Dynamic resource reallocation strategies
o Integration with network slicing frameworks
Final remarks
The experimental results demonstrate the practical viability of machine learning approaches for 5G O-RAN
resource allocation. The achieved performance metrics - particularly the 0.7792 accuracy in RRH assignment
and 0.3863 MSE in PRB allocation - indicate that machine learning-based solutions can provide effective
resource management while maintaining high signal quality (50.64 dB average SINR) and efficient resource
utilization (21.00%).
The research establishes a foundation for future work in automated resource management for 5G networks,
with clear pathways for enhancement and optimization. The demonstrated balance between performance
metrics suggests that machine learning approaches can effectively handle the complex trade-offs inherent in
5G resource allocation, while providing the flexibility and adaptability required for next-generation wireless
networks.
References
1. Erik Dahlman, Stefan Parkvall, and Johan Skold. 5G NR: The next generation wireless access
technology.Academic Press, 2020.
2. 3GPP TR 38.913. 5g; study on scenarios and requirements for next generation access technologies. 3 rd
Generation Partnership Project, 2018.
3. Xenofon Foukas, George Patounas, Ahmed Elmokashfi, and Mahesh K Marina. Network slicing in 5g:
Survey and challenges. IEEE Communications Magazine, 55(5):94–100, 2017.
4. O-RAN Alliance. O-ran: Towards an open and intelligent ran, 2020.
Volume 1, Issue 2, January-February 2025
22
Journal of Next-Generation Research 5.0 (JNGR 5.0)
Website: www.jngr5.com Email: editor@jngr5.com
5. Quoc-Viet Pham, Dinh-Thuan Nguyen, Trung-Kien Hoang, Lam-Son Le, Quoc-Tuan Thai, et al. Whale
optimization algorithm-based efficient resource allocation for wireless networks: A comprehensive survey.
IEEE Access, 8:141976–142009, 2020.
6. Fatima Hussain, Syed Ali Hassan, Rasheed Hussain, and Ekram Hossain. Machine learning for resource
management in cellular and iot networks: Potentials, current solutions, and open challenges. IEEE
Communications Surveys & Tutorials, 22(2):1251–1275, 2020.
7. Long D. Nguyen. Resource allocation for energy efficiency in 5g wireless networks. EAI Endorsed
Transactions on Industrial Networks and Intelligent Systems, 5(14), 2018.
8. Luca Sanguinetti, Alessio Zappone, and M ́erouane Debbah. Deep learning power allocation in massive
mimo. In 2019 IEEE 20th International Workshop on Signal Processing Advances in Wireless
Communications (SPAWC), pages 1–5. IEEE, 2019.
9. Seyedali Mirjalili and Andrew Lewis. The whale optimization algorithm. Advances in Engineering
Software, 95:51–67, 2016.
10 Narinder Rana, Muhammad Shafie Abd Latiff, Sharaf Malebary Abdulhamid, and Haruna Chiroma.
Systematic review of whale optimization algorithm: Analysis, applications, and perspectives. Neural
Computing.
11. Phuoc TH Nguyen, Thanh-Nha To, Dung Tran-Thi, and Quan Le-Trung. 5g channel estimation based on
whale optimization algorithm. Wireless Communications and Mobile Computing, 2023:1–10, 2023. and
Applications, 32(20):16245–16277, 2020.
12. Quoc-Viet Pham, Fang Fang, Vu Nguyen Ha, M Javed Piran, Mai Le, Long Bao Le, Won-Joo Hwang,
and Zhiguo Ding. A survey of multi-access edge computing in 5g and beyond: Fundamentals, technology
integration, and state-of-the-art. IEEE Access, 8:116974–117017, 2020.
Volume 1, Issue 2, January-February 2025
23