Summary: | Machine learning has been a popular approach in predicting future demand. In optical access network, machine learning can best predict bandwidth demand so as to reduce delays. This paper presented a machine learning approach to learn queueing time in XGPON given the traffic load, number of frames and packet size. Queueing time contributes to upstream delay and therefore would improve the network performance. Output R acquired from the trained ANN is close to value 1. From the trained ANN, mean squared error (MSE) shows significantly low value and this proves that machine learning-based queueing time analysis offers another dimension of delay analysis on top of numerical analysis. © 2021, Universiti Malaysia Perlis. All rights reserved.
|