Machine Learning-Based Queueing Time Analysis in XGPON

Machine learning has been a popular approach in predicting future demand. In optical access network, machine learning can best predict bandwidth demand so as to reduce delays. This paper presented a machine learning approach to learn queueing time in XGPON given the traffic load, number of frames an...

Full description

Bibliographic Details
Published in:International Journal of Nanoelectronics and Materials
Main Author: Ismail N.A.; Idrus S.M.; Iqbal F.; Zin A.M.; Atan F.; Ali N.
Format: Article
Language:English
Published: Universiti Malaysia Perlis 2021
Online Access:https://www.scopus.com/inward/record.uri?eid=2-s2.0-85126105720&partnerID=40&md5=076f8bf3d544331f38ddc000eda19849
Description
Summary:Machine learning has been a popular approach in predicting future demand. In optical access network, machine learning can best predict bandwidth demand so as to reduce delays. This paper presented a machine learning approach to learn queueing time in XGPON given the traffic load, number of frames and packet size. Queueing time contributes to upstream delay and therefore would improve the network performance. Output R acquired from the trained ANN is close to value 1. From the trained ANN, mean squared error (MSE) shows significantly low value and this proves that machine learning-based queueing time analysis offers another dimension of delay analysis on top of numerical analysis. © 2021, Universiti Malaysia Perlis. All rights reserved.
ISSN:19855761