Stochastic Gradient Descent with Positive Defined Stabilized Barzilai-Borwein method

As society advances, machine learning holds increasing significance. Optimization, a crucial aspect in machine learning, has garnered considerable research attention. Addressing optimization challenges has become pivotal as models grow in complexity alongside the exponential rise in data volume. In...

Full description

Bibliographic Details
Published in:IAENG International Journal of Computer Science
Main Author: Shi W.; Shuib A.; Alwadood Z.
Format: Article
Language:English
Published: International Association of Engineers 2025
Online Access:https://www.scopus.com/inward/record.uri?eid=2-s2.0-85216921813&partnerID=40&md5=1411b8590ec006be16f0d923177a499a
id 2-s2.0-85216921813
spelling 2-s2.0-85216921813
Shi W.; Shuib A.; Alwadood Z.
Stochastic Gradient Descent with Positive Defined Stabilized Barzilai-Borwein method
2025
IAENG International Journal of Computer Science
52
2

https://www.scopus.com/inward/record.uri?eid=2-s2.0-85216921813&partnerID=40&md5=1411b8590ec006be16f0d923177a499a
As society advances, machine learning holds increasing significance. Optimization, a crucial aspect in machine learning, has garnered considerable research attention. Addressing optimization challenges has become pivotal as models grow in complexity alongside the exponential rise in data volume. In the existing algorithms like stochastic gradient descent (SGD), a common practice is to reduce step sizes or manually adjust step sizes which is inappropriate and time-consuming. In order to address this issue, researchers have put significant efforts, such as adopting the Barzilai-Borwein (BB) method. However, the BB method has its drawbacks, with the denominator potentially approaching zero or even becoming negative. In order to address this problem, this study uses the Positive Defined Stabilized Barzilai-Borwein (PDSBB) method and combined SGD algorithm with the method to create new algorithms, namely SGD-PDSBB. Following that, the algorithm’s convergence is analyzed. Subsequently, its effectiveness is confirmed through numerical experiments, where is compared to the original SGD algorithm, as well as SGD-BB, in terms of step size, sub-optimality, and classification accuracy. The numerical experiments indicate that the new algorithm exhibits numerical performance similar to SGD or SGD-BB on some datasets, and on some other datasets, the new algorithms even perform better. © (2025), (International Association of Engineers). All rights reserved.
International Association of Engineers
1819656X
English
Article

author Shi W.; Shuib A.; Alwadood Z.
spellingShingle Shi W.; Shuib A.; Alwadood Z.
Stochastic Gradient Descent with Positive Defined Stabilized Barzilai-Borwein method
author_facet Shi W.; Shuib A.; Alwadood Z.
author_sort Shi W.; Shuib A.; Alwadood Z.
title Stochastic Gradient Descent with Positive Defined Stabilized Barzilai-Borwein method
title_short Stochastic Gradient Descent with Positive Defined Stabilized Barzilai-Borwein method
title_full Stochastic Gradient Descent with Positive Defined Stabilized Barzilai-Borwein method
title_fullStr Stochastic Gradient Descent with Positive Defined Stabilized Barzilai-Borwein method
title_full_unstemmed Stochastic Gradient Descent with Positive Defined Stabilized Barzilai-Borwein method
title_sort Stochastic Gradient Descent with Positive Defined Stabilized Barzilai-Borwein method
publishDate 2025
container_title IAENG International Journal of Computer Science
container_volume 52
container_issue 2
doi_str_mv
url https://www.scopus.com/inward/record.uri?eid=2-s2.0-85216921813&partnerID=40&md5=1411b8590ec006be16f0d923177a499a
description As society advances, machine learning holds increasing significance. Optimization, a crucial aspect in machine learning, has garnered considerable research attention. Addressing optimization challenges has become pivotal as models grow in complexity alongside the exponential rise in data volume. In the existing algorithms like stochastic gradient descent (SGD), a common practice is to reduce step sizes or manually adjust step sizes which is inappropriate and time-consuming. In order to address this issue, researchers have put significant efforts, such as adopting the Barzilai-Borwein (BB) method. However, the BB method has its drawbacks, with the denominator potentially approaching zero or even becoming negative. In order to address this problem, this study uses the Positive Defined Stabilized Barzilai-Borwein (PDSBB) method and combined SGD algorithm with the method to create new algorithms, namely SGD-PDSBB. Following that, the algorithm’s convergence is analyzed. Subsequently, its effectiveness is confirmed through numerical experiments, where is compared to the original SGD algorithm, as well as SGD-BB, in terms of step size, sub-optimality, and classification accuracy. The numerical experiments indicate that the new algorithm exhibits numerical performance similar to SGD or SGD-BB on some datasets, and on some other datasets, the new algorithms even perform better. © (2025), (International Association of Engineers). All rights reserved.
publisher International Association of Engineers
issn 1819656X
language English
format Article
accesstype
record_format scopus
collection Scopus
_version_ 1825722575720808448