Towards Personalized and Simplified Expository Texts: Pre-trained Classification and Neural Networks Co-Modeling
The goal of automatic text simplification is to reorganize complex text structures into simpler, more comprehendible texts while retaining their original meaning. The automatic text simplification model, coupled with the personalization element, makes it an indispensable tool for assisting students...
Published in: | 2022 12th IEEE Symposium on Computer Applications and Industrial Electronics, ISCAIE 2022 |
---|---|
Main Author: | |
Format: | Conference paper |
Language: | English |
Published: |
Institute of Electrical and Electronics Engineers Inc.
2022
|
Online Access: | https://www.scopus.com/inward/record.uri?eid=2-s2.0-85133479468&doi=10.1109%2fISCAIE54458.2022.9794534&partnerID=40&md5=d3fd294b388e3c9f59c9c746c6e73c5c |
id |
2-s2.0-85133479468 |
---|---|
spelling |
2-s2.0-85133479468 Sukiman S.A.; Azura Husin N. Towards Personalized and Simplified Expository Texts: Pre-trained Classification and Neural Networks Co-Modeling 2022 2022 12th IEEE Symposium on Computer Applications and Industrial Electronics, ISCAIE 2022 10.1109/ISCAIE54458.2022.9794534 https://www.scopus.com/inward/record.uri?eid=2-s2.0-85133479468&doi=10.1109%2fISCAIE54458.2022.9794534&partnerID=40&md5=d3fd294b388e3c9f59c9c746c6e73c5c The goal of automatic text simplification is to reorganize complex text structures into simpler, more comprehendible texts while retaining their original meaning. The automatic text simplification model, coupled with the personalization element, makes it an indispensable tool for assisting students with learning disabilities who struggle to comprehend expository texts found in school textbooks. In recent years, neural networks have been widely embraced in simplified text generation, with most earlier researchers focusing on the Long Short-Term Memory (LSTM), Recurrent Neural Network (RNN), and Transformer models. In general, however, the majority of their efforts resulted in simple, generic texts, and a lack of cognitive-based personalization elements was found in their models. In this paper, we present the concept of generating personalized and simplified expository texts by joining both pre-trained classification and neural networks models. The pre-trained classification aims to predict complex text structures and phrases that give challenges for students with learning disabilities to comprehend, while the neural networks model is then used to generate simplified expository texts based on the predicted text complexity. The advantage of these joint models is the ability to generate simplified expository texts adapted to the cognitive level of students with learning disabilities. This opens up opportunities for continuously personalized learning, makes them less struggling, and increases their motivation to stay competitive with their peers. © 2022 IEEE. Institute of Electrical and Electronics Engineers Inc. English Conference paper |
author |
Sukiman S.A.; Azura Husin N. |
spellingShingle |
Sukiman S.A.; Azura Husin N. Towards Personalized and Simplified Expository Texts: Pre-trained Classification and Neural Networks Co-Modeling |
author_facet |
Sukiman S.A.; Azura Husin N. |
author_sort |
Sukiman S.A.; Azura Husin N. |
title |
Towards Personalized and Simplified Expository Texts: Pre-trained Classification and Neural Networks Co-Modeling |
title_short |
Towards Personalized and Simplified Expository Texts: Pre-trained Classification and Neural Networks Co-Modeling |
title_full |
Towards Personalized and Simplified Expository Texts: Pre-trained Classification and Neural Networks Co-Modeling |
title_fullStr |
Towards Personalized and Simplified Expository Texts: Pre-trained Classification and Neural Networks Co-Modeling |
title_full_unstemmed |
Towards Personalized and Simplified Expository Texts: Pre-trained Classification and Neural Networks Co-Modeling |
title_sort |
Towards Personalized and Simplified Expository Texts: Pre-trained Classification and Neural Networks Co-Modeling |
publishDate |
2022 |
container_title |
2022 12th IEEE Symposium on Computer Applications and Industrial Electronics, ISCAIE 2022 |
container_volume |
|
container_issue |
|
doi_str_mv |
10.1109/ISCAIE54458.2022.9794534 |
url |
https://www.scopus.com/inward/record.uri?eid=2-s2.0-85133479468&doi=10.1109%2fISCAIE54458.2022.9794534&partnerID=40&md5=d3fd294b388e3c9f59c9c746c6e73c5c |
description |
The goal of automatic text simplification is to reorganize complex text structures into simpler, more comprehendible texts while retaining their original meaning. The automatic text simplification model, coupled with the personalization element, makes it an indispensable tool for assisting students with learning disabilities who struggle to comprehend expository texts found in school textbooks. In recent years, neural networks have been widely embraced in simplified text generation, with most earlier researchers focusing on the Long Short-Term Memory (LSTM), Recurrent Neural Network (RNN), and Transformer models. In general, however, the majority of their efforts resulted in simple, generic texts, and a lack of cognitive-based personalization elements was found in their models. In this paper, we present the concept of generating personalized and simplified expository texts by joining both pre-trained classification and neural networks models. The pre-trained classification aims to predict complex text structures and phrases that give challenges for students with learning disabilities to comprehend, while the neural networks model is then used to generate simplified expository texts based on the predicted text complexity. The advantage of these joint models is the ability to generate simplified expository texts adapted to the cognitive level of students with learning disabilities. This opens up opportunities for continuously personalized learning, makes them less struggling, and increases their motivation to stay competitive with their peers. © 2022 IEEE. |
publisher |
Institute of Electrical and Electronics Engineers Inc. |
issn |
|
language |
English |
format |
Conference paper |
accesstype |
|
record_format |
scopus |
collection |
Scopus |
_version_ |
1809678026090217472 |