Please use this identifier to cite or link to this item:
http://hdl.handle.net/10397/114317
| DC Field | Value | Language |
|---|---|---|
| dc.contributor | Department of Chinese and Bilingual Studies | - |
| dc.creator | Huang, Y | - |
| dc.creator | Li, D | - |
| dc.creator | Cheung, AKF | - |
| dc.date.accessioned | 2025-07-24T02:01:42Z | - |
| dc.date.available | 2025-07-24T02:01:42Z | - |
| dc.identifier.uri | http://hdl.handle.net/10397/114317 | - |
| dc.language.iso | en | en_US |
| dc.publisher | Elsevier Ltd. | en_US |
| dc.subject | EFL/ESL | en_US |
| dc.subject | Linguistic complexity | en_US |
| dc.subject | LLMs | en_US |
| dc.subject | Machine translation | en_US |
| dc.subject | Neural machine translation | en_US |
| dc.title | Evaluating the linguistic complexity of machine translation and LLMs for EFL/ESL applications : an entropy weight method | en_US |
| dc.type | Journal/Magazine Article | en_US |
| dc.identifier.volume | 4 | - |
| dc.identifier.issue | 3 | - |
| dc.identifier.doi | 10.1016/j.rmal.2025.100229 | - |
| dcterms.abstract | English as a Foreign and Second Language (EFL/ESL) learners are increasingly using machine translation (MT) tools such as neural machine translations (NMTs) and large language models (LLMs) to enhance their language learning and translation processes due to their accuracy and efficiency in both cost and time compared with human translation. Given the distinct linguistic features exhibited by NMTs and LLMs, it is crucial to assess the linguistic complexity of texts produced by these tools to optimize their use in EFL/ESL teaching and learning. This study examines two forms of absolute linguistic complexity, namely lexical complexity and syntactic complexity, that influence EFL/ESL activities. Lexical complexity affects vocabulary recognition and semantic processing, while syntactic complexity influences sentence parsing and the internalization of grammatical rules. As both dimensions are multi-faceted and involve numerous indices that may vary in different directions (e.g., high values in certain measures and lower in others), an entropy weight method (EWM) is employed to assign data-driven weights and derive a balanced holistic complexity score. This approach enables a systematic comparison of translation outputs from NMTs (Google Translate, DeepL) and LLMs (ChatGPT-4o, OpenAI-o1). The findings reveal that LLMs generally exhibit higher holistic linguistic complexity, whereas NMTs tend to produce simpler translations. Pedagogically, LLM-translated texts may serve as more effective input for advanced language learners in EFL/ESL contexts, while NMT outputs may be more suitable for those with less linguistic proficiency. | - |
| dcterms.accessRights | embargoed access | en_US |
| dcterms.bibliographicCitation | Research methods in applied linguistics, Dec. 2025, v. 4, no. 3, 100229 | - |
| dcterms.isPartOf | Research methods in applied linguistics | - |
| dcterms.issued | 2025-12 | - |
| dc.identifier.scopus | 2-s2.0-105009297693 | - |
| dc.identifier.eissn | 2772-7661 | - |
| dc.identifier.artn | 100229 | - |
| dc.description.validate | 202507 bcch | - |
| dc.identifier.FolderNumber | a3936 | en_US |
| dc.identifier.SubFormID | 51730 | en_US |
| dc.description.fundingSource | Others | en_US |
| dc.description.fundingText | The Hong Kong Polytechnic University, Department of Chinese and Bilingual Studies (Grant Numbers: P0043559; P0044493) | en_US |
| dc.description.pubStatus | Published | en_US |
| dc.date.embargo | 2027-12-31 | en_US |
| dc.description.oaCategory | Green (AAM) | en_US |
| Appears in Collections: | Journal/Magazine Article | |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.



