Please use this identifier to cite or link to this item:
http://hdl.handle.net/10397/114180
| Title: | Brain-computer interface for shared controls of unmanned aerial vehicles | Authors: | Bi, Z Mikkola, A Ip, AWH Yung, KL Luo, C |
Issue Date: | Aug-2024 | Source: | IEEE transactions on aerospace and electronic systems, Aug. 2024, v. 60, no. 4, p. 3860-3871 | Abstract: | To control an intelligent system in an unstructured environment, it is desirable to synergize human and machine intelligence to deal with changes and uncertainty cost-effectively. A shared control takes advantage of human and computer strengths in decision-making support, and this helps to improve the adaptability, agility, reliability, responsiveness, and resilience of the system. Since the decision spaces for human thinking and machine intelligence are quite different, challenges occur to fuse human intelligence and machine intelligence effectively. A brain–computer interface (BCI) can bridge human and machine intelligence; however, traditional BCIs are unidirectional that support interaction in one of two scenarios: first, human or machine takes effect at different control layers, and second, either human or machine takes effect at a time. There is an emerging need to close the loop of BCI-based control to alleviate the adverse effects of a machine's error or a human's mistake. In this article, available technologies for acquisition, processing, and mining of brain signals are reviewed, the needs of integrating human's capability to control unmanned aerial vehicles (UAV) are elaborated, and research challenges in advancing BCI for a shared human and machine control are discussed at the aspects of data acquisition, mapping of human's and machine's decision spaces, and the fusion of human's and machine's intelligence in automated controls. To address unsolved problems in the aforementioned aspects, we proposed a new platform of using BCI for human–machine interactions and three innovations are, first, an advanced BCI to acquire multimodal brain signals and extract features related to the intentions of motion and the quantified human's affection, second, an arbitrating mechanism in system control to determine the weight of human's decisions based on quantified human's affection, and finally, a decision support system that is capable of fusing human's and machine's decisions from different decision spaces seamlessly in controlling a UAV for real-time performance in application. | Publisher: | Institute of Electrical and Electronics Engineers | Journal: | IEEE transactions on aerospace and electronic systems | ISSN: | 0018-9251 | EISSN: | 1557-9603 | DOI: | 10.1109/TAES.2024.3368402 | Rights: | © 2024 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works. The following publication Z. Bi, A. Mikkola, A. W. H. Ip, K. L. Yung and C. Luo, "Brain–Computer Interface for Shared Controls of Unmanned Aerial Vehicles," in IEEE Transactions on Aerospace and Electronic Systems, vol. 60, no. 4, pp. 3860-3871, Aug. 2024 is available at https://doi.org/10.1109/TAES.2024.3368402. |
| Appears in Collections: | Journal/Magazine Article |
Show full item record
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.



