The Role of Artificial Intelligence in Decoding Speech from EEG Signals: A Scoping Review

Sep 23, 2022Sensors (Basel, Switzerland)

Using Artificial Intelligence to Interpret Speech from Brain Signals

AI simplified

Abstract

A total of 34 studies met the eligibility criteria for inclusion in this review of signal-based brain-computer interfaces.

  • Brain traumas, mental disorders, and vocal abuse can lead to speech impairment.
  • EEG devices are becoming increasingly accessible and have shown potential in medical applications.
  • The most frequently used EEG devices in the reviewed studies had 64 electrodes.
  • Common methods for processing EEG signals included bandpass filtering and wavelet-based feature extraction.
  • Machine learning techniques, particularly support vector machines, were most commonly employed, while convolutional neural networks were the leading deep learning approach.
  • EEG signal-based brain-computer interfaces may provide a means for individuals with voice impairments to communicate directly from their brain.

AI simplified

Key numbers

34
Total Studies Included
Total studies meeting eligibility criteria for inclusion.
64
Most Common Device
Number of electrodes used in most studies.

Full Text

What this is

  • This scoping review examines the use of artificial intelligence (AI) in decoding speech from () signals.
  • It highlights the potential of brain-computer interfaces (BCIs) to assist individuals with speech impairments.
  • The review categorizes studies based on AI techniques and summarizes methods for data acquisition and feature extraction.

Essence

  • AI techniques, particularly machine learning and deep learning, are increasingly applied to decode speech from signals. This review identifies key methods and challenges in developing effective brain-computer interfaces for speech communication.

Key takeaways

  • Machine learning and deep learning are the primary AI techniques used for decoding speech from signals. Support vector machines and convolutional neural networks are among the most frequently employed algorithms.
  • The majority of studies utilized 64-electrode devices for data collection, highlighting a common standard in the field. However, there is a need for more studies focusing on continuous word prediction to enhance communication for individuals with cognitive disabilities.

Caveats

  • The review is limited to studies that focus on isolated word and sentence prediction from signals, potentially overlooking relevant research in other modalities like ECoG or fMRI.
  • Only English-language studies were included, which may have excluded significant findings published in other languages.

Definitions

  • Brain-computer interface (BCI): A system that detects and translates brain signals into commands for external devices, facilitating communication for individuals with speech impairments.
  • Electroencephalography (EEG): A non-invasive method for recording electrical activity in the brain, commonly used in BCI applications.

AI simplified

what lands in your inbox each week:

  • 📚7 fresh studies
  • 📝plain-language summaries
  • direct links to original studies
  • 🏅top journal indicators
  • 📅weekly delivery
  • 🧘‍♂️always free