Apple is working on a reader of human brain signals

30 November 19:08

Apple has developed a technology codenamed PARS to analyze electroencephalograms using AI and headphones. Unlike traditional methods that require expensive manual data labeling by specialists, the new system uses self-learning based on unlabeled raw data. The algorithm learns to predict the time intervals between different segments of brain waves, which allows it to detect deep structures of neuroactivity. This was reported by ITHome, "Komersant Ukrainian" reports.

The technology has already demonstrated impressive results in tests, outperforming or matching current methods in three of the four reference EEG datasets. Of particular interest is the use of the EESM17 dataset in the study, which contains information collected using in-ear brain activity monitoring systems during sleep. This confirms the possibility of recording key neurological indicators, including sleep phases and epileptic patterns through sensors located in the ear canal.

Apple’s patent documentation for 2023 reveals the technical details of the implementation: the system involves placing an array of redundant electrodes around the AirPods ear cushions, where intelligent algorithms select sensor combinations with the best signal in real time. This approach solves the problem of maintaining constant contact with the skin due to the unique anatomy of each ear.

ITHome notes the logic of this direction of development in the context of Apple’s growing interest in the medical device market.

Анна Ткаченко
Editor

Reading now