Determining Gaze Information from Steady-State Visually-Evoked Potentials

dc.contributor.authorSayılgan, Ebru
dc.contributor.authorYüce, Yilmaz Kemal
dc.contributor.authorIsler, Yalcin
dc.date.accessioned2026-01-24T12:01:17Z
dc.date.available2026-01-24T12:01:17Z
dc.date.issued2020
dc.departmentAlanya Alaaddin Keykubat Üniversitesi
dc.description.abstractBrain-Computer Interface (BCI) is a communication system that enables individuals who lack control and use of their existingmuscular and nervous systems to interact with the outside world because of various reasons. A BCI enables its user to communicatewith some electronic devices by processing signals generated during brain activities. This study attempts to detect and collect gazedata within Electroencephalogram (EEG) signals through classification. To this purpose, three datasets comprised of EEG signalsrecorded by researchers from the Autonomous University were adopted. The EEG signals in these datasets were collected in a settingwhere subjects’ gaze into five boxes shown on a computer screen was recognized through Steady-State Visually Evoked Potentialbased BCI. The classification was performed using algorithms of Naive Bayes, Extreme Learning Machine, and Support VectorMachines. Three feature sets; Autoregressive, Hjorth, and Power Spectral Density, were extracted from EEG signals. As a result,using Autoregressive features, classifiers performed between 45.67% and 78.34%, whereas for Hjorth their classification performancewas within 43.34-75.25%, and finally, by using Power Spectral Density their classification performance was between 57.36% and83.42% Furthermore, classifier performances using Naive Bayes varied between 52.23% and 79.15% for Naive Bayes, 56.32-83.42%for Extreme Learning Machine, and 43.34-72.27% for Support Vector Machines by regarding classification algorithms. Amongachieved accuracy performances, the best accuracy is 83.42%, achieved by the Power Spectral Density features and Extreme LearningMachine algorithm pair.
dc.identifier.doi10.7212/zkufbd.v10i2.1588
dc.identifier.endpage157
dc.identifier.issn2146-4987
dc.identifier.issn2146-7277
dc.identifier.issue2
dc.identifier.startpage151
dc.identifier.trdizinid424305
dc.identifier.urihttps://search.trdizin.gov.tr/tr/yayin/detay/424305
dc.identifier.urihttps://doi.org/10.7212/zkufbd.v10i2.1588
dc.identifier.urihttps://hdl.handle.net/20.500.12868/4168
dc.identifier.volume10
dc.indekslendigikaynakTR-Dizin
dc.language.isoen
dc.relation.ispartofKaraelmas Fen ve Mühendislik Dergisi
dc.relation.publicationcategoryMakale - Ulusal Hakemli Dergi - Kurum Öğretim Elemanı
dc.rightsinfo:eu-repo/semantics/openAccess
dc.snmzKA_TR-Dizin_20260121
dc.subjectBilgisayar Bilimleri
dc.subjectYapay Zeka
dc.titleDetermining Gaze Information from Steady-State Visually-Evoked Potentials
dc.typeArticle

Dosyalar