77
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      The PREP pipeline: standardized preprocessing for large-scale EEG analysis

      research-article

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          The technology to collect brain imaging and physiological measures has become portable and ubiquitous, opening the possibility of large-scale analysis of real-world human imaging. By its nature, such data is large and complex, making automated processing essential. This paper shows how lack of attention to the very early stages of an EEG preprocessing pipeline can reduce the signal-to-noise ratio and introduce unwanted artifacts into the data, particularly for computations done in single precision. We demonstrate that ordinary average referencing improves the signal-to-noise ratio, but that noisy channels can contaminate the results. We also show that identification of noisy channels depends on the reference and examine the complex interaction of filtering, noisy channel identification, and referencing. We introduce a multi-stage robust referencing scheme to deal with the noisy channel-reference interaction. We propose a standardized early-stage EEG processing pipeline (PREP) and discuss the application of the pipeline to more than 600 EEG datasets. The pipeline includes an automatically generated report for each dataset processed. Users can download the PREP pipeline as a freely available MATLAB library from http://eegstudy.org/prepcode.

          Related collections

          Most cited references31

          • Record: found
          • Abstract: found
          • Article: not found

          BCI2000: a general-purpose brain-computer interface (BCI) system.

          Many laboratories have begun to develop brain-computer interface (BCI) systems that provide communication and control capabilities to people with severe motor disabilities. Further progress and realization of practical applications depends on systematic evaluations and comparisons of different brain signals, recording methods, processing algorithms, output formats, and operating protocols. However, the typical BCI system is designed specifically for one particular BCI method and is, therefore, not suited to the systematic studies that are essential for continued progress. In response to this problem, we have developed a documented general-purpose BCI research and development platform called BCI2000. BCI2000 can incorporate alone or in combination any brain signals, signal processing methods, output devices, and operating protocols. This report is intended to describe to investigators, biomedical engineers, and computer scientists the concepts that the BC12000 system is based upon and gives examples of successful BCI implementations using this system. To date, we have used BCI2000 to create BCI systems for a variety of brain signals, processing methods, and applications. The data show that these systems function well in online operation and that BCI2000 satisfies the stringent real-time requirements of BCI systems. By substantially reducing labor and cost, BCI2000 facilitates the implementation of different BCI systems and other psychophysiological experiments. It is available with full documentation and free of charge for research or educational purposes and is currently being used in a variety of studies by many research groups.
            Bookmark
            • Record: found
            • Abstract: found
            • Article: not found

            ADJUST: An automatic EEG artifact detector based on the joint use of spatial and temporal features.

            Abstract A successful method for removing artifacts from electroencephalogram (EEG) recordings is Independent Component Analysis (ICA), but its implementation remains largely user-dependent. Here, we propose a completely automatic algorithm (ADJUST) that identifies artifacted independent components by combining stereotyped artifact-specific spatial and temporal features. Features were optimized to capture blinks, eye movements, and generic discontinuities on a feature selection dataset. Validation on a totally different EEG dataset shows that (1) ADJUST's classification of independent components largely matches a manual one by experts (agreement on 95.2% of the data variance), and (2) Removal of the artifacted components detected by ADJUST leads to neat reconstruction of visual and auditory event-related potentials from heavily artifacted data. These results demonstrate that ADJUST provides a fast, efficient, and automatic way to use ICA for artifact removal.
              Bookmark
              • Record: found
              • Abstract: found
              • Article: not found

              FASTER: Fully Automated Statistical Thresholding for EEG artifact Rejection.

              Electroencephalogram (EEG) data are typically contaminated with artifacts (e.g., by eye movements). The effect of artifacts can be attenuated by deleting data with amplitudes over a certain value, for example. Independent component analysis (ICA) separates EEG data into neural activity and artifact; once identified, artifactual components can be deleted from the data. Often, artifact rejection algorithms require supervision (e.g., training using canonical artifacts). Many artifact rejection methods are time consuming when applied to high-density EEG data. We describe FASTER (Fully Automated Statistical Thresholding for EEG artifact Rejection). Parameters were estimated for various aspects of data (e.g., channel variance) in both the EEG time series and in the independent components of the EEG: outliers were detected and removed. FASTER was tested on both simulated EEG (n=47) and real EEG (n=47) data on 128-, 64-, and 32-scalp electrode arrays. FASTER was compared to supervised artifact detection by experts and to a variant of the Statistical Control for Dense Arrays of Sensors (SCADS) method. FASTER had >90% sensitivity and specificity for detection of contaminated channels, eye movement and EMG artifacts, linear trends and white noise. FASTER generally had >60% sensitivity and specificity for detection of contaminated epochs, vs. 0.15% for SCADS. FASTER also aggregates the ERP across subject datasets, and detects outlier datasets. The variance in the ERP baseline, a measure of noise, was significantly lower for FASTER than either the supervised or SCADS methods. ERP amplitude did not differ significantly between FASTER and the supervised approach. Copyright 2010 Elsevier B.V. All rights reserved.
                Bookmark

                Author and article information

                Contributors
                Journal
                Front Neuroinform
                Front Neuroinform
                Front. Neuroinform.
                Frontiers in Neuroinformatics
                Frontiers Media S.A.
                1662-5196
                18 June 2015
                2015
                : 9
                : 16
                Affiliations
                [1] 1Syntrogi Inc. San Diego, CA, USA
                [2] 2Swartz Center for Computational Neuroscience, University of California San Diego, La Jolla, CA, USA
                [3] 3Department of Computer Science, University of Texas at San Antonio San Antonio, TX, USA
                Author notes

                Edited by: Arjen Van Ooyen, VU University Amsterdam, Netherlands

                Reviewed by: Adam R. Ferguson, University of California San Francisco, USA; Andreas Widmann, University of Leipzig, Germany

                *Correspondence: Kay A. Robbins, Department of Computer Science, University of Texas at San Antonio, One UTSA Circle, San Antonio, TX 78249, USA kay.robbins@ 123456utsa.edu
                Article
                10.3389/fninf.2015.00016
                4471356
                26150785
                ebaba8f1-9235-4dce-a127-d6b6722f2248
                Copyright © 2015 Bigdely-Shamlo, Mullen, Kothe, Su and Robbins.

                This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

                History
                : 22 March 2015
                : 02 June 2015
                Page count
                Figures: 9, Tables: 5, Equations: 0, References: 38, Pages: 20, Words: 12715
                Funding
                Funded by: Army Research Laboratory
                Award ID: W911NF-10-2-0022
                Funded by: NIH
                Award ID: 1R01MH084819-03
                Categories
                Neuroscience
                Methods

                Neurosciences
                eeg,artifact,preprocessing,eeglab,bcilab,machine learning,big data
                Neurosciences
                eeg, artifact, preprocessing, eeglab, bcilab, machine learning, big data

                Comments

                Comment on this article

                scite_
                0
                0
                0
                0
                Smart Citations
                0
                0
                0
                0
                Citing PublicationsSupportingMentioningContrasting
                View Citations

                See how this article has been cited at scite.ai

                scite shows how a scientific paper has been cited by providing the context of the citation, a classification describing whether it supports, mentions, or contrasts the cited claim, and a label indicating in which section the citation was made.

                Similar content347

                Cited by338

                Most referenced authors466