This is a collaborative space. In order to contribute, send an email to maximilien.chaumon@icm-institute.org
On any page, type the letter L on your keyboard to add a "Label" to the page, which will make search easier.
Decoding full epochs
What we want to do now is we want to use some fancy machine learning to see if there's any differences between two selected conditions in one of our experiments. We will load the epochs (the epochs before running ica or ssp). For example we can look at the differences between auditory left and auditory right stimulation.
We want to train the classifier, so we show it a bunch of epoch and we tell it okay so this is auditory left, this is auditory right, we hope that it learns the spatial temporal pattern, and when we then show it some epochs it hasn't seen before it will be able to tell us oh yeah this is most likely auditory left or this is most likely auditory right.
We are going to use the mne Scaler
which is aware of different channel types so it knows how to treat magnetometer, eeg and gradiometer data and scale them properly. ThenVectorizer()
which takes care of bringing the data into the correct shape and then LogisticRegression()
and then we use a cross validation scheme meaning to split the data into train and tests automatically, we're going to do this several times and we use the area under curve to evaluate classifier performance.
Links to the App:
https://github.com/zahransa/app-decoding-full-epochs
https://brainlife.io/app/62ab109aab3e669780634426
Inputs of the App:
File | Format | Datatype | Description |
---|---|---|---|
Epochs | .fif | neuro/meeg/mne/epochs | Epoched data |
Configuation parameter | Type | Description |
---|---|---|
event_condition | string | Experiment descriptions. |
Outputs of the App: