Skip to content

Commit c4b0d77

Browse files
committed
Update documentation
1 parent a29eba0 commit c4b0d77

28 files changed

+132
-125
lines changed
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.

_sources/examples/dynamic_stopping.rst.txt

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -819,7 +819,7 @@ Additionally, the evaluation is performed on a single participant only.
819819
820820
.. rst-class:: sphx-glr-timing
821821

822-
**Total running time of the script:** (0 minutes 14.976 seconds)
822+
**Total running time of the script:** (0 minutes 15.041 seconds)
823823

824824

825825
.. _sphx_glr_download_examples_dynamic_stopping.py:

_sources/examples/ecca.rst.txt

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -564,7 +564,7 @@ Analyse multiple participants
564564
565565
.. rst-class:: sphx-glr-timing
566566

567-
**Total running time of the script:** (0 minutes 5.391 seconds)
567+
**Total running time of the script:** (0 minutes 5.302 seconds)
568568

569569

570570
.. _sphx_glr_download_examples_ecca.py:

_sources/examples/epoch_cca_lda.rst.txt

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -613,7 +613,7 @@ Ideally, this parameter is explored, to estimate a so-called decoding curve.
613613
614614
.. rst-class:: sphx-glr-timing
615615

616-
**Total running time of the script:** (0 minutes 8.174 seconds)
616+
**Total running time of the script:** (0 minutes 8.076 seconds)
617617

618618

619619
.. _sphx_glr_download_examples_epoch_cca_lda.py:

_sources/examples/etrca.rst.txt

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -573,7 +573,7 @@ Analyse multiple participants
573573
574574
.. rst-class:: sphx-glr-timing
575575

576-
**Total running time of the script:** (0 minutes 16.858 seconds)
576+
**Total running time of the script:** (0 minutes 16.547 seconds)
577577

578578

579579
.. _sphx_glr_download_examples_etrca.py:

_sources/examples/fbrcca.rst.txt

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -222,7 +222,7 @@ filterbank.
222222
223223
.. rst-class:: sphx-glr-timing
224224

225-
**Total running time of the script:** (0 minutes 48.992 seconds)
225+
**Total running time of the script:** (0 minutes 46.491 seconds)
226226

227227

228228
.. _sphx_glr_download_examples_fbrcca.py:

_sources/examples/rcca.rst.txt

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -711,7 +711,7 @@ Analyse multiple participants
711711
712712
.. rst-class:: sphx-glr-timing
713713

714-
**Total running time of the script:** (1 minutes 45.912 seconds)
714+
**Total running time of the script:** (1 minutes 39.472 seconds)
715715

716716

717717
.. _sphx_glr_download_examples_rcca.py:

_sources/examples/sg_execution_times.rst.txt

Lines changed: 7 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@
66

77
Computation times
88
=================
9-
**03:20.302** total execution time for 6 files **from examples**:
9+
**03:10.930** total execution time for 6 files **from examples**:
1010

1111
.. container::
1212

@@ -33,20 +33,20 @@ Computation times
3333
- Time
3434
- Mem (MB)
3535
* - :ref:`sphx_glr_examples_rcca.py` (``rcca.py``)
36-
- 01:45.912
36+
- 01:39.472
3737
- 0.0
3838
* - :ref:`sphx_glr_examples_fbrcca.py` (``fbrcca.py``)
39-
- 00:48.992
39+
- 00:46.491
4040
- 0.0
4141
* - :ref:`sphx_glr_examples_etrca.py` (``etrca.py``)
42-
- 00:16.858
42+
- 00:16.547
4343
- 0.0
4444
* - :ref:`sphx_glr_examples_dynamic_stopping.py` (``dynamic_stopping.py``)
45-
- 00:14.976
45+
- 00:15.041
4646
- 0.0
4747
* - :ref:`sphx_glr_examples_epoch_cca_lda.py` (``epoch_cca_lda.py``)
48-
- 00:08.174
48+
- 00:08.076
4949
- 0.0
5050
* - :ref:`sphx_glr_examples_ecca.py` (``ecca.py``)
51-
- 00:05.391
51+
- 00:05.302
5252
- 0.0

_sources/index.rst.txt

Lines changed: 20 additions & 41 deletions
Original file line numberDiff line numberDiff line change
@@ -1,13 +1,9 @@
11
PyntBCI
22
=======
33

4-
The Python Noise-Tagging Brain-Computer interface (PyntBCI) library is a Python toolbox for the noise-tagging
5-
brain-computer interfacing (BCI) project developed at the Donders Institute for Brain, Cognition and Behaviour, Radboud
6-
University, Nijmegen, the Netherlands. PyntBCI contains various signal processing steps and machine learning algorithms
7-
for BCIs that make use of evoked responses of the electroencephalogram (EEG), specifically code-modulated responses such
8-
as the code-modulated visual evoked potential (c-VEP). For a constructive review of this field, see [7]_.
4+
The Python Noise-Tagging Brain-Computer interface (PyntBCI) library is a Python toolbox for the noise-tagging brain-computer interfacing (BCI) project developed at the Donders Institute for Brain, Cognition and Behaviour, Radboud University, Nijmegen, the Netherlands. PyntBCI contains various signal processing steps and machine learning algorithms for BCIs that make use of evoked responses of the electroencephalogram (EEG), specifically code-modulated responses such as the code-modulated visual evoked potential (c-VEP). For a constructive review of this field, see [mar2021]_.
95

10-
When using PyntBCI, please reference at least one of the following: [1]_, [2]_, [5]_.
6+
When using PyntBCI, please reference at least one of the following articles: [thi2015]_, [thi2017]_, [thi2021]_.
117

128
Installation
139
------------
@@ -19,57 +15,40 @@ To install PyntBCI, use:
1915
Getting started
2016
---------------
2117

22-
Various tutorials and example analysis pipelines are provided in the `tutorials/` (under Getting Started) and
23-
`examples/` (under Examples) folder, which operate on limited preprocessed data as provided with PyntBCI. Furthermore,
24-
please find various pipelines for several of the datasets referenced below in the `pipelines/` folder.
18+
Various tutorials and example analysis pipelines are provided in the `tutorials/` (under Getting Started) and `examples/` (under Examples) folder, which operate on limited preprocessed data as provided with PyntBCI. Furthermore, please find various pipelines for several of the datasets referenced below in the `pipelines/` folder.
2519

2620
References
2721
----------
2822

29-
.. [1]: Thielen, J., van den Broek, P., Farquhar, J., & Desain, P. (2015). Broad-Band visually evoked potentials:
30-
re(con)volution in brain-computer interfacing. PLOS ONE, 10(7), e0133797.
31-
DOI: `10.1371/journal.pone.0133797<https://doi.org/10.1371/journal.pone.0133797>`_
23+
.. [thi2015] Thielen, J., van den Broek, P., Farquhar, J., & Desain, P. (2015). Broad-Band visually evoked potentials: re(con)volution in brain-computer interfacing. PLOS ONE, 10(7), e0133797. DOI: `10.1371/journal.pone.0133797 <https://doi.org/10.1371/journal.pone.0133797>`_
3224
33-
.. [2]: Thielen, J., Marsman, P., Farquhar, J., & Desain, P. (2017). Re(con)volution: accurate response prediction for
34-
broad-band evoked potentials-based brain computer interfaces. In Brain-Computer Interface Research (pp. 35-42).
35-
Springer, Cham. DOI: `10.1007/978-3-319-64373-1_4<https://doi.org/10.1007/978-3-319-64373-1_4>`_
25+
.. [thi2017] Thielen, J., Marsman, P., Farquhar, J., & Desain, P. (2017). Re(con)volution: accurate response prediction for broad-band evoked potentials-based brain computer interfaces. In Brain-Computer Interface Research (pp. 35-42). Springer, Cham. DOI: `10.1007/978-3-319-64373-1_4 <https://doi.org/10.1007/978-3-319-64373-1_4>`_
3626
37-
.. [3]: Desain, P. W. M., Thielen, J., van den Broek, P. L. C., & Farquhar, J. D. R. (2019). U.S. Patent No. 10,314,508.
38-
Washington, DC: U.S. Patent and Trademark Office. `Link<https://patentimages.storage.googleapis.com/40/a3/bb/65db00c7de99ec/US10314508.pdf>`_
27+
.. [des2019] Desain, P. W. M., Thielen, J., van den Broek, P. L. C., & Farquhar, J. D. R. (2019). U.S. Patent No. 10,314,508. Washington, DC: U.S. Patent and Trademark Office. `Link <https://patentimages.storage.googleapis.com/40/a3/bb/65db00c7de99ec/US10314508.pdf>`_
3928
40-
.. [4]: Ahmadi, S., Borhanazad, M., Tump, D., Farquhar, J., & Desain, P. (2019). Low channel count montages using sensor
41-
tying for VEP-based BCI. Journal of Neural Engineering, 16(6), 066038.
42-
DOI: `10.1088/1741-2552/ab4057<https://doi.org/10.1088/1741-2552/ab4057>`_
29+
.. [ahm2019] Ahmadi, S., Borhanazad, M., Tump, D., Farquhar, J., & Desain, P. (2019). Low channel count montages using sensor tying for VEP-based BCI. Journal of Neural Engineering, 16(6), 066038. DOI: `10.1088/1741-2552/ab4057 <https://doi.org/10.1088/1741-2552/ab4057>`_
4330
44-
.. [5]: Thielen, J., Marsman, P., Farquhar, J., & Desain, P. (2021). From full calibration to zero training for a
45-
code-modulated visual evoked potentials for brain–computer interface. Journal of Neural Engineering, 18(5), 056007.
46-
DOI: `10.1088/1741-2552/abecef<https://doi.org/10.1088/1741-2552/abecef>`_
31+
.. [thi2021] Thielen, J., Marsman, P., Farquhar, J., & Desain, P. (2021). From full calibration to zero training for a code-modulated visual evoked potentials for brain–computer interface. Journal of Neural Engineering, 18(5), 056007. DOI: `10.1088/1741-2552/abecef <https://doi.org/10.1088/1741-2552/abecef>`_
4732
48-
.. [6]: Verbaarschot, C., Tump, D., Lutu, A., Borhanazad, M., Thielen, J., van den Broek, P., ... & Desain, P. (2021). A
49-
visual brain-computer interface as communication aid for patients with amyotrophic lateral sclerosis. Clinical
50-
Neurophysiology, 132(10), 2404-2415. DOI: `10.1016/j.clinph.2021.07.012<https://doi.org/10.1016/j.clinph.2021.07.012>`_
33+
.. [ver2021] Verbaarschot, C., Tump, D., Lutu, A., Borhanazad, M., Thielen, J., van den Broek, P., ... & Desain, P. (2021). A visual brain-computer interface as communication aid for patients with amyotrophic lateral sclerosis. Clinical Neurophysiology, 132(10), 2404-2415. DOI: `10.1016/j.clinph.2021.07.012 <https://doi.org/10.1016/j.clinph.2021.07.012>`_
5134
52-
.. [7]: Martínez-Cagigal, V., Thielen, J., Santamaría-Vázquez, E., Pérez-Velasco, S., Desain, P., & Hornero, R. (2021).
53-
Brain–computer interfaces based on code-modulated visual evoked potentials (c-VEP): a literature review. Journal of
54-
Neural Engineering. DOI: `10.1088/1741-2552/ac38cf<https://doi.org/10.1088/1741-2552/ac38cf>`_
35+
.. [mar2021] Martínez-Cagigal, V., Thielen, J., Santamaría-Vázquez, E., Pérez-Velasco, S., Desain, P., & Hornero, R. (2021). Brain–computer interfaces based on code-modulated visual evoked potentials (c-VEP): a literature review. Journal of Neural Engineering. DOI: `10.1088/1741-2552/ac38cf <https://doi.org/10.1088/1741-2552/ac38cf>`_
5536
56-
.. [8]: Thielen, J. (2023). Effects of Stimulus Sequences on Brain-Computer Interfaces Using Code-Modulated Visual
57-
Evoked Potentials: An Offline Simulation. In International Work-Conference on Artificial Neural Networks (pp. 555-568).
58-
Cham: Springer Nature Switzerland. DOI: `10.1007/978-3-031-43078-7_45<https://doi.org/10.1007/978-3-031-43078-7_45>`_
37+
.. [thi2023] Thielen, J. (2023). Effects of Stimulus Sequences on Brain-Computer Interfaces Using Code-Modulated Visual Evoked Potentials: An Offline Simulation. In International Work-Conference on Artificial Neural Networks (pp. 555-568). Cham: Springer Nature Switzerland. DOI: `10.1007/978-3-031-43078-7_45 <https://doi.org/10.1007/978-3-031-43078-7_45>`_
5938
6039
Datasets
6140
--------
6241

63-
On the Radboud Data Repository (`RDR<https://data.ru.nl/>`_):
64-
- Thielen et al. (2018) Broad-Band Visually Evoked Potentials: Re(con)volution in Brain-Computer Interfacing.
65-
DOI: `10.34973/1ecz-1232<https://doi.org/10.34973/1ecz-1232>`_
66-
- Ahmadi et al. (2018) High density EEG measurement. DOI: `10.34973/psaf-mq72<https://doi.org/10.34973/psaf-mq72>`_
67-
- Ahmadi et al. (2019) Sensor tying. DOI: `10.34973/ehq6-b836<https://doi.org/10.34973/ehq6-b836>`_
68-
- Thielen et al. (2021) From full calibration to zero training for a code-modulated visual evoked potentials brain
69-
computer interface. DOI: `10.34973/9txv-z787<https://doi.org/10.34973/9txv-z787>`_
42+
On the Radboud Data Repository (`RDR <https://data.ru.nl/>`_):
7043

71-
On Mother of all BCI Benchmarks (`MOABB<https://moabb.neurotechx.com/docs/index.html>`_):
72-
- c-VEP dataset from Thielen et al. (2021) `link<https://moabb.neurotechx.com/docs/generated/moabb.datasets.Thielen2021.html#moabb.datasets.Thielen2021>`_
44+
.. [thi2018rdr] Thielen et al. (2018) Broad-Band Visually Evoked Potentials: Re(con)volution in Brain-Computer Interfacing. DOI: `10.34973/1ecz-1232 <https://doi.org/10.34973/1ecz-1232>`_
45+
.. [ahm2018rdr] Ahmadi et al. (2018) High density EEG measurement. DOI: `10.34973/psaf-mq72 <https://doi.org/10.34973/psaf-mq72>`_
46+
.. [ahm2019rdr] Ahmadi et al. (2019) Sensor tying. DOI: `10.34973/ehq6-b836 <https://doi.org/10.34973/ehq6-b836>`_
47+
.. [thi2021rdr] Thielen et al. (2021) From full calibration to zero training for a code-modulated visual evoked potentials brain computer interface. DOI: `10.34973/9txv-z787 <https://doi.org/10.34973/9txv-z787>`_
48+
49+
On Mother of all BCI Benchmarks (`MOABB <https://moabb.neurotechx.com/docs/index.html>`_):
50+
51+
.. [thi2021moabb] c-VEP dataset from Thielen et al. (2021). `Link <https://moabb.neurotechx.com/docs/generated/moabb.datasets.Thielen2021.html#moabb.datasets.Thielen2021>`_
7352
7453
Contact
7554
-------

_sources/sg_execution_times.rst.txt

Lines changed: 8 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@
66

77
Computation times
88
=================
9-
**03:49.669** total execution time for 7 files **from all galleries**:
9+
**03:37.163** total execution time for 7 files **from all galleries**:
1010

1111
.. container::
1212

@@ -33,23 +33,23 @@ Computation times
3333
- Time
3434
- Mem (MB)
3535
* - :ref:`sphx_glr_examples_rcca.py` (``../examples/rcca.py``)
36-
- 01:45.912
36+
- 01:39.472
3737
- 0.0
3838
* - :ref:`sphx_glr_examples_fbrcca.py` (``../examples/fbrcca.py``)
39-
- 00:48.992
39+
- 00:46.491
4040
- 0.0
4141
* - :ref:`sphx_glr_tutorials_tutorial_getting_started.py` (``../tutorials/tutorial_getting_started.py``)
42-
- 00:29.366
42+
- 00:26.234
4343
- 0.0
4444
* - :ref:`sphx_glr_examples_etrca.py` (``../examples/etrca.py``)
45-
- 00:16.858
45+
- 00:16.547
4646
- 0.0
4747
* - :ref:`sphx_glr_examples_dynamic_stopping.py` (``../examples/dynamic_stopping.py``)
48-
- 00:14.976
48+
- 00:15.041
4949
- 0.0
5050
* - :ref:`sphx_glr_examples_epoch_cca_lda.py` (``../examples/epoch_cca_lda.py``)
51-
- 00:08.174
51+
- 00:08.076
5252
- 0.0
5353
* - :ref:`sphx_glr_examples_ecca.py` (``../examples/ecca.py``)
54-
- 00:05.391
54+
- 00:05.302
5555
- 0.0

_sources/tutorials/sg_execution_times.rst.txt

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@
66

77
Computation times
88
=================
9-
**00:29.366** total execution time for 1 file **from tutorials**:
9+
**00:26.234** total execution time for 1 file **from tutorials**:
1010

1111
.. container::
1212

@@ -33,5 +33,5 @@ Computation times
3333
- Time
3434
- Mem (MB)
3535
* - :ref:`sphx_glr_tutorials_tutorial_getting_started.py` (``tutorial_getting_started.py``)
36-
- 00:29.366
36+
- 00:26.234
3737
- 0.0

_sources/tutorials/tutorial_getting_started.rst.txt

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -517,7 +517,7 @@ so-called learning curve, which shows the performance as a function of the amoun
517517

518518
.. rst-class:: sphx-glr-timing
519519

520-
**Total running time of the script:** (0 minutes 29.366 seconds)
520+
**Total running time of the script:** (0 minutes 26.234 seconds)
521521

522522

523523
.. _sphx_glr_download_tutorials_tutorial_getting_started.py:

examples/dynamic_stopping.html

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -710,7 +710,7 @@ <h2>Overall comparison<a class="headerlink" href="#overall-comparison" title="Li
710710
<img src="../_images/sphx_glr_dynamic_stopping_011.png" srcset="../_images/sphx_glr_dynamic_stopping_011.png" alt="comparison of dynamic stopping methods averaged across folds" class = "sphx-glr-single-img"/><div class="sphx-glr-script-out highlight-none notranslate"><div class="highlight"><pre><span></span>Text(0.5, 1.0, &#39;comparison of dynamic stopping methods averaged across folds&#39;)
711711
</pre></div>
712712
</div>
713-
<p class="sphx-glr-timing"><strong>Total running time of the script:</strong> (0 minutes 14.976 seconds)</p>
713+
<p class="sphx-glr-timing"><strong>Total running time of the script:</strong> (0 minutes 15.041 seconds)</p>
714714
<div class="sphx-glr-footer sphx-glr-footer-example docutils container" id="sphx-glr-download-examples-dynamic-stopping-py">
715715
<div class="sphx-glr-download sphx-glr-download-jupyter docutils container">
716716
<p><a class="reference download internal" download="" href="../_downloads/e6d03f3e62b6aaa70fbb647e28a129b9/dynamic_stopping.ipynb"><code class="xref download docutils literal notranslate"><span class="pre">Download</span> <span class="pre">Jupyter</span> <span class="pre">notebook:</span> <span class="pre">dynamic_stopping.ipynb</span></code></a></p>

examples/ecca.html

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -493,7 +493,7 @@ <h2>Analyse multiple participants<a class="headerlink" href="#analyse-multiple-p
493493
<img src="../_images/sphx_glr_ecca_008.png" srcset="../_images/sphx_glr_ecca_008.png" alt="Decoding performance full dataset" class = "sphx-glr-single-img"/><div class="sphx-glr-script-out highlight-none notranslate"><div class="highlight"><pre><span></span>Average accuracy: 0.92
494494
</pre></div>
495495
</div>
496-
<p class="sphx-glr-timing"><strong>Total running time of the script:</strong> (0 minutes 5.391 seconds)</p>
496+
<p class="sphx-glr-timing"><strong>Total running time of the script:</strong> (0 minutes 5.302 seconds)</p>
497497
<div class="sphx-glr-footer sphx-glr-footer-example docutils container" id="sphx-glr-download-examples-ecca-py">
498498
<div class="sphx-glr-download sphx-glr-download-jupyter docutils container">
499499
<p><a class="reference download internal" download="" href="../_downloads/3887a9ed47244b68cee76c93140624fd/ecca.ipynb"><code class="xref download docutils literal notranslate"><span class="pre">Download</span> <span class="pre">Jupyter</span> <span class="pre">notebook:</span> <span class="pre">ecca.ipynb</span></code></a></p>

examples/epoch_cca_lda.html

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -524,7 +524,7 @@ <h2>Epoch to trial decoding with CCA and LDA<a class="headerlink" href="#epoch-t
524524
Text(0.5, 1.0, &#39;CCA+LDA: classification accuracy (trial): avg=0.99 std=0.02&#39;)
525525
</pre></div>
526526
</div>
527-
<p class="sphx-glr-timing"><strong>Total running time of the script:</strong> (0 minutes 8.174 seconds)</p>
527+
<p class="sphx-glr-timing"><strong>Total running time of the script:</strong> (0 minutes 8.076 seconds)</p>
528528
<div class="sphx-glr-footer sphx-glr-footer-example docutils container" id="sphx-glr-download-examples-epoch-cca-lda-py">
529529
<div class="sphx-glr-download sphx-glr-download-jupyter docutils container">
530530
<p><a class="reference download internal" download="" href="../_downloads/f105c771db3c7944db6dbbb77d7163b0/epoch_cca_lda.ipynb"><code class="xref download docutils literal notranslate"><span class="pre">Download</span> <span class="pre">Jupyter</span> <span class="pre">notebook:</span> <span class="pre">epoch_cca_lda.ipynb</span></code></a></p>

examples/etrca.html

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -501,7 +501,7 @@ <h2>Analyse multiple participants<a class="headerlink" href="#analyse-multiple-p
501501
<img src="../_images/sphx_glr_etrca_008.png" srcset="../_images/sphx_glr_etrca_008.png" alt="Decoding performance full dataset" class = "sphx-glr-single-img"/><div class="sphx-glr-script-out highlight-none notranslate"><div class="highlight"><pre><span></span>Average accuracy: 0.73
502502
</pre></div>
503503
</div>
504-
<p class="sphx-glr-timing"><strong>Total running time of the script:</strong> (0 minutes 16.858 seconds)</p>
504+
<p class="sphx-glr-timing"><strong>Total running time of the script:</strong> (0 minutes 16.547 seconds)</p>
505505
<div class="sphx-glr-footer sphx-glr-footer-example docutils container" id="sphx-glr-download-examples-etrca-py">
506506
<div class="sphx-glr-download sphx-glr-download-jupyter docutils container">
507507
<p><a class="reference download internal" download="" href="../_downloads/7bab6d21a7a703cf2852f1876e32612e/etrca.ipynb"><code class="xref download docutils literal notranslate"><span class="pre">Download</span> <span class="pre">Jupyter</span> <span class="pre">notebook:</span> <span class="pre">etrca.ipynb</span></code></a></p>

examples/fbrcca.html

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -264,7 +264,7 @@ <h2>Analyse multiple participants<a class="headerlink" href="#analyse-multiple-p
264264
rCCA 30.0-60.0: 0.39 +/- 0.04
265265
</pre></div>
266266
</div>
267-
<p class="sphx-glr-timing"><strong>Total running time of the script:</strong> (0 minutes 48.992 seconds)</p>
267+
<p class="sphx-glr-timing"><strong>Total running time of the script:</strong> (0 minutes 46.491 seconds)</p>
268268
<div class="sphx-glr-footer sphx-glr-footer-example docutils container" id="sphx-glr-download-examples-fbrcca-py">
269269
<div class="sphx-glr-download sphx-glr-download-jupyter docutils container">
270270
<p><a class="reference download internal" download="" href="../_downloads/0d6ec0989ccfcbad1497538b456ab21e/fbrcca.ipynb"><code class="xref download docutils literal notranslate"><span class="pre">Download</span> <span class="pre">Jupyter</span> <span class="pre">notebook:</span> <span class="pre">fbrcca.ipynb</span></code></a></p>

examples/rcca.html

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -586,7 +586,7 @@ <h2>Analyse multiple participants<a class="headerlink" href="#analyse-multiple-p
586586
<img src="../_images/sphx_glr_rcca_011.png" srcset="../_images/sphx_glr_rcca_011.png" alt="Decoding performance full dataset" class = "sphx-glr-single-img"/><div class="sphx-glr-script-out highlight-none notranslate"><div class="highlight"><pre><span></span>Average accuracy: 0.98
587587
</pre></div>
588588
</div>
589-
<p class="sphx-glr-timing"><strong>Total running time of the script:</strong> (1 minutes 45.912 seconds)</p>
589+
<p class="sphx-glr-timing"><strong>Total running time of the script:</strong> (1 minutes 39.472 seconds)</p>
590590
<div class="sphx-glr-footer sphx-glr-footer-example docutils container" id="sphx-glr-download-examples-rcca-py">
591591
<div class="sphx-glr-download sphx-glr-download-jupyter docutils container">
592592
<p><a class="reference download internal" download="" href="../_downloads/da0a2a0786140090a1e5b7c21a1e9d27/rcca.ipynb"><code class="xref download docutils literal notranslate"><span class="pre">Download</span> <span class="pre">Jupyter</span> <span class="pre">notebook:</span> <span class="pre">rcca.ipynb</span></code></a></p>

0 commit comments

Comments
 (0)