Skip to content

Commit

Permalink
visible three level 3, checking that everything works
Browse files Browse the repository at this point in the history
  • Loading branch information
Mattehub committed Nov 9, 2023
1 parent ab5fbd3 commit 9d02b48
Show file tree
Hide file tree
Showing 3 changed files with 14 additions and 3 deletions.
2 changes: 1 addition & 1 deletion docs/overview/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ Overview
This section contains References and Mathematical background about information-theory and HOI

.. toctree::
:maxdepth: 2
:maxdepth: 3

ovw_theory
ovw_refs
Expand Down
4 changes: 2 additions & 2 deletions docs/overview/ovw_theory.rst
Original file line number Diff line number Diff line change
Expand Up @@ -43,8 +43,8 @@ Where:
:math:`H(X,Y)` is the joint entropy of :math:`X` and :math:`Y`.
MI between two variables, quantifies how much knowing one variable reduces the uncertainty about the other and measures the interdependency between the two variables. If they are independent, we have :math:`H(X,Y)=H(X)+H(Y)`, hence :math:`MI(X,Y)=0`. Since the MI can be reduced to a signed sum of entropies, the problem of how to estimate MI from continuous data can be reconducted to the problem, discussed above, of how to estimate entropies. An estimator that has been recently developed and presents interesting properties when computing the MI is the Gaussian Copula estimator :cite:`ince2017statistical`. This estimator is based on the statistical theory of copulas and is proven to provide a lower bound to the real value of MI, this is one of its main advantages: when computing MI, Gaussian copula estimator avoids false positives. Play attention to the fact that this can be mainly used to investigate relationships between two variables that are monotonic.

From pairwise to higher-order interactions - higher-order metrics
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
From pairwise to higher-order interactions
++++++++++++++++++++++++++++++++++++++++++

The information theoretic metrics involved in this work are all based in principle on the concept of Shannon entropy and mutual information. Given a set of variables, a common approach to investigate their interaction is by comparing the entropy and the information of the joint probability distribution of the whole set with the entropy and information of different subsets. This can be done in many different ways, unveiling different aspects of HOIs :cite:`timme2014synergy, varley2023information`. The metrics implemented in the toolbox can be divided in two main categories: a group of metrics focus on the relationship between a set of source variables and a target one and another group measures the interactions within a set of variables. In the following part we are going through all the metrics that have been developed in the toolbox, providing some insights about their theoretical foundation and possible interpretations.

Expand Down
11 changes: 11 additions & 0 deletions docs/refs.bib
Original file line number Diff line number Diff line change
Expand Up @@ -240,6 +240,17 @@ @article{moon1995estimation
publisher={APS}
}

@article{ince2017statistical,
title={A statistical framework for neuroimaging data analysis based on mutual information estimated via a gaussian copula},
author={Ince, Robin AA and Giordano, Bruno L and Kayser, Christoph and Rousselet, Guillaume A and Gross, Joachim and Schyns, Philippe G},
journal={Human brain mapping},
volume={38},
number={3},
pages={1541--1573},
year={2017},
publisher={Wiley Online Library}
}

@article{kraskov2004estimating,
title={Estimating mutual information},
author={Kraskov, Alexander and St{\"o}gbauer, Harald and Grassberger, Peter},
Expand Down

0 comments on commit 9d02b48

Please sign in to comment.