From 9d02b48efad4d16181eb577e8ba54e15b7cf5ae7 Mon Sep 17 00:00:00 2001 From: Mattehub Date: Thu, 9 Nov 2023 15:33:24 +0100 Subject: [PATCH] visible three level 3, checking that everything works --- docs/overview/index.rst | 2 +- docs/overview/ovw_theory.rst | 4 ++-- docs/refs.bib | 11 +++++++++++ 3 files changed, 14 insertions(+), 3 deletions(-) diff --git a/docs/overview/index.rst b/docs/overview/index.rst index 81c151d6..2f2408df 100644 --- a/docs/overview/index.rst +++ b/docs/overview/index.rst @@ -4,7 +4,7 @@ Overview This section contains References and Mathematical background about information-theory and HOI .. toctree:: - :maxdepth: 2 + :maxdepth: 3 ovw_theory ovw_refs diff --git a/docs/overview/ovw_theory.rst b/docs/overview/ovw_theory.rst index ecdf8be2..d79facf6 100644 --- a/docs/overview/ovw_theory.rst +++ b/docs/overview/ovw_theory.rst @@ -43,8 +43,8 @@ Where: :math:`H(X,Y)` is the joint entropy of :math:`X` and :math:`Y`. MI between two variables, quantifies how much knowing one variable reduces the uncertainty about the other and measures the interdependency between the two variables. If they are independent, we have :math:`H(X,Y)=H(X)+H(Y)`, hence :math:`MI(X,Y)=0`. Since the MI can be reduced to a signed sum of entropies, the problem of how to estimate MI from continuous data can be reconducted to the problem, discussed above, of how to estimate entropies. An estimator that has been recently developed and presents interesting properties when computing the MI is the Gaussian Copula estimator :cite:`ince2017statistical`. This estimator is based on the statistical theory of copulas and is proven to provide a lower bound to the real value of MI, this is one of its main advantages: when computing MI, Gaussian copula estimator avoids false positives. Play attention to the fact that this can be mainly used to investigate relationships between two variables that are monotonic. -From pairwise to higher-order interactions - higher-order metrics -+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ +From pairwise to higher-order interactions +++++++++++++++++++++++++++++++++++++++++++ The information theoretic metrics involved in this work are all based in principle on the concept of Shannon entropy and mutual information. Given a set of variables, a common approach to investigate their interaction is by comparing the entropy and the information of the joint probability distribution of the whole set with the entropy and information of different subsets. This can be done in many different ways, unveiling different aspects of HOIs :cite:`timme2014synergy, varley2023information`. The metrics implemented in the toolbox can be divided in two main categories: a group of metrics focus on the relationship between a set of source variables and a target one and another group measures the interactions within a set of variables. In the following part we are going through all the metrics that have been developed in the toolbox, providing some insights about their theoretical foundation and possible interpretations. diff --git a/docs/refs.bib b/docs/refs.bib index 4c43dea0..87ab1a8c 100644 --- a/docs/refs.bib +++ b/docs/refs.bib @@ -240,6 +240,17 @@ @article{moon1995estimation publisher={APS} } +@article{ince2017statistical, + title={A statistical framework for neuroimaging data analysis based on mutual information estimated via a gaussian copula}, + author={Ince, Robin AA and Giordano, Bruno L and Kayser, Christoph and Rousselet, Guillaume A and Gross, Joachim and Schyns, Philippe G}, + journal={Human brain mapping}, + volume={38}, + number={3}, + pages={1541--1573}, + year={2017}, + publisher={Wiley Online Library} +} + @article{kraskov2004estimating, title={Estimating mutual information}, author={Kraskov, Alexander and St{\"o}gbauer, Harald and Grassberger, Peter},