Direct information flow and complex networks from high

Direct information flow and complex networks
from high-dimensional time series
Kugiumtzis Dimitris
Department of Electrical and Computer Engineering,
Faculty of Engineering, Aristotle University of Thessaloniki, Greece
e-mail: [email protected] http:\\users.auth.gr\dkugiu
Complex Systems Seminar, University of Western Australia
Perth, 23 February 2017
Kugiumtzis Dimitris
Direct information flow and complex networks
Outline
Kugiumtzis Dimitris
Direct information flow and complex networks
Outline
1
Interdependence (Granger causality) in multivariate time
series and complex networks
Kugiumtzis Dimitris
Direct information flow and complex networks
Outline
1
Interdependence (Granger causality) in multivariate time
series and complex networks
2
Dimension reduction on information causality measures
(PMIME)
Kugiumtzis Dimitris
Direct information flow and complex networks
Outline
1
Interdependence (Granger causality) in multivariate time
series and complex networks
2
Dimension reduction on information causality measures
(PMIME)
3
Applications of PMIME to neuroscience and finance.
Kugiumtzis Dimitris
Direct information flow and complex networks
correlation
?
Kugiumtzis Dimitris
Direct information flow and complex networks
r (Xt ; Yt )
Xt
Yt
correlation
?
Kugiumtzis Dimitris
Direct information flow and complex networks
r (Xt ; Yt+1 ) or better r (Xt ; Yt+1 |Yt )
Xt
Yt
causality
?
Kugiumtzis Dimitris
Direct information flow and complex networks
r (Xt ; Yt+1 |Yt , Zt )
Xt
Yt
-
direct
causality
?
indirect
causality
?
Zt
?
Kugiumtzis Dimitris
Direct information flow and complex networks
correlation
?
Kugiumtzis Dimitris
Direct information flow and complex networks
r (Xt ; Yt )
Xt
Yt
correlation
?
Kugiumtzis Dimitris
Direct information flow and complex networks
r (Xt ; Yt+1 ) or better r (Xt ; Yt+1 |Yt )
Xt
Yt
causality
?
Kugiumtzis Dimitris
Direct information flow and complex networks
r (Xt ; Yt+1 |Yt , Zt )
Xt
Yt
-
direct
causality
?
indirect
causality
?
Zt
?
Kugiumtzis Dimitris
Direct information flow and complex networks
Outline
1
Interdependence (Granger causality) in multivariate time
series and complex networks
2
Dimension reduction on information causality measures
(PMIME)
3
Applications of PMIME to neuroscience and finance.
Kugiumtzis Dimitris
Direct information flow and complex networks
Linear
X →Y
Granger Causality Index (GCI)
Granger Causality measures
X → Y |Z
Conditional / Partial GCI (CGCI/PGCI)
[Geweke 1982; Guo et al 2008]
Restricted Granger Causality Index (RGCI)
[Siggiridou & Kugiumtzis, 2016]
Directed Coherence (DC) [Wang & Takigawa 1992]
Directed Transfer Function (DTF)
[Kaminski & Blinowska 1991]
Partial DC (PDC) [Baccala & Sameshima 2001]
Restricted PDC (RPDC) [Siggiridou & Kugiumtzis]
direct Directed Transfer Function (dDTF)
[Korzeniewska et al 2003]
Nonlinear
X →Y
X → Y |Z
Phase Directionality Index (PDI) [Rosenblum & Pikovsky 2001]
Phase Slope Index (PSI) [Nolte et al 2008]
Inderdependence measures (InderDep) [Arnhold et al 1999; Chicharro & Andrzejak 2009]
Mean Conditional Recurrence (MCR)
Prediction Improvement (PredImprov)
[Romano et al 2007]
[Faes et al 2008]
Transfer Entropy (TE) [Schreiber 2000]
Symbolic Transfer Entropy (STE)
Partial TE (PTE) [Vakorin et al 2009; Papana et al 2011]
Partial STE (PSTE) [Papana et al 2014]
[Staniek & Lehnertz 2008]
Transfer Entropy on Rank Vectors (TERV)
Partial TERV (PTERV)
[Kugiumtzis 2013]
[Kugiumtzis 2012]
Mutual Information from Mixed Embedding Partial MIME (PMIME)
(MIME) [Vlachos & Kugiumtzis 2010]
Kugiumtzis Dimitris
[Kugiumtzis 2013]
Direct information flow and complex networks
Granger causality measures
Bivariate time series {xt , yt }nt=1
driving system: X , response system: Y
Kugiumtzis Dimitris
Direct information flow and complex networks
Granger causality measures
Bivariate time series {xt , yt }nt=1
driving system: X , response system: Y
Idea of Granger causality X → Y [Granger 1969]:
predict Y better when including X in the regression model.
Kugiumtzis Dimitris
Direct information flow and complex networks
Granger causality measures
Bivariate time series {xt , yt }nt=1
driving system: X , response system: Y
Idea of Granger causality X → Y [Granger 1969]:
predict Y better when including X in the regression model.
Granger causality index (GCI) quantifies the extent of
E [yt+1 |yt , xt ] > E [yt+1 |yt ]
Kugiumtzis Dimitris
Direct information flow and complex networks
Granger causality measures
Bivariate time series {xt , yt }nt=1
driving system: X , response system: Y
Idea of Granger causality X → Y [Granger 1969]:
predict Y better when including X in the regression model.
Granger causality index (GCI) quantifies the extent of
E [yt+1 |yt , xt ] > E [yt+1 |yt ]
Evaluate w.r.t. the distribution rather the mean of it
p(yt+1 |yt , xt ) > p(yt+1 |yt )
Kugiumtzis Dimitris
Direct information flow and complex networks
Granger causality measures
Bivariate time series {xt , yt }nt=1
driving system: X , response system: Y
Idea of Granger causality X → Y [Granger 1969]:
predict Y better when including X in the regression model.
Granger causality index (GCI) quantifies the extent of
E [yt+1 |yt , xt ] > E [yt+1 |yt ]
Evaluate w.r.t. the distribution rather the mean of it
p(yt+1 |yt , xt ) > p(yt+1 |yt )
where
xt = [xt , xt−τx , . . . , xt−(mx −1)τx ]0 , embedding parameters: mx ,τx
yt = [yt , yt−τy , . . . , yt−(my −1)τy ]0 , embedding parameters: my ,τy
yt+1 : future state of Y
Kugiumtzis Dimitris
Direct information flow and complex networks
Nonlinear causality measures (direct and indirect)
Transfer Entropy (TE)
[Schreiber, 2000]
Measure the effect of X on Y at one time step ahead, accounting
(conditioning) for the effect from its own current state
TEX →Y = I (yt+1 ; xt |yt )
= H(xt , yt ) − H(yt+1 , xt , yt ) + H(yt+1 , yt ) − H(yt )
X
p(yt+1 |xt , yt )
=
p(yt+1 , xt , yt ) log
p(yt+1 |yt )
Kugiumtzis Dimitris
Direct information flow and complex networks
Nonlinear causality measures (direct and indirect)
Transfer Entropy (TE)
[Schreiber, 2000]
Measure the effect of X on Y at one time step ahead, accounting
(conditioning) for the effect from its own current state
TEX →Y = I (yt+1 ; xt |yt )
= H(xt , yt ) − H(yt+1 , xt , yt ) + H(yt+1 , yt ) − H(yt )
X
p(yt+1 |xt , yt )
=
p(yt+1 , xt , yt ) log
p(yt+1 |yt )
Joint entropies (and distributions) can have high dimension!
Entropy estimates from nearest neighbors
Kugiumtzis Dimitris
[Kraskov et al, 2004]
Direct information flow and complex networks
Nonlinear causality measures (direct and indirect)
Transfer Entropy (TE)
[Schreiber, 2000]
Measure the effect of X on Y at one time step ahead, accounting
(conditioning) for the effect from its own current state
TEX →Y = I (yt+1 ; xt |yt )
= H(xt , yt ) − H(yt+1 , xt , yt ) + H(yt+1 , yt ) − H(yt )
X
p(yt+1 |xt , yt )
=
p(yt+1 , xt , yt ) log
p(yt+1 |yt )
Joint entropies (and distributions) can have high dimension!
Entropy estimates from nearest neighbors
[Kraskov et al, 2004]
TE is equivalent to GCI when the stochastic process of (X , Y ) is
Gaussian [Barnett et al, PRE 2009]
Kugiumtzis Dimitris
Direct information flow and complex networks
Nonlinear causality measures (direct)
driving system: X , response system: Y ,
conditioning on system Z , Z = {Z1 , Z2 , . . . , ZK −2 }
join all K − 2 z-reconstructed vectors: Zt = [z1,t , . . . , zK −2,t ]
Partial Transfer Entropy (PTE)
[Vakorin et al, 2009; Papana et al, 2012]
Measure the effect of X on Y at T times ahead, accounting
(conditioning) for the effect from its own current state and the
current state of the other variables except X .
PTEX →Y |Z = I (yt+1 ; xt |yt , Zt )
= H(xt , yt |Zt ) − H(yt+1 , xt , yt |Zt ) + H(yt+1 , yt |Zt ) − H(yt |Zt )
Joint entropies (and distributions) can have very high dimension!
Kugiumtzis Dimitris
Direct information flow and complex networks
Example: coupled Henon maps
x1,t+1
=
2
1.4 − x1,t
+ 0.3x1,t−1
xi,t+1
=
1.4 − (0.5C (xi−1,t + xi+1,t ) + (1 − C )xi,t )2 + 0.3xi,t−1
xK ,t+1
=
1.4 − xK2 ,t + 0.3xK ,t−1
C : coupling strength
[Politi & Torcini, 1992]
Network structure
for K = 5
Kugiumtzis Dimitris
Direct information flow and complex networks
Example, TE, K = 5
Kugiumtzis Dimitris
Direct information flow and complex networks
Example, TE, K = 10
Kugiumtzis Dimitris
Direct information flow and complex networks
Example, TE, K = 20
Kugiumtzis Dimitris
Direct information flow and complex networks
The curse of dimensionality
• Bivariate measures are likely to produce false couplings
(indirect connections).
Kugiumtzis Dimitris
Direct information flow and complex networks
Example, PTE, K = 4
Kugiumtzis Dimitris
Direct information flow and complex networks
Example, PTE, K = 8
Kugiumtzis Dimitris
Direct information flow and complex networks
The curse of dimensionality
• Bivariate measures are likely to produce false couplings
(indirect connections).
Kugiumtzis Dimitris
Direct information flow and complex networks
The curse of dimensionality
• Bivariate measures are likely to produce false couplings
(indirect connections).
• Multivariate measures are likely to produce false couplings.
They require long time series,
e.g. PTEX →Y |Z = I (yt+1 ; xt |yt , Zt )
requires the estimation of entropy of [yt+1 , xt , yt , Zt ]0
of dimension 1 + Km.
Kugiumtzis Dimitris
Direct information flow and complex networks
Outline
1
Interdependence (Granger causality) in multivariate time
series and complex networks
2
Dimension reduction on information causality measures
(PMIME)
3
Applications of PMIME to neuroscience and finance.
Kugiumtzis Dimitris
Direct information flow and complex networks
Dimension reduction with PMIME
[Kugiumtzis, 2013]
Partial Mutual Information from Mixed Embedding (PMIME)
applies dimension reduction using mutual information. The idea
[Vlachos & Kugiumtzis, 2010]:
Kugiumtzis Dimitris
Direct information flow and complex networks
Dimension reduction with PMIME
[Kugiumtzis, 2013]
Partial Mutual Information from Mixed Embedding (PMIME)
applies dimension reduction using mutual information. The idea
[Vlachos & Kugiumtzis, 2010]:
1
Find a subset wy of lagged variables from X , Y and Z that
explains best the future of Y .
Kugiumtzis Dimitris
Direct information flow and complex networks
Dimension reduction with PMIME
[Kugiumtzis, 2013]
Partial Mutual Information from Mixed Embedding (PMIME)
applies dimension reduction using mutual information. The idea
[Vlachos & Kugiumtzis, 2010]:
1
Find a subset wy of lagged variables from X , Y and Z that
explains best the future of Y .
2
Quantify the information on Y ahead that is explained by the
X -components in this subset.
Kugiumtzis Dimitris
Direct information flow and complex networks
Dimension reduction with PMIME
[Kugiumtzis, 2013]
Partial Mutual Information from Mixed Embedding (PMIME)
applies dimension reduction using mutual information. The idea
[Vlachos & Kugiumtzis, 2010]:
1
Find a subset wy of lagged variables from X , Y and Z that
explains best the future of Y .
2
Quantify the information on Y ahead that is explained by the
X -components in this subset.
If there are no components of X in wy , then PMIME = 0.
Kugiumtzis Dimitris
Direct information flow and complex networks
Dimension reduction with PMIME
[Kugiumtzis, 2013]
Partial Mutual Information from Mixed Embedding (PMIME)
applies dimension reduction using mutual information. The idea
[Vlachos & Kugiumtzis, 2010]:
1
Find a subset wy of lagged variables from X , Y and Z that
explains best the future of Y .
2
Quantify the information on Y ahead that is explained by the
X -components in this subset.
If there are no components of X in wy , then PMIME = 0.
Similar approaches based on this idea:
[Faes et al, PRE 2011; Stamaglia et
al, PRE 2012; Runge et al, PRL 2012; Wibral et al, PLOSOne 2013; Runge et al, PRE
2015; edited book of Wibral, Vicente and Lizier ”Directed information measures in
Neuroscience”, Springer, 2014.]
Kugiumtzis Dimitris
Direct information flow and complex networks
PMIME - 2
The mixed embedding scheme
Start with an empty embedding vector wt0 , future vector of Y ,
yt+1 , and maximum lag L (or Lx for X , Ly for Y etc)
Wt = {xt , . . . , xt−L−1 , yt , . . . , yt−L−1 , z1,t , . . . , zK −2,t−L−1 }
Kugiumtzis Dimitris
Direct information flow and complex networks
PMIME - 2
The mixed embedding scheme
Start with an empty embedding vector wt0 , future vector of Y ,
yt+1 , and maximum lag L (or Lx for X , Ly for Y etc)
Wt = {xt , . . . , xt−L−1 , yt , . . . , yt−L−1 , z1,t , . . . , zK −2,t−L−1 }
First embedding cycle: wt1 = argmaxw ∈Wt I (yt+1 ; w ), and
wt1 = (wt1 )
Kugiumtzis Dimitris
Direct information flow and complex networks
PMIME - 2
The mixed embedding scheme
Start with an empty embedding vector wt0 , future vector of Y ,
yt+1 , and maximum lag L (or Lx for X , Ly for Y etc)
Wt = {xt , . . . , xt−L−1 , yt , . . . , yt−L−1 , z1,t , . . . , zK −2,t−L−1 }
First embedding cycle: wt1 = argmaxw ∈Wt I (yt+1 ; w ), and
wt1 = (wt1 )
At embedding cycle j suppose wtj−1 = (wt1 , wt2 , . . . , wtj−1 ).
Kugiumtzis Dimitris
Direct information flow and complex networks
PMIME - 2
The mixed embedding scheme
Start with an empty embedding vector wt0 , future vector of Y ,
yt+1 , and maximum lag L (or Lx for X , Ly for Y etc)
Wt = {xt , . . . , xt−L−1 , yt , . . . , yt−L−1 , z1,t , . . . , zK −2,t−L−1 }
First embedding cycle: wt1 = argmaxw ∈Wt I (yt+1 ; w ), and
wt1 = (wt1 )
At embedding cycle j suppose wtj−1 = (wt1 , wt2 , . . . , wtj−1 ).
Add wtj ∈ Wt \ wtj−1 that maximizes mutual information to
yt+1 conditioning on the current wtj−1 ,
Kugiumtzis Dimitris
Direct information flow and complex networks
PMIME - 2
The mixed embedding scheme
Start with an empty embedding vector wt0 , future vector of Y ,
yt+1 , and maximum lag L (or Lx for X , Ly for Y etc)
Wt = {xt , . . . , xt−L−1 , yt , . . . , yt−L−1 , z1,t , . . . , zK −2,t−L−1 }
First embedding cycle: wt1 = argmaxw ∈Wt I (yt+1 ; w ), and
wt1 = (wt1 )
At embedding cycle j suppose wtj−1 = (wt1 , wt2 , . . . , wtj−1 ).
Add wtj ∈ Wt \ wtj−1 that maximizes mutual information to
yt+1 conditioning on the current wtj−1 ,
wtj = argmaxw ∈mathbfWt \wj−1 I (yt+1 ; w |wtj−1 )
t
Kugiumtzis Dimitris
Direct information flow and complex networks
PMIME - 2
The mixed embedding scheme
Start with an empty embedding vector wt0 , future vector of Y ,
yt+1 , and maximum lag L (or Lx for X , Ly for Y etc)
Wt = {xt , . . . , xt−L−1 , yt , . . . , yt−L−1 , z1,t , . . . , zK −2,t−L−1 }
First embedding cycle: wt1 = argmaxw ∈Wt I (yt+1 ; w ), and
wt1 = (wt1 )
At embedding cycle j suppose wtj−1 = (wt1 , wt2 , . . . , wtj−1 ).
Add wtj ∈ Wt \ wtj−1 that maximizes mutual information to
yt+1 conditioning on the current wtj−1 ,
wtj = argmaxw ∈mathbfWt \wj−1 I (yt+1 ; w |wtj−1 )
t
Progressive vector building stops at step j (wt = wtj−1 ):
- Criterion of hard threshold:
I yt+1 ; wtj−1 /I yt+1 ; wtj > A (here A = 0.95)
- Criterion of adaptive threshold:
randomization significance test on I (yt+1 ; wtj |wtj−1 )
Kugiumtzis Dimitris
Direct information flow and complex networks
PMIME - 3
The non-uniform mixed embedding vector of lags of all X , Y , Z for
explaining yt+1 :
wt = (xt−τx1 , . . . , xt−τxmx , yt−τy 1 , . . . , yt−τymy , zt−τz1 , . . . , zt−τzmz )
{z
} |
{z
}
|
{z
} |
wtx
wty
Kugiumtzis Dimitris
wtz
Direct information flow and complex networks
PMIME - 3
The non-uniform mixed embedding vector of lags of all X , Y , Z for
explaining yt+1 :
wt = (xt−τx1 , . . . , xt−τxmx , yt−τy 1 , . . . , yt−τymy , zt−τz1 , . . . , zt−τzmz )
{z
} |
{z
}
|
{z
} |
wtx
wty
wtz
The causality measure PMIME
RX →Y |Z =
I (yt+1 ; wtx | wty , wtZ )
I (yt+1 ; wt )
RX →Y |Z : information on the future of Y explained only by
X -components of the embedding vector (given the
components of Y and Z ), normalized with the mutual
information of the future of Y and the embedding vector.
Kugiumtzis Dimitris
Direct information flow and complex networks
PMIME - 3
The non-uniform mixed embedding vector of lags of all X , Y , Z for
explaining yt+1 :
wt = (xt−τx1 , . . . , xt−τxmx , yt−τy 1 , . . . , yt−τymy , zt−τz1 , . . . , zt−τzmz )
{z
} |
{z
}
|
{z
} |
wtx
wty
wtz
The causality measure PMIME
RX →Y |Z =
I (yt+1 ; wtx | wty , wtZ )
I (yt+1 ; wt )
RX →Y |Z : information on the future of Y explained only by
X -components of the embedding vector (given the
components of Y and Z ), normalized with the mutual
information of the future of Y and the embedding vector.
If wt contains no components from X , then RX →Y |Z = 0 and
X has no direct effect on the future of Y .
Kugiumtzis Dimitris
Direct information flow and complex networks
PMIME - 4
Three main advantages of PMIME
Kugiumtzis Dimitris
Direct information flow and complex networks
PMIME - 4
Three main advantages of PMIME
RX →Y |Z = 0 when no significant causality is present, and
RX →Y |Z > 0 when it is present
[no significance test, no issues with multiple testing!]
Kugiumtzis Dimitris
Direct information flow and complex networks
PMIME - 4
Three main advantages of PMIME
RX →Y |Z = 0 when no significant causality is present, and
RX →Y |Z > 0 when it is present
[no significance test, no issues with multiple testing!]
mixed embedding for all variables is formed as part of the
measure [it does not require the determination of embedding
parameters for each variable]
Kugiumtzis Dimitris
Direct information flow and complex networks
PMIME - 4
Three main advantages of PMIME
RX →Y |Z = 0 when no significant causality is present, and
RX →Y |Z > 0 when it is present
[no significance test, no issues with multiple testing!]
mixed embedding for all variables is formed as part of the
measure [it does not require the determination of embedding
parameters for each variable]
inclusion of more confounding variables only slows the
computation and has no effect on statistical accuracy
[no “curse of dimensionality” for any dimension of Z ,
only slow computation time]
Kugiumtzis Dimitris
Direct information flow and complex networks
PMIME - 4
Three main advantages of PMIME
RX →Y |Z = 0 when no significant causality is present, and
RX →Y |Z > 0 when it is present
[no significance test, no issues with multiple testing!]
mixed embedding for all variables is formed as part of the
measure [it does not require the determination of embedding
parameters for each variable]
inclusion of more confounding variables only slows the
computation and has no effect on statistical accuracy
[no “curse of dimensionality” for any dimension of Z ,
only slow computation time]
⇒ good candidate for causality analysis with many variables
Kugiumtzis Dimitris
Direct information flow and complex networks
Example: coupled Mackey-Glass
Coupled identical Mackey-Glass delayed differential equations
ẋi (t) = −0.1xi (t) +
K
X
Cij xj (t − ∆)
1 + xj (t − ∆)10
for
i = 1, . . . , K
j=1
K =5
Kugiumtzis Dimitris
Direct information flow and complex networks
Mackey-Glass, C = 0.2
∆ = 20
Kugiumtzis Dimitris
Direct information flow and complex networks
Mackey-Glass, C = 0.2
∆ = 20
∆ = 100
Kugiumtzis Dimitris
Direct information flow and complex networks
Mackey-Glass: true/estimated network
K =5
True
from PMIME (∆ = 20)
[Kugiumtzis & Kimiskidis, 2015]
from PMIME (∆ = 100)
2
2
2
3
3
3
1
1
1
4
5
4
4
5
5
Kugiumtzis Dimitris
Direct information flow and complex networks
Mackey-Glass: true/estimated network
K =5
True
from PMIME (∆ = 20)
[Kugiumtzis & Kimiskidis, 2015]
from PMIME (∆ = 100)
2
2
2
3
3
3
1
1
4
1
5
4
4
5
K = 15
5
True
5
6
from PMIME (∆ = 20)
5
4
6
from PMIME (∆ = 100)
5
4
3
6
7
3
7
7
2
2
8
2
8
8
1
9
1
9
15
10
12
13
15
10
14
11
1
9
15
10
4
3
14
11
12
Kugiumtzis Dimitris
13
14
11
12
13
Direct information flow and complex networks
Can different network structures be detected?
Simulation: three types of networks for the generating system
Kugiumtzis Dimitris
Direct information flow and complex networks
Can different network structures be detected?
Simulation: three types of networks for the generating system
Generating system:
coupled Mackey-Glass system, K = 25, ∆ = 100, C = 0.2
with coupling structure defined by the network type
Causality measure: PMIME
Kugiumtzis Dimitris
Direct information flow and complex networks
Estimation of the Random Network
Kugiumtzis Dimitris
Direct information flow and complex networks
Estimation of the Small-World Network
Kugiumtzis Dimitris
Direct information flow and complex networks
Estimation of the Scale-Free Network
Kugiumtzis Dimitris
Direct information flow and complex networks
Structural Change
Simulation example:
The network structure undergoes structural change at specific time
points:
Random ⇒ Small-World ⇒ Scale-Free
Kugiumtzis Dimitris
Direct information flow and complex networks
Structural Change
Simulation example:
The network structure undergoes structural change at specific time
points:
Random ⇒ Small-World ⇒ Scale-Free
Estimation of networks with PMIME at sliding windows
Kugiumtzis Dimitris
Direct information flow and complex networks
Structural Change
Simulation example:
The network structure undergoes structural change at specific time
points:
Random ⇒ Small-World ⇒ Scale-Free
Estimation of networks with PMIME at sliding windows
Estimation of network characteristics on the PMIME networks
Kugiumtzis Dimitris
Direct information flow and complex networks
Structural Change
Simulation example:
The network structure undergoes structural change at specific time
points:
Random ⇒ Small-World ⇒ Scale-Free
Estimation of networks with PMIME at sliding windows
Estimation of network characteristics on the PMIME networks
Structural change detection,
[Slow], [Middle], [Fast], [Very fast]
Kugiumtzis Dimitris
Direct information flow and complex networks
Outline
1
Interdependence (Granger causality) in multivariate time
series and complex networks
2
Dimension reduction on information causality measures
(PMIME)
3
Applications of PMIME to neuroscience and finance.
Kugiumtzis Dimitris
Direct information flow and complex networks
EEG and Transcranial Magnetic Stimulation (TMS)
Jointly with Vassilis Kimiskidis,
Medical School, AUTh
[Kimiskidis et al, IJNS, 2013]
[Siggiridou et al, Sensors, 2014]
[Kugiumtzis and Kimiskidis, IJNS, 2015]
Kugiumtzis Dimitris
Direct information flow and complex networks
TMS-EEG: brain connectivity analysis
How does TMS act on epileptic brain connectivity?
Kugiumtzis Dimitris
Direct information flow and complex networks
TMS-EEG: brain connectivity analysis
How does TMS act on epileptic brain connectivity?
{X1 , X2 , . . . , XK }: K EEG channels, each represents a (sub)system
Practical problems to overcome:
Kugiumtzis Dimitris
Direct information flow and complex networks
TMS-EEG: brain connectivity analysis
How does TMS act on epileptic brain connectivity?
{X1 , X2 , . . . , XK }: K EEG channels, each represents a (sub)system
Practical problems to overcome:
Application on small time windows ⇒ limited data size
Kugiumtzis Dimitris
Direct information flow and complex networks
TMS-EEG: brain connectivity analysis
How does TMS act on epileptic brain connectivity?
{X1 , X2 , . . . , XK }: K EEG channels, each represents a (sub)system
Practical problems to overcome:
Application on small time windows ⇒ limited data size
scalp EEG ⇒ many channels ⇒ many variables in Z to
account for
Kugiumtzis Dimitris
Direct information flow and complex networks
TMS-EEG: brain connectivity analysis
How does TMS act on epileptic brain connectivity?
{X1 , X2 , . . . , XK }: K EEG channels, each represents a (sub)system
Practical problems to overcome:
Application on small time windows ⇒ limited data size
scalp EEG ⇒ many channels ⇒ many variables in Z to
account for
Brainsystem is complex: the connectivity measure has to deal
 high dimensionality
nonlinearity?
with

sensitivity on free parameters?
Kugiumtzis Dimitris
Direct information flow and complex networks
TMS-EEG: brain connectivity analysis
How does TMS act on epileptic brain connectivity?
{X1 , X2 , . . . , XK }: K EEG channels, each represents a (sub)system
Practical problems to overcome:
Application on small time windows ⇒ limited data size
scalp EEG ⇒ many channels ⇒ many variables in Z to
account for
Brainsystem is complex: the connectivity measure has to deal
 high dimensionality
nonlinearity?
with

sensitivity on free parameters?
PMIME addresses all these problems!
Kugiumtzis Dimitris
Direct information flow and complex networks
Example: compare PMIME to other measures on EEG
[Kugiumtzis, PRE, 2013]
One epileptiform discharge (ED) episode terminated by
transcranial magnetic stimulation (TMS), totally 45 channels
Select randomly a subset of channels.
Compute the connectivity measures on the subset at each
sliding window
Compute average connectivity strength at each sliding
window.
Repeat the steps above a number of times (here 12).
... for subsets of 5, 15, 25, 35 and once for 45 channels.
Kugiumtzis Dimitris
Direct information flow and complex networks
Epileptiform discharges induced by TMS
Preprocessing:
[One Episode]
replacement of TMS artifact, high order FIR
rejection of channels with artifacts
reference to infinity (REST) [Qin et al, ClinNeuroph 2010]
overlapping windows of 2s, a sliding step of 0.5s
Kugiumtzis Dimitris
Direct information flow and complex networks
Epileptiform discharges induced by TMS
Preprocessing:
[One Episode]
replacement of TMS artifact, high order FIR
rejection of channels with artifacts
reference to infinity (REST) [Qin et al, ClinNeuroph 2010]
overlapping windows of 2s, a sliding step of 0.5s
Kugiumtzis Dimitris
Direct information flow and complex networks
subject 1 with focal seizure, ED episode ends with TMS
In-Strength
Out-Strength
In-Out-Strength
Kugiumtzis Dimitris
[Kugiumtzis & Kimiskidis, 2015]
Average strength / degree
Direct information flow and complex networks
Subject 1 with focal seizure, 13 episodes, average degree
Kugiumtzis Dimitris
Direct information flow and complex networks
Subject 1 with focal seizure, 13 episodes, average degree
TMS terminates the ED prematurely and restores the network
structure as if it would have terminated spontaneously
Kugiumtzis Dimitris
Direct information flow and complex networks
Subject 2 with focal seizure, 9 episodes, average degree
TMS terminates the ED prematurely and restores the network
structure as if it would have terminated spontaneously
Kugiumtzis Dimitris
Direct information flow and complex networks
Many network indices (totally 78) computed on the
PMIME-causality networks
Kugiumtzis Dimitris
Direct information flow and complex networks
Many network indices (totally 78) computed on the
PMIME-causality networks
For both subjects
and pairs
preED - ED
ED - postED
Kugiumtzis Dimitris
Direct information flow and complex networks
Many network indices (totally 78) computed on the
PMIME-causality networks
For both subjects
and pairs
preED - ED
ED - postED
Many networks indices discriminate well preED, ED, postED
Kugiumtzis Dimitris
Direct information flow and complex networks
subject 1 with genetic generalized epilepsy (GGE)
ED induced by TMS, [PMIME on 2s windows]
Kugiumtzis Dimitris
[Kugiumtzis et al, 2016]
Direct information flow and complex networks
subject 1 with genetic generalized epilepsy (GGE)
ED induced by TMS, [PMIME on 2s windows]
[Kugiumtzis et al, 2016]
One episode
Kugiumtzis Dimitris
Direct information flow and complex networks
subject 1 with genetic generalized epilepsy (GGE)
ED induced by TMS, [PMIME on 2s windows]
One episode
[Kugiumtzis et al, 2016]
29 episodes
Average strength
Average clustering
coefficient
Kugiumtzis Dimitris
Direct information flow and complex networks
PMIME on 2s window
Kugiumtzis Dimitris
Direct information flow and complex networks
RGPDC(α) on 1s window
Kugiumtzis Dimitris
Direct information flow and complex networks
iCoh(α) on 1s window
Kugiumtzis Dimitris
Direct information flow and complex networks
Synchronization Likelihood (SL) on 1s window
Kugiumtzis Dimitris
Direct information flow and complex networks
subject 1 with GGE
ED induced by TMS, SL on 1s windows,
Kugiumtzis Dimitris
Direct information flow and complex networks
subject 1 with GGE
ED induced by TMS, SL on 1s windows,
One episode
Kugiumtzis Dimitris
Direct information flow and complex networks
subject 1 with GGE
ED induced by TMS, SL on 1s windows,
One episode
29 episodes
SL
Average strength
iCOH(α)
Kugiumtzis Dimitris
SL (other parameters)
Direct information flow and complex networks
subject 1 with GGE
ED induced by TMS, SL on 1s windows,
One episode
29 episodes
SL
Average strength
iCOH(α)
Kugiumtzis Dimitris
SL (other parameters)
Direct information flow and complex networks
subject 1 with GGE
ED induced by TMS, SL on 1s windows,
One episode
29 episodes
SL
Average strength
iCOH(α)
Kugiumtzis Dimitris
SL (other parameters)
Direct information flow and complex networks
Financial markets
5 top stocks for each of the 8 sectors of the US economy
Daily closing index in the period 30/12/2002 - 28/9/2012
Sector
Symbol
Name
Sector
Symbol
Name
basic
materials
XOM
CVX
SLB
COP
OXY
Exxon Mobil Corp
Chevron corporation
Schlumberger N.V.
ConocoPhillips Co.
Occidental Petrol
Healthcare
JNJ
PFE
MRK
BMY
AMGN
Johnson & Johnson
Pfizer Inc.
Merck & Company
Bristol-Myers
Amgen Inc.
Conglomerates UTX
MMM
CAT
DOW
MGT
United Technological
3M
Caterpillar Inc.
Dow Chemical Company
MGT Capital Investments
Industrials
GE
HON
CAT
EMR
LMT
General Electric
Honeywell
Caterpillar Inc.
Emerson
Lockheed Martin
Consumer
PG
KO
PM
PEP
MO
Proctor & Gamble
Coca Cola
Phillip Morris Int
Pepsico Inc.
Altria Group Inc.
Services
WMT
AMZN
DIS
HD
CMCSA
Wal-Mart Stores
Amazon.com
Walt Disney Company
Home Depot
Comcast Corporation
Financials
BRK-B
WFC
JPM
C
BAC
Berkshire Hathaway
Wells Fargo
JP Morgan Chase
Citigroup
Bank of America
Technology
AAPL
GOOG
MSFT
IBM
T
Apple Inc.
Google
Microsoft
IBM
AT&T
Excluded:
CAT (Caterpillar Inc.): doubled (Conglomerates and Industrials)
PM (Phillip Morris Int): index starts 31/3/2008
GOOG (Google): index starts 19/8/2004
Kugiumtzis Dimitris
Direct information flow and complex networks
Structural change in
2008,
e.g. change point on
22/2/2008
[Dehling et al, 2013]
Kugiumtzis Dimitris
Direct information flow and complex networks
Structural change in
2008,
e.g. change point on
22/2/2008
[Dehling et al, 2013]
causality measure
computed on log-returns
at
windows of 400 days,
sliding step 100 days
Kugiumtzis Dimitris
Direct information flow and complex networks
windows of w = 400 days, sliding step s = 100 days
[CGCI, P = 5, FDR at α = 0.05],
[RCGCI, P = 5, FDR at α = 0.05],
[PMIME, Lmax = 5],
Kugiumtzis Dimitris
Direct information flow and complex networks
Network from CGCI(P = 5)
Kugiumtzis Dimitris
Direct information flow and complex networks
Network from RCGCI(P = 5)
Kugiumtzis Dimitris
Direct information flow and complex networks
Network from PMIME
Kugiumtzis Dimitris
Direct information flow and complex networks
Average strength of PMIME, CGCI and RCGCI
w = 400, s = 100
Kugiumtzis Dimitris
Direct information flow and complex networks
Average strength of PMIME, CGCI and RCGCI
w = 400, s = 100
w = 400, s = 100, normalized
Kugiumtzis Dimitris
Direct information flow and complex networks
Average strength of PMIME, CGCI and RCGCI
w = 400, s = 100
w = 400, s = 100, normalized
w = 300, s = 100
Kugiumtzis Dimitris
Direct information flow and complex networks
Average strength of PMIME, CGCI and RCGCI
w = 400, s = 100
w = 400, s = 100, normalized
w = 300, s = 100
w = 300, s = 100, normalized
Kugiumtzis Dimitris
Direct information flow and complex networks
All networks
Average strength of PMIME, CGCI and
RCGCI
Kugiumtzis Dimitris
Direct information flow and complex networks
All networks
Average strength of PMIME, CGCI and
RCGCI
Kugiumtzis Dimitris
Average strength of autocorrelation
Direct information flow and complex networks
All networks
Average strength of PMIME, CGCI and
RCGCI
Average strength of autocorrelation
Average strength of cross-correlation
Kugiumtzis Dimitris
Direct information flow and complex networks
All networks
Average strength of PMIME, CGCI and
RCGCI
Average strength of autocorrelation
Average strength of cross-correlation
Average strength of partial correlation
Kugiumtzis Dimitris
Direct information flow and complex networks
Summary
Kugiumtzis Dimitris
Direct information flow and complex networks
Summary
• Conditional mutual information can be used to capture the
inter-dependence structure of a multivariate complex system /
stochastic process, but ...
Kugiumtzis Dimitris
Direct information flow and complex networks
Summary
• Conditional mutual information can be used to capture the
inter-dependence structure of a multivariate complex system /
stochastic process, but ...
when many variables are observed dimension reduction should be
applied.
Kugiumtzis Dimitris
Direct information flow and complex networks
Summary
• Conditional mutual information can be used to capture the
inter-dependence structure of a multivariate complex system /
stochastic process, but ...
when many variables are observed dimension reduction should be
applied.
• PMIME is model-free and almost parameter-free, can estimate
nonlinear causal effects in the presence of many variables, but...
Kugiumtzis Dimitris
Direct information flow and complex networks
Summary
• Conditional mutual information can be used to capture the
inter-dependence structure of a multivariate complex system /
stochastic process, but ...
when many variables are observed dimension reduction should be
applied.
• PMIME is model-free and almost parameter-free, can estimate
nonlinear causal effects in the presence of many variables, but...
it requires larger time series than linear measures (using dimension
reduction)
Kugiumtzis Dimitris
Direct information flow and complex networks
Summary
• Conditional mutual information can be used to capture the
inter-dependence structure of a multivariate complex system /
stochastic process, but ...
when many variables are observed dimension reduction should be
applied.
• PMIME is model-free and almost parameter-free, can estimate
nonlinear causal effects in the presence of many variables, but...
it requires larger time series than linear measures (using dimension
reduction)
it requires large computation time.
Kugiumtzis Dimitris
Direct information flow and complex networks
Thanks to:
Aristotle University
of Thessaloniki
collaborators in the research team
other collaborators
Alkis Tsimpiris, PostDoc
Vassilios K. Kimiskidis, Neurologist, Assoc.Prof., Aristotle Uni
Christos Koutlis, PostDoc
Paal G. Larsson, Neurophysiologist, Oslo Rikshospitalet
Elsa Siggiridou, PhD candidate
Aggeliki Papana, PostDoc,
Macedonia Uni
Maria Papapetrou, PhD candidate
Kugiumtzis Dimitris
Direct information flow and complex networks
References
Koutlis C & Kugiumtzis D (2016) “Discrimination of coupling structures using causality networks from multivariate
time series”, Chaos, 26: 093120 Identification of network structures with PMIME
Kugiumtzis D (2012) “Transfer entropy on rank vectors”, Journal of Nonlinear Systems and Applications, 3(2):
73-81 TERV (bivariate)
Kugiumtzis D (2013) “Partial transfer entropy on rank vectors”, The European Physical Journal Special Topics,
222(2): 401-420 PTERV (multivariate)
Kugiumtzis D (2013) “Direct coupling information measure from non-uniform embedding”, Physical Review E, 87:
062918 PMIME (multivariate)
Kugiumtzis D & Kimiskidis VK (2015) “Direct causal networks for the study of transcranial magnetic stimulation
effects on focal epileptiform discharges”, International Journal of Neural Systems, 25: 1550006 TMS-EEG, Focal
epilepsy, connectivity with PMIME
Kugiumtzis D, Koutlis C, Tsimpiris A & Kimiskidis VK (2016) “Dynamics of epileptiform discharges induced by
transcranial magnetic stimulation in genetic generalized epilepsy”, International Journal of Neural Systems, under
revision TMS-EEG, genetic generalized epilepsy, connectivity
Papana A, Kugiumtzis D & Larsson PG (2012) “Detection of direct causal effects and application in the analysis of
electroencephalograms from patients with epilepsy”, International Journal of Bifurcation and Chaos, 22(9):
1250222 EEG, epilepsy, PTE (multivariate)
Papapetrou M & Kugiumtzis D (2013) “Markov chain order estimation with conditional mutual information”,
Physica A, 392: 15931601 Randomization CI test for Markov chain order estimation (univariate)
Papapetrou M & Kugiumtzis D (2015) “Markov chain order estimation with parametric significance tests of
conditional mutual information”, Simulation Modelling Practice and Theory, 61: 1-13 Parametric CI tests for
Markov chain order estimation (univariate)
Siggiridou E, Koutlis C, Tsimpiris A, Kimiskidis VK & Kugiumtzis D (2015) “Causality networks from multivariate
time series and application to epilepsy”, in Engineering in Medicine and Biology Society (EMBC), 37th Annual
International Conference of the IEEE, 4041-4044 Large scale comparison of connectivity measures
Siggiridou E & Kugiumtzis D (2016) “Granger causality in multi-variate time series using a time ordered restricted
vector autoregressive model”, IEEE Transactions on Signal Processing, 2016 Dimension reduction on the linear
Granger causality measure (multivariate)
Vlachos I & Kugiumtzis D (2010) “Non-uniform state space reconstruction and coupling detection”, Physical
Review E, 82: 016207 MIME (bivariate)
Kugiumtzis Dimitris
Direct information flow and complex networks