1. Introduction
CG Interpretations
On the Properties of MVR CGs
Separators in CGs
Learning CGs
Chain Graphs: Properties and Learning
Algorithms
Mohammad Ali Javidian
Department of Computer Science and Engineering
University of South Carolina
October 23, 2019
Mohammad Ali Javidian Chain Graphs: Properties and Learning Algorithms 1 / 44
2. Introduction
CG Interpretations
On the Properties of MVR CGs
Separators in CGs
Learning CGs
Outline
Introduction
Probabilistic Graphical Models
A Motivational Example
CG Interpretations
LWF Chain Graphs
MVR Chain Graphs
AMP Chain Graphs
CGs and Symmetric Associations
On the Properties of MVR CGs
Separators in CGs
Learning CGs
Order-Independent PC-like Algorithm for Learning MVR CGs
Order-dependent PC-like Algorithm for Learning MVR CGs
Learning MVR CGs via Decomposition
Mohammad Ali Javidian Chain Graphs: Properties and Learning Algorithms 2 / 44
3. Introduction
CG Interpretations
On the Properties of MVR CGs
Separators in CGs
Learning CGs
Probabilistic Graphical Models
A Motivational Example
Outline
Introduction
Probabilistic Graphical Models
A Motivational Example
CG Interpretations
LWF Chain Graphs
MVR Chain Graphs
AMP Chain Graphs
CGs and Symmetric Associations
On the Properties of MVR CGs
Separators in CGs
Learning CGs
Order-Independent PC-like Algorithm for Learning MVR CGs
Order-dependent PC-like Algorithm for Learning MVR CGs
Learning MVR CGs via Decomposition
Mohammad Ali Javidian Chain Graphs: Properties and Learning Algorithms 3 / 44
4. Introduction
CG Interpretations
On the Properties of MVR CGs
Separators in CGs
Learning CGs
Probabilistic Graphical Models
A Motivational Example
Probabilistic Graphical Models (PGMs)
Any PGM consists of:
a graph,
a joint distribution
Applications: probabilistic reasoning and decision making
Mohammad Ali Javidian Chain Graphs: Properties and Learning Algorithms 4 / 44
5. Introduction
CG Interpretations
On the Properties of MVR CGs
Separators in CGs
Learning CGs
Probabilistic Graphical Models
A Motivational Example
Commonly Used PGMs
Markov networks
Bayesian networks
Mohammad Ali Javidian Chain Graphs: Properties and Learning Algorithms 5 / 44
6. Introduction
CG Interpretations
On the Properties of MVR CGs
Separators in CGs
Learning CGs
Probabilistic Graphical Models
A Motivational Example
Creating a Graphical Model from Expert Knowledge
Figure: Lappenschaar et. al. Qualitative chain graphs and their
application. IJAR (2014)
Mohammad Ali Javidian Chain Graphs: Properties and Learning Algorithms 6 / 44
7. Introduction
CG Interpretations
On the Properties of MVR CGs
Separators in CGs
Learning CGs
LWF Chain Graphs
MVR Chain Graphs
AMP Chain Graphs
CGs and Symmetric Associations
Outline
Introduction
Probabilistic Graphical Models
A Motivational Example
CG Interpretations
LWF Chain Graphs
MVR Chain Graphs
AMP Chain Graphs
CGs and Symmetric Associations
On the Properties of MVR CGs
Separators in CGs
Learning CGs
Order-Independent PC-like Algorithm for Learning MVR CGs
Order-dependent PC-like Algorithm for Learning MVR CGs
Learning MVR CGs via Decomposition
Mohammad Ali Javidian Chain Graphs: Properties and Learning Algorithms 7 / 44
8. Introduction
CG Interpretations
On the Properties of MVR CGs
Separators in CGs
Learning CGs
LWF Chain Graphs
MVR Chain Graphs
AMP Chain Graphs
CGs and Symmetric Associations
Chain Graphs (CGs)
Chain graphs:
admit both directed and undirected edges,
there are no partially directed cycles.
A Partially directed cycle: is a sequence of n distinct vertices
v1, v2, . . . , vn(n ≥ 3), and vn+1 ≡ v1, s.t.
for all i (1 ≤ i ≤ n) either vi − vi+1 or vi → vi+1, and
there exists a j (1 ≤ j ≤ n) such that vj ← vj+1.
Example:
Mohammad Ali Javidian Chain Graphs: Properties and Learning Algorithms 8 / 44
9. Introduction
CG Interpretations
On the Properties of MVR CGs
Separators in CGs
Learning CGs
LWF Chain Graphs
MVR Chain Graphs
AMP Chain Graphs
CGs and Symmetric Associations
LWF Chain Graphs
Global Markov property: if for all A, B, C ⊆ V such that C
separates A and B in (GAn(A∪B∪C))m
, we have
A B | C.
Example: a b, a d | {b, c}, and b c | {a, d}.
Figure: The LWF CG G with chain components:
T = {T1 = {a}, T2 = {b}, T3 = {c, d}}.
Mohammad Ali Javidian Chain Graphs: Properties and Learning Algorithms 9 / 44
10. Introduction
CG Interpretations
On the Properties of MVR CGs
Separators in CGs
Learning CGs
LWF Chain Graphs
MVR Chain Graphs
AMP Chain Graphs
CGs and Symmetric Associations
Multivariate Regression CGs (MVR CGs)
Global Markov property: if for all A, B, C ⊆ V such that C
separates A and B in (GAn(A∪B∪C))a
, we have
A B | C.
Example: a b, a d, and b c.
Figure: The MVR CG G with chain components:
T = {T1 = {a}, T2 = {b}, T3 = {c, d}}.
Mohammad Ali Javidian Chain Graphs: Properties and Learning Algorithms 10 / 44
11. Introduction
CG Interpretations
On the Properties of MVR CGs
Separators in CGs
Learning CGs
LWF Chain Graphs
MVR Chain Graphs
AMP Chain Graphs
CGs and Symmetric Associations
AMP Chain Graphs
Global Markov property: if for all A, B, C ⊆ V such that C
separates A and B in (G[A ∪ B ∪ C])a
, we have
A B | C.
Example: a b, a b | c, a b | d, a d, a d | b, b c and
b c | a.
Figure: The AMP CG G with chain components:
T = {T1 = {a}, T2 = {b}, T3 = {c, d}}.
Mohammad Ali Javidian Chain Graphs: Properties and Learning Algorithms 11 / 44
12. Introduction
CG Interpretations
On the Properties of MVR CGs
Separators in CGs
Learning CGs
LWF Chain Graphs
MVR Chain Graphs
AMP Chain Graphs
CGs and Symmetric Associations
Chain Graphs and Symmetric Associations
MVR CGs:
there is an unmeasured common cause of both X and Y;
LWF CGs:
there is feedback between X and Y. They arrive at a
stochastic equilibrium as time goes to infinity.
AMP
Mohammad Ali Javidian Chain Graphs: Properties and Learning Algorithms 12 / 44
13. Introduction
CG Interpretations
On the Properties of MVR CGs
Separators in CGs
Learning CGs
LWF Chain Graphs
MVR Chain Graphs
AMP Chain Graphs
CGs and Symmetric Associations
Creating a CG from Expert Knowledge
Figure: Lappenschaar et. al. Qualitative chain graphs and their
application. IJAR (2014)
Mohammad Ali Javidian Chain Graphs: Properties and Learning Algorithms 13 / 44
14. Introduction
CG Interpretations
On the Properties of MVR CGs
Separators in CGs
Learning CGs
Outline
Introduction
Probabilistic Graphical Models
A Motivational Example
CG Interpretations
LWF Chain Graphs
MVR Chain Graphs
AMP Chain Graphs
CGs and Symmetric Associations
On the Properties of MVR CGs
Separators in CGs
Learning CGs
Order-Independent PC-like Algorithm for Learning MVR CGs
Order-dependent PC-like Algorithm for Learning MVR CGs
Learning MVR CGs via Decomposition
Mohammad Ali Javidian Chain Graphs: Properties and Learning Algorithms 14 / 44
15. Introduction
CG Interpretations
On the Properties of MVR CGs
Separators in CGs
Learning CGs
Are the Proposed Markov Properties of MVR CGs
Equivalent?1
Theorem
Let G be an MVR chain graph. For an independence model over the node set
of G, the following conditions are equivalent:
(i) satisfies the global Markov property (based on the pathwise
m-separation criterion) w.r.t. G in (Richardson & Spirtes, 2002);
(ii) satisfies the global Markov property (based on the augmentation
separation criterion) w.r.t. G in (Richardson & Spirtes, 2002);
(iii) satisfies the block recursive Markov property w.r.t. G in (Drton, 2009);
(iv) satisfies the MR Markov property w.r.t. G in (Marchetti & Lupparelli,
2011).
(v) satisfies the ordered local Markov property w.r.t. G in (Richardson, 2003).
1
(Javidian & Valtorta, WPGM 2018)
Mohammad Ali Javidian Chain Graphs: Properties and Learning Algorithms 15 / 44
16. Introduction
CG Interpretations
On the Properties of MVR CGs
Separators in CGs
Learning CGs
An Alternative Factorization for MVR Chain Graphs
Factorization: [(Drton, 2009) and (Marchetti & Lupparelli, 2011)]
p(x) = T∈T p(xT |xpaD (T)).
Theorem ((Javidian & Valtorta, WPGM 2018))
Let G be an MVR chain graph with chain components (T|T ∈ T ). If
a probability distribution P obeys the global Markov property for G
then p(x) = T∈T p(xT |xpaG(T)).
Example: p = p1234|56p56|7p7 vs p = p1234|5p56|7p7
Figure: an MVR CG with: T = {T1 = {1, 2, 3, 4}, T2 = {5, 6}, T3 = {7}}.
Mohammad Ali Javidian Chain Graphs: Properties and Learning Algorithms 16 / 44
17. Introduction
CG Interpretations
On the Properties of MVR CGs
Separators in CGs
Learning CGs
Outline
Introduction
Probabilistic Graphical Models
A Motivational Example
CG Interpretations
LWF Chain Graphs
MVR Chain Graphs
AMP Chain Graphs
CGs and Symmetric Associations
On the Properties of MVR CGs
Separators in CGs
Learning CGs
Order-Independent PC-like Algorithm for Learning MVR CGs
Order-dependent PC-like Algorithm for Learning MVR CGs
Learning MVR CGs via Decomposition
Mohammad Ali Javidian Chain Graphs: Properties and Learning Algorithms 17 / 44
18. Introduction
CG Interpretations
On the Properties of MVR CGs
Separators in CGs
Learning CGs
Main Theorems: Theorem 11
Theorem
The problem of finding a minimal separating set for X and Y in an
LWF/MVR/AMP chain graph G is equivalent to the problem of
finding a minimal separating set forX and Y in the undirected
graph (GAn(X∪Y))m
/(GAn(X∪Y))a
/(Gant(X∪Y))a
.
1
(Javidian & Valtorta, PGM 2018, WUAI 2018, & Jamshidi JMLR 2019)
Mohammad Ali Javidian Chain Graphs: Properties and Learning Algorithms 18 / 44
19. Introduction
CG Interpretations
On the Properties of MVR CGs
Separators in CGs
Learning CGs
Main Theorems: Theorem 2
Theorem
Given two nodes X and Y in an LWF/MVR chain graph G and a
set S of nodes not containing X and Y, there exists some subset of
S which separates X and Y if only if the set S = S ∩ An(X ∪ Y)
separates X and Y.
Note that in the case of AMP CGs, S = S ∩ ant(X ∪ Y).
Mohammad Ali Javidian Chain Graphs: Properties and Learning Algorithms 19 / 44
20. Introduction
CG Interpretations
On the Properties of MVR CGs
Separators in CGs
Learning CGs
Algorithm for Finding Minimal Separator
Algorithm 1: Restricted separation
Input: A set S of nodes not containing X and Y in the LWF chain graph G.
Output: If there is a subset of S that separates X from Y then the algorithm
returns Z ⊆ S that separates X from Y otherwise, returns FALSE.
1 Construct GAn(X∪Y);
2 Construct (GAn(X∪Y))m
;
3 Set S = S ∩ An(X ∪ Y);
4 Remove S from (GAn(X∪Y))m
;
5 Starting from X, run BFS;
6 if Y is met then
7 return FALSE
8 else
9 return Z = S
10 end
Mohammad Ali Javidian Chain Graphs: Properties and Learning Algorithms 20 / 44
21. Introduction
CG Interpretations
On the Properties of MVR CGs
Separators in CGs
Learning CGs
Order-Independent PC-like Algorithm for Learning MVR CGs
Learning MVR CGs via Decomposition
Outline
Introduction
Probabilistic Graphical Models
A Motivational Example
CG Interpretations
LWF Chain Graphs
MVR Chain Graphs
AMP Chain Graphs
CGs and Symmetric Associations
On the Properties of MVR CGs
Separators in CGs
Learning CGs
Order-Independent PC-like Algorithm for Learning MVR CGs
Order-dependent PC-like Algorithm for Learning MVR CGs
Learning MVR CGs via Decomposition
Mohammad Ali Javidian Chain Graphs: Properties and Learning Algorithms 21 / 44
22. Introduction
CG Interpretations
On the Properties of MVR CGs
Separators in CGs
Learning CGs
Order-Independent PC-like Algorithm for Learning MVR CGs
Learning MVR CGs via Decomposition
PC-Like Algorithm for Learning MVR CGs (Sonntag &
Pe˜na, 2012)
X
Y
Observational Data
a
b
c
d
e
Skeleton Recovery
a
b
c
d
e
v-StructureRecovery
a
b
c
d
e
Essential Recovery
Figure: The procedure of learning the structure of an essential MVR
chain graph from a faithful distribution.
Mohammad Ali Javidian Chain Graphs: Properties and Learning Algorithms 22 / 44
23. Introduction
CG Interpretations
On the Properties of MVR CGs
Separators in CGs
Learning CGs
Order-Independent PC-like Algorithm for Learning MVR CGs
Learning MVR CGs via Decomposition
Markov Equivalence Criterion
The skeleton (underlying graph) of an MVR CG G is obtained
from G by changing all directed/bidirected edges of G into
undirected edges.
A v-structure in a chain graph is an induced subgraph of the
form a → c ← b.
Theorem
Two MVR chain graphs G and H are Markov equivalent if and only
if they have the same skeletons and the same v-structures
(Wermuth & Sadeghi, 2012).
Mohammad Ali Javidian Chain Graphs: Properties and Learning Algorithms 23 / 44
24. Introduction
CG Interpretations
On the Properties of MVR CGs
Separators in CGs
Learning CGs
Order-Independent PC-like Algorithm for Learning MVR CGs
Learning MVR CGs via Decomposition
Skeleton Recovery
Algorithm 2: Skeleton Recovery of MVR chain graphs
Input: a set V of nodes and a probability distribution p faithful to an unknown
MVR CG G.
Output: The skeleton of the corresponding MVR CG G.
1 Let H denote the complete undirected graph over V;
2 for i ← 0 to |VH| − 2 do
3 while possible do
4 Select any ordered pair of nodes A and B in H such that A ∈ adH(B)
and |ad(A) B| ≥ i;
5 if there exists S ⊆ (adH(A) B) s.t. |S| = i and A ⊥⊥p B|S (i.e., A is
independent of B given S in the probability distribution p) then
6 Set SAB = SBA = S;
7 Remove the edge A − B from H;
8 end
9 end
10 end
Mohammad Ali Javidian Chain Graphs: Properties and Learning Algorithms 24 / 44
25. Introduction
CG Interpretations
On the Properties of MVR CGs
Separators in CGs
Learning CGs
Order-Independent PC-like Algorithm for Learning MVR CGs
Learning MVR CGs via Decomposition
Skeleton Recovery: Example
Given order: a, b, c, d, e.
a
b
c
d
e
MVR CG G
a ⊥⊥d d|{b}
a ⊥⊥d e|{c}
b ⊥⊥d e|{c}
d ⊥⊥d e|{c}
a
b
c
d
e
Start with
H = K5,
and after
i = 0
a
b
c
d
e
After i = 1
a ⊥⊥d d|{b}
a ⊥⊥d e|{c}
b ⊥⊥d e|{c}
d ⊥⊥d e|{c}
Mohammad Ali Javidian Chain Graphs: Properties and Learning Algorithms 25 / 44
26. Introduction
CG Interpretations
On the Properties of MVR CGs
Separators in CGs
Learning CGs
Order-Independent PC-like Algorithm for Learning MVR CGs
Learning MVR CGs via Decomposition
v-Structure Recovery
Algorithm 3: Pattern recovery of MVR chain graphs
Input: H, the skeleton of the MVR CG G.
Output: The pattern of the MVR CG G.
1 for each m-separator Suv do
2 if u w v appears in the skeleton and w is not in Suv then
/* u w means u w or u w. Also, w v means
w v or w v. */
3 Determine a v-structure u w v;
4 end
5 end
Mohammad Ali Javidian Chain Graphs: Properties and Learning Algorithms 26 / 44
27. Introduction
CG Interpretations
On the Properties of MVR CGs
Separators in CGs
Learning CGs
Order-Independent PC-like Algorithm for Learning MVR CGs
Learning MVR CGs via Decomposition
v-Structure Recovery: Example
Given order: a, b, c, d, e.
a
b
c
d
e
MVR CG G
a ⊥⊥d d|{b}
a ⊥⊥d e|{c}
b ⊥⊥d e|{c}
d ⊥⊥d e|{c}
a
b
c
d
e
Start with
H = K5.
After
i = 0
a
b
c
d
e
After i = 1
a ⊥⊥d d|{b}
a ⊥⊥d e|{c}
b ⊥⊥d e|{c}
d ⊥⊥d e|{c}
a
b
c
d
e
v-StructureRecovery
a ⊥⊥d d|{c}
So, c Sad
i.e., a, c, d
is v-structure
Mohammad Ali Javidian Chain Graphs: Properties and Learning Algorithms 27 / 44
28. Introduction
CG Interpretations
On the Properties of MVR CGs
Separators in CGs
Learning CGs
Order-Independent PC-like Algorithm for Learning MVR CGs
Learning MVR CGs via Decomposition
Essential Graph Recovery
Algorithm 4: Essential CG recovery of MVR chain graphs
Input: The pattern of the MVR CG G.
Output: The Essential of the MVR CG G.
1 Apply rules 1-3 (Sonntag & Pe˜na, 2012) in the following figure while possible;
Mohammad Ali Javidian Chain Graphs: Properties and Learning Algorithms 28 / 44
29. Introduction
CG Interpretations
On the Properties of MVR CGs
Separators in CGs
Learning CGs
Order-Independent PC-like Algorithm for Learning MVR CGs
Learning MVR CGs via Decomposition
Essential Graph Recovery: Example
a
b
c
d
e
v-StructureRecovery
Learned
Pattern
a
b
c
d
e
Essential Recovery
Apply
Rules 1&3
Mohammad Ali Javidian Chain Graphs: Properties and Learning Algorithms 29 / 44
30. Introduction
CG Interpretations
On the Properties of MVR CGs
Separators in CGs
Learning CGs
Order-Independent PC-like Algorithm for Learning MVR CGs
Learning MVR CGs via Decomposition
Ex. Order-Dependent Skeleton of the PC-like Algorithm
e
d
a
b c
(a)
e
d
a
b c
(b)
Figure: (a) DAG G with minimal sep. sets: a ⊥⊥ d|{b, c} & a ⊥⊥ e|{b, c},
(b) the skeleton returned by OPC with order1(V) = (d, e, a, c, b).
Table: The trace table of OPC for i = 3 and order1(V) = (d, e, a, c, b).
Ordered Is Suv ⊆ Is u v
Pair (u, v) adH(u) Suv adH(u) {v}? removed?
(e, a) {a, b, c, d} {b, c, d} Yes Yes
(e, c) {b, c, d} {a, b, d} No No
(c, e) {a, b, d, e} {a, b, d} Yes Yes
Mohammad Ali Javidian Chain Graphs: Properties and Learning Algorithms 30 / 44
31. Introduction
CG Interpretations
On the Properties of MVR CGs
Separators in CGs
Learning CGs
Order-Independent PC-like Algorithm for Learning MVR CGs
Learning MVR CGs via Decomposition
Ex. Order-Dependent Skeleton of the PC-like Algorithm
e
d
a
b c
(a)
e
d
a
b c
(c)
Figure: (a) DAG G with minimal sep. sets: a ⊥⊥ d|{b, c} & a ⊥⊥ e|{b, c}, (c)
the skeleton returned by OPC with order2(V) = (d, c, e, a, b).
Table: The trace table of OPC for i = 3 and order2(V) = (d, c, e, a, b).
Ordered Is Suv ⊆ Is u v
Pair (u, v) adH(u) Suv adH(u) {v}? removed?
(c, e) {a, b, d, e} {a, b, d} Yes Yes
(e, a) {a, b, d} {b, c, d} No No
(a, e) {b, c, e} {b, c, d} No No
Mohammad Ali Javidian Chain Graphs: Properties and Learning Algorithms 31 / 44
32. Introduction
CG Interpretations
On the Properties of MVR CGs
Separators in CGs
Learning CGs
Order-Independent PC-like Algorithm for Learning MVR CGs
Learning MVR CGs via Decomposition
Ex. Order-Dependent Skeleton of the PC-like Algorithm
e
d
a
b c
(a)
e
d
a
b c
(b)
e
d
a
b c
(c)
Figure: (a) DAG G with minimal sep. sets: a ⊥⊥ d|{b, c} & a ⊥⊥ e|{b, c},
(b) the skeleton returned by OPC with order1(V) = (d, e, a, c, b), (c) the
skeleton returned by OPC with order2(V) = (d, c, e, a, b).
Mohammad Ali Javidian Chain Graphs: Properties and Learning Algorithms 32 / 44
33. Introduction
CG Interpretations
On the Properties of MVR CGs
Separators in CGs
Learning CGs
Order-Independent PC-like Algorithm for Learning MVR CGs
Learning MVR CGs via Decomposition
Algorithm 5: The order-independent (stable) skeleton recovery of
MVR CGs (Javidian et al., SUM 2019).
Input: A set V of nodes and a probability distribution p faithful to an unknown MVR CG G
and an ordering order(V) on the variables.
Output: The skeleton of the corresponding MVR CG G.
1 Let H denote the complete undirected graph over V = {v1, . . . , vn};
2 for i ← 0 to |VH| − 2 do
3 for j ← 1 to |VH| do
4 Set aH(vi) = adH(vi);
5 end
6 while possible do
7 Select any ordered pair of nodes u and v in H such that u ∈ aH(v) and
|aH(u) v| ≥ i using order(V);
8 if there exists S ⊆ (aH(u) v) s.t. |S| = i and u ⊥⊥p v|S (i.e., u is independent of v
given S in the probability distribution p) then
9 Set Suv = Svu = S;
10 Remove the edge u v from H;
11 end
12 end
13 end
Mohammad Ali Javidian Chain Graphs: Properties and Learning Algorithms 33 / 44
34. Introduction
CG Interpretations
On the Properties of MVR CGs
Separators in CGs
Learning CGs
Order-Independent PC-like Algorithm for Learning MVR CGs
Learning MVR CGs via Decomposition
Ex. Order-Independ. Skeleton of the PC-like Algorithm
After CI tests at some significance level α: a ⊥⊥ d|{b, c},
a ⊥⊥ e|{b, c, d}, and c ⊥⊥ e|{a, b, d}. When i = 3:
aH(a) = aH(d) = {b, c, e} and aH(v) = V {v}, for v = b, c, e.
e
d
a
b c
(a)
e
d
a
b c
(b)
Figure: (a) DAG G with minimal sep. sets: a ⊥⊥ d|{b, c} & a ⊥⊥ e|{b, c},
(b) the skeleton returned by OPC with both ordering
order1(V) = (d, e, a, c, b) and order2(V) = (d, c, e, a, b).
Mohammad Ali Javidian Chain Graphs: Properties and Learning Algorithms 34 / 44
35. Introduction
CG Interpretations
On the Properties of MVR CGs
Separators in CGs
Learning CGs
Order-Independent PC-like Algorithm for Learning MVR CGs
Learning MVR CGs via Decomposition
Evaluation: OPC vs SPC
500 1000 5000 10000
0.9750.9800.9850.990
q
q
q
q
P= 50 , N= 2 , OPC vs SPC
Sample size
TDR
q OPC
SPC
500 1000 5000 10000
0.800.850.900.95
q
q
q
q
P= 50 , N= 2 , OPC vs SPC
Sample size
TPR
q OPC
SPC
500 1000 5000 10000
8101418
q
q
q
q
P= 50 , N= 2 , OPC vs SPC
Sample size
SHD
q OPC
SPC
0.05 0.01 0.005 0.001
0.60.70.80.9
q
q
q
q
P= 1000 , N= 2 , OPC vs SPC
p values
TDR
q OPC
SPC
0.05 0.01 0.005 0.001
0.400.440.480.52
q
q
q
q
P= 1000 , N= 2 , OPC vs SPC
p values
TPR
q OPC
SPC
0.05 0.01 0.005 0.001
8009001000
q
q
q
P= 1000 , N= 2 , OPC vs SPC
p values
SHD
q OPC
SPC
Mohammad Ali Javidian Chain Graphs: Properties and Learning Algorithms 35 / 44
36. Introduction
CG Interpretations
On the Properties of MVR CGs
Separators in CGs
Learning CGs
Order-Independent PC-like Algorithm for Learning MVR CGs
Learning MVR CGs via Decomposition
A Decomposition-Based Algorithm for Learning MVR
CGs (Javidian & Valtorta, submitted to IJAR, 2019)
Mohammad Ali Javidian Chain Graphs: Properties and Learning Algorithms 36 / 44
37. Introduction
CG Interpretations
On the Properties of MVR CGs
Separators in CGs
Learning CGs
Order-Independent PC-like Algorithm for Learning MVR CGs
Learning MVR CGs via Decomposition
Local Skeleton Recovery from an m-Separation Tree
Theorem
Let T be an m-separation tree for CG G. Vertices u and v are
m-separated by S ⊆ V in G if and only if (i) u and v are not
contained together in any node C of T or (ii) there exists a node C
that contains both u and v such that a subset S of C, m-separates
u and v.
Mohammad Ali Javidian Chain Graphs: Properties and Learning Algorithms 37 / 44
38. Introduction
CG Interpretations
On the Properties of MVR CGs
Separators in CGs
Learning CGs
Order-Independent PC-like Algorithm for Learning MVR CGs
Learning MVR CGs via Decomposition
Algorithm 6: Pattern recovery algorithm for MVR chain graphs
Input: an m-separation tree T with a node set C = {C1, . . . , CH}
Output: the pattern of MVR CG G.
1 Set S = ∅;
2 for h ← 1 to H do
3 Start from a complete undirected graph ¯Gh with vertex set Ch ;
4 for each vertex pair {u, v} ⊆ Ch do
5 if ∃Suv ⊆ Ch such that u ⊥⊥ v|Suv then
6 Delete the edge (u, v) in ¯Gh ;
7 Add Suv to S;
8 end
9 end
10 end
11 Initialize the edge set ¯EV of ¯GV as the union of all edge sets of ¯Gh , h = 1, . . . , H;
12 for each Vertex pair {u, v} contained in more than one tree node and (u, v) ∈ ¯GV do
13 if ∃Ch such that {u, v} ⊆ Ch and {u, v} ¯Eh then
14 Delete the edge (u, v) in ¯GV ;
15 end
16 end
17 for each m-separator Suv in the list S do
18 if u ◦− w −◦ v appears in the global skeleton and w is not in Suv then
/* u ◦− w means u ← w or u − w. Also, w −◦ v means w → v or w − v. */
19 Determine a v-structure u ◦→ w ←◦ v;
20 end
21 end
Mohammad Ali Javidian Chain Graphs: Properties and Learning Algorithms 38 / 44
39. Introduction
CG Interpretations
On the Properties of MVR CGs
Separators in CGs
Learning CGs
Order-Independent PC-like Algorithm for Learning MVR CGs
Learning MVR CGs via Decomposition
Essential MVR CG Recovery
Apply rules 1-3 in the following figure while possible:
Figure: The Rules for converting a pattern to an essential MVR CG
(Sonntag and Pea, 2012)
Mohammad Ali Javidian Chain Graphs: Properties and Learning Algorithms 39 / 44
40. Introduction
CG Interpretations
On the Properties of MVR CGs
Separators in CGs
Learning CGs
Order-Independent PC-like Algorithm for Learning MVR CGs
Learning MVR CGs via Decomposition
Evaluation: PC vs LCD
q
q
0.76
0.80
0.84
0.88
0.92
LCD
PC
TPR
sample size
s=1000
s=10000
s=300
s=3000
alpha = 0.05
q
q
0.75
0.80
0.85
0.90
alpha = 0.01
q
q
0.70
0.75
0.80
0.85
0.90
alpha = 0.005
PC
PC
LCD
LCD q
q
0.0040
0.0045
0.0050
LCD
PC
FPR
sample size
s=1000
s=10000
s=300
s=3000
alpha = 0.05
q
q
0.0010
0.0015
0.0020
alpha = 0.01
q
q
0.0004
0.0008
0.0012
0.0016
alpha = 0.005
LCD
LCD
PC
PC
q
q
0.9800
0.9825
0.9850
0.9875
0.9900
LCD
PC
ACC
sample size
s=1000
s=10000
s=300
s=3000
alpha = 0.05
q
q
0.980
0.985
0.990
alpha = 0.01
q
q
0.980
0.985
0.990
alpha = 0.005
PC
PC
LCD
LCD
q
q
30
40
50
LCD
PC
#Differences
sample size
s=1000
s=10000
s=300
s=3000
alpha = 0.05
q
q
20
30
40
50
SHD
alpha = 0.01
q
q
20
30
40
50
alpha = 0.005
PC
LCD
LCD
PC
Mohammad Ali Javidian Chain Graphs: Properties and Learning Algorithms 40 / 44
41. Introduction
CG Interpretations
On the Properties of MVR CGs
Separators in CGs
Learning CGs
Order-Independent PC-like Algorithm for Learning MVR CGs
Learning MVR CGs via Decomposition
Summary
We studied all Markov properties for MVR chain graphs,
proposed in the literature, and we proved that all of them are
equivalent (except for pairwise Markov properties).
We proposed a new factorization formula for MVR CGs that is
more concise and informative than the one proposed in the
literature.
We proposed an efficient algorithm to address the problem of
finding minimal separators in chain graphs for all three
different interpretations.
We proposed a PC-like algorithm for structure learning of LWF
CGs. Also, we proposed an order-independent version of
PC-like algorithm for all three different interpretations.
We proposed a decomposition-based algorithm for learning
the structure of MVR/AMP CGs.
Mohammad Ali Javidian Chain Graphs: Properties and Learning Algorithms 41 / 44
42. Introduction
CG Interpretations
On the Properties of MVR CGs
Separators in CGs
Learning CGs
Order-Independent PC-like Algorithm for Learning MVR CGs
Learning MVR CGs via Decomposition
List of publications for each chapter
LWF CGs • Finding Minimal Separators in LWF Chain Graphs
(Javidian & Valtorta, PGM2018) [section 2.3].
• Learning LWF Chain Graphs: An Order-Independent Algorithm
(Javidian & Valtorta & Jamshidi, submitted to AISTATS 2020) [section 2.4].
MVR CGs • On the properties of MVR Chain Graphs (Javidian & Valtorta, WPGM2018) [section 3.2].
• Finding Minimal Separators in MVR Chain Graphs
(Javidian & Valtorta, WUAI2018) [section 3.3].
• Order-Independent Structure Learning of Multivariate Regression Chain Graphs
(Javidian & Valtorta& Jamshidi, SUM 2019) [section 3.4].
• A Decomposition-Based Algorithm for Learning the Structure of MVR Chain Graphs
(Javidian & Valtorta, submitted to IJAR, 11 Apr 2019) [section 3.5].
AMP CGs • AMP CGs: Minimal Separators and Structure Learning Algorithms
(Javidian & Valtorta& Jamshidi, submitted to JMLR, 23 Jul 2019).
Bayesian hypergraphs • On a hypergraph probabilistic graphical model
(Javidian & Wang & Lu & Valtorta, submitted to Ann Math Artif Intel, 19 Apr 2019) [Chapter 5]
• The Causal Interpretations of Bayesian Hypergraphs
(Wang & Javidian & Lu & Valtorta, AAAI Spring Symposium 2019) [section 5.4].
Causal transfer learning • Transfer Learning for Performance Modeling of Configurable Systems:
A Causal Analysis (Javidian & Jamshidi & Valtorta, AAAI Spring Symposium 2019)
[Chapter 6].
Mohammad Ali Javidian Chain Graphs: Properties and Learning Algorithms 42 / 44
43. Introduction
CG Interpretations
On the Properties of MVR CGs
Separators in CGs
Learning CGs
Order-Independent PC-like Algorithm for Learning MVR CGs
Learning MVR CGs via Decomposition
List of other publications
Mohammad Ali Javidian , Pooyan Jamshidi, and Rasoul
Ramezanian. ”Avoiding Social Disappointment in Elections”
Proceedings of the 18th International Conference on
Autonomous Agents and MultiAgent Systems (AAMAS 2019,
Montreal), May 13-17, 2019, Pages: 2039-2041.
Mohammad Ali Javidian, Marco Valtorta, and Pooyan
Jamshidi. ”Learning Bayesian networks in the presence of
unmeasured confounders: an efficient approach based on
Markov blankets.” (submitted to AAAI 2020).
Mohammad Ali Javidian Chain Graphs: Properties and Learning Algorithms 43 / 44
44. Introduction
CG Interpretations
On the Properties of MVR CGs
Separators in CGs
Learning CGs
Order-Independent PC-like Algorithm for Learning MVR CGs
Learning MVR CGs via Decomposition
AMP CGs and Symmetric Associations
X and Y are both causes of some ’selection’ variable;
Every AMP CG is Markov equivalent to some DAG with error
and selection nodes under marginalization of the error nodes
and conditioning of the selection nodes (Pe˜na, 2014).
Figure: IAMP(G) = [IAMP(G )]∅
= [IAMP(G )]S
Return
Mohammad Ali Javidian Chain Graphs: Properties and Learning Algorithms 44 / 44