SlideShare ist ein Scribd-Unternehmen logo
1 von 56
Downloaden Sie, um offline zu lesen
Interactive Learning of Bayesian Networks
Andrés Masegosa, Serafín Moral
Department of Computer Science and Artificial Intelligence - University of
Granada
andrew@decsai.ugr.es
Utrecht, July 2012
Motivation The Bayesian Framework
Outline
1 Motivation
2 The Bayesian Framework
3 Interactive integration of expert knowledge for model
selection
4 Applications:
Learning the parent set of a variable in BN.
Learning Markov Boundaries.
Learning complete Bayesian Networks.
5 Conclusions
Utrecht Utrecht (Netherlands) 2/20
Motivation The Bayesian Framework
Outline
1 Motivation
2 The Bayesian Framework
Utrecht Utrecht (Netherlands) 3/20
Motivation The Bayesian Framework
Bayesian Networks
Bayesian Networks
Excellent models to graphically represent the dependency structure (Markov
Condition and d-separation) of the underlying distribution in multivariate
domain problems: very relevant source of knowledge.
Inference tasks: compute marginal, evidence propagation, abductive-inductive
reasoning, etc.
Utrecht Utrecht (Netherlands) 4/20
Motivation The Bayesian Framework
Learning Bayesian Networks from Data
Learning Algorithms
Learning Bayesian networks from data is quite challenging: the DAG space is
super-exponential.
There usually are several models with explain the data similarly well.
Bayesian framework: high posterior probability given the data.
Utrecht Utrecht (Netherlands) 5/20
Motivation The Bayesian Framework
Learning Bayesian Networks from Data
Uncertainty in Model Selection
This situation is specially common in problem domains with high number of
variables and low sample sizes.
Utrecht Utrecht (Netherlands) 6/20
Motivation The Bayesian Framework
Integration of Domain/Expert Knowledge
Integration of Domain/Expert Knowledge
Find the best statistical model by the combination of data and
expert/domain knowledge.
Utrecht Utrecht (Netherlands) 7/20
Motivation The Bayesian Framework
Integration of Domain/Expert Knowledge
Integration of Domain/Expert Knowledge
Find the best statistical model by the combination of data and
expert/domain knowledge.
Emerging approach in gene expression data mining:
There is a growing number of biological knowledge data bases.
The model space is usually huge and the number of samples is low (costly
to collect).
Many approaches have shifted from begin pure data-oriented to try to
include domain knowledge.
Utrecht Utrecht (Netherlands) 7/20
Motivation The Bayesian Framework
Integration of Expert Knowledge
Previous Works
There have been many attempts to introduce expert knowledge when learning
BNs from data.
Via Prior Distribution: Use of specific prior distributions over the possible
graph structures to integrate expert knowledge:
Expert assigns higher prior probabilities to most likely edges.
Utrecht Utrecht (Netherlands) 8/20
Motivation The Bayesian Framework
Integration of Expert Knowledge
Previous Works
There have been many attempts to introduce expert knowledge when learning
BNs from data.
Via Prior Distribution: Use of specific prior distributions over the possible
graph structures to integrate expert knowledge:
Expert assigns higher prior probabilities to most likely edges.
Via structural Restrictions: Expert codify his/her knowledge as structural
restrictions.
Expert defines the existence/absence of arcs and/or edges and causal
ordering restrictions.
Retrieved model should satisfy these restrictions.
Utrecht Utrecht (Netherlands) 8/20
Motivation The Bayesian Framework
Integration of Domain/Expert Knowledge
Limitations of "Prior" Expert Knowledge:
We are forced to include expert/domain knowledge for
each of the elements of the models (e.g. for every
possible edge of a BN).
Utrecht Utrecht (Netherlands) 9/20
Motivation The Bayesian Framework
Integration of Domain/Expert Knowledge
Limitations of "Prior" Expert Knowledge:
We are forced to include expert/domain knowledge for
each of the elements of the models (e.g. for every
possible edge of a BN).
Expert could be biased to provided the “clearest”
knowledge, which could be the easiest to be find in the
data.
Utrecht Utrecht (Netherlands) 9/20
Motivation The Bayesian Framework
Integration of Domain/Expert Knowledge
Limitations of "Prior" Expert Knowledge:
We are forced to include expert/domain knowledge for
each of the elements of the models (e.g. for every
possible edge of a BN).
Expert could be biased to provided the “clearest”
knowledge, which could be the easiest to be find in the
data.
The system does not help to the user to introduce
information about the BN structure.
Utrecht Utrecht (Netherlands) 9/20
Motivation The Bayesian Framework
Integration of Domain/Expert Knowledge
Interactive Integration of Expert Knowledge
Data is firstly analyzed.
The system only inquires to the expert about most uncertain
elements considering the information present in the data.
Utrecht Utrecht (Netherlands) 10/20
Motivation The Bayesian Framework
Integration of Domain/Expert Knowledge
Interactive Integration of Expert Knowledge
Data is firstly analyzed.
The system only inquires to the expert about most uncertain
elements considering the information present in the data.
Benefits:
Expert is only asked a smaller number of times.
We explicitly show to the expert which are the elements
about which data do not provide enough evidence to make
a reliable model selection.
Utrecht Utrecht (Netherlands) 10/20
Motivation The Bayesian Framework
Integration of Domain/Expert Knowledge
Utrecht Utrecht (Netherlands) 11/20
Motivation The Bayesian Framework
Integration of Domain/Expert Knowledge
Utrecht Utrecht (Netherlands) 12/20
Motivation The Bayesian Framework
Integration of Domain/Expert Knowledge
Active Interaction with the Expert
Strategy: Ask to the expert by the presence of the edges that most reduce the
model uncertainty.
Method: Framework to allow an efficient and effective interaction with the expert.
Expert is only asked for this controversial structural features.
Utrecht Utrecht (Netherlands) 12/20
Motivation The Bayesian Framework
Outline
1 Motivation
2 The Bayesian Framework
Utrecht Utrecht (Netherlands) 13/20
Motivation The Bayesian Framework
Notation
Let us denote by X = (X1, ..., Xn) a vector of random
variables and by D a fully observed set of instances
x = (x1, ..., xn) ∈ Val(X).
Utrecht Utrecht (Netherlands) 14/20
Motivation The Bayesian Framework
Notation
Let us denote by X = (X1, ..., Xn) a vector of random
variables and by D a fully observed set of instances
x = (x1, ..., xn) ∈ Val(X).
Let be M a model and M the set of all possible models. M
may define:
Joint probability distribution: P(X|M) in the case of a
Bayesian network.
Conditional probability distribution for target variable:
P(T|X, M) in the case of a Markov Blanket.
Utrecht Utrecht (Netherlands) 14/20
Motivation The Bayesian Framework
Examples of Models
Each model M is structured:
It is defined by a vector of elements M = (m1, ..., mK ) where
K is the number of possible components of M.
Utrecht Utrecht (Netherlands) 15/20
Motivation The Bayesian Framework
Examples of Models
Each model M is structured:
It is defined by a vector of elements M = (m1, ..., mK ) where
K is the number of possible components of M.
Examples:
Bayesian Network
X 1
X 2
X 3
X 4
M = (1, 0, 0, 1, 0, -1)
Utrecht Utrecht (Netherlands) 15/20
Motivation The Bayesian Framework
Examples of Models
Each model M is structured:
It is defined by a vector of elements M = (m1, ..., mK ) where
K is the number of possible components of M.
Examples:
Bayesian Network Markov Blankets
X 1
X 2
X 3
X 4
M = (1, 0, 0, 1, 0, -1)
X 1 X 2 X 3 X 4 X 5
T
M = (1, 1, 0, 0, 1)
Utrecht Utrecht (Netherlands) 15/20
Motivation The Bayesian Framework
The Model Selection Problem: Bayesian Framework
Define a prior probability over the space of alternative
models P(M).
It is not the classic uniform prior (the multplicity problem).
Utrecht Utrecht (Netherlands) 16/20
Motivation The Bayesian Framework
The Model Selection Problem: Bayesian Framework
Define a prior probability over the space of alternative
models P(M).
It is not the classic uniform prior (the multplicity problem).
For each model, it is computed its Bayesian score:
score(M|D) = P(D|M)P(M)
where P(D|M) = θ P(D|θ, M)P(θ|M) is the marginal
likelihood of the model.
Utrecht Utrecht (Netherlands) 16/20
Motivation The Bayesian Framework
Benefits of the full Bayesian approach
We assume that we are able to approximate the
posterior by means of any Monte Carlo method:
P(M|D) =
P(D|M)P(M)
M′∈Val(M) P(D|M′)P(M′)
Utrecht Utrecht (Netherlands) 17/20
Motivation The Bayesian Framework
Benefits of the full Bayesian approach
We assume that we are able to approximate the
posterior by means of any Monte Carlo method:
P(M|D) =
P(D|M)P(M)
M′∈Val(M) P(D|M′)P(M′)
We can compute the posterior probability of any of the
elements of a model:
P(mi|D) =
M
P(M|D)IM(mi)
where IM(mi) is the indicator function: 1 if mi is present in
M and 0 otherwise.
Utrecht Utrecht (Netherlands) 17/20
Motivation The Bayesian Framework
Examples
Utrecht Utrecht (Netherlands) 18/20
Motivation The Bayesian Framework
Examples
High Probable Edges: P(B → X|D) = 1.0 and
P(A → X|D) = 0.8
Utrecht Utrecht (Netherlands) 18/20
Motivation The Bayesian Framework
Examples
High Probable Edges: P(B → X|D) = 1.0 and
P(A → X|D) = 0.8
Low Probable Edges: P(C → X|D) = 0.0
Utrecht Utrecht (Netherlands) 18/20
Motivation The Bayesian Framework
Examples
High Probable Edges: P(B → X|D) = 1.0 and
P(A → X|D) = 0.8
Low Probable Edges: P(C → X|D) = 0.0
Uncertain Edges: P(D → X|D) = 0.55
Utrecht Utrecht (Netherlands) 18/20
Motivation The Bayesian Framework
Outline
1 Motivation
2 The Bayesian Framework
Utrecht Utrecht (Netherlands) 19/20
Motivation The Bayesian Framework
Interaction with of Expert/Domain Knowledge
Interactive Integration of of Domain/Expert Knowledge:
Expert/Domain Knowledge is given for particular
elements mk of the the models M:
If a variable is present or not in the final variable selection.
If there is an edge between any two variables in a BN.
Expert/Domain Knowledge may be not fully reliable.
Utrecht Utrecht (Netherlands) 20/20
Motivation The Bayesian Framework
Our Approach for Expert Interaction
A s k
M
E
D
U
m k
k
k
Utrecht Utrecht (Netherlands) 21/20
Motivation The Bayesian Framework
Our Approach for Expert Interaction
A s k
M
E
D
U
m k
k
k
E represents expert reliability: Ω(E) = {Right, Wrong}.
If expert is wrong we assume a random answer.
Utrecht Utrecht (Netherlands) 21/20
Motivation The Bayesian Framework
Our Approach for Expert Interaction
A s k
M
E
D
U
m k
k
k
E represents expert reliability: Ω(E) = {Right, Wrong}.
If expert is wrong we assume a random answer.
Our goal is to infer model structure:
U(Askk , mk , M, D) = logP(M|mk , Askk , D)
Utrecht Utrecht (Netherlands) 21/20
Motivation The Bayesian Framework
Our Approach for Expert Interaction
Utrecht Utrecht (Netherlands) 22/20
Motivation The Bayesian Framework
Our Approach for Expert Interaction
The expected utility of asking and not-asking is:
V(Askk ) =
mk M
P(M|D)P(mk |M, Askk , D)logP(M|mk , Askk , D)
Utrecht Utrecht (Netherlands) 22/20
Motivation The Bayesian Framework
Our Approach for Expert Interaction
The expected utility of asking and not-asking is:
V(Askk ) =
mk M
P(M|D)P(mk |M, Askk , D)logP(M|mk , Askk , D)
V(Askk ) =
M
P(M|D)logP(M|D)
Utrecht Utrecht (Netherlands) 22/20
Motivation The Bayesian Framework
Our Approach for Expert Interaction
The expected utility of asking and not-asking is:
V(Askk ) =
mk M
P(M|D)P(mk |M, Askk , D)logP(M|mk , Askk , D)
V(Askk ) =
M
P(M|D)logP(M|D)
The difference between both actions, V(Askk ) − V(Askk ), is the
information gain function:
IG(M : mk |D) = H(M|D) −
mk
P(mk |D)H(M|mk , D)
Utrecht Utrecht (Netherlands) 22/20
Motivation The Bayesian Framework
Our Approach for Expert Interaction
The expected utility of asking and not-asking is:
V(Askk ) =
mk M
P(M|D)P(mk |M, Askk , D)logP(M|mk , Askk , D)
V(Askk ) =
M
P(M|D)logP(M|D)
The difference between both actions, V(Askk ) − V(Askk ), is the
information gain function:
IG(M : mk |D) = H(M|D) −
mk
P(mk |D)H(M|mk , D)
It can be shown that the element with the highest information
gain is the one with highest entropy:
m⋆
k = max
k
IG(M : mk |D) = max
k
H(mk |D) − H(Ek )
Utrecht Utrecht (Netherlands) 22/20
Motivation The Bayesian Framework
Our Approach for Expert Interaction
1 Approximate P(M|D) by means of a MC technique.
Utrecht Utrecht (Netherlands) 23/20
Motivation The Bayesian Framework
Our Approach for Expert Interaction
1 Approximate P(M|D) by means of a MC technique.
2 Ask the expert about the element mk with the highest entropy.
a = a ∪ a(mk ).
Utrecht Utrecht (Netherlands) 23/20
Motivation The Bayesian Framework
Our Approach for Expert Interaction
1 Approximate P(M|D) by means of a MC technique.
2 Ask the expert about the element mk with the highest entropy.
a = a ∪ a(mk ).
Update P(M|D, a), which is equivalent to:
P(M|D, a) ∝ P(D|M)P(M|a)
Utrecht Utrecht (Netherlands) 23/20
Motivation The Bayesian Framework
Our Approach for Expert Interaction
1 Approximate P(M|D) by means of a MC technique.
2 Ask the expert about the element mk with the highest entropy.
a = a ∪ a(mk ).
Update P(M|D, a), which is equivalent to:
P(M|D, a) ∝ P(D|M)P(M|a)
3 Stop Condition: H(mk |D, a) < λ.
Utrecht Utrecht (Netherlands) 23/20
Motivation The Bayesian Framework
Our Approach for Expert Interaction
1 Approximate P(M|D) by means of a MC technique.
2 Ask the expert about the element mk with the highest entropy.
a = a ∪ a(mk ).
Update P(M|D, a), which is equivalent to:
P(M|D, a) ∝ P(D|M)P(M|a)
3 Stop Condition: H(mk |D, a) < λ.
4 Otherwise:
Option 1: Go to Step 2 and ask to the expert again.
Utrecht Utrecht (Netherlands) 23/20
Motivation The Bayesian Framework
Our Approach for Expert Interaction
1 Approximate P(M|D) by means of a MC technique.
2 Ask the expert about the element mk with the highest entropy.
a = a ∪ a(mk ).
Update P(M|D, a), which is equivalent to:
P(M|D, a) ∝ P(D|M)P(M|a)
3 Stop Condition: H(mk |D, a) < λ.
4 Otherwise:
Option 1: Go to Step 2 and ask to the expert again.
Option 2: Go to Step 1 and sample now using P(M|a).
Utrecht Utrecht (Netherlands) 23/20
Motivation The Bayesian Framework
Example I
Probability of the edges:
P(A → X|D) = 0.8
P(B → X|D) = 0.455
P(C → X|D) = 0.75
Utrecht Utrecht (Netherlands) 24/20
Motivation The Bayesian Framework
Example II
Expert say that B → X is present in the model:
P(A → X|D) = 0.88
P(B → X|D) = 1.0
P(C → X|D) = 0.77
Utrecht Utrecht (Netherlands) 25/20
Motivation The Bayesian Framework
Developments
This methodology has been applied to the following model
selection problems:
Utrecht Utrecht (Netherlands) 26/20
Motivation The Bayesian Framework
Developments
This methodology has been applied to the following model
selection problems:
Induce a Bayesian network conditioned to a previously
given causal order of the variables.
Utrecht Utrecht (Netherlands) 26/20
Motivation The Bayesian Framework
Developments
This methodology has been applied to the following model
selection problems:
Induce a Bayesian network conditioned to a previously
given causal order of the variables.
Induce the Markov Blanket of a target variable (Feature
Selection).
Utrecht Utrecht (Netherlands) 26/20
Motivation The Bayesian Framework
Developments
This methodology has been applied to the following model
selection problems:
Induce a Bayesian network conditioned to a previously
given causal order of the variables.
Induce the Markov Blanket of a target variable (Feature
Selection).
Induce Bayesian Networks without any restriction.
Utrecht Utrecht (Netherlands) 26/20
Motivation The Bayesian Framework
Conclusions
We have developed a general method for model selection
which allow the inclusion of expert knowledge.
The method is robust even when expert knowledge is
wrong.
The number of interactions is minimized.
It has been successfully applied to different model
selection problems.
Utrecht Utrecht (Netherlands) 27/20
Motivation The Bayesian Framework
Future Works
Develop a new score to measure the impact of the
interaction in model selection.
Extend this methodology to other probabilistic graphical
models.
Evaluate the impact of the prior over the parameters.
Preference among models may change with the parameter
prior.
Detect the problem and let the user choose.
Employ alternative ways to introduce expert knowledge:
E.g. In BN, we ask about edges: direct causal probabilistic
relationships.
Many domain knowledge is about correlations and
conditional independencies.Utrecht Utrecht (Netherlands) 28/20

Weitere ähnliche Inhalte

Was ist angesagt?

[A]BCel : a presentation at ABC in Roma
[A]BCel : a presentation at ABC in Roma[A]BCel : a presentation at ABC in Roma
[A]BCel : a presentation at ABC in RomaChristian Robert
 
Multiview Alignment Hashing for Efficient Image Search
Multiview Alignment Hashing for Efficient Image SearchMultiview Alignment Hashing for Efficient Image Search
Multiview Alignment Hashing for Efficient Image Search1crore projects
 
Machine learning for the Web:
Machine learning for the Web: Machine learning for the Web:
Machine learning for the Web: butest
 
Short intro to 3D Geoinfo pilot The Netherlands
Short intro to 3D Geoinfo pilot The NetherlandsShort intro to 3D Geoinfo pilot The Netherlands
Short intro to 3D Geoinfo pilot The NetherlandsLéon Berlo
 
SchuurmansLecture.doc
SchuurmansLecture.docSchuurmansLecture.doc
SchuurmansLecture.docbutest
 
ppt slides
ppt slidesppt slides
ppt slidesbutest
 
Volume 2-issue-6-1930-1932
Volume 2-issue-6-1930-1932Volume 2-issue-6-1930-1932
Volume 2-issue-6-1930-1932Editor IJARCET
 
Artificial Intelligence
Artificial IntelligenceArtificial Intelligence
Artificial Intelligencewasim liam
 
MSWord
MSWordMSWord
MSWordbutest
 

Was ist angesagt? (11)

[A]BCel : a presentation at ABC in Roma
[A]BCel : a presentation at ABC in Roma[A]BCel : a presentation at ABC in Roma
[A]BCel : a presentation at ABC in Roma
 
Multiview Alignment Hashing for Efficient Image Search
Multiview Alignment Hashing for Efficient Image SearchMultiview Alignment Hashing for Efficient Image Search
Multiview Alignment Hashing for Efficient Image Search
 
Ia3613981403
Ia3613981403Ia3613981403
Ia3613981403
 
Machine learning for the Web:
Machine learning for the Web: Machine learning for the Web:
Machine learning for the Web:
 
Short intro to 3D Geoinfo pilot The Netherlands
Short intro to 3D Geoinfo pilot The NetherlandsShort intro to 3D Geoinfo pilot The Netherlands
Short intro to 3D Geoinfo pilot The Netherlands
 
SchuurmansLecture.doc
SchuurmansLecture.docSchuurmansLecture.doc
SchuurmansLecture.doc
 
ppt slides
ppt slidesppt slides
ppt slides
 
Volume 2-issue-6-1930-1932
Volume 2-issue-6-1930-1932Volume 2-issue-6-1930-1932
Volume 2-issue-6-1930-1932
 
Artificial Intelligence
Artificial IntelligenceArtificial Intelligence
Artificial Intelligence
 
Csc307
Csc307Csc307
Csc307
 
MSWord
MSWordMSWord
MSWord
 

Andere mochten auch

Advances in Bayesian Learning
Advances in Bayesian LearningAdvances in Bayesian Learning
Advances in Bayesian Learningbutest
 
Jylee probabilistic reasoning with bayesian networks
Jylee probabilistic reasoning with bayesian networksJylee probabilistic reasoning with bayesian networks
Jylee probabilistic reasoning with bayesian networksJungyeol
 
Introduction to Data Mining with R and Data Import/Export in R
Introduction to Data Mining with R and Data Import/Export in RIntroduction to Data Mining with R and Data Import/Export in R
Introduction to Data Mining with R and Data Import/Export in RYanchang Zhao
 
hands on: Text Mining With R
hands on: Text Mining With Rhands on: Text Mining With R
hands on: Text Mining With RJahnab Kumar Deka
 

Andere mochten auch (7)

Bayesian learning
Bayesian learningBayesian learning
Bayesian learning
 
Advances in Bayesian Learning
Advances in Bayesian LearningAdvances in Bayesian Learning
Advances in Bayesian Learning
 
Jylee probabilistic reasoning with bayesian networks
Jylee probabilistic reasoning with bayesian networksJylee probabilistic reasoning with bayesian networks
Jylee probabilistic reasoning with bayesian networks
 
Haystax bayesian networks
Haystax bayesian networksHaystax bayesian networks
Haystax bayesian networks
 
Introduction to Data Mining with R and Data Import/Export in R
Introduction to Data Mining with R and Data Import/Export in RIntroduction to Data Mining with R and Data Import/Export in R
Introduction to Data Mining with R and Data Import/Export in R
 
hands on: Text Mining With R
hands on: Text Mining With Rhands on: Text Mining With R
hands on: Text Mining With R
 
Bayes Belief Network
Bayes Belief NetworkBayes Belief Network
Bayes Belief Network
 

Ähnlich wie Interactive Learning of Bayesian Networks

An Importance Sampling Approach to Integrate Expert Knowledge When Learning B...
An Importance Sampling Approach to Integrate Expert Knowledge When Learning B...An Importance Sampling Approach to Integrate Expert Knowledge When Learning B...
An Importance Sampling Approach to Integrate Expert Knowledge When Learning B...NTNU
 
Classification Techniques
Classification TechniquesClassification Techniques
Classification TechniquesKiran Bhowmick
 
Developing A Theory Of Information Science Research
Developing A Theory Of Information Science ResearchDeveloping A Theory Of Information Science Research
Developing A Theory Of Information Science ResearchRajee Dent
 
January 2020: Prototype-based systems in machine learning
January 2020: Prototype-based systems in machine learning  January 2020: Prototype-based systems in machine learning
January 2020: Prototype-based systems in machine learning University of Groningen
 
20070702 Text Categorization
20070702 Text Categorization20070702 Text Categorization
20070702 Text Categorizationmidi
 
Expandable bayesian
Expandable bayesianExpandable bayesian
Expandable bayesianAhmad Amri
 
Prototype-based models in machine learning
Prototype-based models in machine learningPrototype-based models in machine learning
Prototype-based models in machine learningUniversity of Groningen
 
Probabilistic Modelling with Information Filtering Networks
Probabilistic Modelling with Information Filtering NetworksProbabilistic Modelling with Information Filtering Networks
Probabilistic Modelling with Information Filtering NetworksTomaso Aste
 
Nimrita koul Machine Learning
Nimrita koul  Machine LearningNimrita koul  Machine Learning
Nimrita koul Machine LearningNimrita Koul
 
Méthodologies d'intégration de données omiques
Méthodologies d'intégration de données omiquesMéthodologies d'intégration de données omiques
Méthodologies d'intégration de données omiquestuxette
 
Multi-omics data integration methods: kernel and other machine learning appro...
Multi-omics data integration methods: kernel and other machine learning appro...Multi-omics data integration methods: kernel and other machine learning appro...
Multi-omics data integration methods: kernel and other machine learning appro...tuxette
 
Machine Learning: Foundations Course Number 0368403401
Machine Learning: Foundations Course Number 0368403401Machine Learning: Foundations Course Number 0368403401
Machine Learning: Foundations Course Number 0368403401butest
 
Kernel methods and variable selection for exploratory analysis and multi-omic...
Kernel methods and variable selection for exploratory analysis and multi-omic...Kernel methods and variable selection for exploratory analysis and multi-omic...
Kernel methods and variable selection for exploratory analysis and multi-omic...tuxette
 
slides of ABC talk at i-like workshop, Warwick, May 16
slides of ABC talk at i-like workshop, Warwick, May 16slides of ABC talk at i-like workshop, Warwick, May 16
slides of ABC talk at i-like workshop, Warwick, May 16Christian Robert
 
Machine Learning Algorithms for Image Classification of Hand Digits and Face ...
Machine Learning Algorithms for Image Classification of Hand Digits and Face ...Machine Learning Algorithms for Image Classification of Hand Digits and Face ...
Machine Learning Algorithms for Image Classification of Hand Digits and Face ...IRJET Journal
 
Introduction to Machine Learning with SciKit-Learn
Introduction to Machine Learning with SciKit-LearnIntroduction to Machine Learning with SciKit-Learn
Introduction to Machine Learning with SciKit-LearnBenjamin Bengfort
 
Parallel Filter-Based Feature Selection Based on Balanced Incomplete Block De...
Parallel Filter-Based Feature Selection Based on Balanced Incomplete Block De...Parallel Filter-Based Feature Selection Based on Balanced Incomplete Block De...
Parallel Filter-Based Feature Selection Based on Balanced Incomplete Block De...AMIDST Toolbox
 
Intro to Model Selection
Intro to Model SelectionIntro to Model Selection
Intro to Model Selectionchenhm
 

Ähnlich wie Interactive Learning of Bayesian Networks (20)

An Importance Sampling Approach to Integrate Expert Knowledge When Learning B...
An Importance Sampling Approach to Integrate Expert Knowledge When Learning B...An Importance Sampling Approach to Integrate Expert Knowledge When Learning B...
An Importance Sampling Approach to Integrate Expert Knowledge When Learning B...
 
Classification Techniques
Classification TechniquesClassification Techniques
Classification Techniques
 
Developing A Theory Of Information Science Research
Developing A Theory Of Information Science ResearchDeveloping A Theory Of Information Science Research
Developing A Theory Of Information Science Research
 
January 2020: Prototype-based systems in machine learning
January 2020: Prototype-based systems in machine learning  January 2020: Prototype-based systems in machine learning
January 2020: Prototype-based systems in machine learning
 
20070702 Text Categorization
20070702 Text Categorization20070702 Text Categorization
20070702 Text Categorization
 
Expandable bayesian
Expandable bayesianExpandable bayesian
Expandable bayesian
 
Prototype-based models in machine learning
Prototype-based models in machine learningPrototype-based models in machine learning
Prototype-based models in machine learning
 
Probabilistic Modelling with Information Filtering Networks
Probabilistic Modelling with Information Filtering NetworksProbabilistic Modelling with Information Filtering Networks
Probabilistic Modelling with Information Filtering Networks
 
Nimrita koul Machine Learning
Nimrita koul  Machine LearningNimrita koul  Machine Learning
Nimrita koul Machine Learning
 
Méthodologies d'intégration de données omiques
Méthodologies d'intégration de données omiquesMéthodologies d'intégration de données omiques
Méthodologies d'intégration de données omiques
 
Multi-omics data integration methods: kernel and other machine learning appro...
Multi-omics data integration methods: kernel and other machine learning appro...Multi-omics data integration methods: kernel and other machine learning appro...
Multi-omics data integration methods: kernel and other machine learning appro...
 
Machine Learning: Foundations Course Number 0368403401
Machine Learning: Foundations Course Number 0368403401Machine Learning: Foundations Course Number 0368403401
Machine Learning: Foundations Course Number 0368403401
 
Kernel methods and variable selection for exploratory analysis and multi-omic...
Kernel methods and variable selection for exploratory analysis and multi-omic...Kernel methods and variable selection for exploratory analysis and multi-omic...
Kernel methods and variable selection for exploratory analysis and multi-omic...
 
slides of ABC talk at i-like workshop, Warwick, May 16
slides of ABC talk at i-like workshop, Warwick, May 16slides of ABC talk at i-like workshop, Warwick, May 16
slides of ABC talk at i-like workshop, Warwick, May 16
 
Machine Learning Algorithms for Image Classification of Hand Digits and Face ...
Machine Learning Algorithms for Image Classification of Hand Digits and Face ...Machine Learning Algorithms for Image Classification of Hand Digits and Face ...
Machine Learning Algorithms for Image Classification of Hand Digits and Face ...
 
Introduction to Machine Learning with SciKit-Learn
Introduction to Machine Learning with SciKit-LearnIntroduction to Machine Learning with SciKit-Learn
Introduction to Machine Learning with SciKit-Learn
 
CoopLoc Technical Presentation
CoopLoc Technical PresentationCoopLoc Technical Presentation
CoopLoc Technical Presentation
 
Parallel Filter-Based Feature Selection Based on Balanced Incomplete Block De...
Parallel Filter-Based Feature Selection Based on Balanced Incomplete Block De...Parallel Filter-Based Feature Selection Based on Balanced Incomplete Block De...
Parallel Filter-Based Feature Selection Based on Balanced Incomplete Block De...
 
Intro to Model Selection
Intro to Model SelectionIntro to Model Selection
Intro to Model Selection
 
Basen Network
Basen NetworkBasen Network
Basen Network
 

Mehr von NTNU

Varying parameter in classification based on imprecise probabilities
Varying parameter in classification based on imprecise probabilitiesVarying parameter in classification based on imprecise probabilities
Varying parameter in classification based on imprecise probabilitiesNTNU
 
Bagging Decision Trees on Data Sets with Classification Noise
Bagging Decision Trees on Data Sets with Classification NoiseBagging Decision Trees on Data Sets with Classification Noise
Bagging Decision Trees on Data Sets with Classification NoiseNTNU
 
lassification with decision trees from a nonparametric predictive inference p...
lassification with decision trees from a nonparametric predictive inference p...lassification with decision trees from a nonparametric predictive inference p...
lassification with decision trees from a nonparametric predictive inference p...NTNU
 
Locally Averaged Bayesian Dirichlet Metrics
Locally Averaged Bayesian Dirichlet MetricsLocally Averaged Bayesian Dirichlet Metrics
Locally Averaged Bayesian Dirichlet MetricsNTNU
 
Application of a Selective Gaussian Naïve Bayes Model for Diffuse-Large B-Cel...
Application of a Selective Gaussian Naïve Bayes Model for Diffuse-Large B-Cel...Application of a Selective Gaussian Naïve Bayes Model for Diffuse-Large B-Cel...
Application of a Selective Gaussian Naïve Bayes Model for Diffuse-Large B-Cel...NTNU
 
An interactive approach for cleaning noisy observations in Bayesian networks ...
An interactive approach for cleaning noisy observations in Bayesian networks ...An interactive approach for cleaning noisy observations in Bayesian networks ...
An interactive approach for cleaning noisy observations in Bayesian networks ...NTNU
 
Learning classifiers from discretized expression quantitative trait loci
Learning classifiers from discretized expression quantitative trait lociLearning classifiers from discretized expression quantitative trait loci
Learning classifiers from discretized expression quantitative trait lociNTNU
 
Split Criterions for Variable Selection Using Decision Trees
Split Criterions for Variable Selection Using Decision TreesSplit Criterions for Variable Selection Using Decision Trees
Split Criterions for Variable Selection Using Decision TreesNTNU
 
A Semi-naive Bayes Classifier with Grouping of Cases
A Semi-naive Bayes Classifier with Grouping of CasesA Semi-naive Bayes Classifier with Grouping of Cases
A Semi-naive Bayes Classifier with Grouping of CasesNTNU
 
Combining Decision Trees Based on Imprecise Probabilities and Uncertainty Mea...
Combining Decision Trees Based on Imprecise Probabilities and Uncertainty Mea...Combining Decision Trees Based on Imprecise Probabilities and Uncertainty Mea...
Combining Decision Trees Based on Imprecise Probabilities and Uncertainty Mea...NTNU
 
A Bayesian approach to estimate probabilities in classification trees
A Bayesian approach to estimate probabilities in classification treesA Bayesian approach to estimate probabilities in classification trees
A Bayesian approach to estimate probabilities in classification treesNTNU
 
A Bayesian Random Split to Build Ensembles of Classification Trees
A Bayesian Random Split to Build Ensembles of Classification TreesA Bayesian Random Split to Build Ensembles of Classification Trees
A Bayesian Random Split to Build Ensembles of Classification TreesNTNU
 
An Experimental Study about Simple Decision Trees for Bagging Ensemble on Dat...
An Experimental Study about Simple Decision Trees for Bagging Ensemble on Dat...An Experimental Study about Simple Decision Trees for Bagging Ensemble on Dat...
An Experimental Study about Simple Decision Trees for Bagging Ensemble on Dat...NTNU
 
Selective Gaussian Naïve Bayes Model for Diffuse Large-B-Cell Lymphoma Classi...
Selective Gaussian Naïve Bayes Model for Diffuse Large-B-Cell Lymphoma Classi...Selective Gaussian Naïve Bayes Model for Diffuse Large-B-Cell Lymphoma Classi...
Selective Gaussian Naïve Bayes Model for Diffuse Large-B-Cell Lymphoma Classi...NTNU
 
Evaluating query-independent object features for relevancy prediction
Evaluating query-independent object features for relevancy predictionEvaluating query-independent object features for relevancy prediction
Evaluating query-independent object features for relevancy predictionNTNU
 
Effects of Highly Agreed Documents in Relevancy Prediction
Effects of Highly Agreed Documents in Relevancy PredictionEffects of Highly Agreed Documents in Relevancy Prediction
Effects of Highly Agreed Documents in Relevancy PredictionNTNU
 
Conference poster 6
Conference poster 6Conference poster 6
Conference poster 6NTNU
 

Mehr von NTNU (17)

Varying parameter in classification based on imprecise probabilities
Varying parameter in classification based on imprecise probabilitiesVarying parameter in classification based on imprecise probabilities
Varying parameter in classification based on imprecise probabilities
 
Bagging Decision Trees on Data Sets with Classification Noise
Bagging Decision Trees on Data Sets with Classification NoiseBagging Decision Trees on Data Sets with Classification Noise
Bagging Decision Trees on Data Sets with Classification Noise
 
lassification with decision trees from a nonparametric predictive inference p...
lassification with decision trees from a nonparametric predictive inference p...lassification with decision trees from a nonparametric predictive inference p...
lassification with decision trees from a nonparametric predictive inference p...
 
Locally Averaged Bayesian Dirichlet Metrics
Locally Averaged Bayesian Dirichlet MetricsLocally Averaged Bayesian Dirichlet Metrics
Locally Averaged Bayesian Dirichlet Metrics
 
Application of a Selective Gaussian Naïve Bayes Model for Diffuse-Large B-Cel...
Application of a Selective Gaussian Naïve Bayes Model for Diffuse-Large B-Cel...Application of a Selective Gaussian Naïve Bayes Model for Diffuse-Large B-Cel...
Application of a Selective Gaussian Naïve Bayes Model for Diffuse-Large B-Cel...
 
An interactive approach for cleaning noisy observations in Bayesian networks ...
An interactive approach for cleaning noisy observations in Bayesian networks ...An interactive approach for cleaning noisy observations in Bayesian networks ...
An interactive approach for cleaning noisy observations in Bayesian networks ...
 
Learning classifiers from discretized expression quantitative trait loci
Learning classifiers from discretized expression quantitative trait lociLearning classifiers from discretized expression quantitative trait loci
Learning classifiers from discretized expression quantitative trait loci
 
Split Criterions for Variable Selection Using Decision Trees
Split Criterions for Variable Selection Using Decision TreesSplit Criterions for Variable Selection Using Decision Trees
Split Criterions for Variable Selection Using Decision Trees
 
A Semi-naive Bayes Classifier with Grouping of Cases
A Semi-naive Bayes Classifier with Grouping of CasesA Semi-naive Bayes Classifier with Grouping of Cases
A Semi-naive Bayes Classifier with Grouping of Cases
 
Combining Decision Trees Based on Imprecise Probabilities and Uncertainty Mea...
Combining Decision Trees Based on Imprecise Probabilities and Uncertainty Mea...Combining Decision Trees Based on Imprecise Probabilities and Uncertainty Mea...
Combining Decision Trees Based on Imprecise Probabilities and Uncertainty Mea...
 
A Bayesian approach to estimate probabilities in classification trees
A Bayesian approach to estimate probabilities in classification treesA Bayesian approach to estimate probabilities in classification trees
A Bayesian approach to estimate probabilities in classification trees
 
A Bayesian Random Split to Build Ensembles of Classification Trees
A Bayesian Random Split to Build Ensembles of Classification TreesA Bayesian Random Split to Build Ensembles of Classification Trees
A Bayesian Random Split to Build Ensembles of Classification Trees
 
An Experimental Study about Simple Decision Trees for Bagging Ensemble on Dat...
An Experimental Study about Simple Decision Trees for Bagging Ensemble on Dat...An Experimental Study about Simple Decision Trees for Bagging Ensemble on Dat...
An Experimental Study about Simple Decision Trees for Bagging Ensemble on Dat...
 
Selective Gaussian Naïve Bayes Model for Diffuse Large-B-Cell Lymphoma Classi...
Selective Gaussian Naïve Bayes Model for Diffuse Large-B-Cell Lymphoma Classi...Selective Gaussian Naïve Bayes Model for Diffuse Large-B-Cell Lymphoma Classi...
Selective Gaussian Naïve Bayes Model for Diffuse Large-B-Cell Lymphoma Classi...
 
Evaluating query-independent object features for relevancy prediction
Evaluating query-independent object features for relevancy predictionEvaluating query-independent object features for relevancy prediction
Evaluating query-independent object features for relevancy prediction
 
Effects of Highly Agreed Documents in Relevancy Prediction
Effects of Highly Agreed Documents in Relevancy PredictionEffects of Highly Agreed Documents in Relevancy Prediction
Effects of Highly Agreed Documents in Relevancy Prediction
 
Conference poster 6
Conference poster 6Conference poster 6
Conference poster 6
 

Kürzlich hochgeladen

SUKDANAN DIAGNOSTIC TEST IN PHYSICAL SCIENCE ANSWER KEYY.pdf
SUKDANAN DIAGNOSTIC TEST IN PHYSICAL SCIENCE ANSWER KEYY.pdfSUKDANAN DIAGNOSTIC TEST IN PHYSICAL SCIENCE ANSWER KEYY.pdf
SUKDANAN DIAGNOSTIC TEST IN PHYSICAL SCIENCE ANSWER KEYY.pdfsantiagojoderickdoma
 
KeyBio pipeline for bioinformatics and data science
KeyBio pipeline for bioinformatics and data scienceKeyBio pipeline for bioinformatics and data science
KeyBio pipeline for bioinformatics and data scienceLayne Sadler
 
Applied Biochemistry feedback_M Ahwad 2023.docx
Applied Biochemistry feedback_M Ahwad 2023.docxApplied Biochemistry feedback_M Ahwad 2023.docx
Applied Biochemistry feedback_M Ahwad 2023.docxmarwaahmad357
 
Role of Herbs in Cosmetics in Cosmetic Science.
Role of Herbs in Cosmetics in Cosmetic Science.Role of Herbs in Cosmetics in Cosmetic Science.
Role of Herbs in Cosmetics in Cosmetic Science.ShwetaHattimare
 
Contracts with Interdependent Preferences (2)
Contracts with Interdependent Preferences (2)Contracts with Interdependent Preferences (2)
Contracts with Interdependent Preferences (2)GRAPE
 
MARSILEA notes in detail for II year Botany.ppt
MARSILEA  notes in detail for II year Botany.pptMARSILEA  notes in detail for II year Botany.ppt
MARSILEA notes in detail for II year Botany.pptaigil2
 
Lehninger_Chapter 17_Fatty acid Oxid.ppt
Lehninger_Chapter 17_Fatty acid Oxid.pptLehninger_Chapter 17_Fatty acid Oxid.ppt
Lehninger_Chapter 17_Fatty acid Oxid.pptSachin Teotia
 
Pests of tenai_Identification,Binomics_Dr.UPR
Pests of tenai_Identification,Binomics_Dr.UPRPests of tenai_Identification,Binomics_Dr.UPR
Pests of tenai_Identification,Binomics_Dr.UPRPirithiRaju
 
Legacy Analysis of Dark Matter Annihilation from the Milky Way Dwarf Spheroid...
Legacy Analysis of Dark Matter Annihilation from the Milky Way Dwarf Spheroid...Legacy Analysis of Dark Matter Annihilation from the Milky Way Dwarf Spheroid...
Legacy Analysis of Dark Matter Annihilation from the Milky Way Dwarf Spheroid...Sérgio Sacani
 
3.2 Pests of Sorghum_Identification, Symptoms and nature of damage, Binomics,...
3.2 Pests of Sorghum_Identification, Symptoms and nature of damage, Binomics,...3.2 Pests of Sorghum_Identification, Symptoms and nature of damage, Binomics,...
3.2 Pests of Sorghum_Identification, Symptoms and nature of damage, Binomics,...PirithiRaju
 
Pests of Redgram_Identification, Binomics_Dr.UPR
Pests of Redgram_Identification, Binomics_Dr.UPRPests of Redgram_Identification, Binomics_Dr.UPR
Pests of Redgram_Identification, Binomics_Dr.UPRPirithiRaju
 
TORSION IN GASTROPODS- Anatomical event (Zoology)
TORSION IN GASTROPODS- Anatomical event (Zoology)TORSION IN GASTROPODS- Anatomical event (Zoology)
TORSION IN GASTROPODS- Anatomical event (Zoology)chatterjeesoumili50
 
SCIENCE 6 QUARTER 3 REVIEWER(FRICTION, GRAVITY, ENERGY AND SPEED).pptx
SCIENCE 6 QUARTER 3 REVIEWER(FRICTION, GRAVITY, ENERGY AND SPEED).pptxSCIENCE 6 QUARTER 3 REVIEWER(FRICTION, GRAVITY, ENERGY AND SPEED).pptx
SCIENCE 6 QUARTER 3 REVIEWER(FRICTION, GRAVITY, ENERGY AND SPEED).pptxROVELYNEDELUNA3
 
Principles & Formulation of Hair Care Products
Principles & Formulation of Hair Care  ProductsPrinciples & Formulation of Hair Care  Products
Principles & Formulation of Hair Care Productspurwaborkar@gmail.com
 
Pests of ragi_Identification, Binomics_Dr.UPR
Pests of ragi_Identification, Binomics_Dr.UPRPests of ragi_Identification, Binomics_Dr.UPR
Pests of ragi_Identification, Binomics_Dr.UPRPirithiRaju
 
Krishi Vigyan Kendras - कृषि विज्ञान केंद्र
Krishi Vigyan Kendras - कृषि विज्ञान केंद्रKrishi Vigyan Kendras - कृषि विज्ञान केंद्र
Krishi Vigyan Kendras - कृषि विज्ञान केंद्रKrashi Coaching
 

Kürzlich hochgeladen (20)

SUKDANAN DIAGNOSTIC TEST IN PHYSICAL SCIENCE ANSWER KEYY.pdf
SUKDANAN DIAGNOSTIC TEST IN PHYSICAL SCIENCE ANSWER KEYY.pdfSUKDANAN DIAGNOSTIC TEST IN PHYSICAL SCIENCE ANSWER KEYY.pdf
SUKDANAN DIAGNOSTIC TEST IN PHYSICAL SCIENCE ANSWER KEYY.pdf
 
Applying Cheminformatics to Develop a Structure Searchable Database of Analyt...
Applying Cheminformatics to Develop a Structure Searchable Database of Analyt...Applying Cheminformatics to Develop a Structure Searchable Database of Analyt...
Applying Cheminformatics to Develop a Structure Searchable Database of Analyt...
 
Cheminformatics tools and chemistry data underpinning mass spectrometry analy...
Cheminformatics tools and chemistry data underpinning mass spectrometry analy...Cheminformatics tools and chemistry data underpinning mass spectrometry analy...
Cheminformatics tools and chemistry data underpinning mass spectrometry analy...
 
KeyBio pipeline for bioinformatics and data science
KeyBio pipeline for bioinformatics and data scienceKeyBio pipeline for bioinformatics and data science
KeyBio pipeline for bioinformatics and data science
 
Applied Biochemistry feedback_M Ahwad 2023.docx
Applied Biochemistry feedback_M Ahwad 2023.docxApplied Biochemistry feedback_M Ahwad 2023.docx
Applied Biochemistry feedback_M Ahwad 2023.docx
 
Role of Herbs in Cosmetics in Cosmetic Science.
Role of Herbs in Cosmetics in Cosmetic Science.Role of Herbs in Cosmetics in Cosmetic Science.
Role of Herbs in Cosmetics in Cosmetic Science.
 
Contracts with Interdependent Preferences (2)
Contracts with Interdependent Preferences (2)Contracts with Interdependent Preferences (2)
Contracts with Interdependent Preferences (2)
 
MARSILEA notes in detail for II year Botany.ppt
MARSILEA  notes in detail for II year Botany.pptMARSILEA  notes in detail for II year Botany.ppt
MARSILEA notes in detail for II year Botany.ppt
 
Lehninger_Chapter 17_Fatty acid Oxid.ppt
Lehninger_Chapter 17_Fatty acid Oxid.pptLehninger_Chapter 17_Fatty acid Oxid.ppt
Lehninger_Chapter 17_Fatty acid Oxid.ppt
 
Pests of tenai_Identification,Binomics_Dr.UPR
Pests of tenai_Identification,Binomics_Dr.UPRPests of tenai_Identification,Binomics_Dr.UPR
Pests of tenai_Identification,Binomics_Dr.UPR
 
Legacy Analysis of Dark Matter Annihilation from the Milky Way Dwarf Spheroid...
Legacy Analysis of Dark Matter Annihilation from the Milky Way Dwarf Spheroid...Legacy Analysis of Dark Matter Annihilation from the Milky Way Dwarf Spheroid...
Legacy Analysis of Dark Matter Annihilation from the Milky Way Dwarf Spheroid...
 
3.2 Pests of Sorghum_Identification, Symptoms and nature of damage, Binomics,...
3.2 Pests of Sorghum_Identification, Symptoms and nature of damage, Binomics,...3.2 Pests of Sorghum_Identification, Symptoms and nature of damage, Binomics,...
3.2 Pests of Sorghum_Identification, Symptoms and nature of damage, Binomics,...
 
Pests of Redgram_Identification, Binomics_Dr.UPR
Pests of Redgram_Identification, Binomics_Dr.UPRPests of Redgram_Identification, Binomics_Dr.UPR
Pests of Redgram_Identification, Binomics_Dr.UPR
 
TORSION IN GASTROPODS- Anatomical event (Zoology)
TORSION IN GASTROPODS- Anatomical event (Zoology)TORSION IN GASTROPODS- Anatomical event (Zoology)
TORSION IN GASTROPODS- Anatomical event (Zoology)
 
SCIENCE 6 QUARTER 3 REVIEWER(FRICTION, GRAVITY, ENERGY AND SPEED).pptx
SCIENCE 6 QUARTER 3 REVIEWER(FRICTION, GRAVITY, ENERGY AND SPEED).pptxSCIENCE 6 QUARTER 3 REVIEWER(FRICTION, GRAVITY, ENERGY AND SPEED).pptx
SCIENCE 6 QUARTER 3 REVIEWER(FRICTION, GRAVITY, ENERGY AND SPEED).pptx
 
Principles & Formulation of Hair Care Products
Principles & Formulation of Hair Care  ProductsPrinciples & Formulation of Hair Care  Products
Principles & Formulation of Hair Care Products
 
Data delivery from the US-EPA Center for Computational Toxicology and Exposur...
Data delivery from the US-EPA Center for Computational Toxicology and Exposur...Data delivery from the US-EPA Center for Computational Toxicology and Exposur...
Data delivery from the US-EPA Center for Computational Toxicology and Exposur...
 
Pests of ragi_Identification, Binomics_Dr.UPR
Pests of ragi_Identification, Binomics_Dr.UPRPests of ragi_Identification, Binomics_Dr.UPR
Pests of ragi_Identification, Binomics_Dr.UPR
 
Krishi Vigyan Kendras - कृषि विज्ञान केंद्र
Krishi Vigyan Kendras - कृषि विज्ञान केंद्रKrishi Vigyan Kendras - कृषि विज्ञान केंद्र
Krishi Vigyan Kendras - कृषि विज्ञान केंद्र
 
Cheminformatics tools supporting dissemination of data associated with US EPA...
Cheminformatics tools supporting dissemination of data associated with US EPA...Cheminformatics tools supporting dissemination of data associated with US EPA...
Cheminformatics tools supporting dissemination of data associated with US EPA...
 

Interactive Learning of Bayesian Networks

  • 1. Interactive Learning of Bayesian Networks Andrés Masegosa, Serafín Moral Department of Computer Science and Artificial Intelligence - University of Granada andrew@decsai.ugr.es Utrecht, July 2012
  • 2. Motivation The Bayesian Framework Outline 1 Motivation 2 The Bayesian Framework 3 Interactive integration of expert knowledge for model selection 4 Applications: Learning the parent set of a variable in BN. Learning Markov Boundaries. Learning complete Bayesian Networks. 5 Conclusions Utrecht Utrecht (Netherlands) 2/20
  • 3. Motivation The Bayesian Framework Outline 1 Motivation 2 The Bayesian Framework Utrecht Utrecht (Netherlands) 3/20
  • 4. Motivation The Bayesian Framework Bayesian Networks Bayesian Networks Excellent models to graphically represent the dependency structure (Markov Condition and d-separation) of the underlying distribution in multivariate domain problems: very relevant source of knowledge. Inference tasks: compute marginal, evidence propagation, abductive-inductive reasoning, etc. Utrecht Utrecht (Netherlands) 4/20
  • 5. Motivation The Bayesian Framework Learning Bayesian Networks from Data Learning Algorithms Learning Bayesian networks from data is quite challenging: the DAG space is super-exponential. There usually are several models with explain the data similarly well. Bayesian framework: high posterior probability given the data. Utrecht Utrecht (Netherlands) 5/20
  • 6. Motivation The Bayesian Framework Learning Bayesian Networks from Data Uncertainty in Model Selection This situation is specially common in problem domains with high number of variables and low sample sizes. Utrecht Utrecht (Netherlands) 6/20
  • 7. Motivation The Bayesian Framework Integration of Domain/Expert Knowledge Integration of Domain/Expert Knowledge Find the best statistical model by the combination of data and expert/domain knowledge. Utrecht Utrecht (Netherlands) 7/20
  • 8. Motivation The Bayesian Framework Integration of Domain/Expert Knowledge Integration of Domain/Expert Knowledge Find the best statistical model by the combination of data and expert/domain knowledge. Emerging approach in gene expression data mining: There is a growing number of biological knowledge data bases. The model space is usually huge and the number of samples is low (costly to collect). Many approaches have shifted from begin pure data-oriented to try to include domain knowledge. Utrecht Utrecht (Netherlands) 7/20
  • 9. Motivation The Bayesian Framework Integration of Expert Knowledge Previous Works There have been many attempts to introduce expert knowledge when learning BNs from data. Via Prior Distribution: Use of specific prior distributions over the possible graph structures to integrate expert knowledge: Expert assigns higher prior probabilities to most likely edges. Utrecht Utrecht (Netherlands) 8/20
  • 10. Motivation The Bayesian Framework Integration of Expert Knowledge Previous Works There have been many attempts to introduce expert knowledge when learning BNs from data. Via Prior Distribution: Use of specific prior distributions over the possible graph structures to integrate expert knowledge: Expert assigns higher prior probabilities to most likely edges. Via structural Restrictions: Expert codify his/her knowledge as structural restrictions. Expert defines the existence/absence of arcs and/or edges and causal ordering restrictions. Retrieved model should satisfy these restrictions. Utrecht Utrecht (Netherlands) 8/20
  • 11. Motivation The Bayesian Framework Integration of Domain/Expert Knowledge Limitations of "Prior" Expert Knowledge: We are forced to include expert/domain knowledge for each of the elements of the models (e.g. for every possible edge of a BN). Utrecht Utrecht (Netherlands) 9/20
  • 12. Motivation The Bayesian Framework Integration of Domain/Expert Knowledge Limitations of "Prior" Expert Knowledge: We are forced to include expert/domain knowledge for each of the elements of the models (e.g. for every possible edge of a BN). Expert could be biased to provided the “clearest” knowledge, which could be the easiest to be find in the data. Utrecht Utrecht (Netherlands) 9/20
  • 13. Motivation The Bayesian Framework Integration of Domain/Expert Knowledge Limitations of "Prior" Expert Knowledge: We are forced to include expert/domain knowledge for each of the elements of the models (e.g. for every possible edge of a BN). Expert could be biased to provided the “clearest” knowledge, which could be the easiest to be find in the data. The system does not help to the user to introduce information about the BN structure. Utrecht Utrecht (Netherlands) 9/20
  • 14. Motivation The Bayesian Framework Integration of Domain/Expert Knowledge Interactive Integration of Expert Knowledge Data is firstly analyzed. The system only inquires to the expert about most uncertain elements considering the information present in the data. Utrecht Utrecht (Netherlands) 10/20
  • 15. Motivation The Bayesian Framework Integration of Domain/Expert Knowledge Interactive Integration of Expert Knowledge Data is firstly analyzed. The system only inquires to the expert about most uncertain elements considering the information present in the data. Benefits: Expert is only asked a smaller number of times. We explicitly show to the expert which are the elements about which data do not provide enough evidence to make a reliable model selection. Utrecht Utrecht (Netherlands) 10/20
  • 16. Motivation The Bayesian Framework Integration of Domain/Expert Knowledge Utrecht Utrecht (Netherlands) 11/20
  • 17. Motivation The Bayesian Framework Integration of Domain/Expert Knowledge Utrecht Utrecht (Netherlands) 12/20
  • 18. Motivation The Bayesian Framework Integration of Domain/Expert Knowledge Active Interaction with the Expert Strategy: Ask to the expert by the presence of the edges that most reduce the model uncertainty. Method: Framework to allow an efficient and effective interaction with the expert. Expert is only asked for this controversial structural features. Utrecht Utrecht (Netherlands) 12/20
  • 19. Motivation The Bayesian Framework Outline 1 Motivation 2 The Bayesian Framework Utrecht Utrecht (Netherlands) 13/20
  • 20. Motivation The Bayesian Framework Notation Let us denote by X = (X1, ..., Xn) a vector of random variables and by D a fully observed set of instances x = (x1, ..., xn) ∈ Val(X). Utrecht Utrecht (Netherlands) 14/20
  • 21. Motivation The Bayesian Framework Notation Let us denote by X = (X1, ..., Xn) a vector of random variables and by D a fully observed set of instances x = (x1, ..., xn) ∈ Val(X). Let be M a model and M the set of all possible models. M may define: Joint probability distribution: P(X|M) in the case of a Bayesian network. Conditional probability distribution for target variable: P(T|X, M) in the case of a Markov Blanket. Utrecht Utrecht (Netherlands) 14/20
  • 22. Motivation The Bayesian Framework Examples of Models Each model M is structured: It is defined by a vector of elements M = (m1, ..., mK ) where K is the number of possible components of M. Utrecht Utrecht (Netherlands) 15/20
  • 23. Motivation The Bayesian Framework Examples of Models Each model M is structured: It is defined by a vector of elements M = (m1, ..., mK ) where K is the number of possible components of M. Examples: Bayesian Network X 1 X 2 X 3 X 4 M = (1, 0, 0, 1, 0, -1) Utrecht Utrecht (Netherlands) 15/20
  • 24. Motivation The Bayesian Framework Examples of Models Each model M is structured: It is defined by a vector of elements M = (m1, ..., mK ) where K is the number of possible components of M. Examples: Bayesian Network Markov Blankets X 1 X 2 X 3 X 4 M = (1, 0, 0, 1, 0, -1) X 1 X 2 X 3 X 4 X 5 T M = (1, 1, 0, 0, 1) Utrecht Utrecht (Netherlands) 15/20
  • 25. Motivation The Bayesian Framework The Model Selection Problem: Bayesian Framework Define a prior probability over the space of alternative models P(M). It is not the classic uniform prior (the multplicity problem). Utrecht Utrecht (Netherlands) 16/20
  • 26. Motivation The Bayesian Framework The Model Selection Problem: Bayesian Framework Define a prior probability over the space of alternative models P(M). It is not the classic uniform prior (the multplicity problem). For each model, it is computed its Bayesian score: score(M|D) = P(D|M)P(M) where P(D|M) = θ P(D|θ, M)P(θ|M) is the marginal likelihood of the model. Utrecht Utrecht (Netherlands) 16/20
  • 27. Motivation The Bayesian Framework Benefits of the full Bayesian approach We assume that we are able to approximate the posterior by means of any Monte Carlo method: P(M|D) = P(D|M)P(M) M′∈Val(M) P(D|M′)P(M′) Utrecht Utrecht (Netherlands) 17/20
  • 28. Motivation The Bayesian Framework Benefits of the full Bayesian approach We assume that we are able to approximate the posterior by means of any Monte Carlo method: P(M|D) = P(D|M)P(M) M′∈Val(M) P(D|M′)P(M′) We can compute the posterior probability of any of the elements of a model: P(mi|D) = M P(M|D)IM(mi) where IM(mi) is the indicator function: 1 if mi is present in M and 0 otherwise. Utrecht Utrecht (Netherlands) 17/20
  • 29. Motivation The Bayesian Framework Examples Utrecht Utrecht (Netherlands) 18/20
  • 30. Motivation The Bayesian Framework Examples High Probable Edges: P(B → X|D) = 1.0 and P(A → X|D) = 0.8 Utrecht Utrecht (Netherlands) 18/20
  • 31. Motivation The Bayesian Framework Examples High Probable Edges: P(B → X|D) = 1.0 and P(A → X|D) = 0.8 Low Probable Edges: P(C → X|D) = 0.0 Utrecht Utrecht (Netherlands) 18/20
  • 32. Motivation The Bayesian Framework Examples High Probable Edges: P(B → X|D) = 1.0 and P(A → X|D) = 0.8 Low Probable Edges: P(C → X|D) = 0.0 Uncertain Edges: P(D → X|D) = 0.55 Utrecht Utrecht (Netherlands) 18/20
  • 33. Motivation The Bayesian Framework Outline 1 Motivation 2 The Bayesian Framework Utrecht Utrecht (Netherlands) 19/20
  • 34. Motivation The Bayesian Framework Interaction with of Expert/Domain Knowledge Interactive Integration of of Domain/Expert Knowledge: Expert/Domain Knowledge is given for particular elements mk of the the models M: If a variable is present or not in the final variable selection. If there is an edge between any two variables in a BN. Expert/Domain Knowledge may be not fully reliable. Utrecht Utrecht (Netherlands) 20/20
  • 35. Motivation The Bayesian Framework Our Approach for Expert Interaction A s k M E D U m k k k Utrecht Utrecht (Netherlands) 21/20
  • 36. Motivation The Bayesian Framework Our Approach for Expert Interaction A s k M E D U m k k k E represents expert reliability: Ω(E) = {Right, Wrong}. If expert is wrong we assume a random answer. Utrecht Utrecht (Netherlands) 21/20
  • 37. Motivation The Bayesian Framework Our Approach for Expert Interaction A s k M E D U m k k k E represents expert reliability: Ω(E) = {Right, Wrong}. If expert is wrong we assume a random answer. Our goal is to infer model structure: U(Askk , mk , M, D) = logP(M|mk , Askk , D) Utrecht Utrecht (Netherlands) 21/20
  • 38. Motivation The Bayesian Framework Our Approach for Expert Interaction Utrecht Utrecht (Netherlands) 22/20
  • 39. Motivation The Bayesian Framework Our Approach for Expert Interaction The expected utility of asking and not-asking is: V(Askk ) = mk M P(M|D)P(mk |M, Askk , D)logP(M|mk , Askk , D) Utrecht Utrecht (Netherlands) 22/20
  • 40. Motivation The Bayesian Framework Our Approach for Expert Interaction The expected utility of asking and not-asking is: V(Askk ) = mk M P(M|D)P(mk |M, Askk , D)logP(M|mk , Askk , D) V(Askk ) = M P(M|D)logP(M|D) Utrecht Utrecht (Netherlands) 22/20
  • 41. Motivation The Bayesian Framework Our Approach for Expert Interaction The expected utility of asking and not-asking is: V(Askk ) = mk M P(M|D)P(mk |M, Askk , D)logP(M|mk , Askk , D) V(Askk ) = M P(M|D)logP(M|D) The difference between both actions, V(Askk ) − V(Askk ), is the information gain function: IG(M : mk |D) = H(M|D) − mk P(mk |D)H(M|mk , D) Utrecht Utrecht (Netherlands) 22/20
  • 42. Motivation The Bayesian Framework Our Approach for Expert Interaction The expected utility of asking and not-asking is: V(Askk ) = mk M P(M|D)P(mk |M, Askk , D)logP(M|mk , Askk , D) V(Askk ) = M P(M|D)logP(M|D) The difference between both actions, V(Askk ) − V(Askk ), is the information gain function: IG(M : mk |D) = H(M|D) − mk P(mk |D)H(M|mk , D) It can be shown that the element with the highest information gain is the one with highest entropy: m⋆ k = max k IG(M : mk |D) = max k H(mk |D) − H(Ek ) Utrecht Utrecht (Netherlands) 22/20
  • 43. Motivation The Bayesian Framework Our Approach for Expert Interaction 1 Approximate P(M|D) by means of a MC technique. Utrecht Utrecht (Netherlands) 23/20
  • 44. Motivation The Bayesian Framework Our Approach for Expert Interaction 1 Approximate P(M|D) by means of a MC technique. 2 Ask the expert about the element mk with the highest entropy. a = a ∪ a(mk ). Utrecht Utrecht (Netherlands) 23/20
  • 45. Motivation The Bayesian Framework Our Approach for Expert Interaction 1 Approximate P(M|D) by means of a MC technique. 2 Ask the expert about the element mk with the highest entropy. a = a ∪ a(mk ). Update P(M|D, a), which is equivalent to: P(M|D, a) ∝ P(D|M)P(M|a) Utrecht Utrecht (Netherlands) 23/20
  • 46. Motivation The Bayesian Framework Our Approach for Expert Interaction 1 Approximate P(M|D) by means of a MC technique. 2 Ask the expert about the element mk with the highest entropy. a = a ∪ a(mk ). Update P(M|D, a), which is equivalent to: P(M|D, a) ∝ P(D|M)P(M|a) 3 Stop Condition: H(mk |D, a) < λ. Utrecht Utrecht (Netherlands) 23/20
  • 47. Motivation The Bayesian Framework Our Approach for Expert Interaction 1 Approximate P(M|D) by means of a MC technique. 2 Ask the expert about the element mk with the highest entropy. a = a ∪ a(mk ). Update P(M|D, a), which is equivalent to: P(M|D, a) ∝ P(D|M)P(M|a) 3 Stop Condition: H(mk |D, a) < λ. 4 Otherwise: Option 1: Go to Step 2 and ask to the expert again. Utrecht Utrecht (Netherlands) 23/20
  • 48. Motivation The Bayesian Framework Our Approach for Expert Interaction 1 Approximate P(M|D) by means of a MC technique. 2 Ask the expert about the element mk with the highest entropy. a = a ∪ a(mk ). Update P(M|D, a), which is equivalent to: P(M|D, a) ∝ P(D|M)P(M|a) 3 Stop Condition: H(mk |D, a) < λ. 4 Otherwise: Option 1: Go to Step 2 and ask to the expert again. Option 2: Go to Step 1 and sample now using P(M|a). Utrecht Utrecht (Netherlands) 23/20
  • 49. Motivation The Bayesian Framework Example I Probability of the edges: P(A → X|D) = 0.8 P(B → X|D) = 0.455 P(C → X|D) = 0.75 Utrecht Utrecht (Netherlands) 24/20
  • 50. Motivation The Bayesian Framework Example II Expert say that B → X is present in the model: P(A → X|D) = 0.88 P(B → X|D) = 1.0 P(C → X|D) = 0.77 Utrecht Utrecht (Netherlands) 25/20
  • 51. Motivation The Bayesian Framework Developments This methodology has been applied to the following model selection problems: Utrecht Utrecht (Netherlands) 26/20
  • 52. Motivation The Bayesian Framework Developments This methodology has been applied to the following model selection problems: Induce a Bayesian network conditioned to a previously given causal order of the variables. Utrecht Utrecht (Netherlands) 26/20
  • 53. Motivation The Bayesian Framework Developments This methodology has been applied to the following model selection problems: Induce a Bayesian network conditioned to a previously given causal order of the variables. Induce the Markov Blanket of a target variable (Feature Selection). Utrecht Utrecht (Netherlands) 26/20
  • 54. Motivation The Bayesian Framework Developments This methodology has been applied to the following model selection problems: Induce a Bayesian network conditioned to a previously given causal order of the variables. Induce the Markov Blanket of a target variable (Feature Selection). Induce Bayesian Networks without any restriction. Utrecht Utrecht (Netherlands) 26/20
  • 55. Motivation The Bayesian Framework Conclusions We have developed a general method for model selection which allow the inclusion of expert knowledge. The method is robust even when expert knowledge is wrong. The number of interactions is minimized. It has been successfully applied to different model selection problems. Utrecht Utrecht (Netherlands) 27/20
  • 56. Motivation The Bayesian Framework Future Works Develop a new score to measure the impact of the interaction in model selection. Extend this methodology to other probabilistic graphical models. Evaluate the impact of the prior over the parameters. Preference among models may change with the parameter prior. Detect the problem and let the user choose. Employ alternative ways to introduce expert knowledge: E.g. In BN, we ask about edges: direct causal probabilistic relationships. Many domain knowledge is about correlations and conditional independencies.Utrecht Utrecht (Netherlands) 28/20