SlideShare ist ein Scribd-Unternehmen logo
1 von 50
Downloaden Sie, um offline zu lesen
Gabriel Peyré
www.numerical-tours.com
Model Selection
with Piecewise
Regular Gauges
Samuel Vaiter
Charles Deledalle
Jalal Fadili
Joint work with:
Joseph Salmon
VISI N
Overview
• Inverse Problems
• Gauge Decomposition and Model Selection
• L2 Stability Performances
• Model Stability Performances
y = x0 + w 2 RP
Inverse Problems
Recovering x0 RN
from noisy observations
Examples: Inpainting, super-resolution, compressed-sensing
y = x0 + w 2 RP
Inverse Problems
Recovering x0 RN
from noisy observations
x0
x0
Regularized inversion:
Estimators
x(y) 2 argmin
x2RN
1
2
||y x||2
+ J(x)
Data fidelity Regularity
Observations: y = x0 + w 2 RP
.
L2
error stability: ||x(y) x0|| = O(||w||).
Promoted subspace (“model”) stability.
Goal: Performance analysis:
Regularized inversion:
Estimators
x(y) 2 argmin
x2RN
1
2
||y x||2
+ J(x)
! Criteria on (x0, ||w||, ) to ensure
Data fidelity Regularity
Observations: y = x0 + w 2 RP
.
Overview
• Inverse Problems
• Gauge Decomposition and Model Selection
• L2 Stability Performances
• Model Stability Performances
Coe cients x Image x
Union of Linear Models for Data Processing
Union of models: T 2 T linear spaces.
Synthesis
sparsity:
T
Coe cients x Image x
Union of Linear Models for Data Processing
Union of models: T 2 T linear spaces.
Synthesis
sparsity:
T
Structured
sparsity:
Coe cients x Image x
Union of Linear Models for Data Processing
D
Image x Gradient D⇤
x
Union of models: T 2 T linear spaces.
Synthesis
sparsity:
T
Structured
sparsity:
Analysis
sparsity:
Coe cients x Image x
Multi-spectral imaging:
xi,· =
Pr
j=1 Ai,jSj,·
Union of Linear Models for Data Processing
D
Image x Gradient D⇤
x
Union of models: T 2 T linear spaces.
Synthesis
sparsity:
T
Structured
sparsity:
Analysis
sparsity:
Low-rank:
S1,·
S2,·
S3,·x
Gauge: J : RN
! R+
8 ↵ 2 R+
, J(↵x) = ↵J(x)
Gauges for Union of Linear Models
Convex
Gauge: J : RN
! R+
8 ↵ 2 R+
, J(↵x) = ↵J(x)
J(x) = C(x) = inf {⇢ > 0  x 2 ⇢C}
C = {x  J(x) 6 1} (assuming 0 2 C)
Gauges for Union of Linear Models
J(x)
C
1
Convex
Gauge:
, Union of linear models (T)T 2TPiecewise regular ball
J : RN
! R+
8 ↵ 2 R+
, J(↵x) = ↵J(x)
J(x) = C(x) = inf {⇢ > 0  x 2 ⇢C}
C = {x  J(x) 6 1} (assuming 0 2 C)
Gauges for Union of Linear Models
J(x)
C
1
Convex
x
T
Gauge:
, Union of linear models (T)T 2TPiecewise regular ball
J : RN
! R+
8 ↵ 2 R+
, J(↵x) = ↵J(x)
J(x) = C(x) = inf {⇢ > 0  x 2 ⇢C}
C = {x  J(x) 6 1} (assuming 0 2 C)
Gauges for Union of Linear Models
J(x) = ||x||1
T = sparse
vectors
J(x)
C
1
Convex
x
T
Gauge:
, Union of linear models (T)T 2TPiecewise regular ball
J : RN
! R+
8 ↵ 2 R+
, J(↵x) = ↵J(x)
J(x) = C(x) = inf {⇢ > 0  x 2 ⇢C}
C = {x  J(x) 6 1} (assuming 0 2 C)
Gauges for Union of Linear Models
J(x) = ||x||1
x0
T0
T = sparse
vectors
J(x)
C
1
Convex
x
T
Gauge:
, Union of linear models (T)T 2TPiecewise regular ball
J : RN
! R+
8 ↵ 2 R+
, J(↵x) = ↵J(x)
J(x) = C(x) = inf {⇢ > 0  x 2 ⇢C}
C = {x  J(x) 6 1} (assuming 0 2 C)
Gauges for Union of Linear Models
J(x) = ||x||1
x0
T0
T = sparse
vectors
|x1|+||x2,3||
x0
xT
T0
T = block
vectors
sparse
J(x)
C
1
Convex
x
T
Gauge:
, Union of linear models (T)T 2TPiecewise regular ball
J : RN
! R+
8 ↵ 2 R+
, J(↵x) = ↵J(x)
J(x) = C(x) = inf {⇢ > 0  x 2 ⇢C}
C = {x  J(x) 6 1} (assuming 0 2 C)
Gauges for Union of Linear Models
J(x) = ||x||1
T = low-rank
matrices
J(x) = ||x||⇤
x
x0
T0
T = sparse
vectors
|x1|+||x2,3||
x0
xT
T0
T = block
vectors
sparse
J(x)
C
1
Convex
x
T
Gauge:
, Union of linear models (T)T 2TPiecewise regular ball
J : RN
! R+
8 ↵ 2 R+
, J(↵x) = ↵J(x)
J(x) = C(x) = inf {⇢ > 0  x 2 ⇢C}
C = {x  J(x) 6 1} (assuming 0 2 C)
Gauges for Union of Linear Models
J(x) = ||x||1
T = low-rank
matrices
J(x) = ||x||⇤
x
x0
T0
T = anti-
sparse
vectors
J(x) = ||x||1
x
x0
T0
T = sparse
vectors
|x1|+||x2,3||
x0
xT
T0
T = block
vectors
sparse
J(x)
C
1
Convex
Subdifferentials and Models
@J(x) = ⌘ 2 RN
 8 y, J(y) > J(x) + h⌘, y xi
I = supp(x) = {i  xi 6= 0}
Subdifferentials and Models
Example: J(x) = ||x||1 @||x||1 =
⇢
⌘ 
supp(⌘) = I,
8 j /2 I, |⌘j| 6 1
x
@J(x)
0
@J(x) = ⌘ 2 RN
 8 y, J(y) > J(x) + h⌘, y xi
x@J(x)
0
I = supp(x) = {i  xi 6= 0}
Subdifferentials and Models
Example: J(x) = ||x||1 @||x||1 =
⇢
⌘ 
supp(⌘) = I,
8 j /2 I, |⌘j| 6 1
x
@J(x)
0
@J(x) = ⌘ 2 RN
 8 y, J(y) > J(x) + h⌘, y xi
Tx
x@J(x)
0
Definition:
I = supp(x) = {i  xi 6= 0}
Tx = VectHull(@J(x))?
Subdifferentials and Models
Tx = {⌘  supp(⌘) = I}
Example: J(x) = ||x||1 @||x||1 =
⇢
⌘ 
supp(⌘) = I,
8 j /2 I, |⌘j| 6 1
Tx
x
@J(x)
0
@J(x) = ⌘ 2 RN
 8 y, J(y) > J(x) + h⌘, y xi
Tx
x@J(x)
0
Definition:
I = supp(x) = {i  xi 6= 0}
Tx = VectHull(@J(x))?
Subdifferentials and Models
ex
ex = ProjTx
(@J(x))
ex = sign(x)
Tx = {⌘  supp(⌘) = I}
Example: J(x) = ||x||1 @||x||1 =
⇢
⌘ 
supp(⌘) = I,
8 j /2 I, |⌘j| 6 1
ex
Tx
x
@J(x)
0
@J(x) = ⌘ 2 RN
 8 y, J(y) > J(x) + h⌘, y xi
Examples
`1
sparsity: J(x) = ||x||1
ex = sign(x) Tx = {z  supp(z) ⇢ supp(x)}
x0
x @J(x)
Examples
x0
x
@J(x)
`1
sparsity: J(x) = ||x||1
ex = sign(x) Tx = {z  supp(z) ⇢ supp(x)}
ex = (N(xb))b2B
N(a) = a/||a||Structured sparsity: J(x) =
P
b ||xb||
Tx = {z  supp(z) ⇢ supp(x)}
x0
x @J(x)
Examples
x0
x
@J(x)
Tx = {z  U⇤
?zV? = 0}ex = UV ⇤
Nuclear norm: J(x) = ||x||⇤ x = U⇤V ⇤
SVD:
`1
sparsity: J(x) = ||x||1
ex = sign(x) Tx = {z  supp(z) ⇢ supp(x)}
ex = (N(xb))b2B
N(a) = a/||a||Structured sparsity: J(x) =
P
b ||xb||
Tx = {z  supp(z) ⇢ supp(x)}
x
@J(x)
x0
x @J(x)
Examples
x0
x
@J(x)
x x0
@J(x)
I = {i  |xi| = ||x||1}Anti-sparsity: J(x) = ||x||1
Tx = {y  yI / sign(xI)}ex = |I| 1
sign(x)
Tx = {z  U⇤
?zV? = 0}ex = UV ⇤
Nuclear norm: J(x) = ||x||⇤ x = U⇤V ⇤
SVD:
`1
sparsity: J(x) = ||x||1
ex = sign(x) Tx = {z  supp(z) ⇢ supp(x)}
ex = (N(xb))b2B
N(a) = a/||a||Structured sparsity: J(x) =
P
b ||xb||
Tx = {z  supp(z) ⇢ supp(x)}
x
@J(x)
x0
x @J(x)
Overview
• Inverse Problems
• Gauge Decomposition and Model Selection
• L2 Stability Performances
• Model Stability Performances
Noiseless recovery: min
x= x0
J(x) (P0)
x = x0
Dual Certificate and L2 Stability
x?
Noiseless recovery: min
x= x0
J(x) (P0)
Dual certificates:
x = x0
⌘
Proposition:
D = Im( ⇤
)  @J(x0)
9 ⌘ 2 D () x0 solution of (P0)
Dual Certificate and L2 Stability
@J(x0)
x?
Noiseless recovery: min
x= x0
J(x) (P0)
Dual certificates:
Tight dual certificates:
x = x0
⌘
Proposition:
D = Im( ⇤
)  @J(x0)
¯D = Im( ⇤
)  ri(@J(x0))
9 ⌘ 2 D () x0 solution of (P0)
Dual Certificate and L2 Stability
@J(x0)
x?
Noiseless recovery: min
x= x0
J(x) (P0)
Dual certificates:
Tight dual certificates:
x = x0
⌘
Proposition:
D = Im( ⇤
)  @J(x0)
¯D = Im( ⇤
)  ri(@J(x0))
9 ⌘ 2 D () x0 solution of (P0)
Dual Certificate and L2 Stability
@J(x0)
x?
Theorem:
[Fadili et al. 2013] for ⇠ ||w|| one has ||x?
x0|| = O(||w||)
If 9 ⌘ 2 ¯D and ker( )  Tx0 = {0}
Noiseless recovery: min
x= x0
J(x) (P0)
Dual certificates:
Tight dual certificates:
x = x0
⌘
Proposition:
! The constants depend on N . . .
D = Im( ⇤
)  @J(x0)
¯D = Im( ⇤
)  ri(@J(x0))
9 ⌘ 2 D () x0 solution of (P0)
Dual Certificate and L2 Stability
@J(x0)
x?
Theorem:
[Fadili et al. 2013] for ⇠ ||w|| one has ||x?
x0|| = O(||w||)
If 9 ⌘ 2 ¯D and ker( )  Tx0 = {0}
Noiseless recovery: min
x= x0
J(x) (P0)
Dual certificates:
Tight dual certificates:
x = x0
⌘
Proposition:
[Grassmair 2012]: J(x?
x0) = O(||w||).
[Grassmair, Haltmeier, Scherzer 2010]: J = || · ||1.
! The constants depend on N . . .
D = Im( ⇤
)  @J(x0)
¯D = Im( ⇤
)  ri(@J(x0))
9 ⌘ 2 D () x0 solution of (P0)
Dual Certificate and L2 Stability
@J(x0)
x?
Theorem:
[Fadili et al. 2013] for ⇠ ||w|| one has ||x?
x0|| = O(||w||)
If 9 ⌘ 2 ¯D and ker( )  Tx0 = {0}
Overview
• Inverse Problems
• Gauge Decomposition and Model Selection
• L2 Stability Performances
• Model Stability Performances
⌘ 2 D () and J (⌘) = 1
Minimal-norm Certificate
⌘ = ⇤
q
⌘T = e
⇢
T = Tx0
e = ex0
⌘ 2 D ()
We assume ker( )  T = {0} and J piecewise regular.
and J (⌘) = 1
Minimal-norm Certificate
⌘ = ⇤
q
⌘T = e
⇢
T = Tx0
e = ex0
⌘0 = argmin
⌘= ⇤q,⌘T =e
||q||
⌘ 2 D ()
We assume ker( )  T = {0} and J piecewise regular.
and J (⌘) = 1
Minimal-norm Certificate
⌘ = ⇤
q
⌘T = e
Minimal-norm pre-certificate:
⇢
T = Tx0
e = ex0
⌘0 = argmin
⌘= ⇤q,⌘T =e
||q||
⌘ 2 D ()
We assume ker( )  T = {0} and J piecewise regular.
and J (⌘) = 1
Minimal-norm Certificate
Proposition: One has
⌘ = ⇤
q
⌘T = e
Minimal-norm pre-certificate:
⇢
T = Tx0
e = ex0
⌘0 = ( +
T )⇤
e
⌘0 = argmin
⌘= ⇤q,⌘T =e
||q||
⌘ 2 D ()
We assume ker( )  T = {0} and J piecewise regular.
and J (⌘) = 1
Minimal-norm Certificate
Proposition:
||w|| = O(⌫x0 ) and ⇠ ||w||,Theorem:
the unique solution x?
of P (y) for y = x0 + w satisfies
Tx? = Tx0
and ||x?
x0|| = O(||w||) [Vaiter et al. 2013]
One has
⌘ = ⇤
q
⌘T = e
Minimal-norm pre-certificate:
⇢
T = Tx0
e = ex0
⌘0 = ( +
T )⇤
e
If ⌘0 2 ¯D,
[Fuchs 2004]: J = || · ||1.
[Bach 2008]: J = || · ||1,2 and J = || · ||⇤.
[Vaiter et al. 2011]: J = ||D⇤
· ||1.
⌘0 = argmin
⌘= ⇤q,⌘T =e
||q||
⌘ 2 D ()
We assume ker( )  T = {0} and J piecewise regular.
and J (⌘) = 1
Minimal-norm Certificate
Proposition:
||w|| = O(⌫x0 ) and ⇠ ||w||,Theorem:
the unique solution x?
of P (y) for y = x0 + w satisfies
Tx? = Tx0
and ||x?
x0|| = O(||w||) [Vaiter et al. 2013]
One has
⌘ = ⇤
q
⌘T = e
Minimal-norm pre-certificate:
⇢
T = Tx0
e = ex0
⌘0 = ( +
T )⇤
e
If ⌘0 2 ¯D,
⇥x =
i
xi (· i)
Increasing :
reduces correlation.
reduces resolution.
J(x) = ||x||1
Example: 1-D Sparse Deconvolution
x0
x0
⇥x =
i
xi (· i)
Increasing :
reduces correlation.
reduces resolution.
0 10
2
support recovery.
J(x) = ||x||1
()
||⌘0,Ic ||1 < 1
⌘0 2 ¯D(x0)
I = {j  x0(j) 6= 0}
||⌘0,Ic ||1
Example: 1-D Sparse Deconvolution
x0
x0
20
1
()
J(x) = ||rx||1 (rx)i = xi xi 1
= Id I = {i  (rx0)i 6= 0}
8 j /2 I, ( ↵0)j = 0⌘0 = div(↵0) where
Example: 1-D TV Denoising
x0
+1
1
I
J
Support stability.
J(x) = ||rx||1 (rx)i = xi xi 1
= Id I = {i  (rx0)i 6= 0}
8 j /2 I, ( ↵0)j = 0⌘0 = div(↵0) where
||↵0,Ic || < 1
Example: 1-D TV Denoising
x0
x0
+1
1
I
J
`2
stability onlySupport stability.
x0
J(x) = ||rx||1 (rx)i = xi xi 1
= Id I = {i  (rx0)i 6= 0}
8 j /2 I, ( ↵0)j = 0⌘0 = div(↵0) where
||↵0,Ic || < 1 ||↵0,Ic || = 1
Example: 1-D TV Denoising
+1
1
J
x0
x0
Gauges: encode linear models as singular points.
Conclusion
Piecewise smooth gauges: enable model recovery Tx? = Tx0 .
Gauges: encode linear models as singular points.
Tight dual certificates: enables L2
stability.
Conclusion
Piecewise smooth gauges: enable model recovery Tx? = Tx0 .
– Approximate model recovery Tx? ⇡ Tx0 .
Gauges: encode linear models as singular points.
– Infinite dimensional problems (measures, TV, etc.).
Tight dual certificates: enables L2
stability.
Conclusion
Open problems:

Weitere ähnliche Inhalte

Was ist angesagt?

Actuarial Science Reference Sheet
Actuarial Science Reference SheetActuarial Science Reference Sheet
Actuarial Science Reference Sheet
Daniel Nolan
 
Lecture3 linear svm_with_slack
Lecture3 linear svm_with_slackLecture3 linear svm_with_slack
Lecture3 linear svm_with_slack
Stéphane Canu
 
Tele3113 wk1wed
Tele3113 wk1wedTele3113 wk1wed
Tele3113 wk1wed
Vin Voro
 

Was ist angesagt? (20)

Mesh Processing Course : Multiresolution
Mesh Processing Course : MultiresolutionMesh Processing Course : Multiresolution
Mesh Processing Course : Multiresolution
 
Geodesic Method in Computer Vision and Graphics
Geodesic Method in Computer Vision and GraphicsGeodesic Method in Computer Vision and Graphics
Geodesic Method in Computer Vision and Graphics
 
Learning Sparse Representation
Learning Sparse RepresentationLearning Sparse Representation
Learning Sparse Representation
 
Actuarial Science Reference Sheet
Actuarial Science Reference SheetActuarial Science Reference Sheet
Actuarial Science Reference Sheet
 
Lecture 2: linear SVM in the dual
Lecture 2: linear SVM in the dualLecture 2: linear SVM in the dual
Lecture 2: linear SVM in the dual
 
Mesh Processing Course : Mesh Parameterization
Mesh Processing Course : Mesh ParameterizationMesh Processing Course : Mesh Parameterization
Mesh Processing Course : Mesh Parameterization
 
Lecture5 kernel svm
Lecture5 kernel svmLecture5 kernel svm
Lecture5 kernel svm
 
A series of maximum entropy upper bounds of the differential entropy
A series of maximum entropy upper bounds of the differential entropyA series of maximum entropy upper bounds of the differential entropy
A series of maximum entropy upper bounds of the differential entropy
 
Lecture3 linear svm_with_slack
Lecture3 linear svm_with_slackLecture3 linear svm_with_slack
Lecture3 linear svm_with_slack
 
Lecture 1: linear SVM in the primal
Lecture 1: linear SVM in the primalLecture 1: linear SVM in the primal
Lecture 1: linear SVM in the primal
 
Bregman divergences from comparative convexity
Bregman divergences from comparative convexityBregman divergences from comparative convexity
Bregman divergences from comparative convexity
 
Classification with mixtures of curved Mahalanobis metrics
Classification with mixtures of curved Mahalanobis metricsClassification with mixtures of curved Mahalanobis metrics
Classification with mixtures of curved Mahalanobis metrics
 
Image denoising
Image denoisingImage denoising
Image denoising
 
QMC: Operator Splitting Workshop, A New (More Intuitive?) Interpretation of I...
QMC: Operator Splitting Workshop, A New (More Intuitive?) Interpretation of I...QMC: Operator Splitting Workshop, A New (More Intuitive?) Interpretation of I...
QMC: Operator Splitting Workshop, A New (More Intuitive?) Interpretation of I...
 
QMC: Operator Splitting Workshop, Boundedness of the Sequence if Iterates Gen...
QMC: Operator Splitting Workshop, Boundedness of the Sequence if Iterates Gen...QMC: Operator Splitting Workshop, Boundedness of the Sequence if Iterates Gen...
QMC: Operator Splitting Workshop, Boundedness of the Sequence if Iterates Gen...
 
The dual geometry of Shannon information
The dual geometry of Shannon informationThe dual geometry of Shannon information
The dual geometry of Shannon information
 
Lecture 2: linear SVM in the Dual
Lecture 2: linear SVM in the DualLecture 2: linear SVM in the Dual
Lecture 2: linear SVM in the Dual
 
Iceaa07 Foils
Iceaa07 FoilsIceaa07 Foils
Iceaa07 Foils
 
Andreas Eberle
Andreas EberleAndreas Eberle
Andreas Eberle
 
Tele3113 wk1wed
Tele3113 wk1wedTele3113 wk1wed
Tele3113 wk1wed
 

Ähnlich wie Model Selection with Piecewise Regular Gauges

slides_online_optimization_david_mateos
slides_online_optimization_david_mateosslides_online_optimization_david_mateos
slides_online_optimization_david_mateos
David Mateos
 
QMC: Transition Workshop - Applying Quasi-Monte Carlo Methods to a Stochastic...
QMC: Transition Workshop - Applying Quasi-Monte Carlo Methods to a Stochastic...QMC: Transition Workshop - Applying Quasi-Monte Carlo Methods to a Stochastic...
QMC: Transition Workshop - Applying Quasi-Monte Carlo Methods to a Stochastic...
The Statistical and Applied Mathematical Sciences Institute
 
Integration techniques
Integration techniquesIntegration techniques
Integration techniques
Krishna Gali
 

Ähnlich wie Model Selection with Piecewise Regular Gauges (20)

Automatic Bayesian method for Numerical Integration
Automatic Bayesian method for Numerical Integration Automatic Bayesian method for Numerical Integration
Automatic Bayesian method for Numerical Integration
 
Topic5
Topic5Topic5
Topic5
 
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
 
Modeling the Dynamics of SGD by Stochastic Differential Equation
Modeling the Dynamics of SGD by Stochastic Differential EquationModeling the Dynamics of SGD by Stochastic Differential Equation
Modeling the Dynamics of SGD by Stochastic Differential Equation
 
ch3.ppt
ch3.pptch3.ppt
ch3.ppt
 
2018 MUMS Fall Course - Statistical Representation of Model Input (EDITED) - ...
2018 MUMS Fall Course - Statistical Representation of Model Input (EDITED) - ...2018 MUMS Fall Course - Statistical Representation of Model Input (EDITED) - ...
2018 MUMS Fall Course - Statistical Representation of Model Input (EDITED) - ...
 
Formulario Geometria Analitica
Formulario Geometria AnaliticaFormulario Geometria Analitica
Formulario Geometria Analitica
 
slides_online_optimization_david_mateos
slides_online_optimization_david_mateosslides_online_optimization_david_mateos
slides_online_optimization_david_mateos
 
QMC: Transition Workshop - Applying Quasi-Monte Carlo Methods to a Stochastic...
QMC: Transition Workshop - Applying Quasi-Monte Carlo Methods to a Stochastic...QMC: Transition Workshop - Applying Quasi-Monte Carlo Methods to a Stochastic...
QMC: Transition Workshop - Applying Quasi-Monte Carlo Methods to a Stochastic...
 
A Szemerédi-type theorem for subsets of the unit cube
A Szemerédi-type theorem for subsets of the unit cubeA Szemerédi-type theorem for subsets of the unit cube
A Szemerédi-type theorem for subsets of the unit cube
 
Properties of bivariate and conditional Gaussian PDFs
Properties of bivariate and conditional Gaussian PDFsProperties of bivariate and conditional Gaussian PDFs
Properties of bivariate and conditional Gaussian PDFs
 
Differential Calculus
Differential Calculus Differential Calculus
Differential Calculus
 
Nested sampling
Nested samplingNested sampling
Nested sampling
 
2018 MUMS Fall Course - Mathematical surrogate and reduced-order models - Ral...
2018 MUMS Fall Course - Mathematical surrogate and reduced-order models - Ral...2018 MUMS Fall Course - Mathematical surrogate and reduced-order models - Ral...
2018 MUMS Fall Course - Mathematical surrogate and reduced-order models - Ral...
 
Applications of Differential Calculus in real life
Applications of Differential Calculus in real life Applications of Differential Calculus in real life
Applications of Differential Calculus in real life
 
Time-Series Analysis on Multiperiodic Conditional Correlation by Sparse Covar...
Time-Series Analysis on Multiperiodic Conditional Correlation by Sparse Covar...Time-Series Analysis on Multiperiodic Conditional Correlation by Sparse Covar...
Time-Series Analysis on Multiperiodic Conditional Correlation by Sparse Covar...
 
Docslide.us 2002 formulae-and-tables
Docslide.us 2002 formulae-and-tablesDocslide.us 2002 formulae-and-tables
Docslide.us 2002 formulae-and-tables
 
cps170_bayes_nets.ppt
cps170_bayes_nets.pptcps170_bayes_nets.ppt
cps170_bayes_nets.ppt
 
Integration techniques
Integration techniquesIntegration techniques
Integration techniques
 
Modeling Transformations
Modeling TransformationsModeling Transformations
Modeling Transformations
 

Mehr von Gabriel Peyré

Mehr von Gabriel Peyré (18)

Adaptive Signal and Image Processing
Adaptive Signal and Image ProcessingAdaptive Signal and Image Processing
Adaptive Signal and Image Processing
 
Mesh Processing Course : Introduction
Mesh Processing Course : IntroductionMesh Processing Course : Introduction
Mesh Processing Course : Introduction
 
Mesh Processing Course : Geodesics
Mesh Processing Course : GeodesicsMesh Processing Course : Geodesics
Mesh Processing Course : Geodesics
 
Mesh Processing Course : Geodesic Sampling
Mesh Processing Course : Geodesic SamplingMesh Processing Course : Geodesic Sampling
Mesh Processing Course : Geodesic Sampling
 
Mesh Processing Course : Differential Calculus
Mesh Processing Course : Differential CalculusMesh Processing Course : Differential Calculus
Mesh Processing Course : Differential Calculus
 
Signal Processing Course : Theory for Sparse Recovery
Signal Processing Course : Theory for Sparse RecoverySignal Processing Course : Theory for Sparse Recovery
Signal Processing Course : Theory for Sparse Recovery
 
Signal Processing Course : Presentation of the Course
Signal Processing Course : Presentation of the CourseSignal Processing Course : Presentation of the Course
Signal Processing Course : Presentation of the Course
 
Signal Processing Course : Orthogonal Bases
Signal Processing Course : Orthogonal BasesSignal Processing Course : Orthogonal Bases
Signal Processing Course : Orthogonal Bases
 
Signal Processing Course : Sparse Regularization of Inverse Problems
Signal Processing Course : Sparse Regularization of Inverse ProblemsSignal Processing Course : Sparse Regularization of Inverse Problems
Signal Processing Course : Sparse Regularization of Inverse Problems
 
Signal Processing Course : Fourier
Signal Processing Course : FourierSignal Processing Course : Fourier
Signal Processing Course : Fourier
 
Signal Processing Course : Denoising
Signal Processing Course : DenoisingSignal Processing Course : Denoising
Signal Processing Course : Denoising
 
Signal Processing Course : Compressed Sensing
Signal Processing Course : Compressed SensingSignal Processing Course : Compressed Sensing
Signal Processing Course : Compressed Sensing
 
Signal Processing Course : Approximation
Signal Processing Course : ApproximationSignal Processing Course : Approximation
Signal Processing Course : Approximation
 
Signal Processing Course : Wavelets
Signal Processing Course : WaveletsSignal Processing Course : Wavelets
Signal Processing Course : Wavelets
 
Sparsity and Compressed Sensing
Sparsity and Compressed SensingSparsity and Compressed Sensing
Sparsity and Compressed Sensing
 
Optimal Transport in Imaging Sciences
Optimal Transport in Imaging SciencesOptimal Transport in Imaging Sciences
Optimal Transport in Imaging Sciences
 
An Introduction to Optimal Transport
An Introduction to Optimal TransportAn Introduction to Optimal Transport
An Introduction to Optimal Transport
 
A Review of Proximal Methods, with a New One
A Review of Proximal Methods, with a New OneA Review of Proximal Methods, with a New One
A Review of Proximal Methods, with a New One
 

Kürzlich hochgeladen

Beyond the EU: DORA and NIS 2 Directive's Global Impact
Beyond the EU: DORA and NIS 2 Directive's Global ImpactBeyond the EU: DORA and NIS 2 Directive's Global Impact
Beyond the EU: DORA and NIS 2 Directive's Global Impact
PECB
 
Russian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in Delhi
Russian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in DelhiRussian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in Delhi
Russian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in Delhi
kauryashika82
 
Seal of Good Local Governance (SGLG) 2024Final.pptx
Seal of Good Local Governance (SGLG) 2024Final.pptxSeal of Good Local Governance (SGLG) 2024Final.pptx
Seal of Good Local Governance (SGLG) 2024Final.pptx
negromaestrong
 
Making and Justifying Mathematical Decisions.pdf
Making and Justifying Mathematical Decisions.pdfMaking and Justifying Mathematical Decisions.pdf
Making and Justifying Mathematical Decisions.pdf
Chris Hunter
 
1029-Danh muc Sach Giao Khoa khoi 6.pdf
1029-Danh muc Sach Giao Khoa khoi  6.pdf1029-Danh muc Sach Giao Khoa khoi  6.pdf
1029-Danh muc Sach Giao Khoa khoi 6.pdf
QucHHunhnh
 

Kürzlich hochgeladen (20)

Beyond the EU: DORA and NIS 2 Directive's Global Impact
Beyond the EU: DORA and NIS 2 Directive's Global ImpactBeyond the EU: DORA and NIS 2 Directive's Global Impact
Beyond the EU: DORA and NIS 2 Directive's Global Impact
 
Nutritional Needs Presentation - HLTH 104
Nutritional Needs Presentation - HLTH 104Nutritional Needs Presentation - HLTH 104
Nutritional Needs Presentation - HLTH 104
 
Unit-IV- Pharma. Marketing Channels.pptx
Unit-IV- Pharma. Marketing Channels.pptxUnit-IV- Pharma. Marketing Channels.pptx
Unit-IV- Pharma. Marketing Channels.pptx
 
Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...
Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...
Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...
 
Role Of Transgenic Animal In Target Validation-1.pptx
Role Of Transgenic Animal In Target Validation-1.pptxRole Of Transgenic Animal In Target Validation-1.pptx
Role Of Transgenic Animal In Target Validation-1.pptx
 
Python Notes for mca i year students osmania university.docx
Python Notes for mca i year students osmania university.docxPython Notes for mca i year students osmania university.docx
Python Notes for mca i year students osmania university.docx
 
Energy Resources. ( B. Pharmacy, 1st Year, Sem-II) Natural Resources
Energy Resources. ( B. Pharmacy, 1st Year, Sem-II) Natural ResourcesEnergy Resources. ( B. Pharmacy, 1st Year, Sem-II) Natural Resources
Energy Resources. ( B. Pharmacy, 1st Year, Sem-II) Natural Resources
 
On National Teacher Day, meet the 2024-25 Kenan Fellows
On National Teacher Day, meet the 2024-25 Kenan FellowsOn National Teacher Day, meet the 2024-25 Kenan Fellows
On National Teacher Day, meet the 2024-25 Kenan Fellows
 
Application orientated numerical on hev.ppt
Application orientated numerical on hev.pptApplication orientated numerical on hev.ppt
Application orientated numerical on hev.ppt
 
Class 11th Physics NEET formula sheet pdf
Class 11th Physics NEET formula sheet pdfClass 11th Physics NEET formula sheet pdf
Class 11th Physics NEET formula sheet pdf
 
This PowerPoint helps students to consider the concept of infinity.
This PowerPoint helps students to consider the concept of infinity.This PowerPoint helps students to consider the concept of infinity.
This PowerPoint helps students to consider the concept of infinity.
 
PROCESS RECORDING FORMAT.docx
PROCESS      RECORDING        FORMAT.docxPROCESS      RECORDING        FORMAT.docx
PROCESS RECORDING FORMAT.docx
 
Russian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in Delhi
Russian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in DelhiRussian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in Delhi
Russian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in Delhi
 
Web & Social Media Analytics Previous Year Question Paper.pdf
Web & Social Media Analytics Previous Year Question Paper.pdfWeb & Social Media Analytics Previous Year Question Paper.pdf
Web & Social Media Analytics Previous Year Question Paper.pdf
 
Mixin Classes in Odoo 17 How to Extend Models Using Mixin Classes
Mixin Classes in Odoo 17  How to Extend Models Using Mixin ClassesMixin Classes in Odoo 17  How to Extend Models Using Mixin Classes
Mixin Classes in Odoo 17 How to Extend Models Using Mixin Classes
 
Seal of Good Local Governance (SGLG) 2024Final.pptx
Seal of Good Local Governance (SGLG) 2024Final.pptxSeal of Good Local Governance (SGLG) 2024Final.pptx
Seal of Good Local Governance (SGLG) 2024Final.pptx
 
How to Give a Domain for a Field in Odoo 17
How to Give a Domain for a Field in Odoo 17How to Give a Domain for a Field in Odoo 17
How to Give a Domain for a Field in Odoo 17
 
Making and Justifying Mathematical Decisions.pdf
Making and Justifying Mathematical Decisions.pdfMaking and Justifying Mathematical Decisions.pdf
Making and Justifying Mathematical Decisions.pdf
 
1029-Danh muc Sach Giao Khoa khoi 6.pdf
1029-Danh muc Sach Giao Khoa khoi  6.pdf1029-Danh muc Sach Giao Khoa khoi  6.pdf
1029-Danh muc Sach Giao Khoa khoi 6.pdf
 
Introduction to Nonprofit Accounting: The Basics
Introduction to Nonprofit Accounting: The BasicsIntroduction to Nonprofit Accounting: The Basics
Introduction to Nonprofit Accounting: The Basics
 

Model Selection with Piecewise Regular Gauges

  • 1. Gabriel Peyré www.numerical-tours.com Model Selection with Piecewise Regular Gauges Samuel Vaiter Charles Deledalle Jalal Fadili Joint work with: Joseph Salmon VISI N
  • 2. Overview • Inverse Problems • Gauge Decomposition and Model Selection • L2 Stability Performances • Model Stability Performances
  • 3. y = x0 + w 2 RP Inverse Problems Recovering x0 RN from noisy observations
  • 4. Examples: Inpainting, super-resolution, compressed-sensing y = x0 + w 2 RP Inverse Problems Recovering x0 RN from noisy observations x0 x0
  • 5. Regularized inversion: Estimators x(y) 2 argmin x2RN 1 2 ||y x||2 + J(x) Data fidelity Regularity Observations: y = x0 + w 2 RP .
  • 6. L2 error stability: ||x(y) x0|| = O(||w||). Promoted subspace (“model”) stability. Goal: Performance analysis: Regularized inversion: Estimators x(y) 2 argmin x2RN 1 2 ||y x||2 + J(x) ! Criteria on (x0, ||w||, ) to ensure Data fidelity Regularity Observations: y = x0 + w 2 RP .
  • 7. Overview • Inverse Problems • Gauge Decomposition and Model Selection • L2 Stability Performances • Model Stability Performances
  • 8. Coe cients x Image x Union of Linear Models for Data Processing Union of models: T 2 T linear spaces. Synthesis sparsity: T
  • 9. Coe cients x Image x Union of Linear Models for Data Processing Union of models: T 2 T linear spaces. Synthesis sparsity: T Structured sparsity:
  • 10. Coe cients x Image x Union of Linear Models for Data Processing D Image x Gradient D⇤ x Union of models: T 2 T linear spaces. Synthesis sparsity: T Structured sparsity: Analysis sparsity:
  • 11. Coe cients x Image x Multi-spectral imaging: xi,· = Pr j=1 Ai,jSj,· Union of Linear Models for Data Processing D Image x Gradient D⇤ x Union of models: T 2 T linear spaces. Synthesis sparsity: T Structured sparsity: Analysis sparsity: Low-rank: S1,· S2,· S3,·x
  • 12. Gauge: J : RN ! R+ 8 ↵ 2 R+ , J(↵x) = ↵J(x) Gauges for Union of Linear Models Convex
  • 13. Gauge: J : RN ! R+ 8 ↵ 2 R+ , J(↵x) = ↵J(x) J(x) = C(x) = inf {⇢ > 0 x 2 ⇢C} C = {x J(x) 6 1} (assuming 0 2 C) Gauges for Union of Linear Models J(x) C 1 Convex
  • 14. Gauge: , Union of linear models (T)T 2TPiecewise regular ball J : RN ! R+ 8 ↵ 2 R+ , J(↵x) = ↵J(x) J(x) = C(x) = inf {⇢ > 0 x 2 ⇢C} C = {x J(x) 6 1} (assuming 0 2 C) Gauges for Union of Linear Models J(x) C 1 Convex
  • 15. x T Gauge: , Union of linear models (T)T 2TPiecewise regular ball J : RN ! R+ 8 ↵ 2 R+ , J(↵x) = ↵J(x) J(x) = C(x) = inf {⇢ > 0 x 2 ⇢C} C = {x J(x) 6 1} (assuming 0 2 C) Gauges for Union of Linear Models J(x) = ||x||1 T = sparse vectors J(x) C 1 Convex
  • 16. x T Gauge: , Union of linear models (T)T 2TPiecewise regular ball J : RN ! R+ 8 ↵ 2 R+ , J(↵x) = ↵J(x) J(x) = C(x) = inf {⇢ > 0 x 2 ⇢C} C = {x J(x) 6 1} (assuming 0 2 C) Gauges for Union of Linear Models J(x) = ||x||1 x0 T0 T = sparse vectors J(x) C 1 Convex
  • 17. x T Gauge: , Union of linear models (T)T 2TPiecewise regular ball J : RN ! R+ 8 ↵ 2 R+ , J(↵x) = ↵J(x) J(x) = C(x) = inf {⇢ > 0 x 2 ⇢C} C = {x J(x) 6 1} (assuming 0 2 C) Gauges for Union of Linear Models J(x) = ||x||1 x0 T0 T = sparse vectors |x1|+||x2,3|| x0 xT T0 T = block vectors sparse J(x) C 1 Convex
  • 18. x T Gauge: , Union of linear models (T)T 2TPiecewise regular ball J : RN ! R+ 8 ↵ 2 R+ , J(↵x) = ↵J(x) J(x) = C(x) = inf {⇢ > 0 x 2 ⇢C} C = {x J(x) 6 1} (assuming 0 2 C) Gauges for Union of Linear Models J(x) = ||x||1 T = low-rank matrices J(x) = ||x||⇤ x x0 T0 T = sparse vectors |x1|+||x2,3|| x0 xT T0 T = block vectors sparse J(x) C 1 Convex
  • 19. x T Gauge: , Union of linear models (T)T 2TPiecewise regular ball J : RN ! R+ 8 ↵ 2 R+ , J(↵x) = ↵J(x) J(x) = C(x) = inf {⇢ > 0 x 2 ⇢C} C = {x J(x) 6 1} (assuming 0 2 C) Gauges for Union of Linear Models J(x) = ||x||1 T = low-rank matrices J(x) = ||x||⇤ x x0 T0 T = anti- sparse vectors J(x) = ||x||1 x x0 T0 T = sparse vectors |x1|+||x2,3|| x0 xT T0 T = block vectors sparse J(x) C 1 Convex
  • 20. Subdifferentials and Models @J(x) = ⌘ 2 RN 8 y, J(y) > J(x) + h⌘, y xi
  • 21. I = supp(x) = {i xi 6= 0} Subdifferentials and Models Example: J(x) = ||x||1 @||x||1 = ⇢ ⌘ supp(⌘) = I, 8 j /2 I, |⌘j| 6 1 x @J(x) 0 @J(x) = ⌘ 2 RN 8 y, J(y) > J(x) + h⌘, y xi
  • 22. x@J(x) 0 I = supp(x) = {i xi 6= 0} Subdifferentials and Models Example: J(x) = ||x||1 @||x||1 = ⇢ ⌘ supp(⌘) = I, 8 j /2 I, |⌘j| 6 1 x @J(x) 0 @J(x) = ⌘ 2 RN 8 y, J(y) > J(x) + h⌘, y xi
  • 23. Tx x@J(x) 0 Definition: I = supp(x) = {i xi 6= 0} Tx = VectHull(@J(x))? Subdifferentials and Models Tx = {⌘ supp(⌘) = I} Example: J(x) = ||x||1 @||x||1 = ⇢ ⌘ supp(⌘) = I, 8 j /2 I, |⌘j| 6 1 Tx x @J(x) 0 @J(x) = ⌘ 2 RN 8 y, J(y) > J(x) + h⌘, y xi
  • 24. Tx x@J(x) 0 Definition: I = supp(x) = {i xi 6= 0} Tx = VectHull(@J(x))? Subdifferentials and Models ex ex = ProjTx (@J(x)) ex = sign(x) Tx = {⌘ supp(⌘) = I} Example: J(x) = ||x||1 @||x||1 = ⇢ ⌘ supp(⌘) = I, 8 j /2 I, |⌘j| 6 1 ex Tx x @J(x) 0 @J(x) = ⌘ 2 RN 8 y, J(y) > J(x) + h⌘, y xi
  • 25. Examples `1 sparsity: J(x) = ||x||1 ex = sign(x) Tx = {z supp(z) ⇢ supp(x)} x0 x @J(x)
  • 26. Examples x0 x @J(x) `1 sparsity: J(x) = ||x||1 ex = sign(x) Tx = {z supp(z) ⇢ supp(x)} ex = (N(xb))b2B N(a) = a/||a||Structured sparsity: J(x) = P b ||xb|| Tx = {z supp(z) ⇢ supp(x)} x0 x @J(x)
  • 27. Examples x0 x @J(x) Tx = {z U⇤ ?zV? = 0}ex = UV ⇤ Nuclear norm: J(x) = ||x||⇤ x = U⇤V ⇤ SVD: `1 sparsity: J(x) = ||x||1 ex = sign(x) Tx = {z supp(z) ⇢ supp(x)} ex = (N(xb))b2B N(a) = a/||a||Structured sparsity: J(x) = P b ||xb|| Tx = {z supp(z) ⇢ supp(x)} x @J(x) x0 x @J(x)
  • 28. Examples x0 x @J(x) x x0 @J(x) I = {i |xi| = ||x||1}Anti-sparsity: J(x) = ||x||1 Tx = {y yI / sign(xI)}ex = |I| 1 sign(x) Tx = {z U⇤ ?zV? = 0}ex = UV ⇤ Nuclear norm: J(x) = ||x||⇤ x = U⇤V ⇤ SVD: `1 sparsity: J(x) = ||x||1 ex = sign(x) Tx = {z supp(z) ⇢ supp(x)} ex = (N(xb))b2B N(a) = a/||a||Structured sparsity: J(x) = P b ||xb|| Tx = {z supp(z) ⇢ supp(x)} x @J(x) x0 x @J(x)
  • 29. Overview • Inverse Problems • Gauge Decomposition and Model Selection • L2 Stability Performances • Model Stability Performances
  • 30. Noiseless recovery: min x= x0 J(x) (P0) x = x0 Dual Certificate and L2 Stability x?
  • 31. Noiseless recovery: min x= x0 J(x) (P0) Dual certificates: x = x0 ⌘ Proposition: D = Im( ⇤ ) @J(x0) 9 ⌘ 2 D () x0 solution of (P0) Dual Certificate and L2 Stability @J(x0) x?
  • 32. Noiseless recovery: min x= x0 J(x) (P0) Dual certificates: Tight dual certificates: x = x0 ⌘ Proposition: D = Im( ⇤ ) @J(x0) ¯D = Im( ⇤ ) ri(@J(x0)) 9 ⌘ 2 D () x0 solution of (P0) Dual Certificate and L2 Stability @J(x0) x?
  • 33. Noiseless recovery: min x= x0 J(x) (P0) Dual certificates: Tight dual certificates: x = x0 ⌘ Proposition: D = Im( ⇤ ) @J(x0) ¯D = Im( ⇤ ) ri(@J(x0)) 9 ⌘ 2 D () x0 solution of (P0) Dual Certificate and L2 Stability @J(x0) x? Theorem: [Fadili et al. 2013] for ⇠ ||w|| one has ||x? x0|| = O(||w||) If 9 ⌘ 2 ¯D and ker( ) Tx0 = {0}
  • 34. Noiseless recovery: min x= x0 J(x) (P0) Dual certificates: Tight dual certificates: x = x0 ⌘ Proposition: ! The constants depend on N . . . D = Im( ⇤ ) @J(x0) ¯D = Im( ⇤ ) ri(@J(x0)) 9 ⌘ 2 D () x0 solution of (P0) Dual Certificate and L2 Stability @J(x0) x? Theorem: [Fadili et al. 2013] for ⇠ ||w|| one has ||x? x0|| = O(||w||) If 9 ⌘ 2 ¯D and ker( ) Tx0 = {0}
  • 35. Noiseless recovery: min x= x0 J(x) (P0) Dual certificates: Tight dual certificates: x = x0 ⌘ Proposition: [Grassmair 2012]: J(x? x0) = O(||w||). [Grassmair, Haltmeier, Scherzer 2010]: J = || · ||1. ! The constants depend on N . . . D = Im( ⇤ ) @J(x0) ¯D = Im( ⇤ ) ri(@J(x0)) 9 ⌘ 2 D () x0 solution of (P0) Dual Certificate and L2 Stability @J(x0) x? Theorem: [Fadili et al. 2013] for ⇠ ||w|| one has ||x? x0|| = O(||w||) If 9 ⌘ 2 ¯D and ker( ) Tx0 = {0}
  • 36. Overview • Inverse Problems • Gauge Decomposition and Model Selection • L2 Stability Performances • Model Stability Performances
  • 37. ⌘ 2 D () and J (⌘) = 1 Minimal-norm Certificate ⌘ = ⇤ q ⌘T = e ⇢ T = Tx0 e = ex0
  • 38. ⌘ 2 D () We assume ker( ) T = {0} and J piecewise regular. and J (⌘) = 1 Minimal-norm Certificate ⌘ = ⇤ q ⌘T = e ⇢ T = Tx0 e = ex0
  • 39. ⌘0 = argmin ⌘= ⇤q,⌘T =e ||q|| ⌘ 2 D () We assume ker( ) T = {0} and J piecewise regular. and J (⌘) = 1 Minimal-norm Certificate ⌘ = ⇤ q ⌘T = e Minimal-norm pre-certificate: ⇢ T = Tx0 e = ex0
  • 40. ⌘0 = argmin ⌘= ⇤q,⌘T =e ||q|| ⌘ 2 D () We assume ker( ) T = {0} and J piecewise regular. and J (⌘) = 1 Minimal-norm Certificate Proposition: One has ⌘ = ⇤ q ⌘T = e Minimal-norm pre-certificate: ⇢ T = Tx0 e = ex0 ⌘0 = ( + T )⇤ e
  • 41. ⌘0 = argmin ⌘= ⇤q,⌘T =e ||q|| ⌘ 2 D () We assume ker( ) T = {0} and J piecewise regular. and J (⌘) = 1 Minimal-norm Certificate Proposition: ||w|| = O(⌫x0 ) and ⇠ ||w||,Theorem: the unique solution x? of P (y) for y = x0 + w satisfies Tx? = Tx0 and ||x? x0|| = O(||w||) [Vaiter et al. 2013] One has ⌘ = ⇤ q ⌘T = e Minimal-norm pre-certificate: ⇢ T = Tx0 e = ex0 ⌘0 = ( + T )⇤ e If ⌘0 2 ¯D,
  • 42. [Fuchs 2004]: J = || · ||1. [Bach 2008]: J = || · ||1,2 and J = || · ||⇤. [Vaiter et al. 2011]: J = ||D⇤ · ||1. ⌘0 = argmin ⌘= ⇤q,⌘T =e ||q|| ⌘ 2 D () We assume ker( ) T = {0} and J piecewise regular. and J (⌘) = 1 Minimal-norm Certificate Proposition: ||w|| = O(⌫x0 ) and ⇠ ||w||,Theorem: the unique solution x? of P (y) for y = x0 + w satisfies Tx? = Tx0 and ||x? x0|| = O(||w||) [Vaiter et al. 2013] One has ⌘ = ⇤ q ⌘T = e Minimal-norm pre-certificate: ⇢ T = Tx0 e = ex0 ⌘0 = ( + T )⇤ e If ⌘0 2 ¯D,
  • 43. ⇥x = i xi (· i) Increasing : reduces correlation. reduces resolution. J(x) = ||x||1 Example: 1-D Sparse Deconvolution x0 x0
  • 44. ⇥x = i xi (· i) Increasing : reduces correlation. reduces resolution. 0 10 2 support recovery. J(x) = ||x||1 () ||⌘0,Ic ||1 < 1 ⌘0 2 ¯D(x0) I = {j x0(j) 6= 0} ||⌘0,Ic ||1 Example: 1-D Sparse Deconvolution x0 x0 20 1 ()
  • 45. J(x) = ||rx||1 (rx)i = xi xi 1 = Id I = {i (rx0)i 6= 0} 8 j /2 I, ( ↵0)j = 0⌘0 = div(↵0) where Example: 1-D TV Denoising x0
  • 46. +1 1 I J Support stability. J(x) = ||rx||1 (rx)i = xi xi 1 = Id I = {i (rx0)i 6= 0} 8 j /2 I, ( ↵0)j = 0⌘0 = div(↵0) where ||↵0,Ic || < 1 Example: 1-D TV Denoising x0 x0
  • 47. +1 1 I J `2 stability onlySupport stability. x0 J(x) = ||rx||1 (rx)i = xi xi 1 = Id I = {i (rx0)i 6= 0} 8 j /2 I, ( ↵0)j = 0⌘0 = div(↵0) where ||↵0,Ic || < 1 ||↵0,Ic || = 1 Example: 1-D TV Denoising +1 1 J x0 x0
  • 48. Gauges: encode linear models as singular points. Conclusion
  • 49. Piecewise smooth gauges: enable model recovery Tx? = Tx0 . Gauges: encode linear models as singular points. Tight dual certificates: enables L2 stability. Conclusion
  • 50. Piecewise smooth gauges: enable model recovery Tx? = Tx0 . – Approximate model recovery Tx? ⇡ Tx0 . Gauges: encode linear models as singular points. – Infinite dimensional problems (measures, TV, etc.). Tight dual certificates: enables L2 stability. Conclusion Open problems: