SlideShare ist ein Scribd-Unternehmen logo
1 von 36
Downloaden Sie, um offline zu lesen
A computational framework based over Transported
Meshfree methods.
P.G. LeFloch 1, J.M. Mercier 2
1CNRS, 2MPG-Partners
16 01 2020
P.G. LeFloch 1
, J.M. Mercier 2
(1
CNRS, 2
MPG-Partners)A computational framework based over Transported Meshfree methods.16 01 2020 1 / 12
Foundations : local integration with Monte-Carlo methods
Monte Carlo estimations - consider the following family of (worst) error estimates (µ
probability measure, Y = (y1
, . . . , yN
) ∈ RN×D
)
RD
ϕ(x)dµ −
1
N
N
n=1
ϕ(yn
) ≤ E Y , Hµ ϕ Hµ
where Hµ is a µ-weighted Hilbert (or Banach) functional space.
1 classical example 1 : Y i.i.d. → E(Y , Hµ) ∼ 1√
N
and Hµ ∼ L2
(RD
, |x|2
dµ) (stat :
law of large number) : most used convergence rate in the Finance industry.
P.G. LeFloch 1
, J.M. Mercier 2
(1
CNRS, 2
MPG-Partners)A computational framework based over Transported Meshfree methods.16 01 2020 2 / 12
Foundations : local integration with Monte-Carlo methods
Monte Carlo estimations - consider the following family of (worst) error estimates (µ
probability measure, Y = (y1
, . . . , yN
) ∈ RN×D
)
RD
ϕ(x)dµ −
1
N
N
n=1
ϕ(yn
) ≤ E Y , Hµ ϕ Hµ
where Hµ is a µ-weighted Hilbert (or Banach) functional space.
1 classical example 1 : Y i.i.d. → E(Y , Hµ) ∼ 1√
N
and Hµ ∼ L2
(RD
, |x|2
dµ) (stat :
law of large number) : most used convergence rate in the Finance industry.
2 classical ex 2 : Y SOBOL, µ = dxΩ, Ω = [0, 1]D
. Then HK = BV (Ω) (bounded
variations), E(Y , HK ) ≥ ln(N)D−1
N
( Koksma-Hlawka sharp estimate conjecture).
P.G. LeFloch 1
, J.M. Mercier 2
(1
CNRS, 2
MPG-Partners)A computational framework based over Transported Meshfree methods.16 01 2020 2 / 12
Foundations : local integration with Monte-Carlo methods
Monte Carlo estimations - consider the following family of (worst) error estimates (µ
probability measure, Y = (y1
, . . . , yN
) ∈ RN×D
)
RD
ϕ(x)dµ −
1
N
N
n=1
ϕ(yn
) ≤ E Y , Hµ ϕ Hµ
where Hµ is a µ-weighted Hilbert (or Banach) functional space.
1 classical example 1 : Y i.i.d. → E(Y , Hµ) ∼ 1√
N
and Hµ ∼ L2
(RD
, |x|2
dµ) (stat :
law of large number) : most used convergence rate in the Finance industry.
2 classical ex 2 : Y SOBOL, µ = dxΩ, Ω = [0, 1]D
. Then HK = BV (Ω) (bounded
variations), E(Y , HK ) ≥ ln(N)D−1
N
( Koksma-Hlawka sharp estimate conjecture).
3 Others examples ...quantifiers, wavelet, deep feed-forward neural networks ...
P.G. LeFloch 1
, J.M. Mercier 2
(1
CNRS, 2
MPG-Partners)A computational framework based over Transported Meshfree methods.16 01 2020 2 / 12
A general approach using kernels methods
1 You have a problem involving a probability measure µ and you guess that the
solution belongs to a weighted functional space Hµ.
P.G. LeFloch 1
, J.M. Mercier 2
(1
CNRS, 2
MPG-Partners)A computational framework based over Transported Meshfree methods.16 01 2020 3 / 12
A general approach using kernels methods
1 You have a problem involving a probability measure µ and you guess that the
solution belongs to a weighted functional space Hµ.
2 Identify the admissible kernel K(x, y) generating it (RHKS theory) : Hµ ≡ HK .
Example of classically used kernels : RELU, convolutional kernels, Wendland
functions...
P.G. LeFloch 1
, J.M. Mercier 2
(1
CNRS, 2
MPG-Partners)A computational framework based over Transported Meshfree methods.16 01 2020 3 / 12
A general approach using kernels methods
1 You have a problem involving a probability measure µ and you guess that the
solution belongs to a weighted functional space Hµ.
2 Identify the admissible kernel K(x, y) generating it (RHKS theory) : Hµ ≡ HK .
Example of classically used kernels : RELU, convolutional kernels, Wendland
functions...
3 Pick (i.i.d) samples y1
, . . . , yN
. Then you can measure your integration error using
RD
ϕ(x)dµ −
1
N
N
n=1
ϕ(yn
) ≤ E Y , HK ϕ HK
where
E2
(Y , HK ) =
R2D
K(x, y)dxdy +
1
N2
N
n,m=1
K(yn
, ym
) −
2
N
N
n=1 RD
K(x, yn
)dx
P.G. LeFloch 1
, J.M. Mercier 2
(1
CNRS, 2
MPG-Partners)A computational framework based over Transported Meshfree methods.16 01 2020 3 / 12
A general approach using kernels methods
1 You have a problem involving a probability measure µ and you guess that the
solution belongs to a weighted functional space Hµ.
2 Identify the admissible kernel K(x, y) generating it (RHKS theory) : Hµ ≡ HK .
Example of classically used kernels : RELU, convolutional kernels, Wendland
functions...
3 Pick (i.i.d) samples y1
, . . . , yN
. Then you can measure your integration error using
RD
ϕ(x)dµ −
1
N
N
n=1
ϕ(yn
) ≤ E Y , HK ϕ HK
where
E2
(Y , HK ) =
R2D
K(x, y)dxdy +
1
N2
N
n,m=1
K(yn
, ym
) −
2
N
N
n=1 RD
K(x, yn
)dx
4 You can optimize your error computing sharp discrepancy sequences and optimal
discrepancy error as
Y = arg inf
Y ∈RD×N
E(Y , HK ), EHK (N, D) = E(Y , HK )
P.G. LeFloch 1
, J.M. Mercier 2
(1
CNRS, 2
MPG-Partners)A computational framework based over Transported Meshfree methods.16 01 2020 3 / 12
Our local kernels : lattice-based and transported kernels
For our purposes, we crafted two kind of kernels :
1 Lattice-based kernel : (suited to study Lebesgue-measure of type µ = dxΩ). Let L
a Lattice, L∗
its dual Lattice. Consider any discrete function satisfying
φ(α∗
) ∈ 1
(L∗
), φ(α∗
) ≥ 0, φ(0) = 1 and define
Kper (x, y) =
1
|L|
α∗∈L∗
φ(α∗
) exp2iπ<x−y,α∗
>
x
y
z
Matern
x
y
k Multiquadric
x
y
k
Gaussian
x
y
k
Truncated
P.G. LeFloch 1
, J.M. Mercier 2
(1
CNRS, 2
MPG-Partners)A computational framework based over Transported Meshfree methods.16 01 2020 4 / 12
Our local kernels : lattice-based and transported kernels
For our purposes, we crafted two kind of kernels :
1 Lattice-based kernel : (suited to study Lebesgue-measure of type µ = dxΩ). Let L
a Lattice, L∗
its dual Lattice. Consider any discrete function satisfying
φ(α∗
) ∈ 1
(L∗
), φ(α∗
) ≥ 0, φ(0) = 1 and define
Kper (x, y) =
1
|L|
α∗∈L∗
φ(α∗
) exp2iπ<x−y,α∗
>
x
y
z
Matern
x
y
k Multiquadric
x
y
k
Gaussian
x
y
k
Truncated
2 Transported kernel : S : Ω → RD
a transport map. Ktra(x, y) = K(S(x), S(y)).
x
y
k
Matern
x
y
k
Gaussian
x
y
k
Multiquadric
x
y
k
Truncated
P.G. LeFloch 1
, J.M. Mercier 2
(1
CNRS, 2
MPG-Partners)A computational framework based over Transported Meshfree methods.16 01 2020 4 / 12
Example I : Monte-Carlo integration with Matern kernel
1 Kernel, random and computed sequences Y . N=256,D=2.
x
y
z
Matern
0.0 0.2 0.4 0.6 0.8 1.0
0.00.20.40.60.81.0
random points
x
y
0.0 0.2 0.4 0.6 0.8 1.0
0.00.20.40.60.81.0
computed points for lattice Matern
x
y
P.G. LeFloch 1
, J.M. Mercier 2
(1
CNRS, 2
MPG-Partners)A computational framework based over Transported Meshfree methods.16 01 2020 5 / 12
Example I : Monte-Carlo integration with Matern kernel
1 Kernel, random and computed sequences Y . N=256,D=2.
x
y
z
Matern
0.0 0.2 0.4 0.6 0.8 1.0
0.00.20.40.60.81.0
random points
x
y
0.0 0.2 0.4 0.6 0.8 1.0
0.00.20.40.60.81.0
computed points for lattice Matern
x
y
2 Optimal discrepancy error → Koksma-Hlavka type estimate
EHK (N, D) ∼ n>N
φ(α∗n)
N
∼
ln(N)D−1
N
, φ(α) = ΠD
d=1
2
1 + 4π2α2
d /τ2
D
P.G. LeFloch 1
, J.M. Mercier 2
(1
CNRS, 2
MPG-Partners)A computational framework based over Transported Meshfree methods.16 01 2020 5 / 12
Example I : Monte-Carlo integration with Matern kernel
1 Kernel, random and computed sequences Y . N=256,D=2.
x
y
z
Matern
0.0 0.2 0.4 0.6 0.8 1.0
0.00.20.40.60.81.0
random points
x
y
0.0 0.2 0.4 0.6 0.8 1.0
0.00.20.40.60.81.0
computed points for lattice Matern
x
y
2 Optimal discrepancy error → Koksma-Hlavka type estimate
EHK (N, D) ∼ n>N
φ(α∗n)
N
∼
ln(N)D−1
N
, φ(α) = ΠD
d=1
2
1 + 4π2α2
d /τ2
D
3 E(Y , HK ) random – vs E(Y , HK ) computed - vs theoretical EHK (N, D)
D=1 D=16 D=128
N=16 0.228 0.304 0.319
N=128 0.117 0.111 0.115
N=512 0.035 0.054 0.059
D=1 D=16 D=128
N=16 0.062 0.211 0.223
N=128 0.008 0.069 0.077
N=512 0.002 0.034 0.049
D=1 D=16 D=128
N=16 0.062 0.288 0.323
N=128 0.008 0.077 0.105
N=512 0.002 0.034 0.043
P.G. LeFloch 1
, J.M. Mercier 2
(1
CNRS, 2
MPG-Partners)A computational framework based over Transported Meshfree methods.16 01 2020 5 / 12
Application : Machine Learning
1 Setting : consider a set of observations
(y1
, P1
), . . . , (yN
, PN
) ∈ RD×M×N
P.G. LeFloch 1
, J.M. Mercier 2
(1
CNRS, 2
MPG-Partners)A computational framework based over Transported Meshfree methods.16 01 2020 6 / 12
Application : Machine Learning
1 Setting : consider a set of observations
(y1
, P1
), . . . , (yN
, PN
) ∈ RD×M×N
2 Interpolation : pick-up a kernel K(x, y), denotes HK its native space, and consider
a continuous function P(y) such that
< P, δyn >= P(yn
) ∼ Pn
One can further optimize computing Y (∼ learning).
P.G. LeFloch 1
, J.M. Mercier 2
(1
CNRS, 2
MPG-Partners)A computational framework based over Transported Meshfree methods.16 01 2020 6 / 12
Application : Machine Learning
1 Setting : consider a set of observations
(y1
, P1
), . . . , (yN
, PN
) ∈ RD×M×N
2 Interpolation : pick-up a kernel K(x, y), denotes HK its native space, and consider
a continuous function P(y) such that
< P, δyn >= P(yn
) ∼ Pn
One can further optimize computing Y (∼ learning).
3 Extrapolation : then one can extrapolate with error bound
RD
P(x)dµ −
1
N
N
n=1
P(yn
) ≤ EHK (N, D) P HK
i.e. µ ∼ 1
N
N
n=1
δyn .
P.G. LeFloch 1
, J.M. Mercier 2
(1
CNRS, 2
MPG-Partners)A computational framework based over Transported Meshfree methods.16 01 2020 6 / 12
Application : Machine Learning
1 Setting : consider a set of observations
(y1
, P1
), . . . , (yN
, PN
) ∈ RD×M×N
2 Interpolation : pick-up a kernel K(x, y), denotes HK its native space, and consider
a continuous function P(y) such that
< P, δyn >= P(yn
) ∼ Pn
One can further optimize computing Y (∼ learning).
3 Extrapolation : then one can extrapolate with error bound
RD
P(x)dµ −
1
N
N
n=1
P(yn
) ≤ EHK (N, D) P HK
i.e. µ ∼ 1
N
N
n=1
δyn .
4 Here are two very similar applications :
1 (y1
, P1
), . . . , (yN
, PN
) are prices and implied volatilities (eg call
options under SABR model): Pricing.
2 (y1
, P1
), . . . , (yN
, PN
) are pictures of dogs and cats : classifier.
P.G. LeFloch 1
, J.M. Mercier 2
(1
CNRS, 2
MPG-Partners)A computational framework based over Transported Meshfree methods.16 01 2020 6 / 12
Application to time-dependant PDE
(Loading NS)
1 Consider a time dependant probability measure µ(t, x) and a kernel Kt
(x, y). We
can define sharp discrepancy sequences t → y1
(t), . . . , yn
(t) .
P.G. LeFloch 1
, J.M. Mercier 2
(1
CNRS, 2
MPG-Partners)A computational framework based over Transported Meshfree methods.16 01 2020 7 / 12
Application to time-dependant PDE
(Loading NS)
1 Consider a time dependant probability measure µ(t, x) and a kernel Kt
(x, y). We
can define sharp discrepancy sequences t → y1
(t), . . . , yn
(t) .
2 For PDE, we can try to compute these sequences. For instance consider the
Navier-Stokes equation (hyperbolic equations)
∂t µ = · (vµ), ∂t (µv) + · (µv2
) = − p + · (µΣ)
· v = 0 (or energy conservation for non newtonian fluids)
Together with boundary conditions Dirichlet / Neumann. We obtain a numerical
scheme sharing some similarities with SPH - smooth particle hydrodynamics :
that are LAGRANGIAN MESHFREE METHODS.
P.G. LeFloch 1
, J.M. Mercier 2
(1
CNRS, 2
MPG-Partners)A computational framework based over Transported Meshfree methods.16 01 2020 7 / 12
Application to industrial Finance
Consider µ(t, x) solution to non-linear hyperbolic-parabolic Fokker-Planck equations
∂t µ − Lµ = 0, L = · (bµ) + 2
· (Aµ), A :=
1
2
σσT
1 FORWARD : Compute µ(t) ∼ 1
N
δy1(t) + . . . + δyN (t) as sharp discrepancy
sequences.
RD
ϕ(x)dµ(t, x) −
1
N
N
n=1
ϕ(yn
(t)) ≤ E Y (t), HKt ϕ HKt
P.G. LeFloch 1
, J.M. Mercier 2
(1
CNRS, 2
MPG-Partners)A computational framework based over Transported Meshfree methods.16 01 2020 8 / 12
Application to industrial Finance
Consider µ(t, x) solution to non-linear hyperbolic-parabolic Fokker-Planck equations
∂t µ − Lµ = 0, L = · (bµ) + 2
· (Aµ), A :=
1
2
σσT
1 FORWARD : Compute µ(t) ∼ 1
N
δy1(t) + . . . + δyN (t) as sharp discrepancy
sequences.
RD
ϕ(x)dµ(t, x) −
1
N
N
n=1
ϕ(yn
(t)) ≤ E Y (t), HKt ϕ HKt
2 CHECK optimal rate : E Y (t), HKt ∼ EHKt (N, D)
P.G. LeFloch 1
, J.M. Mercier 2
(1
CNRS, 2
MPG-Partners)A computational framework based over Transported Meshfree methods.16 01 2020 8 / 12
Application to industrial Finance
Consider µ(t, x) solution to non-linear hyperbolic-parabolic Fokker-Planck equations
∂t µ − Lµ = 0, L = · (bµ) + 2
· (Aµ), A :=
1
2
σσT
1 FORWARD : Compute µ(t) ∼ 1
N
δy1(t) + . . . + δyN (t) as sharp discrepancy
sequences.
RD
ϕ(x)dµ(t, x) −
1
N
N
n=1
ϕ(yn
(t)) ≤ E Y (t), HKt ϕ HKt
2 CHECK optimal rate : E Y (t), HKt ∼ EHKt (N, D)
3 BACKWARD : interpret t → yn
(t), n = 1 . . . N as a moving, transported, PDE
grid (TREE). Solve with it the Kolmogorov equation. ERROR ESTIMATION :
RD
P(t, ·)dµ(t, ·) −
1
N
N
n=1
P(t, yn
(t)) ≤ EHKt (N, D) P(t, ·) HKt
P.G. LeFloch 1
, J.M. Mercier 2
(1
CNRS, 2
MPG-Partners)A computational framework based over Transported Meshfree methods.16 01 2020 8 / 12
Illustration : the 2D SABR process, widely used in Finance
SABR process d
Ft
αt
= ρ
αt Fβ
t 0
0 ναt
dW 1
t
dW 2
t
, with 0 ≤ β ≤ 1,
ν ≥ 0,ρ ∈ R2×2
. The Fokker-Planck equation associated to SABR is
∂t µ + L∗
µ = 0, L∗
µ = ρ
x2
2
2
xβ
1 0
0 ν2
2
x2
ρT
· 2
µ.
(Loading SABR200)
P.G. LeFloch 1
, J.M. Mercier 2
(1
CNRS, 2
MPG-Partners)A computational framework based over Transported Meshfree methods.16 01 2020 9 / 12
The curse of dimensionality
CURSE of dimensionality in finance : Price and manage a complex option written on
several underlyings.
1 Step 1 : Compute a measure solution µ(t, x) to a Fokker-Planck equation in large
dimension, calibrate it.
P.G. LeFloch 1
, J.M. Mercier 2
(1
CNRS, 2
MPG-Partners)A computational framework based over Transported Meshfree methods.16 01 2020 10 / 12
The curse of dimensionality
CURSE of dimensionality in finance : Price and manage a complex option written on
several underlyings.
1 Step 1 : Compute a measure solution µ(t, x) to a Fokker-Planck equation in large
dimension, calibrate it.
2 Step 2 : Backward a Kolmogorov equation in large dimension, denote the solution
P(t, x).
P.G. LeFloch 1
, J.M. Mercier 2
(1
CNRS, 2
MPG-Partners)A computational framework based over Transported Meshfree methods.16 01 2020 10 / 12
The curse of dimensionality
CURSE of dimensionality in finance : Price and manage a complex option written on
several underlyings.
1 Step 1 : Compute a measure solution µ(t, x) to a Fokker-Planck equation in large
dimension, calibrate it.
2 Step 2 : Backward a Kolmogorov equation in large dimension, denote the solution
P(t, x).
3 Step 3 : Compute various metrics on the solution as for instance : VaR or XVA for
regulatory purposes, or future deltas / gammas / implied vols for hedging purposes.
P.G. LeFloch 1
, J.M. Mercier 2
(1
CNRS, 2
MPG-Partners)A computational framework based over Transported Meshfree methods.16 01 2020 10 / 12
The curse of dimensionality
CURSE of dimensionality in finance : Price and manage a complex option written on
several underlyings.
1 Step 1 : Compute a measure solution µ(t, x) to a Fokker-Planck equation in large
dimension, calibrate it.
2 Step 2 : Backward a Kolmogorov equation in large dimension, denote the solution
P(t, x).
3 Step 3 : Compute various metrics on the solution as for instance : VaR or XVA for
regulatory purposes, or future deltas / gammas / implied vols for hedging purposes.
4 Result ? We can compute the solution P(t, x) at any order of accuracy :
RD
P(t, ·)dµ(t, ·) −
1
N
N
n=1
P(t, yn
(t)) ≤
P(t, ·) HK
Nα
...
...where α ≥ 1/2 is any number... Choose it according to your desired electricity bill
! But beware to smoothing effects in high dimensions : HK contains less
informations as the dimension raises. Some problems, as are for instance optimal
stopping problems, are intrinsecally cursed.
P.G. LeFloch 1
, J.M. Mercier 2
(1
CNRS, 2
MPG-Partners)A computational framework based over Transported Meshfree methods.16 01 2020 10 / 12
academic tests, business cases
1 Academic works : finance, non-linear hyperbolic systems
1 Revisiting the method of characteristics via a convex hull algorithm :
explicit solutions to high-dimensional conservation laws with non
convex-fluxes.
2 Numerical results using CoDeFi. Benchmark of TMM methods for
classical pricing problems.
P.G. LeFloch 1
, J.M. Mercier 2
(1
CNRS, 2
MPG-Partners)A computational framework based over Transported Meshfree methods.16 01 2020 11 / 12
academic tests, business cases
1 Academic works : finance, non-linear hyperbolic systems
1 Revisiting the method of characteristics via a convex hull algorithm :
explicit solutions to high-dimensional conservation laws with non
convex-fluxes.
2 Numerical results using CoDeFi. Benchmark of TMM methods for
classical pricing problems.
2 Business cases - done
1 Hedging Strategies for Net Interest Income and Economic Values of
Equity (http://dx.doi.org/10.2139/ssrn.3454813, with S.Miryusupov).
2 Compute metrics for big portfolio of Autocalls depending on several
underlyings (unpublished).
P.G. LeFloch 1
, J.M. Mercier 2
(1
CNRS, 2
MPG-Partners)A computational framework based over Transported Meshfree methods.16 01 2020 11 / 12
academic tests, business cases
1 Academic works : finance, non-linear hyperbolic systems
1 Revisiting the method of characteristics via a convex hull algorithm :
explicit solutions to high-dimensional conservation laws with non
convex-fluxes.
2 Numerical results using CoDeFi. Benchmark of TMM methods for
classical pricing problems.
2 Business cases - done
1 Hedging Strategies for Net Interest Income and Economic Values of
Equity (http://dx.doi.org/10.2139/ssrn.3454813, with S.Miryusupov).
2 Compute metrics for big portfolio of Autocalls depending on several
underlyings (unpublished).
3 Under work
1 McKean Vlasov equations (stochastic volatility modeling).
2 ISDA Standard Initial Margin : XVA computations based on sensitivities
(delta / vega ..gamma)
3 Transition IBOR / RFR rates a la Lyashenko - Mercurio.
4 Strategies for Liquidity risk : Hamilton-Jacobi-Bellman equations in high
dimensions.
P.G. LeFloch 1
, J.M. Mercier 2
(1
CNRS, 2
MPG-Partners)A computational framework based over Transported Meshfree methods.16 01 2020 11 / 12
Summary and Conclusions
We presented in this talk:
1 News, sharps, estimations for Monte-Carlo methods...
P.G. LeFloch 1
, J.M. Mercier 2
(1
CNRS, 2
MPG-Partners)A computational framework based over Transported Meshfree methods.16 01 2020 12 / 12
Summary and Conclusions
We presented in this talk:
1 News, sharps, estimations for Monte-Carlo methods...
2 ...that can be used in a wide variety of context to perform a sharp
error analysis.
P.G. LeFloch 1
, J.M. Mercier 2
(1
CNRS, 2
MPG-Partners)A computational framework based over Transported Meshfree methods.16 01 2020 12 / 12
Summary and Conclusions
We presented in this talk:
1 News, sharps, estimations for Monte-Carlo methods...
2 ...that can be used in a wide variety of context to perform a sharp
error analysis.
3 A new method for numerical simulations of PDE : Transported
Meshfree Methods....
P.G. LeFloch 1
, J.M. Mercier 2
(1
CNRS, 2
MPG-Partners)A computational framework based over Transported Meshfree methods.16 01 2020 12 / 12
Summary and Conclusions
We presented in this talk:
1 News, sharps, estimations for Monte-Carlo methods...
2 ...that can be used in a wide variety of context to perform a sharp
error analysis.
3 A new method for numerical simulations of PDE : Transported
Meshfree Methods....
4 ... that can be used in a wide variety of applications (hyperbolic /
parabolic equations, artificial intelligence, etc...)...
P.G. LeFloch 1
, J.M. Mercier 2
(1
CNRS, 2
MPG-Partners)A computational framework based over Transported Meshfree methods.16 01 2020 12 / 12
Summary and Conclusions
We presented in this talk:
1 News, sharps, estimations for Monte-Carlo methods...
2 ...that can be used in a wide variety of context to perform a sharp
error analysis.
3 A new method for numerical simulations of PDE : Transported
Meshfree Methods....
4 ... that can be used in a wide variety of applications (hyperbolic /
parabolic equations, artificial intelligence, etc...)...
5 ..for which the error analysis applies : we can guarantee a worst
error estimation, and we can check that this error matches an
optimal convergence rate.
P.G. LeFloch 1
, J.M. Mercier 2
(1
CNRS, 2
MPG-Partners)A computational framework based over Transported Meshfree methods.16 01 2020 12 / 12
Summary and Conclusions
We presented in this talk:
1 News, sharps, estimations for Monte-Carlo methods...
2 ...that can be used in a wide variety of context to perform a sharp
error analysis.
3 A new method for numerical simulations of PDE : Transported
Meshfree Methods....
4 ... that can be used in a wide variety of applications (hyperbolic /
parabolic equations, artificial intelligence, etc...)...
5 ..for which the error analysis applies : we can guarantee a worst
error estimation, and we can check that this error matches an
optimal convergence rate.
6 ...Thus we can argue that our numerical methods reach nearly optimal
algorithmic complexity.
P.G. LeFloch 1
, J.M. Mercier 2
(1
CNRS, 2
MPG-Partners)A computational framework based over Transported Meshfree methods.16 01 2020 12 / 12

Weitere ähnliche Inhalte

Was ist angesagt?

بررسی دو روش شناسایی سیستم های متغیر با زمان به همراه شبیه سازی و گزارش
بررسی دو روش شناسایی سیستم های متغیر با زمان به همراه شبیه سازی و گزارشبررسی دو روش شناسایی سیستم های متغیر با زمان به همراه شبیه سازی و گزارش
بررسی دو روش شناسایی سیستم های متغیر با زمان به همراه شبیه سازی و گزارشپروژه مارکت
 
Dictionary Learning for Massive Matrix Factorization
Dictionary Learning for Massive Matrix FactorizationDictionary Learning for Massive Matrix Factorization
Dictionary Learning for Massive Matrix Factorizationrecsysfr
 
Differential analyses of structures in HiC data
Differential analyses of structures in HiC dataDifferential analyses of structures in HiC data
Differential analyses of structures in HiC datatuxette
 
Least squares support Vector Machine Classifier
Least squares support Vector Machine ClassifierLeast squares support Vector Machine Classifier
Least squares support Vector Machine ClassifierRaj Sikarwar
 
Habilitation à diriger des recherches
Habilitation à diriger des recherchesHabilitation à diriger des recherches
Habilitation à diriger des recherchesPierre Pudlo
 
Data-Driven Recommender Systems
Data-Driven Recommender SystemsData-Driven Recommender Systems
Data-Driven Recommender Systemsrecsysfr
 
Machine Learning and Statistical Analysis
Machine Learning and Statistical AnalysisMachine Learning and Statistical Analysis
Machine Learning and Statistical Analysisbutest
 
cvpr2009 tutorial: kernel methods in computer vision: part II: Statistics and...
cvpr2009 tutorial: kernel methods in computer vision: part II: Statistics and...cvpr2009 tutorial: kernel methods in computer vision: part II: Statistics and...
cvpr2009 tutorial: kernel methods in computer vision: part II: Statistics and...zukun
 
Hyperparameter optimization with approximate gradient
Hyperparameter optimization with approximate gradientHyperparameter optimization with approximate gradient
Hyperparameter optimization with approximate gradientFabian Pedregosa
 
Robust Image Denoising in RKHS via Orthogonal Matching Pursuit
Robust Image Denoising in RKHS via Orthogonal Matching PursuitRobust Image Denoising in RKHS via Orthogonal Matching Pursuit
Robust Image Denoising in RKHS via Orthogonal Matching PursuitPantelis Bouboulis
 

Was ist angesagt? (15)

Polynomial Matrix Decompositions
Polynomial Matrix DecompositionsPolynomial Matrix Decompositions
Polynomial Matrix Decompositions
 
بررسی دو روش شناسایی سیستم های متغیر با زمان به همراه شبیه سازی و گزارش
بررسی دو روش شناسایی سیستم های متغیر با زمان به همراه شبیه سازی و گزارشبررسی دو روش شناسایی سیستم های متغیر با زمان به همراه شبیه سازی و گزارش
بررسی دو روش شناسایی سیستم های متغیر با زمان به همراه شبیه سازی و گزارش
 
Estimating Space-Time Covariance from Finite Sample Sets
Estimating Space-Time Covariance from Finite Sample SetsEstimating Space-Time Covariance from Finite Sample Sets
Estimating Space-Time Covariance from Finite Sample Sets
 
Dictionary Learning for Massive Matrix Factorization
Dictionary Learning for Massive Matrix FactorizationDictionary Learning for Massive Matrix Factorization
Dictionary Learning for Massive Matrix Factorization
 
Differential analyses of structures in HiC data
Differential analyses of structures in HiC dataDifferential analyses of structures in HiC data
Differential analyses of structures in HiC data
 
CSC446: Pattern Recognition (LN8)
CSC446: Pattern Recognition (LN8)CSC446: Pattern Recognition (LN8)
CSC446: Pattern Recognition (LN8)
 
ICPR 2012
ICPR 2012ICPR 2012
ICPR 2012
 
Least squares support Vector Machine Classifier
Least squares support Vector Machine ClassifierLeast squares support Vector Machine Classifier
Least squares support Vector Machine Classifier
 
Habilitation à diriger des recherches
Habilitation à diriger des recherchesHabilitation à diriger des recherches
Habilitation à diriger des recherches
 
Data-Driven Recommender Systems
Data-Driven Recommender SystemsData-Driven Recommender Systems
Data-Driven Recommender Systems
 
Machine Learning and Statistical Analysis
Machine Learning and Statistical AnalysisMachine Learning and Statistical Analysis
Machine Learning and Statistical Analysis
 
cvpr2009 tutorial: kernel methods in computer vision: part II: Statistics and...
cvpr2009 tutorial: kernel methods in computer vision: part II: Statistics and...cvpr2009 tutorial: kernel methods in computer vision: part II: Statistics and...
cvpr2009 tutorial: kernel methods in computer vision: part II: Statistics and...
 
Hyperparameter optimization with approximate gradient
Hyperparameter optimization with approximate gradientHyperparameter optimization with approximate gradient
Hyperparameter optimization with approximate gradient
 
Robust Image Denoising in RKHS via Orthogonal Matching Pursuit
Robust Image Denoising in RKHS via Orthogonal Matching PursuitRobust Image Denoising in RKHS via Orthogonal Matching Pursuit
Robust Image Denoising in RKHS via Orthogonal Matching Pursuit
 
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
 

Ähnlich wie Pres metabief2020jmm

NIPS2010: optimization algorithms in machine learning
NIPS2010: optimization algorithms in machine learningNIPS2010: optimization algorithms in machine learning
NIPS2010: optimization algorithms in machine learningzukun
 
Hierarchical matrices for approximating large covariance matries and computin...
Hierarchical matrices for approximating large covariance matries and computin...Hierarchical matrices for approximating large covariance matries and computin...
Hierarchical matrices for approximating large covariance matries and computin...Alexander Litvinenko
 
Computational Information Geometry on Matrix Manifolds (ICTP 2013)
Computational Information Geometry on Matrix Manifolds (ICTP 2013)Computational Information Geometry on Matrix Manifolds (ICTP 2013)
Computational Information Geometry on Matrix Manifolds (ICTP 2013)Frank Nielsen
 
A new implementation of k-MLE for mixture modelling of Wishart distributions
A new implementation of k-MLE for mixture modelling of Wishart distributionsA new implementation of k-MLE for mixture modelling of Wishart distributions
A new implementation of k-MLE for mixture modelling of Wishart distributionsFrank Nielsen
 
Jörg Stelzer
Jörg StelzerJörg Stelzer
Jörg Stelzerbutest
 
SURF 2012 Final Report(1)
SURF 2012 Final Report(1)SURF 2012 Final Report(1)
SURF 2012 Final Report(1)Eric Zhang
 
Nonlinear Stochastic Optimization by the Monte-Carlo Method
Nonlinear Stochastic Optimization by the Monte-Carlo MethodNonlinear Stochastic Optimization by the Monte-Carlo Method
Nonlinear Stochastic Optimization by the Monte-Carlo MethodSSA KPI
 
Slides: A glance at information-geometric signal processing
Slides: A glance at information-geometric signal processingSlides: A glance at information-geometric signal processing
Slides: A glance at information-geometric signal processingFrank Nielsen
 
Maximum likelihood estimation of regularisation parameters in inverse problem...
Maximum likelihood estimation of regularisation parameters in inverse problem...Maximum likelihood estimation of regularisation parameters in inverse problem...
Maximum likelihood estimation of regularisation parameters in inverse problem...Valentin De Bortoli
 
Regularized Compression of A Noisy Blurred Image
Regularized Compression of A Noisy Blurred Image Regularized Compression of A Noisy Blurred Image
Regularized Compression of A Noisy Blurred Image ijcsa
 
Pattern learning and recognition on statistical manifolds: An information-geo...
Pattern learning and recognition on statistical manifolds: An information-geo...Pattern learning and recognition on statistical manifolds: An information-geo...
Pattern learning and recognition on statistical manifolds: An information-geo...Frank Nielsen
 
An investigation of inference of the generalized extreme value distribution b...
An investigation of inference of the generalized extreme value distribution b...An investigation of inference of the generalized extreme value distribution b...
An investigation of inference of the generalized extreme value distribution b...Alexander Decker
 
FiniteElementNotes
FiniteElementNotesFiniteElementNotes
FiniteElementNotesMartin Jones
 
ENBIS 2018 presentation on Deep k-Means
ENBIS 2018 presentation on Deep k-MeansENBIS 2018 presentation on Deep k-Means
ENBIS 2018 presentation on Deep k-Meanstthonet
 
Graph Neural Network in practice
Graph Neural Network in practiceGraph Neural Network in practice
Graph Neural Network in practicetuxette
 
IVR - Chapter 1 - Introduction
IVR - Chapter 1 - IntroductionIVR - Chapter 1 - Introduction
IVR - Chapter 1 - IntroductionCharles Deledalle
 
Unbiased Markov chain Monte Carlo
Unbiased Markov chain Monte CarloUnbiased Markov chain Monte Carlo
Unbiased Markov chain Monte CarloJeremyHeng10
 

Ähnlich wie Pres metabief2020jmm (20)

NIPS2010: optimization algorithms in machine learning
NIPS2010: optimization algorithms in machine learningNIPS2010: optimization algorithms in machine learning
NIPS2010: optimization algorithms in machine learning
 
Hierarchical matrices for approximating large covariance matries and computin...
Hierarchical matrices for approximating large covariance matries and computin...Hierarchical matrices for approximating large covariance matries and computin...
Hierarchical matrices for approximating large covariance matries and computin...
 
Computational Information Geometry on Matrix Manifolds (ICTP 2013)
Computational Information Geometry on Matrix Manifolds (ICTP 2013)Computational Information Geometry on Matrix Manifolds (ICTP 2013)
Computational Information Geometry on Matrix Manifolds (ICTP 2013)
 
A new implementation of k-MLE for mixture modelling of Wishart distributions
A new implementation of k-MLE for mixture modelling of Wishart distributionsA new implementation of k-MLE for mixture modelling of Wishart distributions
A new implementation of k-MLE for mixture modelling of Wishart distributions
 
Jörg Stelzer
Jörg StelzerJörg Stelzer
Jörg Stelzer
 
SURF 2012 Final Report(1)
SURF 2012 Final Report(1)SURF 2012 Final Report(1)
SURF 2012 Final Report(1)
 
Nonlinear Stochastic Optimization by the Monte-Carlo Method
Nonlinear Stochastic Optimization by the Monte-Carlo MethodNonlinear Stochastic Optimization by the Monte-Carlo Method
Nonlinear Stochastic Optimization by the Monte-Carlo Method
 
Slides: A glance at information-geometric signal processing
Slides: A glance at information-geometric signal processingSlides: A glance at information-geometric signal processing
Slides: A glance at information-geometric signal processing
 
Maximum likelihood estimation of regularisation parameters in inverse problem...
Maximum likelihood estimation of regularisation parameters in inverse problem...Maximum likelihood estimation of regularisation parameters in inverse problem...
Maximum likelihood estimation of regularisation parameters in inverse problem...
 
Regularized Compression of A Noisy Blurred Image
Regularized Compression of A Noisy Blurred Image Regularized Compression of A Noisy Blurred Image
Regularized Compression of A Noisy Blurred Image
 
Pattern learning and recognition on statistical manifolds: An information-geo...
Pattern learning and recognition on statistical manifolds: An information-geo...Pattern learning and recognition on statistical manifolds: An information-geo...
Pattern learning and recognition on statistical manifolds: An information-geo...
 
An investigation of inference of the generalized extreme value distribution b...
An investigation of inference of the generalized extreme value distribution b...An investigation of inference of the generalized extreme value distribution b...
An investigation of inference of the generalized extreme value distribution b...
 
FiniteElementNotes
FiniteElementNotesFiniteElementNotes
FiniteElementNotes
 
ENBIS 2018 presentation on Deep k-Means
ENBIS 2018 presentation on Deep k-MeansENBIS 2018 presentation on Deep k-Means
ENBIS 2018 presentation on Deep k-Means
 
Graph Neural Network in practice
Graph Neural Network in practiceGraph Neural Network in practice
Graph Neural Network in practice
 
KAUST_talk_short.pdf
KAUST_talk_short.pdfKAUST_talk_short.pdf
KAUST_talk_short.pdf
 
Jere Koskela slides
Jere Koskela slidesJere Koskela slides
Jere Koskela slides
 
IVR - Chapter 1 - Introduction
IVR - Chapter 1 - IntroductionIVR - Chapter 1 - Introduction
IVR - Chapter 1 - Introduction
 
Unbiased Markov chain Monte Carlo
Unbiased Markov chain Monte CarloUnbiased Markov chain Monte Carlo
Unbiased Markov chain Monte Carlo
 
Optimization tutorial
Optimization tutorialOptimization tutorial
Optimization tutorial
 

Kürzlich hochgeladen

Understanding the Laravel MVC Architecture
Understanding the Laravel MVC ArchitectureUnderstanding the Laravel MVC Architecture
Understanding the Laravel MVC ArchitecturePixlogix Infotech
 
AI as an Interface for Commercial Buildings
AI as an Interface for Commercial BuildingsAI as an Interface for Commercial Buildings
AI as an Interface for Commercial BuildingsMemoori
 
Presentation on how to chat with PDF using ChatGPT code interpreter
Presentation on how to chat with PDF using ChatGPT code interpreterPresentation on how to chat with PDF using ChatGPT code interpreter
Presentation on how to chat with PDF using ChatGPT code interpreternaman860154
 
GenCyber Cyber Security Day Presentation
GenCyber Cyber Security Day PresentationGenCyber Cyber Security Day Presentation
GenCyber Cyber Security Day PresentationMichael W. Hawkins
 
Kotlin Multiplatform & Compose Multiplatform - Starter kit for pragmatics
Kotlin Multiplatform & Compose Multiplatform - Starter kit for pragmaticsKotlin Multiplatform & Compose Multiplatform - Starter kit for pragmatics
Kotlin Multiplatform & Compose Multiplatform - Starter kit for pragmaticscarlostorres15106
 
08448380779 Call Girls In Diplomatic Enclave Women Seeking Men
08448380779 Call Girls In Diplomatic Enclave Women Seeking Men08448380779 Call Girls In Diplomatic Enclave Women Seeking Men
08448380779 Call Girls In Diplomatic Enclave Women Seeking MenDelhi Call girls
 
A Domino Admins Adventures (Engage 2024)
A Domino Admins Adventures (Engage 2024)A Domino Admins Adventures (Engage 2024)
A Domino Admins Adventures (Engage 2024)Gabriella Davis
 
Neo4j - How KGs are shaping the future of Generative AI at AWS Summit London ...
Neo4j - How KGs are shaping the future of Generative AI at AWS Summit London ...Neo4j - How KGs are shaping the future of Generative AI at AWS Summit London ...
Neo4j - How KGs are shaping the future of Generative AI at AWS Summit London ...Neo4j
 
Breaking the Kubernetes Kill Chain: Host Path Mount
Breaking the Kubernetes Kill Chain: Host Path MountBreaking the Kubernetes Kill Chain: Host Path Mount
Breaking the Kubernetes Kill Chain: Host Path MountPuma Security, LLC
 
The Codex of Business Writing Software for Real-World Solutions 2.pptx
The Codex of Business Writing Software for Real-World Solutions 2.pptxThe Codex of Business Writing Software for Real-World Solutions 2.pptx
The Codex of Business Writing Software for Real-World Solutions 2.pptxMalak Abu Hammad
 
SIEMENS: RAPUNZEL – A Tale About Knowledge Graph
SIEMENS: RAPUNZEL – A Tale About Knowledge GraphSIEMENS: RAPUNZEL – A Tale About Knowledge Graph
SIEMENS: RAPUNZEL – A Tale About Knowledge GraphNeo4j
 
How to Remove Document Management Hurdles with X-Docs?
How to Remove Document Management Hurdles with X-Docs?How to Remove Document Management Hurdles with X-Docs?
How to Remove Document Management Hurdles with X-Docs?XfilesPro
 
Transforming Data Streams with Kafka Connect: An Introduction to Single Messa...
Transforming Data Streams with Kafka Connect: An Introduction to Single Messa...Transforming Data Streams with Kafka Connect: An Introduction to Single Messa...
Transforming Data Streams with Kafka Connect: An Introduction to Single Messa...HostedbyConfluent
 
Benefits Of Flutter Compared To Other Frameworks
Benefits Of Flutter Compared To Other FrameworksBenefits Of Flutter Compared To Other Frameworks
Benefits Of Flutter Compared To Other FrameworksSoftradix Technologies
 
Azure Monitor & Application Insight to monitor Infrastructure & Application
Azure Monitor & Application Insight to monitor Infrastructure & ApplicationAzure Monitor & Application Insight to monitor Infrastructure & Application
Azure Monitor & Application Insight to monitor Infrastructure & ApplicationAndikSusilo4
 
08448380779 Call Girls In Civil Lines Women Seeking Men
08448380779 Call Girls In Civil Lines Women Seeking Men08448380779 Call Girls In Civil Lines Women Seeking Men
08448380779 Call Girls In Civil Lines Women Seeking MenDelhi Call girls
 
Integration and Automation in Practice: CI/CD in Mule Integration and Automat...
Integration and Automation in Practice: CI/CD in Mule Integration and Automat...Integration and Automation in Practice: CI/CD in Mule Integration and Automat...
Integration and Automation in Practice: CI/CD in Mule Integration and Automat...Patryk Bandurski
 
FULL ENJOY 🔝 8264348440 🔝 Call Girls in Diplomatic Enclave | Delhi
FULL ENJOY 🔝 8264348440 🔝 Call Girls in Diplomatic Enclave | DelhiFULL ENJOY 🔝 8264348440 🔝 Call Girls in Diplomatic Enclave | Delhi
FULL ENJOY 🔝 8264348440 🔝 Call Girls in Diplomatic Enclave | Delhisoniya singh
 
[2024]Digital Global Overview Report 2024 Meltwater.pdf
[2024]Digital Global Overview Report 2024 Meltwater.pdf[2024]Digital Global Overview Report 2024 Meltwater.pdf
[2024]Digital Global Overview Report 2024 Meltwater.pdfhans926745
 
Enhancing Worker Digital Experience: A Hands-on Workshop for Partners
Enhancing Worker Digital Experience: A Hands-on Workshop for PartnersEnhancing Worker Digital Experience: A Hands-on Workshop for Partners
Enhancing Worker Digital Experience: A Hands-on Workshop for PartnersThousandEyes
 

Kürzlich hochgeladen (20)

Understanding the Laravel MVC Architecture
Understanding the Laravel MVC ArchitectureUnderstanding the Laravel MVC Architecture
Understanding the Laravel MVC Architecture
 
AI as an Interface for Commercial Buildings
AI as an Interface for Commercial BuildingsAI as an Interface for Commercial Buildings
AI as an Interface for Commercial Buildings
 
Presentation on how to chat with PDF using ChatGPT code interpreter
Presentation on how to chat with PDF using ChatGPT code interpreterPresentation on how to chat with PDF using ChatGPT code interpreter
Presentation on how to chat with PDF using ChatGPT code interpreter
 
GenCyber Cyber Security Day Presentation
GenCyber Cyber Security Day PresentationGenCyber Cyber Security Day Presentation
GenCyber Cyber Security Day Presentation
 
Kotlin Multiplatform & Compose Multiplatform - Starter kit for pragmatics
Kotlin Multiplatform & Compose Multiplatform - Starter kit for pragmaticsKotlin Multiplatform & Compose Multiplatform - Starter kit for pragmatics
Kotlin Multiplatform & Compose Multiplatform - Starter kit for pragmatics
 
08448380779 Call Girls In Diplomatic Enclave Women Seeking Men
08448380779 Call Girls In Diplomatic Enclave Women Seeking Men08448380779 Call Girls In Diplomatic Enclave Women Seeking Men
08448380779 Call Girls In Diplomatic Enclave Women Seeking Men
 
A Domino Admins Adventures (Engage 2024)
A Domino Admins Adventures (Engage 2024)A Domino Admins Adventures (Engage 2024)
A Domino Admins Adventures (Engage 2024)
 
Neo4j - How KGs are shaping the future of Generative AI at AWS Summit London ...
Neo4j - How KGs are shaping the future of Generative AI at AWS Summit London ...Neo4j - How KGs are shaping the future of Generative AI at AWS Summit London ...
Neo4j - How KGs are shaping the future of Generative AI at AWS Summit London ...
 
Breaking the Kubernetes Kill Chain: Host Path Mount
Breaking the Kubernetes Kill Chain: Host Path MountBreaking the Kubernetes Kill Chain: Host Path Mount
Breaking the Kubernetes Kill Chain: Host Path Mount
 
The Codex of Business Writing Software for Real-World Solutions 2.pptx
The Codex of Business Writing Software for Real-World Solutions 2.pptxThe Codex of Business Writing Software for Real-World Solutions 2.pptx
The Codex of Business Writing Software for Real-World Solutions 2.pptx
 
SIEMENS: RAPUNZEL – A Tale About Knowledge Graph
SIEMENS: RAPUNZEL – A Tale About Knowledge GraphSIEMENS: RAPUNZEL – A Tale About Knowledge Graph
SIEMENS: RAPUNZEL – A Tale About Knowledge Graph
 
How to Remove Document Management Hurdles with X-Docs?
How to Remove Document Management Hurdles with X-Docs?How to Remove Document Management Hurdles with X-Docs?
How to Remove Document Management Hurdles with X-Docs?
 
Transforming Data Streams with Kafka Connect: An Introduction to Single Messa...
Transforming Data Streams with Kafka Connect: An Introduction to Single Messa...Transforming Data Streams with Kafka Connect: An Introduction to Single Messa...
Transforming Data Streams with Kafka Connect: An Introduction to Single Messa...
 
Benefits Of Flutter Compared To Other Frameworks
Benefits Of Flutter Compared To Other FrameworksBenefits Of Flutter Compared To Other Frameworks
Benefits Of Flutter Compared To Other Frameworks
 
Azure Monitor & Application Insight to monitor Infrastructure & Application
Azure Monitor & Application Insight to monitor Infrastructure & ApplicationAzure Monitor & Application Insight to monitor Infrastructure & Application
Azure Monitor & Application Insight to monitor Infrastructure & Application
 
08448380779 Call Girls In Civil Lines Women Seeking Men
08448380779 Call Girls In Civil Lines Women Seeking Men08448380779 Call Girls In Civil Lines Women Seeking Men
08448380779 Call Girls In Civil Lines Women Seeking Men
 
Integration and Automation in Practice: CI/CD in Mule Integration and Automat...
Integration and Automation in Practice: CI/CD in Mule Integration and Automat...Integration and Automation in Practice: CI/CD in Mule Integration and Automat...
Integration and Automation in Practice: CI/CD in Mule Integration and Automat...
 
FULL ENJOY 🔝 8264348440 🔝 Call Girls in Diplomatic Enclave | Delhi
FULL ENJOY 🔝 8264348440 🔝 Call Girls in Diplomatic Enclave | DelhiFULL ENJOY 🔝 8264348440 🔝 Call Girls in Diplomatic Enclave | Delhi
FULL ENJOY 🔝 8264348440 🔝 Call Girls in Diplomatic Enclave | Delhi
 
[2024]Digital Global Overview Report 2024 Meltwater.pdf
[2024]Digital Global Overview Report 2024 Meltwater.pdf[2024]Digital Global Overview Report 2024 Meltwater.pdf
[2024]Digital Global Overview Report 2024 Meltwater.pdf
 
Enhancing Worker Digital Experience: A Hands-on Workshop for Partners
Enhancing Worker Digital Experience: A Hands-on Workshop for PartnersEnhancing Worker Digital Experience: A Hands-on Workshop for Partners
Enhancing Worker Digital Experience: A Hands-on Workshop for Partners
 

Pres metabief2020jmm

  • 1. A computational framework based over Transported Meshfree methods. P.G. LeFloch 1, J.M. Mercier 2 1CNRS, 2MPG-Partners 16 01 2020 P.G. LeFloch 1 , J.M. Mercier 2 (1 CNRS, 2 MPG-Partners)A computational framework based over Transported Meshfree methods.16 01 2020 1 / 12
  • 2. Foundations : local integration with Monte-Carlo methods Monte Carlo estimations - consider the following family of (worst) error estimates (µ probability measure, Y = (y1 , . . . , yN ) ∈ RN×D ) RD ϕ(x)dµ − 1 N N n=1 ϕ(yn ) ≤ E Y , Hµ ϕ Hµ where Hµ is a µ-weighted Hilbert (or Banach) functional space. 1 classical example 1 : Y i.i.d. → E(Y , Hµ) ∼ 1√ N and Hµ ∼ L2 (RD , |x|2 dµ) (stat : law of large number) : most used convergence rate in the Finance industry. P.G. LeFloch 1 , J.M. Mercier 2 (1 CNRS, 2 MPG-Partners)A computational framework based over Transported Meshfree methods.16 01 2020 2 / 12
  • 3. Foundations : local integration with Monte-Carlo methods Monte Carlo estimations - consider the following family of (worst) error estimates (µ probability measure, Y = (y1 , . . . , yN ) ∈ RN×D ) RD ϕ(x)dµ − 1 N N n=1 ϕ(yn ) ≤ E Y , Hµ ϕ Hµ where Hµ is a µ-weighted Hilbert (or Banach) functional space. 1 classical example 1 : Y i.i.d. → E(Y , Hµ) ∼ 1√ N and Hµ ∼ L2 (RD , |x|2 dµ) (stat : law of large number) : most used convergence rate in the Finance industry. 2 classical ex 2 : Y SOBOL, µ = dxΩ, Ω = [0, 1]D . Then HK = BV (Ω) (bounded variations), E(Y , HK ) ≥ ln(N)D−1 N ( Koksma-Hlawka sharp estimate conjecture). P.G. LeFloch 1 , J.M. Mercier 2 (1 CNRS, 2 MPG-Partners)A computational framework based over Transported Meshfree methods.16 01 2020 2 / 12
  • 4. Foundations : local integration with Monte-Carlo methods Monte Carlo estimations - consider the following family of (worst) error estimates (µ probability measure, Y = (y1 , . . . , yN ) ∈ RN×D ) RD ϕ(x)dµ − 1 N N n=1 ϕ(yn ) ≤ E Y , Hµ ϕ Hµ where Hµ is a µ-weighted Hilbert (or Banach) functional space. 1 classical example 1 : Y i.i.d. → E(Y , Hµ) ∼ 1√ N and Hµ ∼ L2 (RD , |x|2 dµ) (stat : law of large number) : most used convergence rate in the Finance industry. 2 classical ex 2 : Y SOBOL, µ = dxΩ, Ω = [0, 1]D . Then HK = BV (Ω) (bounded variations), E(Y , HK ) ≥ ln(N)D−1 N ( Koksma-Hlawka sharp estimate conjecture). 3 Others examples ...quantifiers, wavelet, deep feed-forward neural networks ... P.G. LeFloch 1 , J.M. Mercier 2 (1 CNRS, 2 MPG-Partners)A computational framework based over Transported Meshfree methods.16 01 2020 2 / 12
  • 5. A general approach using kernels methods 1 You have a problem involving a probability measure µ and you guess that the solution belongs to a weighted functional space Hµ. P.G. LeFloch 1 , J.M. Mercier 2 (1 CNRS, 2 MPG-Partners)A computational framework based over Transported Meshfree methods.16 01 2020 3 / 12
  • 6. A general approach using kernels methods 1 You have a problem involving a probability measure µ and you guess that the solution belongs to a weighted functional space Hµ. 2 Identify the admissible kernel K(x, y) generating it (RHKS theory) : Hµ ≡ HK . Example of classically used kernels : RELU, convolutional kernels, Wendland functions... P.G. LeFloch 1 , J.M. Mercier 2 (1 CNRS, 2 MPG-Partners)A computational framework based over Transported Meshfree methods.16 01 2020 3 / 12
  • 7. A general approach using kernels methods 1 You have a problem involving a probability measure µ and you guess that the solution belongs to a weighted functional space Hµ. 2 Identify the admissible kernel K(x, y) generating it (RHKS theory) : Hµ ≡ HK . Example of classically used kernels : RELU, convolutional kernels, Wendland functions... 3 Pick (i.i.d) samples y1 , . . . , yN . Then you can measure your integration error using RD ϕ(x)dµ − 1 N N n=1 ϕ(yn ) ≤ E Y , HK ϕ HK where E2 (Y , HK ) = R2D K(x, y)dxdy + 1 N2 N n,m=1 K(yn , ym ) − 2 N N n=1 RD K(x, yn )dx P.G. LeFloch 1 , J.M. Mercier 2 (1 CNRS, 2 MPG-Partners)A computational framework based over Transported Meshfree methods.16 01 2020 3 / 12
  • 8. A general approach using kernels methods 1 You have a problem involving a probability measure µ and you guess that the solution belongs to a weighted functional space Hµ. 2 Identify the admissible kernel K(x, y) generating it (RHKS theory) : Hµ ≡ HK . Example of classically used kernels : RELU, convolutional kernels, Wendland functions... 3 Pick (i.i.d) samples y1 , . . . , yN . Then you can measure your integration error using RD ϕ(x)dµ − 1 N N n=1 ϕ(yn ) ≤ E Y , HK ϕ HK where E2 (Y , HK ) = R2D K(x, y)dxdy + 1 N2 N n,m=1 K(yn , ym ) − 2 N N n=1 RD K(x, yn )dx 4 You can optimize your error computing sharp discrepancy sequences and optimal discrepancy error as Y = arg inf Y ∈RD×N E(Y , HK ), EHK (N, D) = E(Y , HK ) P.G. LeFloch 1 , J.M. Mercier 2 (1 CNRS, 2 MPG-Partners)A computational framework based over Transported Meshfree methods.16 01 2020 3 / 12
  • 9. Our local kernels : lattice-based and transported kernels For our purposes, we crafted two kind of kernels : 1 Lattice-based kernel : (suited to study Lebesgue-measure of type µ = dxΩ). Let L a Lattice, L∗ its dual Lattice. Consider any discrete function satisfying φ(α∗ ) ∈ 1 (L∗ ), φ(α∗ ) ≥ 0, φ(0) = 1 and define Kper (x, y) = 1 |L| α∗∈L∗ φ(α∗ ) exp2iπ<x−y,α∗ > x y z Matern x y k Multiquadric x y k Gaussian x y k Truncated P.G. LeFloch 1 , J.M. Mercier 2 (1 CNRS, 2 MPG-Partners)A computational framework based over Transported Meshfree methods.16 01 2020 4 / 12
  • 10. Our local kernels : lattice-based and transported kernels For our purposes, we crafted two kind of kernels : 1 Lattice-based kernel : (suited to study Lebesgue-measure of type µ = dxΩ). Let L a Lattice, L∗ its dual Lattice. Consider any discrete function satisfying φ(α∗ ) ∈ 1 (L∗ ), φ(α∗ ) ≥ 0, φ(0) = 1 and define Kper (x, y) = 1 |L| α∗∈L∗ φ(α∗ ) exp2iπ<x−y,α∗ > x y z Matern x y k Multiquadric x y k Gaussian x y k Truncated 2 Transported kernel : S : Ω → RD a transport map. Ktra(x, y) = K(S(x), S(y)). x y k Matern x y k Gaussian x y k Multiquadric x y k Truncated P.G. LeFloch 1 , J.M. Mercier 2 (1 CNRS, 2 MPG-Partners)A computational framework based over Transported Meshfree methods.16 01 2020 4 / 12
  • 11. Example I : Monte-Carlo integration with Matern kernel 1 Kernel, random and computed sequences Y . N=256,D=2. x y z Matern 0.0 0.2 0.4 0.6 0.8 1.0 0.00.20.40.60.81.0 random points x y 0.0 0.2 0.4 0.6 0.8 1.0 0.00.20.40.60.81.0 computed points for lattice Matern x y P.G. LeFloch 1 , J.M. Mercier 2 (1 CNRS, 2 MPG-Partners)A computational framework based over Transported Meshfree methods.16 01 2020 5 / 12
  • 12. Example I : Monte-Carlo integration with Matern kernel 1 Kernel, random and computed sequences Y . N=256,D=2. x y z Matern 0.0 0.2 0.4 0.6 0.8 1.0 0.00.20.40.60.81.0 random points x y 0.0 0.2 0.4 0.6 0.8 1.0 0.00.20.40.60.81.0 computed points for lattice Matern x y 2 Optimal discrepancy error → Koksma-Hlavka type estimate EHK (N, D) ∼ n>N φ(α∗n) N ∼ ln(N)D−1 N , φ(α) = ΠD d=1 2 1 + 4π2α2 d /τ2 D P.G. LeFloch 1 , J.M. Mercier 2 (1 CNRS, 2 MPG-Partners)A computational framework based over Transported Meshfree methods.16 01 2020 5 / 12
  • 13. Example I : Monte-Carlo integration with Matern kernel 1 Kernel, random and computed sequences Y . N=256,D=2. x y z Matern 0.0 0.2 0.4 0.6 0.8 1.0 0.00.20.40.60.81.0 random points x y 0.0 0.2 0.4 0.6 0.8 1.0 0.00.20.40.60.81.0 computed points for lattice Matern x y 2 Optimal discrepancy error → Koksma-Hlavka type estimate EHK (N, D) ∼ n>N φ(α∗n) N ∼ ln(N)D−1 N , φ(α) = ΠD d=1 2 1 + 4π2α2 d /τ2 D 3 E(Y , HK ) random – vs E(Y , HK ) computed - vs theoretical EHK (N, D) D=1 D=16 D=128 N=16 0.228 0.304 0.319 N=128 0.117 0.111 0.115 N=512 0.035 0.054 0.059 D=1 D=16 D=128 N=16 0.062 0.211 0.223 N=128 0.008 0.069 0.077 N=512 0.002 0.034 0.049 D=1 D=16 D=128 N=16 0.062 0.288 0.323 N=128 0.008 0.077 0.105 N=512 0.002 0.034 0.043 P.G. LeFloch 1 , J.M. Mercier 2 (1 CNRS, 2 MPG-Partners)A computational framework based over Transported Meshfree methods.16 01 2020 5 / 12
  • 14. Application : Machine Learning 1 Setting : consider a set of observations (y1 , P1 ), . . . , (yN , PN ) ∈ RD×M×N P.G. LeFloch 1 , J.M. Mercier 2 (1 CNRS, 2 MPG-Partners)A computational framework based over Transported Meshfree methods.16 01 2020 6 / 12
  • 15. Application : Machine Learning 1 Setting : consider a set of observations (y1 , P1 ), . . . , (yN , PN ) ∈ RD×M×N 2 Interpolation : pick-up a kernel K(x, y), denotes HK its native space, and consider a continuous function P(y) such that < P, δyn >= P(yn ) ∼ Pn One can further optimize computing Y (∼ learning). P.G. LeFloch 1 , J.M. Mercier 2 (1 CNRS, 2 MPG-Partners)A computational framework based over Transported Meshfree methods.16 01 2020 6 / 12
  • 16. Application : Machine Learning 1 Setting : consider a set of observations (y1 , P1 ), . . . , (yN , PN ) ∈ RD×M×N 2 Interpolation : pick-up a kernel K(x, y), denotes HK its native space, and consider a continuous function P(y) such that < P, δyn >= P(yn ) ∼ Pn One can further optimize computing Y (∼ learning). 3 Extrapolation : then one can extrapolate with error bound RD P(x)dµ − 1 N N n=1 P(yn ) ≤ EHK (N, D) P HK i.e. µ ∼ 1 N N n=1 δyn . P.G. LeFloch 1 , J.M. Mercier 2 (1 CNRS, 2 MPG-Partners)A computational framework based over Transported Meshfree methods.16 01 2020 6 / 12
  • 17. Application : Machine Learning 1 Setting : consider a set of observations (y1 , P1 ), . . . , (yN , PN ) ∈ RD×M×N 2 Interpolation : pick-up a kernel K(x, y), denotes HK its native space, and consider a continuous function P(y) such that < P, δyn >= P(yn ) ∼ Pn One can further optimize computing Y (∼ learning). 3 Extrapolation : then one can extrapolate with error bound RD P(x)dµ − 1 N N n=1 P(yn ) ≤ EHK (N, D) P HK i.e. µ ∼ 1 N N n=1 δyn . 4 Here are two very similar applications : 1 (y1 , P1 ), . . . , (yN , PN ) are prices and implied volatilities (eg call options under SABR model): Pricing. 2 (y1 , P1 ), . . . , (yN , PN ) are pictures of dogs and cats : classifier. P.G. LeFloch 1 , J.M. Mercier 2 (1 CNRS, 2 MPG-Partners)A computational framework based over Transported Meshfree methods.16 01 2020 6 / 12
  • 18. Application to time-dependant PDE (Loading NS) 1 Consider a time dependant probability measure µ(t, x) and a kernel Kt (x, y). We can define sharp discrepancy sequences t → y1 (t), . . . , yn (t) . P.G. LeFloch 1 , J.M. Mercier 2 (1 CNRS, 2 MPG-Partners)A computational framework based over Transported Meshfree methods.16 01 2020 7 / 12
  • 19. Application to time-dependant PDE (Loading NS) 1 Consider a time dependant probability measure µ(t, x) and a kernel Kt (x, y). We can define sharp discrepancy sequences t → y1 (t), . . . , yn (t) . 2 For PDE, we can try to compute these sequences. For instance consider the Navier-Stokes equation (hyperbolic equations) ∂t µ = · (vµ), ∂t (µv) + · (µv2 ) = − p + · (µΣ) · v = 0 (or energy conservation for non newtonian fluids) Together with boundary conditions Dirichlet / Neumann. We obtain a numerical scheme sharing some similarities with SPH - smooth particle hydrodynamics : that are LAGRANGIAN MESHFREE METHODS. P.G. LeFloch 1 , J.M. Mercier 2 (1 CNRS, 2 MPG-Partners)A computational framework based over Transported Meshfree methods.16 01 2020 7 / 12
  • 20. Application to industrial Finance Consider µ(t, x) solution to non-linear hyperbolic-parabolic Fokker-Planck equations ∂t µ − Lµ = 0, L = · (bµ) + 2 · (Aµ), A := 1 2 σσT 1 FORWARD : Compute µ(t) ∼ 1 N δy1(t) + . . . + δyN (t) as sharp discrepancy sequences. RD ϕ(x)dµ(t, x) − 1 N N n=1 ϕ(yn (t)) ≤ E Y (t), HKt ϕ HKt P.G. LeFloch 1 , J.M. Mercier 2 (1 CNRS, 2 MPG-Partners)A computational framework based over Transported Meshfree methods.16 01 2020 8 / 12
  • 21. Application to industrial Finance Consider µ(t, x) solution to non-linear hyperbolic-parabolic Fokker-Planck equations ∂t µ − Lµ = 0, L = · (bµ) + 2 · (Aµ), A := 1 2 σσT 1 FORWARD : Compute µ(t) ∼ 1 N δy1(t) + . . . + δyN (t) as sharp discrepancy sequences. RD ϕ(x)dµ(t, x) − 1 N N n=1 ϕ(yn (t)) ≤ E Y (t), HKt ϕ HKt 2 CHECK optimal rate : E Y (t), HKt ∼ EHKt (N, D) P.G. LeFloch 1 , J.M. Mercier 2 (1 CNRS, 2 MPG-Partners)A computational framework based over Transported Meshfree methods.16 01 2020 8 / 12
  • 22. Application to industrial Finance Consider µ(t, x) solution to non-linear hyperbolic-parabolic Fokker-Planck equations ∂t µ − Lµ = 0, L = · (bµ) + 2 · (Aµ), A := 1 2 σσT 1 FORWARD : Compute µ(t) ∼ 1 N δy1(t) + . . . + δyN (t) as sharp discrepancy sequences. RD ϕ(x)dµ(t, x) − 1 N N n=1 ϕ(yn (t)) ≤ E Y (t), HKt ϕ HKt 2 CHECK optimal rate : E Y (t), HKt ∼ EHKt (N, D) 3 BACKWARD : interpret t → yn (t), n = 1 . . . N as a moving, transported, PDE grid (TREE). Solve with it the Kolmogorov equation. ERROR ESTIMATION : RD P(t, ·)dµ(t, ·) − 1 N N n=1 P(t, yn (t)) ≤ EHKt (N, D) P(t, ·) HKt P.G. LeFloch 1 , J.M. Mercier 2 (1 CNRS, 2 MPG-Partners)A computational framework based over Transported Meshfree methods.16 01 2020 8 / 12
  • 23. Illustration : the 2D SABR process, widely used in Finance SABR process d Ft αt = ρ αt Fβ t 0 0 ναt dW 1 t dW 2 t , with 0 ≤ β ≤ 1, ν ≥ 0,ρ ∈ R2×2 . The Fokker-Planck equation associated to SABR is ∂t µ + L∗ µ = 0, L∗ µ = ρ x2 2 2 xβ 1 0 0 ν2 2 x2 ρT · 2 µ. (Loading SABR200) P.G. LeFloch 1 , J.M. Mercier 2 (1 CNRS, 2 MPG-Partners)A computational framework based over Transported Meshfree methods.16 01 2020 9 / 12
  • 24. The curse of dimensionality CURSE of dimensionality in finance : Price and manage a complex option written on several underlyings. 1 Step 1 : Compute a measure solution µ(t, x) to a Fokker-Planck equation in large dimension, calibrate it. P.G. LeFloch 1 , J.M. Mercier 2 (1 CNRS, 2 MPG-Partners)A computational framework based over Transported Meshfree methods.16 01 2020 10 / 12
  • 25. The curse of dimensionality CURSE of dimensionality in finance : Price and manage a complex option written on several underlyings. 1 Step 1 : Compute a measure solution µ(t, x) to a Fokker-Planck equation in large dimension, calibrate it. 2 Step 2 : Backward a Kolmogorov equation in large dimension, denote the solution P(t, x). P.G. LeFloch 1 , J.M. Mercier 2 (1 CNRS, 2 MPG-Partners)A computational framework based over Transported Meshfree methods.16 01 2020 10 / 12
  • 26. The curse of dimensionality CURSE of dimensionality in finance : Price and manage a complex option written on several underlyings. 1 Step 1 : Compute a measure solution µ(t, x) to a Fokker-Planck equation in large dimension, calibrate it. 2 Step 2 : Backward a Kolmogorov equation in large dimension, denote the solution P(t, x). 3 Step 3 : Compute various metrics on the solution as for instance : VaR or XVA for regulatory purposes, or future deltas / gammas / implied vols for hedging purposes. P.G. LeFloch 1 , J.M. Mercier 2 (1 CNRS, 2 MPG-Partners)A computational framework based over Transported Meshfree methods.16 01 2020 10 / 12
  • 27. The curse of dimensionality CURSE of dimensionality in finance : Price and manage a complex option written on several underlyings. 1 Step 1 : Compute a measure solution µ(t, x) to a Fokker-Planck equation in large dimension, calibrate it. 2 Step 2 : Backward a Kolmogorov equation in large dimension, denote the solution P(t, x). 3 Step 3 : Compute various metrics on the solution as for instance : VaR or XVA for regulatory purposes, or future deltas / gammas / implied vols for hedging purposes. 4 Result ? We can compute the solution P(t, x) at any order of accuracy : RD P(t, ·)dµ(t, ·) − 1 N N n=1 P(t, yn (t)) ≤ P(t, ·) HK Nα ... ...where α ≥ 1/2 is any number... Choose it according to your desired electricity bill ! But beware to smoothing effects in high dimensions : HK contains less informations as the dimension raises. Some problems, as are for instance optimal stopping problems, are intrinsecally cursed. P.G. LeFloch 1 , J.M. Mercier 2 (1 CNRS, 2 MPG-Partners)A computational framework based over Transported Meshfree methods.16 01 2020 10 / 12
  • 28. academic tests, business cases 1 Academic works : finance, non-linear hyperbolic systems 1 Revisiting the method of characteristics via a convex hull algorithm : explicit solutions to high-dimensional conservation laws with non convex-fluxes. 2 Numerical results using CoDeFi. Benchmark of TMM methods for classical pricing problems. P.G. LeFloch 1 , J.M. Mercier 2 (1 CNRS, 2 MPG-Partners)A computational framework based over Transported Meshfree methods.16 01 2020 11 / 12
  • 29. academic tests, business cases 1 Academic works : finance, non-linear hyperbolic systems 1 Revisiting the method of characteristics via a convex hull algorithm : explicit solutions to high-dimensional conservation laws with non convex-fluxes. 2 Numerical results using CoDeFi. Benchmark of TMM methods for classical pricing problems. 2 Business cases - done 1 Hedging Strategies for Net Interest Income and Economic Values of Equity (http://dx.doi.org/10.2139/ssrn.3454813, with S.Miryusupov). 2 Compute metrics for big portfolio of Autocalls depending on several underlyings (unpublished). P.G. LeFloch 1 , J.M. Mercier 2 (1 CNRS, 2 MPG-Partners)A computational framework based over Transported Meshfree methods.16 01 2020 11 / 12
  • 30. academic tests, business cases 1 Academic works : finance, non-linear hyperbolic systems 1 Revisiting the method of characteristics via a convex hull algorithm : explicit solutions to high-dimensional conservation laws with non convex-fluxes. 2 Numerical results using CoDeFi. Benchmark of TMM methods for classical pricing problems. 2 Business cases - done 1 Hedging Strategies for Net Interest Income and Economic Values of Equity (http://dx.doi.org/10.2139/ssrn.3454813, with S.Miryusupov). 2 Compute metrics for big portfolio of Autocalls depending on several underlyings (unpublished). 3 Under work 1 McKean Vlasov equations (stochastic volatility modeling). 2 ISDA Standard Initial Margin : XVA computations based on sensitivities (delta / vega ..gamma) 3 Transition IBOR / RFR rates a la Lyashenko - Mercurio. 4 Strategies for Liquidity risk : Hamilton-Jacobi-Bellman equations in high dimensions. P.G. LeFloch 1 , J.M. Mercier 2 (1 CNRS, 2 MPG-Partners)A computational framework based over Transported Meshfree methods.16 01 2020 11 / 12
  • 31. Summary and Conclusions We presented in this talk: 1 News, sharps, estimations for Monte-Carlo methods... P.G. LeFloch 1 , J.M. Mercier 2 (1 CNRS, 2 MPG-Partners)A computational framework based over Transported Meshfree methods.16 01 2020 12 / 12
  • 32. Summary and Conclusions We presented in this talk: 1 News, sharps, estimations for Monte-Carlo methods... 2 ...that can be used in a wide variety of context to perform a sharp error analysis. P.G. LeFloch 1 , J.M. Mercier 2 (1 CNRS, 2 MPG-Partners)A computational framework based over Transported Meshfree methods.16 01 2020 12 / 12
  • 33. Summary and Conclusions We presented in this talk: 1 News, sharps, estimations for Monte-Carlo methods... 2 ...that can be used in a wide variety of context to perform a sharp error analysis. 3 A new method for numerical simulations of PDE : Transported Meshfree Methods.... P.G. LeFloch 1 , J.M. Mercier 2 (1 CNRS, 2 MPG-Partners)A computational framework based over Transported Meshfree methods.16 01 2020 12 / 12
  • 34. Summary and Conclusions We presented in this talk: 1 News, sharps, estimations for Monte-Carlo methods... 2 ...that can be used in a wide variety of context to perform a sharp error analysis. 3 A new method for numerical simulations of PDE : Transported Meshfree Methods.... 4 ... that can be used in a wide variety of applications (hyperbolic / parabolic equations, artificial intelligence, etc...)... P.G. LeFloch 1 , J.M. Mercier 2 (1 CNRS, 2 MPG-Partners)A computational framework based over Transported Meshfree methods.16 01 2020 12 / 12
  • 35. Summary and Conclusions We presented in this talk: 1 News, sharps, estimations for Monte-Carlo methods... 2 ...that can be used in a wide variety of context to perform a sharp error analysis. 3 A new method for numerical simulations of PDE : Transported Meshfree Methods.... 4 ... that can be used in a wide variety of applications (hyperbolic / parabolic equations, artificial intelligence, etc...)... 5 ..for which the error analysis applies : we can guarantee a worst error estimation, and we can check that this error matches an optimal convergence rate. P.G. LeFloch 1 , J.M. Mercier 2 (1 CNRS, 2 MPG-Partners)A computational framework based over Transported Meshfree methods.16 01 2020 12 / 12
  • 36. Summary and Conclusions We presented in this talk: 1 News, sharps, estimations for Monte-Carlo methods... 2 ...that can be used in a wide variety of context to perform a sharp error analysis. 3 A new method for numerical simulations of PDE : Transported Meshfree Methods.... 4 ... that can be used in a wide variety of applications (hyperbolic / parabolic equations, artificial intelligence, etc...)... 5 ..for which the error analysis applies : we can guarantee a worst error estimation, and we can check that this error matches an optimal convergence rate. 6 ...Thus we can argue that our numerical methods reach nearly optimal algorithmic complexity. P.G. LeFloch 1 , J.M. Mercier 2 (1 CNRS, 2 MPG-Partners)A computational framework based over Transported Meshfree methods.16 01 2020 12 / 12