Complex Support Vector Machines For Quaternary Classification
1. Introduction
Support Vector Machines
The Complex Case
Complex Support Vector Machines For
Quaternary Classification
P. Bouboulis, E. Theodoridou, S. Theodoridis
Department of Informatics and Telecommunications
University of Athens
Athens, Greece
23-09-2013
P. Bouboulis, E. Theodoridou, S. Theodoridis CSVR 1 / 47
2. Introduction
Support Vector Machines
The Complex Case
Outline
1 Introduction
Reproducing Kernel Hilbert Spaces
Complex RKHS
2 Support Vector Machines
Linear SVMs
Non-linear SVM
3 The Complex Case
Complex Hyperplanes
Problem formulation
Experiments
P. Bouboulis, E. Theodoridou, S. Theodoridis CSVR 2 / 47
3. Introduction
Support Vector Machines
The Complex Case
Reproducing Kernel Hilbert Spaces
Complex RKHS
Outline
1 Introduction
Reproducing Kernel Hilbert Spaces
Complex RKHS
2 Support Vector Machines
Linear SVMs
Non-linear SVM
3 The Complex Case
Complex Hyperplanes
Problem formulation
Experiments
P. Bouboulis, E. Theodoridou, S. Theodoridis CSVR 3 / 47
4. Introduction
Support Vector Machines
The Complex Case
Reproducing Kernel Hilbert Spaces
Complex RKHS
Reproducing Kernel Hilbert Spaces.
Consider a linear class H of real (complex) valued functions f
defined on a set X (in particular H is a Hilbert space), for which
there exists a function (kernel) κ : X × X → R(C) with the
following two properties:
P. Bouboulis, E. Theodoridou, S. Theodoridis CSVR 4 / 47
5. Introduction
Support Vector Machines
The Complex Case
Reproducing Kernel Hilbert Spaces
Complex RKHS
Reproducing Kernel Hilbert Spaces.
Consider a linear class H of real (complex) valued functions f
defined on a set X (in particular H is a Hilbert space), for which
there exists a function (kernel) κ : X × X → R(C) with the
following two properties:
1 For every x ∈ X, κ(x, ·) belongs to H.
P. Bouboulis, E. Theodoridou, S. Theodoridis CSVR 4 / 47
6. Introduction
Support Vector Machines
The Complex Case
Reproducing Kernel Hilbert Spaces
Complex RKHS
Reproducing Kernel Hilbert Spaces.
Consider a linear class H of real (complex) valued functions f
defined on a set X (in particular H is a Hilbert space), for which
there exists a function (kernel) κ : X × X → R(C) with the
following two properties:
1 For every x ∈ X, κ(x, ·) belongs to H.
2 κ has the so called reproducing property, i.e.,
f(x) = f, κ(x, ·) H, for all f ∈ H, x ∈ X. (1)
P. Bouboulis, E. Theodoridou, S. Theodoridis CSVR 4 / 47
7. Introduction
Support Vector Machines
The Complex Case
Reproducing Kernel Hilbert Spaces
Complex RKHS
Reproducing Kernel Hilbert Spaces.
Consider a linear class H of real (complex) valued functions f
defined on a set X (in particular H is a Hilbert space), for which
there exists a function (kernel) κ : X × X → R(C) with the
following two properties:
1 For every x ∈ X, κ(x, ·) belongs to H.
2 κ has the so called reproducing property, i.e.,
f(x) = f, κ(x, ·) H, for all f ∈ H, x ∈ X. (1)
Then H is called a Reproducing Kernel Hilbert Space (RKHS)
associated to the the kernel κ.
P. Bouboulis, E. Theodoridou, S. Theodoridis CSVR 4 / 47
8. Introduction
Support Vector Machines
The Complex Case
Reproducing Kernel Hilbert Spaces
Complex RKHS
Kernel Trick
The notion of RKHS is a popular tool for treating non-linear
learning tasks.
P. Bouboulis, E. Theodoridou, S. Theodoridis CSVR 5 / 47
9. Introduction
Support Vector Machines
The Complex Case
Reproducing Kernel Hilbert Spaces
Complex RKHS
Kernel Trick
The notion of RKHS is a popular tool for treating non-linear
learning tasks.
Usually this is attained by the so called “kernel trick”.
P. Bouboulis, E. Theodoridou, S. Theodoridis CSVR 5 / 47
10. Introduction
Support Vector Machines
The Complex Case
Reproducing Kernel Hilbert Spaces
Complex RKHS
Kernel Trick
The notion of RKHS is a popular tool for treating non-linear
learning tasks.
Usually this is attained by the so called “kernel trick”.
If
X ∋ x → Φ(x) := κ(x, ·) ∈ H
X ∋ y → Φ(y) := κ(y, ·) ∈ H,
P. Bouboulis, E. Theodoridou, S. Theodoridis CSVR 5 / 47
11. Introduction
Support Vector Machines
The Complex Case
Reproducing Kernel Hilbert Spaces
Complex RKHS
Kernel Trick
The notion of RKHS is a popular tool for treating non-linear
learning tasks.
Usually this is attained by the so called “kernel trick”.
If
X ∋ x → Φ(x) := κ(x, ·) ∈ H
X ∋ y → Φ(y) := κ(y, ·) ∈ H,
then the inner product in H is given as a function computed on
X:
κ(x, y) = κ(x, ·), κ(y, ·) H kernel trick
P. Bouboulis, E. Theodoridou, S. Theodoridis CSVR 5 / 47
12. Introduction
Support Vector Machines
The Complex Case
Reproducing Kernel Hilbert Spaces
Complex RKHS
Developing Learning Algorithms in RKHS
The black box approach.
P. Bouboulis, E. Theodoridou, S. Theodoridis CSVR 6 / 47
13. Introduction
Support Vector Machines
The Complex Case
Reproducing Kernel Hilbert Spaces
Complex RKHS
Developing Learning Algorithms in RKHS
The black box approach.
Develop the learning Algorithm in X.
P. Bouboulis, E. Theodoridou, S. Theodoridis CSVR 6 / 47
14. Introduction
Support Vector Machines
The Complex Case
Reproducing Kernel Hilbert Spaces
Complex RKHS
Developing Learning Algorithms in RKHS
The black box approach.
Develop the learning Algorithm in X.
Express it, if possible, in inner products.
P. Bouboulis, E. Theodoridou, S. Theodoridis CSVR 6 / 47
15. Introduction
Support Vector Machines
The Complex Case
Reproducing Kernel Hilbert Spaces
Complex RKHS
Developing Learning Algorithms in RKHS
The black box approach.
Develop the learning Algorithm in X.
Express it, if possible, in inner products.
Choose a kernel function κ.
P. Bouboulis, E. Theodoridou, S. Theodoridis CSVR 6 / 47
16. Introduction
Support Vector Machines
The Complex Case
Reproducing Kernel Hilbert Spaces
Complex RKHS
Developing Learning Algorithms in RKHS
The black box approach.
Develop the learning Algorithm in X.
Express it, if possible, in inner products.
Choose a kernel function κ.
Replace inner products with kernel evaluations according to
the kernel trick.
P. Bouboulis, E. Theodoridou, S. Theodoridis CSVR 6 / 47
17. Introduction
Support Vector Machines
The Complex Case
Reproducing Kernel Hilbert Spaces
Complex RKHS
Developing Learning Algorithms in RKHS
The black box approach.
Develop the learning Algorithm in X.
Express it, if possible, in inner products.
Choose a kernel function κ.
Replace inner products with kernel evaluations according to
the kernel trick.
Work directly in the RKHS, assuming that the data have
been mapped and live in the RKHS H, i.e.,
X ∋ x → Φ(x) := κ(x, ·) ∈ H.
P. Bouboulis, E. Theodoridou, S. Theodoridis CSVR 6 / 47
18. Introduction
Support Vector Machines
The Complex Case
Reproducing Kernel Hilbert Spaces
Complex RKHS
Advantages
Advantages of kernel-based learning tasks:
P. Bouboulis, E. Theodoridou, S. Theodoridis CSVR 7 / 47
19. Introduction
Support Vector Machines
The Complex Case
Reproducing Kernel Hilbert Spaces
Complex RKHS
Advantages
Advantages of kernel-based learning tasks:
The original nonlinear task is transformed into a linear one.
P. Bouboulis, E. Theodoridou, S. Theodoridis CSVR 7 / 47
20. Introduction
Support Vector Machines
The Complex Case
Reproducing Kernel Hilbert Spaces
Complex RKHS
Advantages
Advantages of kernel-based learning tasks:
The original nonlinear task is transformed into a linear one.
Different types of nonlinearities can be treated in a unified
way.
P. Bouboulis, E. Theodoridou, S. Theodoridis CSVR 7 / 47
21. Introduction
Support Vector Machines
The Complex Case
Reproducing Kernel Hilbert Spaces
Complex RKHS
Outline
1 Introduction
Reproducing Kernel Hilbert Spaces
Complex RKHS
2 Support Vector Machines
Linear SVMs
Non-linear SVM
3 The Complex Case
Complex Hyperplanes
Problem formulation
Experiments
P. Bouboulis, E. Theodoridou, S. Theodoridis CSVR 8 / 47
22. Introduction
Support Vector Machines
The Complex Case
Reproducing Kernel Hilbert Spaces
Complex RKHS
Complex RKHS
Although the theory of RKHS holds for complex spaces
too, most of the kernel-based learning techniques were
designed to process real data only.
P. Bouboulis, E. Theodoridou, S. Theodoridis CSVR 9 / 47
23. Introduction
Support Vector Machines
The Complex Case
Reproducing Kernel Hilbert Spaces
Complex RKHS
Complex RKHS
Although the theory of RKHS holds for complex spaces
too, most of the kernel-based learning techniques were
designed to process real data only.
Moreover, in the related literature the complex kernel
functions have been ignored.
P. Bouboulis, E. Theodoridou, S. Theodoridis CSVR 9 / 47
24. Introduction
Support Vector Machines
The Complex Case
Reproducing Kernel Hilbert Spaces
Complex RKHS
Complex RKHS
Although the theory of RKHS holds for complex spaces
too, most of the kernel-based learning techniques were
designed to process real data only.
Moreover, in the related literature the complex kernel
functions have been ignored.
Recently, however, a unified kernel-based framework,
which is able to treat complex data, has been presented.
P. Bouboulis, E. Theodoridou, S. Theodoridis CSVR 9 / 47
25. Introduction
Support Vector Machines
The Complex Case
Reproducing Kernel Hilbert Spaces
Complex RKHS
Complex RKHS
Although the theory of RKHS holds for complex spaces
too, most of the kernel-based learning techniques were
designed to process real data only.
Moreover, in the related literature the complex kernel
functions have been ignored.
Recently, however, a unified kernel-based framework,
which is able to treat complex data, has been presented.
This machinery transforms the input data into a complex
RKHS, i.e.,
Φ(z) = κC(·, z).
P. Bouboulis, E. Theodoridou, S. Theodoridis CSVR 9 / 47
26. Introduction
Support Vector Machines
The Complex Case
Reproducing Kernel Hilbert Spaces
Complex RKHS
Complex RKHS
Although the theory of RKHS holds for complex spaces
too, most of the kernel-based learning techniques were
designed to process real data only.
Moreover, in the related literature the complex kernel
functions have been ignored.
Recently, however, a unified kernel-based framework,
which is able to treat complex data, has been presented.
This machinery transforms the input data into a complex
RKHS, i.e.,
Φ(z) = κC(·, z).
and employs the Wirtinger’s Calculus to derive the
respective gradients.
P. Bouboulis, E. Theodoridou, S. Theodoridis CSVR 9 / 47
27. Introduction
Support Vector Machines
The Complex Case
Reproducing Kernel Hilbert Spaces
Complex RKHS
Complex RKHS
Definitions:
H denotes a complex RKHS.
P. Bouboulis, E. Theodoridou, S. Theodoridis CSVR 10/ 47
28. Introduction
Support Vector Machines
The Complex Case
Reproducing Kernel Hilbert Spaces
Complex RKHS
Complex RKHS
Definitions:
H denotes a complex RKHS.
H denotes a real RKHS.
P. Bouboulis, E. Theodoridou, S. Theodoridis CSVR 10/ 47
29. Introduction
Support Vector Machines
The Complex Case
Reproducing Kernel Hilbert Spaces
Complex RKHS
Complex RKHS
Definitions:
H denotes a complex RKHS.
H denotes a real RKHS.
The complex RKHS can be expressed as H = H + iH.
P. Bouboulis, E. Theodoridou, S. Theodoridis CSVR 10/ 47
30. Introduction
Support Vector Machines
The Complex Case
Reproducing Kernel Hilbert Spaces
Complex RKHS
Complex RKHS
Definitions:
H denotes a complex RKHS.
H denotes a real RKHS.
The complex RKHS can be expressed as H = H + iH.
H is isomorphic to the doubled real space H2.
P. Bouboulis, E. Theodoridou, S. Theodoridis CSVR 10/ 47
31. Introduction
Support Vector Machines
The Complex Case
Reproducing Kernel Hilbert Spaces
Complex RKHS
Complex Kernels
The complex Gaussian kernel:
κ(z, w) = exp −
d
i=1(zi −w∗
i )2
σ2 ,
P. Bouboulis, E. Theodoridou, S. Theodoridis CSVR 11/ 47
32. Introduction
Support Vector Machines
The Complex Case
Reproducing Kernel Hilbert Spaces
Complex RKHS
Complex Kernels
The complex Gaussian kernel:
κ(z, w) = exp −
d
i=1(zi −w∗
i )2
σ2 ,
The Szego kernel: κ(z, w) = 1
1−wH z
,
P. Bouboulis, E. Theodoridou, S. Theodoridis CSVR 11/ 47
33. Introduction
Support Vector Machines
The Complex Case
Reproducing Kernel Hilbert Spaces
Complex RKHS
Complex Kernels
The complex Gaussian kernel:
κ(z, w) = exp −
d
i=1(zi −w∗
i )2
σ2 ,
The Szego kernel: κ(z, w) = 1
1−wH z
,
Bergman kernel: κ(z, w) = 1
(1−wH z)2 .
P. Bouboulis, E. Theodoridou, S. Theodoridis CSVR 11/ 47
34. Introduction
Support Vector Machines
The Complex Case
Reproducing Kernel Hilbert Spaces
Complex RKHS
Wirtinger Calculus
Complex differentiability is a very strict notion.
P. Bouboulis, E. Theodoridou, S. Theodoridis CSVR 12/ 47
35. Introduction
Support Vector Machines
The Complex Case
Reproducing Kernel Hilbert Spaces
Complex RKHS
Wirtinger Calculus
Complex differentiability is a very strict notion.
In learning tasks that involve complex data, we often
encounter functions (e.g., the cost functions, which are
defined in R) that ARE NOT complex differentiable.
P. Bouboulis, E. Theodoridou, S. Theodoridis CSVR 12/ 47
36. Introduction
Support Vector Machines
The Complex Case
Reproducing Kernel Hilbert Spaces
Complex RKHS
Wirtinger Calculus
Complex differentiability is a very strict notion.
In learning tasks that involve complex data, we often
encounter functions (e.g., the cost functions, which are
defined in R) that ARE NOT complex differentiable.
Example: f(z) = |z|2 = zz∗.
P. Bouboulis, E. Theodoridou, S. Theodoridis CSVR 12/ 47
37. Introduction
Support Vector Machines
The Complex Case
Reproducing Kernel Hilbert Spaces
Complex RKHS
Wirtinger Calculus
Complex differentiability is a very strict notion.
In learning tasks that involve complex data, we often
encounter functions (e.g., the cost functions, which are
defined in R) that ARE NOT complex differentiable.
Example: f(z) = |z|2 = zz∗.
In these cases one has to express the cost function in
terms of its real part fr and its imaginary part fi , and use
real derivation with respect to fr , fi .
P. Bouboulis, E. Theodoridou, S. Theodoridis CSVR 12/ 47
38. Introduction
Support Vector Machines
The Complex Case
Reproducing Kernel Hilbert Spaces
Complex RKHS
Wirtinger’s Calculus
This approach leads usually to cumbersome and tedious
calculations.
P. Bouboulis, E. Theodoridou, S. Theodoridis CSVR 13/ 47
39. Introduction
Support Vector Machines
The Complex Case
Reproducing Kernel Hilbert Spaces
Complex RKHS
Wirtinger’s Calculus
This approach leads usually to cumbersome and tedious
calculations.
Wirtinger’s Calculus provides an alternative equivalent
formulation.
P. Bouboulis, E. Theodoridou, S. Theodoridis CSVR 13/ 47
40. Introduction
Support Vector Machines
The Complex Case
Reproducing Kernel Hilbert Spaces
Complex RKHS
Wirtinger’s Calculus
This approach leads usually to cumbersome and tedious
calculations.
Wirtinger’s Calculus provides an alternative equivalent
formulation.
It is based on simple rules and principles.
P. Bouboulis, E. Theodoridou, S. Theodoridis CSVR 13/ 47
41. Introduction
Support Vector Machines
The Complex Case
Reproducing Kernel Hilbert Spaces
Complex RKHS
Wirtinger’s Calculus
This approach leads usually to cumbersome and tedious
calculations.
Wirtinger’s Calculus provides an alternative equivalent
formulation.
It is based on simple rules and principles.
These rules bear a great resemblance to the rules of the
standard complex derivative.
P. Bouboulis, E. Theodoridou, S. Theodoridis CSVR 13/ 47
42. Introduction
Support Vector Machines
The Complex Case
Linear SVMs
Non-linear SVM
Outline
1 Introduction
Reproducing Kernel Hilbert Spaces
Complex RKHS
2 Support Vector Machines
Linear SVMs
Non-linear SVM
3 The Complex Case
Complex Hyperplanes
Problem formulation
Experiments
P. Bouboulis, E. Theodoridou, S. Theodoridis CSVR 14/ 47
43. Introduction
Support Vector Machines
The Complex Case
Linear SVMs
Non-linear SVM
The primal problem
Suppose we are given training data, which belong to two
separate classes C+, C−,i.e.,
{(xn, dn); n = 1, . . . , N} ⊂ X × {±1}, where if dn = +1, then
the n-th sample belongs to C+, while if dn = −1, then the n-th
sample belongs to C−.
P. Bouboulis, E. Theodoridou, S. Theodoridis CSVR 15/ 47
44. Introduction
Support Vector Machines
The Complex Case
Linear SVMs
Non-linear SVM
The primal problem
Suppose we are given training data, which belong to two
separate classes C+, C−,i.e.,
{(xn, dn); n = 1, . . . , N} ⊂ X × {±1}, where if dn = +1, then
the n-th sample belongs to C+, while if dn = −1, then the n-th
sample belongs to C−.
For example:
Figure: Training points belonging to two classes.
P. Bouboulis, E. Theodoridou, S. Theodoridis CSVR 15/ 47
45. Introduction
Support Vector Machines
The Complex Case
Linear SVMs
Non-linear SVM
The primal problem
The goal of the SVM task is to estimate the maximum margin
hyperplane (wT x + c = 0), that separates the points of the two
classes as best as possible
P. Bouboulis, E. Theodoridou, S. Theodoridis CSVR 16/ 47
46. Introduction
Support Vector Machines
The Complex Case
Linear SVMs
Non-linear SVM
The primal problem
The goal of the SVM task is to estimate the maximum margin
hyperplane (wT x + c = 0), that separates the points of the two
classes as best as possible
minimize
w∈X,c∈R
1
2 w 2
H + C
N
N
n=1
ξn
subject to
dn wT xn + c ≥ 1 − ξn
ξn ≥ 0
for n = 1, . . . , N,
P. Bouboulis, E. Theodoridou, S. Theodoridis CSVR 16/ 47
47. Introduction
Support Vector Machines
The Complex Case
Linear SVMs
Non-linear SVM
The primal problem
The goal of the SVM task is to estimate the maximum margin
hyperplane (wT x + c = 0), that separates the points of the two
classes as best as possible
minimize
w∈X,c∈R
1
2 w 2
H + C
N
N
n=1
ξn
subject to
dn wT xn + c ≥ 1 − ξn
ξn ≥ 0
for n = 1, . . . , N,
Note that C is chosen a priori.
P. Bouboulis, E. Theodoridou, S. Theodoridis CSVR 16/ 47
48. Introduction
Support Vector Machines
The Complex Case
Linear SVMs
Non-linear SVM
Physical justification
minimize
w∈X,c∈R
1
2
w 2
H + C
N
N
n=1
ξn
subject to
dn wT
xn + c ≥ 1 − ξn
ξn ≥ 0
for n = 1, . . . , N,
P. Bouboulis, E. Theodoridou, S. Theodoridis CSVR 17/ 47
49. Introduction
Support Vector Machines
The Complex Case
Linear SVMs
Non-linear SVM
Physical justification
minimize
w∈X,c∈R
1
2
w 2
H + C
N
N
n=1
ξn
subject to
dn wT
xn + c ≥ 1 − ξn
ξn ≥ 0
for n = 1, . . . , N,
Figure: Linear SVM
P. Bouboulis, E. Theodoridou, S. Theodoridis CSVR 17/ 47
50. Introduction
Support Vector Machines
The Complex Case
Linear SVMs
Non-linear SVM
The dual problem
To solve this task, usually we consider the dual problem derived
by the Lagrangian:
maximize
a∈RN
N
n=1
an −
1
2
N
n,m=1
anamdndmxT
mxn
subject to
N
n=1
andn = 0 and an ∈ [0, C/N].
P. Bouboulis, E. Theodoridou, S. Theodoridis CSVR 18/ 47
51. Introduction
Support Vector Machines
The Complex Case
Linear SVMs
Non-linear SVM
Outline
1 Introduction
Reproducing Kernel Hilbert Spaces
Complex RKHS
2 Support Vector Machines
Linear SVMs
Non-linear SVM
3 The Complex Case
Complex Hyperplanes
Problem formulation
Experiments
P. Bouboulis, E. Theodoridou, S. Theodoridis CSVR 19/ 47
52. Introduction
Support Vector Machines
The Complex Case
Linear SVMs
Non-linear SVM
The kernel trick
Choose a positive definite kernel κR.
P. Bouboulis, E. Theodoridou, S. Theodoridis CSVR 20/ 47
53. Introduction
Support Vector Machines
The Complex Case
Linear SVMs
Non-linear SVM
The kernel trick
Choose a positive definite kernel κR.
In the dual problem, replace the inner products xT
n xm with
the respective kernel evaluations, i.e., κR(xn, xm).
P. Bouboulis, E. Theodoridou, S. Theodoridis CSVR 20/ 47
54. Introduction
Support Vector Machines
The Complex Case
Linear SVMs
Non-linear SVM
The kernel trick
Choose a positive definite kernel κR.
In the dual problem, replace the inner products xT
n xm with
the respective kernel evaluations, i.e., κR(xn, xm).
The application of the kernel trick leads to the nonlinear
SVM:
P. Bouboulis, E. Theodoridou, S. Theodoridis CSVR 20/ 47
55. Introduction
Support Vector Machines
The Complex Case
Linear SVMs
Non-linear SVM
The kernel trick
Choose a positive definite kernel κR.
In the dual problem, replace the inner products xT
n xm with
the respective kernel evaluations, i.e., κR(xn, xm).
The application of the kernel trick leads to the nonlinear
SVM:
maximize
a∈RN
N
n=1
an −
1
2
N
n,m=1
anamdndmκR(xm, xn)
subject to
N
n=1
andn = 0 and an ∈ [0, C/N].
P. Bouboulis, E. Theodoridou, S. Theodoridis CSVR 20/ 47
56. Introduction
Support Vector Machines
The Complex Case
Linear SVMs
Non-linear SVM
Mapping to the feature space
The application of the kernel trick to the dual problem is
equivalent to the following procedure:
P. Bouboulis, E. Theodoridou, S. Theodoridis CSVR 21/ 47
57. Introduction
Support Vector Machines
The Complex Case
Linear SVMs
Non-linear SVM
Mapping to the feature space
The application of the kernel trick to the dual problem is
equivalent to the following procedure:
Choose a positive definite kernel κR, that is associated to a
specific RKHS H.
P. Bouboulis, E. Theodoridou, S. Theodoridis CSVR 21/ 47
58. Introduction
Support Vector Machines
The Complex Case
Linear SVMs
Non-linear SVM
Mapping to the feature space
The application of the kernel trick to the dual problem is
equivalent to the following procedure:
Choose a positive definite kernel κR, that is associated to a
specific RKHS H.
Map the points xn to Φ(xn) ∈ H, n = 1, . . . , N.
P. Bouboulis, E. Theodoridou, S. Theodoridis CSVR 21/ 47
59. Introduction
Support Vector Machines
The Complex Case
Linear SVMs
Non-linear SVM
Mapping to the feature space
The application of the kernel trick to the dual problem is
equivalent to the following procedure:
Choose a positive definite kernel κR, that is associated to a
specific RKHS H.
Map the points xn to Φ(xn) ∈ H, n = 1, . . . , N.
Solve the linear SVM task on the infinite dimensional
RKHS H, for the training data {(Φ(xn), dn); n = 1, . . . , N}.
P. Bouboulis, E. Theodoridou, S. Theodoridis CSVR 21/ 47
60. Introduction
Support Vector Machines
The Complex Case
Linear SVMs
Non-linear SVM
A toy example
−4 −3 −2 −1 0 1 2 3 4 5 6
−6
−4
−2
0
2
4
6
−1
−1
−1
−1
0
0 0
0
0
0
0
1
1
1
Figure: Non linear SVM classification, C = 2, gaussian kernel
(σ = 2).
P. Bouboulis, E. Theodoridou, S. Theodoridis CSVR 22/ 47
61. Introduction
Support Vector Machines
The Complex Case
Linear SVMs
Non-linear SVM
A toy example
−4 −3 −2 −1 0 1 2 3 4 5 6
−6
−4
−2
0
2
4
6
−1
−1
−1
−1
−1
0
0
0
0
0 0
0
1
1
1
1Figure: Non linear SVM classification, C = 5, gaussian kernel
(σ = 2).
P. Bouboulis, E. Theodoridou, S. Theodoridis CSVR 23/ 47
62. Introduction
Support Vector Machines
The Complex Case
Linear SVMs
Non-linear SVM
A toy example
−4 −3 −2 −1 0 1 2 3 4 5 6
−6
−4
−2
0
2
4
6
−1 −1
−1
−1
−1
−1
−1
−1
0
0
0
0
0
0
0
0
1
1
1
1
1
Figure: Non linear SVM classification, C = 15, gaussian kernel
(σ = 2).
P. Bouboulis, E. Theodoridou, S. Theodoridis CSVR 24/ 47
63. Introduction
Support Vector Machines
The Complex Case
Complex Hyperplanes
Problem formulation
Experiments
Outline
1 Introduction
Reproducing Kernel Hilbert Spaces
Complex RKHS
2 Support Vector Machines
Linear SVMs
Non-linear SVM
3 The Complex Case
Complex Hyperplanes
Problem formulation
Experiments
P. Bouboulis, E. Theodoridou, S. Theodoridis CSVR 25/ 47
64. Introduction
Support Vector Machines
The Complex Case
Complex Hyperplanes
Problem formulation
Experiments
Real Hyperplanes
Recall that in any real Hilbert space H, a hyperplane
consists of all the elements f ∈ H that satisfy
f, w H + b = 0, (2)
for some w ∈ H, b ∈ R.
P. Bouboulis, E. Theodoridou, S. Theodoridis CSVR 26/ 47
65. Introduction
Support Vector Machines
The Complex Case
Complex Hyperplanes
Problem formulation
Experiments
Real Hyperplanes
Recall that in any real Hilbert space H, a hyperplane
consists of all the elements f ∈ H that satisfy
f, w H + b = 0, (2)
for some w ∈ H, b ∈ R.
Moreover, any hyperplane of H divides the space into two
parts, H+ = {f ∈ H; f, w H + b > 0} and
H− = {f ∈ H; f, w H + b < 0}.
P. Bouboulis, E. Theodoridou, S. Theodoridis CSVR 26/ 47
66. Introduction
Support Vector Machines
The Complex Case
Complex Hyperplanes
Problem formulation
Experiments
Generalization
Due to this constraint all the efforts so far to generalize real
SVMs to more generic Algebras (quaternions, Clifford
algebras, e.t.c.) has been explored so that:
P. Bouboulis, E. Theodoridou, S. Theodoridis CSVR 27/ 47
67. Introduction
Support Vector Machines
The Complex Case
Complex Hyperplanes
Problem formulation
Experiments
Generalization
Due to this constraint all the efforts so far to generalize real
SVMs to more generic Algebras (quaternions, Clifford
algebras, e.t.c.) has been explored so that:
The output variable y is retained to be real.
P. Bouboulis, E. Theodoridou, S. Theodoridis CSVR 27/ 47
68. Introduction
Support Vector Machines
The Complex Case
Complex Hyperplanes
Problem formulation
Experiments
Generalization
Due to this constraint all the efforts so far to generalize real
SVMs to more generic Algebras (quaternions, Clifford
algebras, e.t.c.) has been explored so that:
The output variable y is retained to be real.
The set of functions considered is in one way or another of
a special structure, so that the inner product is real.
P. Bouboulis, E. Theodoridou, S. Theodoridis CSVR 27/ 47
69. Introduction
Support Vector Machines
The Complex Case
Complex Hyperplanes
Problem formulation
Experiments
Generalization
Due to this constraint all the efforts so far to generalize real
SVMs to more generic Algebras (quaternions, Clifford
algebras, e.t.c.) has been explored so that:
The output variable y is retained to be real.
The set of functions considered is in one way or another of
a special structure, so that the inner product is real.
The difficulty is that the set of complex numbers is not an
ordered one, and thus one may not assume that a
complex version of the aforementioned relation divides the
space into two parts, as H+ and H− cannot be defined.
P. Bouboulis, E. Theodoridou, S. Theodoridis CSVR 27/ 47
70. Introduction
Support Vector Machines
The Complex Case
Complex Hyperplanes
Problem formulation
Experiments
Generalization
In order to be able to generalize the SVM rationale to
complex spaces, we need first to develop an appropriate
definition for a complex hyperplane.
P. Bouboulis, E. Theodoridou, S. Theodoridis CSVR 28/ 47
71. Introduction
Support Vector Machines
The Complex Case
Complex Hyperplanes
Problem formulation
Experiments
Generalization
In order to be able to generalize the SVM rationale to
complex spaces, we need first to develop an appropriate
definition for a complex hyperplane.
We begin by considering the following two relations,
Re ( f, w H + c) = 0,
Im ( f, w H + c) = 0,
for some w ∈ H, c ∈ C, where f ∈ H.
P. Bouboulis, E. Theodoridou, S. Theodoridis CSVR 28/ 47
72. Introduction
Support Vector Machines
The Complex Case
Complex Hyperplanes
Problem formulation
Experiments
Generalization
In order to be able to generalize the SVM rationale to
complex spaces, we need first to develop an appropriate
definition for a complex hyperplane.
We begin by considering the following two relations,
Re ( f, w H + c) = 0,
Im ( f, w H + c) = 0,
for some w ∈ H, c ∈ C, where f ∈ H.
It is not difficult to see, that this couple of relations
represent two orthogonal hyperplanes of the doubled real
space, i.e., H2.
P. Bouboulis, E. Theodoridou, S. Theodoridis CSVR 28/ 47
73. Introduction
Support Vector Machines
The Complex Case
Complex Hyperplanes
Problem formulation
Experiments
Generalization
To overcome this constraint and be able to define arbitrarily
placed hyperplanes, we need to employ the so called
widely linear estimation functions, i.e.,
Re ( f, w H + f∗
, v H + c) = 0,
Im ( f, w H + f∗
, v H + c) = 0,
for some w, v ∈ H, c ∈ C, where f ∈ H.
P. Bouboulis, E. Theodoridou, S. Theodoridis CSVR 29/ 47
74. Introduction
Support Vector Machines
The Complex Case
Complex Hyperplanes
Problem formulation
Experiments
Generalization
To overcome this constraint and be able to define arbitrarily
placed hyperplanes, we need to employ the so called
widely linear estimation functions, i.e.,
Re ( f, w H + f∗
, v H + c) = 0,
Im ( f, w H + f∗
, v H + c) = 0,
for some w, v ∈ H, c ∈ C, where f ∈ H.
Depending on the values of w, v, these hyperplanes may
be placed arbitrarily on H2. We define this complex couple
of hyperplanes as the set of all f ∈ H that satisfy either one
of the two relations, for some w, v ∈ H, c ∈ C.
P. Bouboulis, E. Theodoridou, S. Theodoridis CSVR 29/ 47
75. Introduction
Support Vector Machines
The Complex Case
Complex Hyperplanes
Problem formulation
Experiments
Generalization
The aforementioned arguments demonstrate the significant
difference between complex linear estimation and widely
linear estimation functions, which has been pointed out by
many other authors, in the context of regression tasks.
P. Bouboulis, E. Theodoridou, S. Theodoridis CSVR 30/ 47
76. Introduction
Support Vector Machines
The Complex Case
Complex Hyperplanes
Problem formulation
Experiments
Generalization
The aforementioned arguments demonstrate the significant
difference between complex linear estimation and widely
linear estimation functions, which has been pointed out by
many other authors, in the context of regression tasks.
In the current context of classification, we have just seen
that confining to complex linear modeling is quite
restrictive, as the corresponding couple of complex
hyperplanes are always orthogonal.
P. Bouboulis, E. Theodoridou, S. Theodoridis CSVR 30/ 47
77. Introduction
Support Vector Machines
The Complex Case
Complex Hyperplanes
Problem formulation
Experiments
Generalization
The aforementioned arguments demonstrate the significant
difference between complex linear estimation and widely
linear estimation functions, which has been pointed out by
many other authors, in the context of regression tasks.
In the current context of classification, we have just seen
that confining to complex linear modeling is quite
restrictive, as the corresponding couple of complex
hyperplanes are always orthogonal.
On the other hand, the widely linear case is more general
and covers all cases.
P. Bouboulis, E. Theodoridou, S. Theodoridis CSVR 30/ 47
78. Introduction
Support Vector Machines
The Complex Case
Complex Hyperplanes
Problem formulation
Experiments
Generalization
The complex couple of hyperplanes divides the space H (or
H2) into four parts, i.e.,
H++ = f ∈ H;
Re ( f, w H + f∗, v H + c) > 0,
Im ( f, w H + f∗, v H + c) > 0
,
H+− = f ∈ H;
Re ( f, w H + f∗, v H + c) > 0,
Im ( f, w H + f∗, v H + c) < 0
,
H−+ = f ∈ H;
Re ( f, w H + f∗, v H + c) < 0,
Im ( f, w H + f∗, v H + c) > 0
,
H−− = f ∈ H;
Re ( f, w H + f∗, v H + c) < 0,
Im ( f, w H + f∗, v H + c) < 0
.
P. Bouboulis, E. Theodoridou, S. Theodoridis CSVR 31/ 47
79. Introduction
Support Vector Machines
The Complex Case
Complex Hyperplanes
Problem formulation
Experiments
Generalization
Figure: A complex couple of hyperplanes separates the space of
complex numbers (i.e., H = C) into four parts.
P. Bouboulis, E. Theodoridou, S. Theodoridis CSVR 32/ 47
80. Introduction
Support Vector Machines
The Complex Case
Complex Hyperplanes
Problem formulation
Experiments
Outline
1 Introduction
Reproducing Kernel Hilbert Spaces
Complex RKHS
2 Support Vector Machines
Linear SVMs
Non-linear SVM
3 The Complex Case
Complex Hyperplanes
Problem formulation
Experiments
P. Bouboulis, E. Theodoridou, S. Theodoridis CSVR 33/ 47
81. Introduction
Support Vector Machines
The Complex Case
Complex Hyperplanes
Problem formulation
Experiments
The problem
Suppose we are given training data, which belong to four
separate classes C++, C+−, C−+, C−−, i.e.,
{(zn, dn); n = 1, . . . , N} ⊂ X × {±1 ± i)}. If dn = +1 + i, then
the n-th sample belongs to C++, i.e., zn ∈ C++, if dn = 1 − i,
then zn ∈ C+−, e.t.c.
P. Bouboulis, E. Theodoridou, S. Theodoridis CSVR 34/ 47
82. Introduction
Support Vector Machines
The Complex Case
Complex Hyperplanes
Problem formulation
Experiments
The problem
Suppose we are given training data, which belong to four
separate classes C++, C+−, C−+, C−−, i.e.,
{(zn, dn); n = 1, . . . , N} ⊂ X × {±1 ± i)}. If dn = +1 + i, then
the n-th sample belongs to C++, i.e., zn ∈ C++, if dn = 1 − i,
then zn ∈ C+−, e.t.c.
As zn is complex, we denote by xn its real part and by yn its
imaginary part respectively, i.e.,
zn = xn + iyn, n = 1, . . . , N.
P. Bouboulis, E. Theodoridou, S. Theodoridis CSVR 34/ 47
83. Introduction
Support Vector Machines
The Complex Case
Complex Hyperplanes
Problem formulation
Experiments
The problem
Suppose we are given training data, which belong to four
separate classes C++, C+−, C−+, C−−, i.e.,
{(zn, dn); n = 1, . . . , N} ⊂ X × {±1 ± i)}. If dn = +1 + i, then
the n-th sample belongs to C++, i.e., zn ∈ C++, if dn = 1 − i,
then zn ∈ C+−, e.t.c.
As zn is complex, we denote by xn its real part and by yn its
imaginary part respectively, i.e.,
zn = xn + iyn, n = 1, . . . , N.
Our objective is to develop an SVM rationale for the complex
training data.
P. Bouboulis, E. Theodoridou, S. Theodoridis CSVR 34/ 47
84. Introduction
Support Vector Machines
The Complex Case
Complex Hyperplanes
Problem formulation
Experiments
Complex formulation
Consider the complex RKHS, H, with respective kernel κC.
Following a similar rationale to the real case, we transform
the input data from X to H, via the feature map ΦC.
P. Bouboulis, E. Theodoridou, S. Theodoridis CSVR 35/ 47
85. Introduction
Support Vector Machines
The Complex Case
Complex Hyperplanes
Problem formulation
Experiments
Complex formulation
Consider the complex RKHS, H, with respective kernel κC.
Following a similar rationale to the real case, we transform
the input data from X to H, via the feature map ΦC.
The goal of the SVM task is to estimate a complex couple
of maximum margin hyperplanes, that separates the points
of the four classes as best as possible. To this end, we
formulate the primal complex SVM as
P. Bouboulis, E. Theodoridou, S. Theodoridis CSVR 35/ 47
86. Introduction
Support Vector Machines
The Complex Case
Complex Hyperplanes
Problem formulation
Experiments
Complex formulation
Consider the complex RKHS, H, with respective kernel κC.
Following a similar rationale to the real case, we transform
the input data from X to H, via the feature map ΦC.
The goal of the SVM task is to estimate a complex couple
of maximum margin hyperplanes, that separates the points
of the four classes as best as possible. To this end, we
formulate the primal complex SVM as
min
w,v,c
1
2 w 2
H + 1
2 v 2
H + C
N
N
n=1
(ξr
n + ξi
n)
s. to
dr
n Re ( ΦC(zn), w H + Φ∗
C(zn), v H + c) ≥ 1 − ξr
n
di
n Im ( ΦC(zn), w H + Φ∗
C(zn), w H + c) ≥ 1 − ξi
n
ξr
n, ξi
n ≥ 0
for n = 1, . . . , N.
(3)
for some C > 0.
P. Bouboulis, E. Theodoridou, S. Theodoridis CSVR 35/ 47
87. Introduction
Support Vector Machines
The Complex Case
Complex Hyperplanes
Problem formulation
Experiments
Complex formulation
Consequently, the Lagrangian function becomes
L(w, v, a, ˆa, b, ˆb) =
1
2
w 2
H +
1
2
v 2
H +
C
N
N
n=1
(ξr
n + ξi
n)
−
N
n=1
an (dr
n Re ( ΦC(zn), w H + Φ∗
C(zn), v H + c) − 1 + ξr
n)
−
N
n=1
bn di
n Im ( ΦC(zn), w H + Φ∗
C(zn), w H + c) − 1 + ξi
n
−
N
n=1
ηnξr
n −
N
n=1
θnξi
n,
where an, bn, ηn, θn are the positive Lagrange multipliers of the
respective inequalities, for n = 1, . . . , N.
P. Bouboulis, E. Theodoridou, S. Theodoridis CSVR 36/ 47
88. Introduction
Support Vector Machines
The Complex Case
Complex Hyperplanes
Problem formulation
Experiments
Complex formulation
Employing the notion of Wirtinger’s calculus to derive the
respective gradients and exploiting the saddle point conditions
of the Lagrangian function, it turns out that the dual problem
can be split into two separate maximization tasks:
maximize
a
N
n=1
an −
1
2
N
n,m=1
anamd
r
nd
r
mκ
r
C(zm, zn)
subject to
N
n=1
and
r
n = 0
0 ≤ an ≤ C
N
for n = 1, . . . , N
(4a)
and
maximize
ˆa
N
n=1
bn −
1
2
N
n,m=1
bnbmd
i
nd
i
mκ
r
C(zm, zn)
subject to
N
n=1
bnd
i
n = 0
0 ≤ bn ≤ C
N
for n = 1, . . . , N,
(4b)
P. Bouboulis, E. Theodoridou, S. Theodoridis CSVR 37/ 47
89. Introduction
Support Vector Machines
The Complex Case
Complex Hyperplanes
Problem formulation
Experiments
Complex formulation
We observe that these problems are equivalent with two distinct
real SVM (dual) tasks employing the induced real kernel κr
C:
κr
C(z, z′
) = 2 Re(κC(z, z′
)), (5)
P. Bouboulis, E. Theodoridou, S. Theodoridis CSVR 38/ 47
90. Introduction
Support Vector Machines
The Complex Case
Complex Hyperplanes
Problem formulation
Experiments
Complex formulation
We observe that these problems are equivalent with two distinct
real SVM (dual) tasks employing the induced real kernel κr
C:
κr
C(z, z′
) = 2 Re(κC(z, z′
)), (5)
One may
split the (output) data to their real and imaginary parts,
P. Bouboulis, E. Theodoridou, S. Theodoridis CSVR 38/ 47
91. Introduction
Support Vector Machines
The Complex Case
Complex Hyperplanes
Problem formulation
Experiments
Complex formulation
We observe that these problems are equivalent with two distinct
real SVM (dual) tasks employing the induced real kernel κr
C:
κr
C(z, z′
) = 2 Re(κC(z, z′
)), (5)
One may
split the (output) data to their real and imaginary parts,
solve two real SVM tasks employing any one of the
standard algorithms and, finally,
P. Bouboulis, E. Theodoridou, S. Theodoridis CSVR 38/ 47
92. Introduction
Support Vector Machines
The Complex Case
Complex Hyperplanes
Problem formulation
Experiments
Complex formulation
We observe that these problems are equivalent with two distinct
real SVM (dual) tasks employing the induced real kernel κr
C:
κr
C(z, z′
) = 2 Re(κC(z, z′
)), (5)
One may
split the (output) data to their real and imaginary parts,
solve two real SVM tasks employing any one of the
standard algorithms and, finally,
combine the solutions to take the complex labeling
function:
g(z) = sign
i
N
n=1
(andr
n + ibndi
n)κr
C(zn, z) + cr
+ ici
,
where sign
i
(z) = sign(Re(z)) + i sign(Im(z)).
P. Bouboulis, E. Theodoridou, S. Theodoridis CSVR 38/ 47
93. Introduction
Support Vector Machines
The Complex Case
Complex Hyperplanes
Problem formulation
Experiments
Pure Complex SVM
Figure: Pure Complex Support Vector Machines.
P. Bouboulis, E. Theodoridou, S. Theodoridis CSVR 39/ 47
94. Introduction
Support Vector Machines
The Complex Case
Complex Hyperplanes
Problem formulation
Experiments
Complexification
An alternative path is the so called complexification
procedure.
P. Bouboulis, E. Theodoridou, S. Theodoridis CSVR 40/ 47
95. Introduction
Support Vector Machines
The Complex Case
Complex Hyperplanes
Problem formulation
Experiments
Complexification
An alternative path is the so called complexification
procedure.
We employ a real kernel κR and transform the input data
from X to the complexified space H, i.e.,
x → ΦR(x) + iΦR(x).
P. Bouboulis, E. Theodoridou, S. Theodoridis CSVR 40/ 47
96. Introduction
Support Vector Machines
The Complex Case
Complex Hyperplanes
Problem formulation
Experiments
Complexification
An alternative path is the so called complexification
procedure.
We employ a real kernel κR and transform the input data
from X to the complexified space H, i.e.,
x → ΦR(x) + iΦR(x).
We can similarly deduce that the dual of the complexified
SVM task is equivalent to two real SVM tasks employing
the kernel 2κR.
P. Bouboulis, E. Theodoridou, S. Theodoridis CSVR 40/ 47
97. Introduction
Support Vector Machines
The Complex Case
Complex Hyperplanes
Problem formulation
Experiments
Complexification
An alternative path is the so called complexification
procedure.
We employ a real kernel κR and transform the input data
from X to the complexified space H, i.e.,
x → ΦR(x) + iΦR(x).
We can similarly deduce that the dual of the complexified
SVM task is equivalent to two real SVM tasks employing
the kernel 2κR.
We conclude that, in both cases, we end up with two real
SVM tasks (although employing different types of kernels).
P. Bouboulis, E. Theodoridou, S. Theodoridis CSVR 40/ 47
98. Introduction
Support Vector Machines
The Complex Case
Complex Hyperplanes
Problem formulation
Experiments
Binary Classification
Although both scenarios are developed naturally for
quaternary classification, they can be easily adapted to the
binary case also.
P. Bouboulis, E. Theodoridou, S. Theodoridis CSVR 41/ 47
99. Introduction
Support Vector Machines
The Complex Case
Complex Hyperplanes
Problem formulation
Experiments
Binary Classification
Although both scenarios are developed naturally for
quaternary classification, they can be easily adapted to the
binary case also.
This can be done by considering that the labels of the data
are real numbers (i.e., dn ∈ R) taking the values ±1. In this
case we solve one problem instead of two.
P. Bouboulis, E. Theodoridou, S. Theodoridis CSVR 41/ 47
100. Introduction
Support Vector Machines
The Complex Case
Complex Hyperplanes
Problem formulation
Experiments
Outline
1 Introduction
Reproducing Kernel Hilbert Spaces
Complex RKHS
2 Support Vector Machines
Linear SVMs
Non-linear SVM
3 The Complex Case
Complex Hyperplanes
Problem formulation
Experiments
P. Bouboulis, E. Theodoridou, S. Theodoridis CSVR 42/ 47
101. Introduction
Support Vector Machines
The Complex Case
Complex Hyperplanes
Problem formulation
Experiments
MNIST
We use the popular MNIST database of handwritten digits.
P. Bouboulis, E. Theodoridou, S. Theodoridis CSVR 43/ 47
102. Introduction
Support Vector Machines
The Complex Case
Complex Hyperplanes
Problem formulation
Experiments
MNIST
We use the popular MNIST database of handwritten digits.
Each digit is encoded as an image file with 28 × 28 pixels.
P. Bouboulis, E. Theodoridou, S. Theodoridis CSVR 43/ 47
103. Introduction
Support Vector Machines
The Complex Case
Complex Hyperplanes
Problem formulation
Experiments
MNIST
We use the popular MNIST database of handwritten digits.
Each digit is encoded as an image file with 28 × 28 pixels.
MNIST contains 60000 handwritten digits (from 0 to 9) for
training and 10000 handwritten digits for testing.
P. Bouboulis, E. Theodoridou, S. Theodoridis CSVR 43/ 47
104. Introduction
Support Vector Machines
The Complex Case
Complex Hyperplanes
Problem formulation
Experiments
MNIST
We use the popular MNIST database of handwritten digits.
Each digit is encoded as an image file with 28 × 28 pixels.
MNIST contains 60000 handwritten digits (from 0 to 9) for
training and 10000 handwritten digits for testing.
To exploit the structure of complex numbers, we perform a
Fourier transform to each training image and keep only the
100 most significant coefficients.
P. Bouboulis, E. Theodoridou, S. Theodoridis CSVR 43/ 47
105. Introduction
Support Vector Machines
The Complex Case
Complex Hyperplanes
Problem formulation
Experiments
First Experiment
We compare a standard one-versus-all SVM scenario that
exploits the original (real) data (images of 28 × 28 = 784
pixels) with
P. Bouboulis, E. Theodoridou, S. Theodoridis CSVR 44/ 47
106. Introduction
Support Vector Machines
The Complex Case
Complex Hyperplanes
Problem formulation
Experiments
First Experiment
We compare a standard one-versus-all SVM scenario that
exploits the original (real) data (images of 28 × 28 = 784
pixels) with
a complex one versus all variant exploiting the
complexified binary SVM, where we use only the 100 most
significant (complex) Fourier coefficients of each picture.
P. Bouboulis, E. Theodoridou, S. Theodoridis CSVR 44/ 47
107. Introduction
Support Vector Machines
The Complex Case
Complex Hyperplanes
Problem formulation
Experiments
First Experiment
We compare a standard one-versus-all SVM scenario that
exploits the original (real) data (images of 28 × 28 = 784
pixels) with
a complex one versus all variant exploiting the
complexified binary SVM, where we use only the 100 most
significant (complex) Fourier coefficients of each picture.
In both scenarios we use the first 6000 digits of the MNIST
training set to train the learning machines and test their
performances using the 10000 digits of the testing set.
P. Bouboulis, E. Theodoridou, S. Theodoridis CSVR 44/ 47
108. Introduction
Support Vector Machines
The Complex Case
Complex Hyperplanes
Problem formulation
Experiments
First Experiment
We compare a standard one-versus-all SVM scenario that
exploits the original (real) data (images of 28 × 28 = 784
pixels) with
a complex one versus all variant exploiting the
complexified binary SVM, where we use only the 100 most
significant (complex) Fourier coefficients of each picture.
In both scenarios we use the first 6000 digits of the MNIST
training set to train the learning machines and test their
performances using the 10000 digits of the testing set.
We used the gaussian kernel with t = 1/64 and
t = 1/1402 respectively.
P. Bouboulis, E. Theodoridou, S. Theodoridis CSVR 44/ 47
109. Introduction
Support Vector Machines
The Complex Case
Complex Hyperplanes
Problem formulation
Experiments
First Experiment
We compare a standard one-versus-all SVM scenario that
exploits the original (real) data (images of 28 × 28 = 784
pixels) with
a complex one versus all variant exploiting the
complexified binary SVM, where we use only the 100 most
significant (complex) Fourier coefficients of each picture.
In both scenarios we use the first 6000 digits of the MNIST
training set to train the learning machines and test their
performances using the 10000 digits of the testing set.
We used the gaussian kernel with t = 1/64 and
t = 1/1402 respectively.
The SVM parameter C has been set equal to 100.
P. Bouboulis, E. Theodoridou, S. Theodoridis CSVR 44/ 47
110. Introduction
Support Vector Machines
The Complex Case
Complex Hyperplanes
Problem formulation
Experiments
First Experiment
The error rate of the standard real-valued scenario is
3.79%, while the error rate of the complexified
(one-versus-all) SVM is 3.46%.
P. Bouboulis, E. Theodoridou, S. Theodoridis CSVR 45/ 47
111. Introduction
Support Vector Machines
The Complex Case
Complex Hyperplanes
Problem formulation
Experiments
First Experiment
The error rate of the standard real-valued scenario is
3.79%, while the error rate of the complexified
(one-versus-all) SVM is 3.46%.
In both learning tasks we used the SMO algorithm to train
the SVM.
P. Bouboulis, E. Theodoridou, S. Theodoridis CSVR 45/ 47
112. Introduction
Support Vector Machines
The Complex Case
Complex Hyperplanes
Problem formulation
Experiments
First Experiment
The error rate of the standard real-valued scenario is
3.79%, while the error rate of the complexified
(one-versus-all) SVM is 3.46%.
In both learning tasks we used the SMO algorithm to train
the SVM.
The total amount of time needed to perform the training of
each learning machine is almost the same for both cases
(the complexified task is slightly faster).
P. Bouboulis, E. Theodoridou, S. Theodoridis CSVR 45/ 47
113. Introduction
Support Vector Machines
The Complex Case
Complex Hyperplanes
Problem formulation
Experiments
Second Experiment - Quaternary Classification
This is a quaternary classification problem.
P. Bouboulis, E. Theodoridou, S. Theodoridis CSVR 46/ 47
114. Introduction
Support Vector Machines
The Complex Case
Complex Hyperplanes
Problem formulation
Experiments
Second Experiment - Quaternary Classification
This is a quaternary classification problem.
Using the complex approach, such a problem can be
solved using only 2 distinct SVM tasks, instead of the 4
SVM tasks needed by the standard 1-versus-all or the
1-versus-1 strategies.
P. Bouboulis, E. Theodoridou, S. Theodoridis CSVR 46/ 47
115. Introduction
Support Vector Machines
The Complex Case
Complex Hyperplanes
Problem formulation
Experiments
Second Experiment - Quaternary Classification
This is a quaternary classification problem.
Using the complex approach, such a problem can be
solved using only 2 distinct SVM tasks, instead of the 4
SVM tasks needed by the standard 1-versus-all or the
1-versus-1 strategies.
We compare a complex quaternary SVM task with the
1-versus-all scenario.
P. Bouboulis, E. Theodoridou, S. Theodoridis CSVR 46/ 47
116. Introduction
Support Vector Machines
The Complex Case
Complex Hyperplanes
Problem formulation
Experiments
Second Experiment - Quaternary Classification
This is a quaternary classification problem.
Using the complex approach, such a problem can be
solved using only 2 distinct SVM tasks, instead of the 4
SVM tasks needed by the standard 1-versus-all or the
1-versus-1 strategies.
We compare a complex quaternary SVM task with the
1-versus-all scenario.
To this end we use the first 6000, 0, 1, 2 and 3 digits of the
MNIST training set and compare the performances of the
two algorithms using the respective digits of the MNIST
training set.
P. Bouboulis, E. Theodoridou, S. Theodoridis CSVR 46/ 47
117. Introduction
Support Vector Machines
The Complex Case
Complex Hyperplanes
Problem formulation
Experiments
Second Experiment - Quaternary Classification
The error rate of the 1-versus-all SVM was 0.721%,
P. Bouboulis, E. Theodoridou, S. Theodoridis CSVR 47/ 47
118. Introduction
Support Vector Machines
The Complex Case
Complex Hyperplanes
Problem formulation
Experiments
Second Experiment - Quaternary Classification
The error rate of the 1-versus-all SVM was 0.721%,
while the error rate of the complex SVM was 0.866%.
P. Bouboulis, E. Theodoridou, S. Theodoridis CSVR 47/ 47
119. Introduction
Support Vector Machines
The Complex Case
Complex Hyperplanes
Problem formulation
Experiments
Second Experiment - Quaternary Classification
The error rate of the 1-versus-all SVM was 0.721%,
while the error rate of the complex SVM was 0.866%.
In terms of speed the 1-versus-all SVM task required about
double the time for training, compared to the complex SVM.
P. Bouboulis, E. Theodoridou, S. Theodoridis CSVR 47/ 47
120. Introduction
Support Vector Machines
The Complex Case
Complex Hyperplanes
Problem formulation
Experiments
Second Experiment - Quaternary Classification
The error rate of the 1-versus-all SVM was 0.721%,
while the error rate of the complex SVM was 0.866%.
In terms of speed the 1-versus-all SVM task required about
double the time for training, compared to the complex SVM.
This is expected, as the latter solves half as many distinct
SVM tasks as the first one.
P. Bouboulis, E. Theodoridou, S. Theodoridis CSVR 47/ 47
121. Introduction
Support Vector Machines
The Complex Case
Complex Hyperplanes
Problem formulation
Experiments
Second Experiment - Quaternary Classification
The error rate of the 1-versus-all SVM was 0.721%,
while the error rate of the complex SVM was 0.866%.
In terms of speed the 1-versus-all SVM task required about
double the time for training, compared to the complex SVM.
This is expected, as the latter solves half as many distinct
SVM tasks as the first one.
In both experiments we used the gaussian kernel with
t = 1/49 and t = 1/1602 respectively.
P. Bouboulis, E. Theodoridou, S. Theodoridis CSVR 47/ 47
122. Introduction
Support Vector Machines
The Complex Case
Complex Hyperplanes
Problem formulation
Experiments
Second Experiment - Quaternary Classification
The error rate of the 1-versus-all SVM was 0.721%,
while the error rate of the complex SVM was 0.866%.
In terms of speed the 1-versus-all SVM task required about
double the time for training, compared to the complex SVM.
This is expected, as the latter solves half as many distinct
SVM tasks as the first one.
In both experiments we used the gaussian kernel with
t = 1/49 and t = 1/1602 respectively.
The SVM parameter C has been set equal to 100 in this
case also.
P. Bouboulis, E. Theodoridou, S. Theodoridis CSVR 47/ 47