9
$\begingroup$

Suppose we are given a training set of 2D points that are linearly non-separable. I train a binary SVM with an RBF kernel in order to classify them. What I want to do is to draw the desicion boundary.

The kernel SVM I train leads to a decision function of the form: $$ f(\mathbf{x})=\sum_{i=1}^{N_s}\alpha_i y_i k(\mathbf{x}, \mathbf{x}_i)+b, $$ where $N_s$ is the number of support vectors, $\mathbf{x}_i$, $\alpha_i$, and $y_i$ are the $i$-th support vector, the corresponding positive Lagrangian multiplier, and the associated truth label, respectively. I use the LIBSVM package to train the SVM, thus all the above are known from the created model file.

If we extend the above decision function, we see that it's a sum of Gaussians centred at the support vectors, i.e., $$ f(\mathbf{x})=\sum_{i=1}^{N_s}\alpha_i y_i \operatorname{exp}\left(-\gamma\lVert\mathbf{x}-\mathbf{x}_i\rVert^2\right)+b, $$ where $\gamma=\frac{1}{2\sigma^2}>0$ is a user-defined parameter.

How could I plot in the same space of my input data (the Euclidean 2D space), the contour of the decision function? Could you please provide some equations or ideas? I've tried some things, but I had no luck!

$\endgroup$

2 Answers 2

8
$\begingroup$

I figured out what is needed to be done. Actually, it's something simple, but its seems I had a matlaboid bug... Here is the code and the resulting figure for the "XOR" binary classification problem.

gamma     = getGamma();
b         = getB();
points_x1 = linspace(xLimits(1), xLimits(2), 100);
points_x2 = linspace(yLimits(1), yLimits(2), 100);
[X1, X2]  = meshgrid(points_x1, points_x2);

% Initialize f
f = ones(length(points_x1), length(points_x2))*rho;

% Iter. all SVs
for i=1:N_sv
    alpha_i = getAlpha(i);
    sv_i    = getSV(i);
    for j=1:length(points_x1)
        for k=1:length(points_x2)
            x = [points_x1(j);points_x2(k)];
            f(j,k) = f(j,k) + alpha_i*y_i*kernel_func(gamma, x, sv_i);
        end
    end    
end

surf(X1,X2,f);
shading interp;
lighting phong;
alpha(.6)

contourf(X1, X2, f, 1);

where the function

function k = kernel_func(gamma, x, x_i)
    k = exp(-gamma*norm(x - x_i)^2);
end

just produces the kernel function (RBF kernel), $k(\mathbf{x},\mathbf{x}')=\operatorname{exp}\left(-\gamma\lVert\mathbf{x}-\mathbf{x}'\rVert^2\right)$.

Here is the result for the XOR problem. Here $\gamma=4$.

enter image description here

$\endgroup$
4
  • $\begingroup$ thanks for your great answer to your own question, helped me a lot! what is the y_i parameter in the f(j,k) ... statement, though? $\endgroup$
    – codeling
    Commented Jun 25, 2015 at 14:58
  • $\begingroup$ ah it's the label, right? $\endgroup$
    – codeling
    Commented Jun 25, 2015 at 15:05
  • $\begingroup$ Exactly, it's the label. $\endgroup$ Commented Jun 25, 2015 at 15:22
  • $\begingroup$ note: you will want to use [0 0] as last parameter to contour, to draw the boundary at the position where decision function is 0! Otherwise (at least in octave) the decision boundary is chosen automatically, and not at 0. $\endgroup$
    – codeling
    Commented Jun 29, 2015 at 15:23
1
$\begingroup$

Use a grid approach, divide your 2D space into a k by k grid, evaluate that point as a sample in your SVM (i.e., predict the label), and plot the predictions at all points.

$\endgroup$

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Not the answer you're looking for? Browse other questions tagged or ask your own question.