Help language development. Donate to The Perl Foundation

## Algorithm::LibSVM cpan:TITSUKI last updated on 2022-07-24

Algorithm-LibSVM-0.0.16/

# NAME

Algorithm::LibSVM - A Raku bindings for libsvm

# SYNOPSIS

## EXAMPLE 1

``````use Algorithm::LibSVM;
use Algorithm::LibSVM::Parameter;
use Algorithm::LibSVM::Problem;
use Algorithm::LibSVM::Model;

my \$libsvm = Algorithm::LibSVM.new;
my Algorithm::LibSVM::Parameter \$parameter .= new(svm-type => C_SVC,
kernel-type => RBF);
# heart_scale is here: https://www.csie.ntu.edu.tw/~cjlin/libsvmtools/datasets/binary/heart_scale
my Algorithm::LibSVM::Problem \$problem = Algorithm::LibSVM::Problem.from-file('heart_scale');
my @r = \$libsvm.cross-validation(\$problem, \$parameter, 10);
\$libsvm.evaluate(\$problem.y, @r).say; # {acc => 81.1111111111111, mse => 0.755555555555556, scc => 1.01157627463546}
``````

## EXAMPLE 2

``````use Algorithm::LibSVM;
use Algorithm::LibSVM::Parameter;
use Algorithm::LibSVM::Problem;
use Algorithm::LibSVM::Model;

sub gen-train {
my \$max-x = 1;
my \$min-x = -1;
my \$max-y = 1;
my \$min-y = -1;
my @tmp-x;
my @tmp-y;
do for ^300 {
my \$x = \$min-x + rand * (\$max-x - \$min-x);
my \$y = \$min-y + rand * (\$max-y - \$min-y);

my \$label = do given \$x, \$y {
when (\$x - 0.5) ** 2 + (\$y - 0.5) ** 2 <= 0.2 {
1
}
when (\$x - -0.5) ** 2 + (\$y - -0.5) ** 2 <= 0.2 {
-1
}
default { Nil }
}
if \$label.defined {
@tmp-y.push: \$label;
@tmp-x.push: [\$x, \$y];
}
}
# Note that @x must be a shaped one.
my @x[[email protected];2] = @tmp-x.clone;
my @y = @tmp-y.clone;
(@x, @y)
}

my (@train-x, @train-y) := gen-train;
my @test-x = 1 => 0.5e0, 2 => 0.5e0;
my \$libsvm = Algorithm::LibSVM.new;
my Algorithm::LibSVM::Parameter \$parameter .= new(svm-type => C_SVC,
kernel-type => LINEAR);
my Algorithm::LibSVM::Problem \$problem = Algorithm::LibSVM::Problem.from-matrix(@train-x, @train-y);
my \$model = \$libsvm.train(\$problem, \$parameter);
say \$model.predict(features => @test-x)<label> # 1
``````

# DESCRIPTION

Algorithm::LibSVM is a Raku bindings for libsvm.

## METHODS

### cross-validation

Defined as:

``````method cross-validation(Algorithm::LibSVM::Problem \$problem, Algorithm::LibSVM::Parameter \$param, Int \$nr-fold --> List)
``````

Conducts `\$nr-fold`-fold cross validation and returns predicted values.

### train

Defined as:

``````method train(Algorithm::LibSVM::Problem \$problem, Algorithm::LibSVM::Parameter \$param --> Algorithm::LibSVM::Model)
``````

Trains a SVM model.

• `\$problem` The instance of Algorithm::LibSVM::Problem.

• `\$param` The instance of Algorithm::LibSVM::Parameter.

Defined as:

``````multi method load-problem(\lines --> Algorithm::LibSVM::Problem)
multi method load-problem(Str \$filename --> Algorithm::LibSVM::Problem)
``````

Defined as:

``````method load-model(Str \$filename --> Algorithm::LibSVM::Model)
``````

### evaluate

Defined as:

``````method evaluate(@true-values, @predicted-values --> Hash)
``````

Evaluates the performance of the three metrics (i.e. accuracy, mean squared error and squared correlation coefficient)

• `@true-values` The array that contains ground-truth values.

• `@predicted-values` The array that contains predicted values.

### nr-feature

Defined as:

``````method nr-feature(--> Int:D)
``````

Returns the maximum index of all the features.

## ROUTINES

### parse-libsvmformat

Defined as:

``````sub parse-libsvmformat(Str \$text --> List) is export
``````

Is a helper routine for handling libsvm-format text.

# CAUTION

## DON'T USE `PRECOMPUTED` KERNEL

As a workaround for RT130187, I applied the patch programs (e.g. src/3.22/svm.cpp.patch) for the sake of disabling random access of the problematic array.

Sadly to say, those patches drastically increase the complexity of using `PRECOMPUTED` kernel.