Constructs a learner class object for fitting support vector
machines with e1071::svm. As shown in the examples, the constructed learner
returns predicted class probabilities of class 2 in case of binary
classification. A n times p
matrix, with n
being the number of
observations and p
the number of classes, is returned for multi-class
classification.
learner_svm(
formula,
info = "e1071::svm",
cost = 1,
epsilon = 0.1,
kernel = "radial",
learner.args = NULL,
...
)
(formula) Formula specifying response and design matrix.
(character) Optional information to describe the instantiated learner object.
cost of constraints violation (default: 1)—it is the ‘C’-constant of the regularization term in the Lagrange formulation.
epsilon in the insensitive-loss function (default: 0.1)
the kernel used in training and predicting. You
might consider changing some of the following parameters, depending
on the kernel type.
\(u'v\)
\((\gamma u'v + coef0)^{degree}\)
\(e^(-\gamma |u-v|^2)\)
\(tanh(\gamma u'v + coef0)\)
(list) Additional arguments to learner$new().
Additional arguments to e1071::svm.
learner object.
n <- 5e2
x1 <- rnorm(n, sd = 2)
x2 <- rnorm(n)
lp <- x2*x1 + cos(x1)
yb <- rbinom(n, 1, lava::expit(lp))
y <- lp + rnorm(n, sd = 0.5**.5)
d <- data.frame(y, yb, x1, x2)
# regression
lr <- learner_svm(y ~ x1 + x2)
lr$estimate(d)
lr$predict(head(d))
#> 1 2 3 4 5 6
#> 1.995377303 0.645702513 0.005660922 -2.727438622 1.112411111 1.060965475
# binary classification
lr <- learner_svm(as.factor(yb) ~ x1 + x2)
# alternative to transforming response variable to factor
# lr <- learner_svm(yb ~ x1 + x2, type = "C-classification")
lr$estimate(d)
lr$predict(head(d)) # predict class probabilities of class 2
#> 1 2 3 4 5 6
#> 0.1837071 0.2929537 0.5175953 0.9222461 0.1874361 0.2446923
lr$predict(head(d), probability = FALSE) # predict labels
#> 1 2 3 4 5 6
#> 1 1 1 0 1 1
#> Levels: 0 1
# multi-class classification
lr <- learner_svm(Species ~ .)
lr$estimate(iris)
lr$predict(head(iris))
#> setosa versicolor virginica
#> 1 0.9811383 0.01047810 0.008383591
#> 2 0.9741682 0.01682844 0.009003326
#> 3 0.9798900 0.01105912 0.009050920
#> 4 0.9760581 0.01421695 0.009724906
#> 5 0.9803570 0.01077415 0.008868896
#> 6 0.9752374 0.01555128 0.009211281