Basic SVM issues with e1071: test error rate doesn't match up with tune's
results
This seems like a very basic question but I can't seem to find the answer
anywhere. I'm new to SVMs and ML in general and am trying to do a few
simple exercises but the results don't seem to match up. I'm using e1071
with R and have been going through An Introduction to Statistical Learning
by James, Witten, Hastie, and Tibshirani.
My question: why is it that when I use predict I don't seem to have any
classification errors and yet the results of the tune function indicate a
non-zero error rate? My code (I'm looking at three classes):
set.seed(4)
dat <- data.frame(pop = rnorm(900, c(0,3,6), 1), strat =
factor(rep(c(0,1,2), times=300)))
ind <- sample(1:900)
train <- dat[ind[1:600],]
test <- dat[ind[601:900],]
tune1 <- tune(svm, train.x=train[,1], train.y=train[,2], kernel="radial",
ranges=list(cost=10^(-1:2), gamma=c(.5,1,2)))
svm.tuned <- svm(train[,2]~., data=train, kernel = "radial", cost=10,
gamma=1) # I just entered the optimal cost and gamma values returned by
tune
test.pred <- predict(svm.tuned,
newdata=data.frame(pop=test[,1],strat=test[,2]))
So when I look at test.pred I see that every value matches up with the
true class labels. Yet when I tuned the model it gave an error rate of
around 0.06, and either way a test error rate of 0 seems absurd for
nonseparable data (unless I'm wrong about this not being separable?). Any
clarification would be tremendously helpful. Thanks a lot.
No comments:
Post a Comment