我猜这是一个规模问题,您可以对其进行标准化或缩放。
之间存在差异scaling
and normalizing
,它会影响你的结果,值得一个单独的问题:
标准化输入
norm.fun = function(x){
(x - min(x))/(max(x) - min(x))
}
require(ggplot2) # load mpg dataset
require(neuralnet)
data = mpg[, c('cty', 'displ', 'year', 'cyl', 'hwy')]
data.norm = apply(data, 2, norm.fun)
net = neuralnet(cty ~ displ + year + cyl + hwy, data.norm, hidden = 2)
然后你可以对数据进行非规范化
# restore data
y.net = min(data[, 'cty']) + net$net.result[[1]] * range(data[, 'cty'])
plot(data[, 'cty'], col = 'red')
points(y.net)
规模投入
data.scaled = scale(data)
net = neuralnet(cty ~ displ + year + cyl + hwy, data.scaled, hidden = 2)
# restore data
y.sd = sd(data[, 'cty'])
y.mean = mean(data[, 'cty'])
y.net = net$net.result[[1]] * y.sd + y.mean
plot(data[, 'cty'], col = 'red')
points(y.net)
您还可以尝试nnet包,速度非常快:
require(nnet)
data2 = mpg
data2$year = scale(data2$year)
fit = nnet(cty ~ displ + year + cyl + hwy, size = 10, data = data2, linout = TRUE)
plot(mpg$cty)
points(fit$fitted.values, col = 'red')