使用 numpy 进行多元多项式回归

2023-12-30

我有很多样品(y_i, (a_i, b_i, c_i)) where y假定作为多项式变化a,b,c达到一定程度。例如,对于给定的一组数据和 2 度,我可能会生成模型

y = a^2 + 2ab - 3cb + c^2 +.5ac

这可以使用最小二乘法来完成,并且是 numpy 的 polyfit 例程的轻微扩展。 Python 生态系统中是否有标准实现?


sklearn 提供了一种简单的方法来做到这一点。

构建发布的示例here https://stackoverflow.com/questions/20463226/how-to-do-gaussian-polynomial-regression-with-scikit-learn:

import numpy as np
from sklearn.preprocessing import PolynomialFeatures
from sklearn import linear_model

#X is the independent variable (bivariate in this case)
X = np.array([[0.44, 0.68], [0.99, 0.23]])

#vector is the dependent data
vector = np.array([109.85, 155.72])

#predict is an independent variable for which we'd like to predict the value
predict= np.array([[0.49, 0.18]])

#generate a model of polynomial features
poly = PolynomialFeatures(degree=2)

#transform the x data for proper fitting (for single variable type it returns,[1,x,x**2])
X_ = poly.fit_transform(X)

#transform the prediction to fit the model type
predict_ = poly.fit_transform(predict)

#here we can remove polynomial orders we don't want
#for instance I'm removing the `x` component
X_ = np.delete(X_,(1),axis=1)
predict_ = np.delete(predict_,(1),axis=1)

#generate the regression object
clf = linear_model.LinearRegression()
#preform the actual regression
clf.fit(X_, vector)

print("X_ = ",X_)
print("predict_ = ",predict_)
print("Prediction = ",clf.predict(predict_))

这是输出:

>>> X_ =  [[ 0.44    0.68    0.1936  0.2992  0.4624]
>>>  [ 0.99    0.23    0.9801  0.2277  0.0529]]
>>> predict_ =  [[ 0.49    0.18    0.2401  0.0882  0.0324]]
>>> Prediction =  [ 126.84247142]
本文内容由网友自发贡献,版权归原作者所有,本站不承担相应法律责任。如您发现有涉嫌抄袭侵权的内容,请联系:hwhale#tublm.com(使用前将#替换为@)

使用 numpy 进行多元多项式回归 的相关文章

随机推荐