首先,DecisionTreeClassifier
has no属性decision_function
.
如果我从你的代码结构猜测,你看到了这个example http://scikit-learn.org/stable/auto_examples/model_selection/plot_roc.html
在这种情况下,分类器不是决策树,而是支持 Decision_function 方法的 OneVsRestClassifier。
您可以看到可用的属性DecisionTreeClassifier
here http://scikit-learn.org/stable/modules/generated/sklearn.tree.DecisionTreeClassifier.html
一种可能的方法是二值化类进而计算每个类别的 auc:
Example:
from sklearn import datasets
from sklearn.metrics import roc_curve, auc
from sklearn.model_selection import train_test_split
from sklearn.preprocessing import label_binarize
from sklearn.tree import DecisionTreeClassifier
from scipy import interp
iris = datasets.load_iris()
X = iris.data
y = iris.target
y = label_binarize(y, classes=[0, 1, 2])
n_classes = y.shape[1]
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=.5, random_state=0)
classifier = DecisionTreeClassifier()
y_score = classifier.fit(X_train, y_train).predict(X_test)
fpr = dict()
tpr = dict()
roc_auc = dict()
for i in range(n_classes):
fpr[i], tpr[i], _ = roc_curve(y_test[:, i], y_score[:, i])
roc_auc[i] = auc(fpr[i], tpr[i])
# Compute micro-average ROC curve and ROC area
fpr["micro"], tpr["micro"], _ = roc_curve(y_test.ravel(), y_score.ravel())
roc_auc["micro"] = auc(fpr["micro"], tpr["micro"])
#ROC curve for a specific class here for the class 2
roc_auc[2]
Result
0.94852941176470573