Decision Tree Classification Error
Contents |
here for a quick overview of the site Help Center Detailed answers to any questions you might have Meta Discuss the workings decision tree classification in data mining example and policies of this site About Us Learn more about Stack Overflow
Decision Tree Classification Matlab
the company Business Learn more about hiring developers or posting ads with us Stack Overflow Questions Jobs Documentation decision tree classification envi Tags Users Badges Ask Question x Dismiss Join the Stack Overflow Community Stack Overflow is a community of 4.7 million programmers, just like you, helping each other. Join them; decision tree classification in r it only takes a minute: Sign up How to compute error rate from a decision tree? up vote 20 down vote favorite 12 Does anyone know how to calculate the error rate for a decision tree with R? I am using the rpart() function. r classification decision-tree rpart share|improve this question edited Jan 29 '13 at 9:09 rcs 35.8k10118127 asked
Decision Tree Classification Remote Sensing
Mar 12 '12 at 11:29 teo6389 1431210 add a comment| 1 Answer 1 active oldest votes up vote 38 down vote accepted Assuming you mean computing error rate on the sample used to fit the model, you can use printcp(). For example, using the on-line example, > library(rpart) > fit <- rpart(Kyphosis ~ Age + Number + Start, data=kyphosis) > printcp(fit) Classification tree: rpart(formula = Kyphosis ~ Age + Number + Start, data = kyphosis) Variables actually used in tree construction: [1] Age Start Root node error: 17/81 = 0.20988 n= 81 CP nsplit rel error xerror xstd 1 0.176471 0 1.00000 1.00000 0.21559 2 0.019608 1 0.82353 0.82353 0.20018 3 0.010000 4 0.76471 0.82353 0.20018 The Root node error is used to compute two measures of predictive performance, when considering values displayed in the rel error and xerror column, and depending on the complexity parameter (first column): 0.76471 x 0.20988 = 0.1604973 (16.0%) is the resubstitution error rate (i.e., error rate computed on the training sample) -- this is roughly class.pred <- table(predict(fit, type="class"), kyphosis$Kyphosis) 1-sum(diag(class.pred))/sum(class.pred) 0.82
Search All Support Resources Support Documentation MathWorks Search MathWorks.com MathWorks Documentation Support Documentation Toggle navigation Trial Software Product Updates Documentation Home Statistics decision tree classification algorithm and Machine Learning Toolbox Examples Functions and Other Reference Release Notes steps of decision tree classification PDF Documentation Classification Classification Trees Statistics and Machine Learning Toolbox Functions loss On this page Syntax Description
Gini Index Decision Tree Example
Input Arguments tree TBL X ResponseVarName Y Name-Value Pair Arguments 'LossFun' 'Weights' 'Subtrees' 'TreeSize' Output Arguments L se NLeaf bestlevel Definitions Classification Loss True Misclassification Cost Expected Misclassification http://stackoverflow.com/questions/9666212/how-to-compute-error-rate-from-a-decision-tree Cost Score (tree) Examples Compute the In-sample Classification Error Examine the Classification Error for Each Subtree See Also This is machine translation Translated by Mouse over text to see original. Click the button below to return to the English verison of the page. Back to English × Translate This Page Select Language Bulgarian Catalan Chinese Simplified https://www.mathworks.com/help/stats/compactclassificationtree.loss.html Chinese Traditional Czech Danish Dutch English Estonian Finnish French German Greek Haitian Creole Hindi Hmong Daw Hungarian Indonesian Italian Japanese Korean Latvian Lithuanian Malay Maltese Norwegian Polish Portuguese Romanian Russian Slovak Slovenian Spanish Swedish Thai Turkish Ukrainian Vietnamese Welsh MathWorks Machine Translation The automated translation of this page is provided by a general purpose third party translator tool. MathWorks does not warrant, and disclaims all liability for, the accuracy, suitability, or fitness for purpose of the translation. Translate lossClass: CompactClassificationTreeClassification errorexpand all in page SyntaxL = loss(tree,TBL,ResponseVarName)L = loss(tree,TBL,Y)L = loss(tree,X,Y)L = loss(___,Name,Value)[L,se,NLeaf,bestlevel] = loss(___)Description L
= loss(tree,TBL,ResponseVarName) returns a scalar representing how well tree classifies the data in TBL, when TBL.ResponseVarName contains the true classifications.When computing the loss, loss normalizes the class probabilities in Y to the class probabilities used for training, stored in the Prior property of tree. L
= loss(tree,TBL,Y) returns a scalar representing how well tree classifies the data in TBL, when Y contains the true classifications.