09-Multiclass(多分类)

合集下载
  1. 1、下载文档前请自行甄别文档内容的完整性,平台不提供额外的编辑、内容补充、找答案等附加服务。
  2. 2、"仅部分预览"的文档,不可在线预览部分如存在完整性等问题,可反馈申请退款(可完整预览的文档不适用该条件!)。
  3. 3、如文档侵犯您的权益,请联系客服反馈,我们会尽快为您处理(人工客服工作时间:9:00-18:30)。

Instances for CC
X1 X2 X3 X4 X5 X6 X7 X8 + + -
Instances for CD
X1 X2 X3 X4 X5 X6 X7 + -
X8
+
X8
-
X8
-
One Versus All - Example
Classify test tuple X: (-, +, -, -)
1
1 0 1
1
1 1 0
•Need to train 7 classifiers
• Generate 7 training sets. Given Record <X, y2>, add: • <X, 0> in the training set of classifiers 1..4 • <X, 1> in the training set for 5..7
•Test instance result: (0, 1, 1, 1, 1, 1, 1)
Example
Test y1 D 0 1 1 1 1 1 1 1 1 1 1 1 1 1 0 Test y2 D 0 0 1 0 1 0 1 0 1 1 1 1 1 1 0 1 0 0 0 0 0 Hamming Distance = 1 0 1 1 1 1 1 0 1 1 1 0 0 Hamming Distance = 3 0 1 1 1 1 1
Instances for CAD X1
X3 X6 X8
Instances for CBC X2
X4 X5 X7
A
A D A
B
C C B
One Versus One - Example
Classify test tuple X: (B, A, D, B, D, D)
AB AC AD BC BD CD RX A B C 1 B A 1 1 D B D D Votes 1 2 0
Test
1
Test
1
y3 D
0 0
0 1 1 0 0 1 0 0 1 1 Hamming Distance = 3
1 0
y4 D
0 1 0 1 0 1 0 0 1 0 1 0 Hamming Distance = 3
0 1
Classify as y1
Design issues
•Minimum codeword length to represent k classes n = log2k
One Versus All - Example
Input Instances X1 A
X2 X3 X4 X5 X6 X7 X8 B A C C D B A
Instances for CA
X1 X2 X3 X4 X5 X6 X7 + + -
Instances for CB
X1 X2 X3 X4 X5 X6 X7 + +
A B C D
CA CB +
1
CC CD + 1 1 1 1
1 1 1
Votes
3 1 3 1
Randomly break the tie
One Versus One
•Y = {y1, y2, …, yK} : the set of class labels
•Classifier building: • For each pair yi and yj create a binary problem: • Keep instances belonging to yi and yj • Ignore other instances •Tuple Classification:
•Classifier building: • For each yi, create a binary problem such that: • Instances belonging to yi are positive • Instances not belonging to yi are negative •Tuple Classification:
One Versus One - Example
Input Instances X1 A
X2 X3 X4 X5 X6 X7 X8 B A C C D B A Instances for CAB X1 X2 X3 X7 X8 A B A B A Instances for CAC X1 X3 X4 X5 X8 A A C C A Instances for CBD X2 X6 X7 B D B Instances for CCD X4 C X5 X6 C D
Multiclass Classification Approaches
•One versus All (OVA)
•One versus One (OVO) •Error correcting codes
One Versus All
•Y = {y1, y2, …, yK} : the set of class labels
• Classify the tuple using each classifier • If classifier i returns a positive label, yi gets one vote • If classifier i returns a negative label, all classes except yi get a vote • Assign the class with the most votes
A CA CB + 1 CC CD 1 1 1 1 1 1 1 1 1 Votes 2 4 2 2
Classification results through all the One vs. All classifiers
B C D
Classify test tuple X: (+, -, +, -)
•Sensitive to binary classification errors
Error correcting codes
•Idea: Add redundancy to increase chances of detecting errors
•Training:
• Represent each yi by a unique n bit codeword • Build n binary classifiers, each to predict one bit
•Testing
• Run each classifier on the test instance to predict its bit vector • Assign to the test instance the codeword with the closest Hamming distance to the output codeword
•Hamming distance: number of bits that differ
Example
•Given: Y= {y1, y2, y3, y4}
•Encode each yi as:
Class Codeword
y1
y2 y3 y4
1
0 0 0
1
0 0 1
1
0 1 0
1
0 1 1
1
1 0 0
•Can correct up to (d-1)/2 errors
• d: minimum Hamming Distance between codewords
•Large row-wise separation: more tolerance for errors
•Large column wise separation: binary classifiers are mutually independent
• Classify the tuple using each classifier Cij • If classifier Cij returns i label, yi gets one vote • If it returns j, yj gets one vote • Assign the class with the most votes
D
1
1
1
3
Characteristics
•One vs All:
• Builds k classifiers for a k class problem • Full training set for each classifier
•One vs One:
• Builds k(k-1)/2 classifiers • Subset of training set for each classifier
Classification – Multiclass classification
Multiclass Classification
•Character recognition
Multiclass ห้องสมุดไป่ตู้lassification
•Image recognition
http://www.cis.temple.edu/~latecki/research.html
相关文档
最新文档