2011 |
Luckner, Marcin Reducing Number of Classifiers in DAGSVM Based on Class Similarity Inproceedings Image Analysis and Processing – ICIAP 2011 Lecture Notes in Computer Science, pp. 514–523, Springer Berlin Heidelberg, 2011. Abstract | Links | BibTeX | Tags: Classification, Directed Acyclic Graph, One–Against–One, Support Vector Machines @inproceedings{Luckner2011a, title = {Reducing Number of Classifiers in DAGSVM Based on Class Similarity}, author = {Marcin Luckner}, url = {http://link.springer.com/chapter/10.1007%2F978-3-642-24085-0_53}, doi = {10.1007/978-3-642-24085-0_53}, year = {2011}, date = {2011-01-01}, booktitle = {Image Analysis and Processing – ICIAP 2011 Lecture Notes in Computer Science}, pages = {514--523}, publisher = {Springer Berlin Heidelberg}, abstract = {Support Vector Machines are excellent binary classifiers. In case of multi–class classification problems individual classifiers can be collected into a directed acyclic graph structure DAGSVM. Such structure implements One-Against-One strategy. In this strategy a split is created for each pair of classes, but, because of hierarchical structure, only a part of them is used in the single classification process. The number of classifiers may be reduced if their classification tasks will be changed from separation of individual classes into separation of groups of classes. The proposed method is based on the similarity of classes. For near classes the structure of DAG stays immutable. For the distant classes more than one is separated with a single classifier. This solution reduces the classification cost. At the same time the recognition accuracy is not reduced in a significant way. Moreover, a number of SV, which influences on the learning time will not grow rapidly.}, keywords = {Classification, Directed Acyclic Graph, One–Against–One, Support Vector Machines}, pubstate = {published}, tppubtype = {inproceedings} } Support Vector Machines are excellent binary classifiers. In case of multi–class classification problems individual classifiers can be collected into a directed acyclic graph structure DAGSVM. Such structure implements One-Against-One strategy. In this strategy a split is created for each pair of classes, but, because of hierarchical structure, only a part of them is used in the single classification process. The number of classifiers may be reduced if their classification tasks will be changed from separation of individual classes into separation of groups of classes. The proposed method is based on the similarity of classes. For near classes the structure of DAG stays immutable. For the distant classes more than one is separated with a single classifier. This solution reduces the classification cost. At the same time the recognition accuracy is not reduced in a significant way. Moreover, a number of SV, which influences on the learning time will not grow rapidly. |
Luckner, Marcin Reducing Number of Classifiers in DAGSVM Based on Class Similarity Inproceedings Image Analysis and Processing – ICIAP 2011 Lecture Notes in Computer Science, pp. 514–523, Springer Berlin Heidelberg, 2011. Abstract | Links | BibTeX | Tags: Classification, Directed Acyclic Graph, One–Against–One, Support Vector Machines @inproceedings{Luckner2011ab, title = {Reducing Number of Classifiers in DAGSVM Based on Class Similarity}, author = {Marcin Luckner}, url = {http://link.springer.com/chapter/10.1007%2F978-3-642-24085-0_53}, doi = {10.1007/978-3-642-24085-0_53}, year = {2011}, date = {2011-01-01}, booktitle = {Image Analysis and Processing – ICIAP 2011 Lecture Notes in Computer Science}, pages = {514--523}, publisher = {Springer Berlin Heidelberg}, abstract = {Support Vector Machines are excellent binary classifiers. In case of multi–class classification problems individual classifiers can be collected into a directed acyclic graph structure DAGSVM. Such structure implements One-Against-One strategy. In this strategy a split is created for each pair of classes, but, because of hierarchical structure, only a part of them is used in the single classification process. The number of classifiers may be reduced if their classification tasks will be changed from separation of individual classes into separation of groups of classes. The proposed method is based on the similarity of classes. For near classes the structure of DAG stays immutable. For the distant classes more than one is separated with a single classifier. This solution reduces the classification cost. At the same time the recognition accuracy is not reduced in a significant way. Moreover, a number of SV, which influences on the learning time will not grow rapidly.}, keywords = {Classification, Directed Acyclic Graph, One–Against–One, Support Vector Machines}, pubstate = {published}, tppubtype = {inproceedings} } Support Vector Machines are excellent binary classifiers. In case of multi–class classification problems individual classifiers can be collected into a directed acyclic graph structure DAGSVM. Such structure implements One-Against-One strategy. In this strategy a split is created for each pair of classes, but, because of hierarchical structure, only a part of them is used in the single classification process. The number of classifiers may be reduced if their classification tasks will be changed from separation of individual classes into separation of groups of classes. The proposed method is based on the similarity of classes. For near classes the structure of DAG stays immutable. For the distant classes more than one is separated with a single classifier. This solution reduces the classification cost. At the same time the recognition accuracy is not reduced in a significant way. Moreover, a number of SV, which influences on the learning time will not grow rapidly. |
Publications
2011 |
Reducing Number of Classifiers in DAGSVM Based on Class Similarity Inproceedings Image Analysis and Processing – ICIAP 2011 Lecture Notes in Computer Science, pp. 514–523, Springer Berlin Heidelberg, 2011. |
Reducing Number of Classifiers in DAGSVM Based on Class Similarity Inproceedings Image Analysis and Processing – ICIAP 2011 Lecture Notes in Computer Science, pp. 514–523, Springer Berlin Heidelberg, 2011. |