Show simple item record

dc.contributor.advisorTrafalis, Theodore B.,en_US
dc.contributor.authorOladunni, Olutayo O.en_US
dc.date.accessioned2013-08-16T12:20:02Z
dc.date.available2013-08-16T12:20:02Z
dc.date.issued2006en_US
dc.identifier.urihttps://hdl.handle.net/11244/986
dc.description.abstractIn this study, the problem of discriminating between objects of two or more classes with (or without) prior knowledge is investigated. We present how a two-class discrimination model with or without prior knowledge can be extended to the case of multi-categorical discrimination with or without prior knowledge. The prior knowledge of interest is in the form of multiple polyhedral sets belonging to one or more categories, classes, or labels, and it is introduced as additional constraints into a classification model formulation. The solution of the knowledge-based support vector machine (KBSVM) model for two-class discrimination is characterized by a linear programming (LP) problem, and this is due to the specific norm (L1 or Linfinity) that is used to compute the distance between the two classes. We propose solutions to classification problems expressed as a single unconstrained optimization problem with (or without) prior knowledge via a regularized least square cost function in order to obtain a linear system of equations in input space and/or dual space induced by a kernel function that can be solved using matrix methods or iterative methods. Advantages of this formulation include the explicit expressions for the classification weights of the classifier(s); its ability to incorporate and handle prior knowledge directly to the classifiers; its ability to incorporate several classes in a single formulation and provide fast solutions to the optimal classification weights for multicategorical separation.en_US
dc.description.abstractComparisons with other learning techniques such as the least square SVM & MSVM developed by Suykens & Vandewalle (1999b & 1999c), and the knowledge-based SVM developed by Fung et al. (2002) indicate that the regularized least square methods are more efficient in terms of misclassification testing error and computational time.en_US
dc.format.extentxv, 164 leaves :en_US
dc.subjectLinear programming.en_US
dc.subjectLeast squares Computer programs.en_US
dc.subjectMachine learning.en_US
dc.subjectEngineering, Industrial.en_US
dc.titleLeast square multi-class kernel machines with prior knowledge and applications.en_US
dc.typeThesisen_US
dc.thesis.degreePh.D.en_US
dc.thesis.degreeDisciplineSchool of Industrial and Systems Engineeringen_US
dc.noteSource: Dissertation Abstracts International, Volume: 67-01, Section: B, page: 0473.en_US
dc.noteAdviser: Theodore B. Trafalis.en_US
ou.identifier(UMI)AAI3207058en_US
ou.groupCollege of Engineering::School of Industrial and Systems Engineering


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record