Show simple item record

dc.contributor.advisorTrafalis, Theodore B.,en_US
dc.contributor.authorInce, Huseyin.en_US
dc.date.accessioned2013-08-16T12:18:49Z
dc.date.available2013-08-16T12:18:49Z
dc.date.issued2002en_US
dc.identifier.urihttps://hdl.handle.net/11244/533
dc.description.abstractThe theory of the Support Vector Machine (SVM) algorithm is based on statistical learning theory and can be applied to pattern recognition and regression. Training of SVMs leads to either a quadratic programming (QP) problem, or a linear programming (LP) problem. This depends on the specific noun that is used when the distance between the two classes is computed. l1 or linfinity norm distance leads to a large scale linear programming problem in the case where the sample size is very large. We propose to apply the Benders Decomposition technique to the resulting LP for the regression case. In addition to this, other decomposition techniques like support clusters and bagging methods have been developed. Also, a very efficient data preprocessing method for SVM which is called gamma-SVM has been developed. This method reduces the size of the training set and preserves the important data points which are high candidates as support vectors for defining the decision function.en_US
dc.description.abstractComparisons with other decomposition methods such as SVMTorch and SVMFu reveal that gamma-SVM, support clusters, subsampling and bagging methods are more efficient in terms of CPU time. gamma-SVM method outperforms SVMFu and SVMTorch for generalization error. In the case of support clusters, subsampling and bagging methods, generalization error is close or higher than SVMFu and SVMTorch. Some information is lost for speeding up the algorithm. SVM regression has been applied to an option pricing model and prediction of S&P 500 daily return. Comparisons with multilayer perceptron and radial basis function networks is also presented.en_US
dc.format.extentxv, 120 leaves :en_US
dc.subjectQuadratic programming.en_US
dc.subjectComputer Science.en_US
dc.subjectOperations Research.en_US
dc.subjectLinear programming.en_US
dc.subjectData mining.en_US
dc.subjectDecomposition (Mathematics)en_US
dc.subjectEngineering, Industrial.en_US
dc.titleDecomposition techniques for support vector machines training and applications.en_US
dc.typeThesisen_US
dc.thesis.degreePh.D.en_US
dc.thesis.degreeDisciplineSchool of Industrial and Systems Engineeringen_US
dc.noteAdviser: Theodore B. Trafalis.en_US
dc.noteSource: Dissertation Abstracts International, Volume: 63-11, Section: B, page: 5442.en_US
ou.identifier(UMI)AAI3070638en_US
ou.groupCollege of Engineering::School of Industrial and Systems Engineering


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record