Abstract

In this work, we provide an exposition of the support vector machine classifier (SVMC) algorithm. We show that SVMCs produce a linear discriminant, specifically the maximal margin hyperplane and we detail how we can use the method of Lagrange multipliers and the Kuhn-Tucker conditions to find this maximal margin hyperplane. We also present some of the most popular kernels: the Gaussian radial basis function kernel, the inhomogeneous polynomial kernel, and the hyperbolic tangent kernel and discuss their advantages in the context of SVMCs. To provide an example of SVMCs in use, we perform a series of experiments on the Titanic dataset, made available by Kaggle. Finally, we present SVMC Visualizer, a web-based graphical user interface we developed in the R programming language, that can be used for experimenting with SVMCs under various feature maps and kernels on user specified datasets. We developed SVMC Visualizer with the intention that it aids users in better understanding the support vector machine classification.

Advisor

Visa, Sofia

Department

Computer Science

Disciplines

Artificial Intelligence and Robotics | Operational Research

Keywords

SVM, Support Vector Machines, Support Vector Machine Classification, Support Vector Machine Classifiers, Kernels

Publication Date

2016

Degree Granted

Bachelor of Arts

Document Type

Senior Independent Study Thesis

ui.R (3 kB)
Part of SVMC Visualizer

server.R (7 kB)
Part of SVMC Visualizer

Share

COinS
 

© Copyright 2016 Dagm Zegeye