Euromicro Symposium on Digital System Design, 2004. DSD 2004.
Download PDF

Abstract

In this paper we present an analysis of the minimal hardware precision required to implement Support Vector Machine (SVM) classification within a Logarithmic Number System architecture. Support Vector Machines are fast emerging as a powerful machine-learning tool for pattern recognition, decision-making and classification. Logarithmic Number Systems (LNS) utilize the property of logarithmic compression for numerical operations. Within the logarithmic domain, multiplication and division can be treated simply as addition or subtraction. Hardware computation of these operations is significantly faster with reduced complexity. Leveraging the inherent properties of LNS, we are able to achieve significant savings over double-precision floating point in an implementation of a SVM classification algorithm.
Like what you’re reading?
Already a member?Sign In
Member Price
$11
Non-Member Price
$21
Add to CartSign In
Get this article FREE with a new membership!

Related Articles