izpis_h1_title_alt

Applicability of approximate multipliers in hardware neural networks
Lotrič, Uroš (Author), Bulić, Patricio (Author)

URLURL - Presentation file, Visit http://eprints.fri.uni-lj.si/1751/ This link opens in a new window

Abstract
In recent years there has been a growing interest in hardware neural networks, which express many benefits over conventional software models, mainly in applications where speed, cost, reliability, or energy efficiency are of great importance. These hardware neural networks require many resource-, power- and time-consuming multiplication operations, thus special care must be taken during their design. Since the neural network processing can be performed in parallel, there is usually a requirement for designs with as many concurrent multiplication circuits as possible. One option to achieve this goal is to replace the complex exact multiplying circuits with simpler, approximate ones. The present work demonstrates the application of approximate multiplying circuits in the design of a feed-forward neural network model with on-chip learning ability. The experiments performed on a heterogeneous Proben1 benchmark dataset show that the adaptive nature of the neural network model successfully compensates for the calculation errors of the approximate multiplying circuits. At the same time, the proposed designs also profit from more computing power and increased energy efficiency.

Language:Unknown
Keywords:Hardware neural network, Iterative logarithmic multiplier, FPGA, Digital design, Computer arithmetic
Work type:Not categorized (r6)
Organization:FRI - Faculty of computer and information science
Publisher:ELSEVIER
Number of pages:1-11
ISSN:0925-2312
Views:353
Downloads:174
Metadata:XML RDF-CHPDL DC-XML DC-RDF
 
Average score:(0 votes)
Your score:Voting is allowed only to logged in users.
:
Share:AddThis
AddThis uses cookies that require your consent. Edit consent...

Similar documents

Similar works from RUL:
Similar works from other Slovenian collections:

Comments

Leave comment

You have to log in to leave a comment.

Comments (0)
0 - 0 / 0
 
There are no comments!

Back