Training Neural Networks With Fewer Quantization Bits

TBMG-32162

07/01/1998

Abstract
Content

A method for reducing the number of bits of quantization of synaptic weights during training of an artificial neural network involves the use of the cascade back-propagation learning algorithm. The development of neural networks of adequate synaptic-weight resolution in very-large-scale integrated (VLSI) circuitry poses considerable problems of overall size, power consumption, complexity, and connection density. Reduction of the required number of bits from the present typical value of 12 to a value as low as 5 could thus facilitate and accelerate development.

Meta TagsDetails
Citation
"Training Neural Networks With Fewer Quantization Bits," Mobility Engineering, July 1, 1998.
Additional Details
Publisher
Published
Jul 1, 1998
Product Code
TBMG-32162
Content Type
Magazine Article
Language
English