Faster Computation for Deep-Learning Applications

TBMG-47336

01/01/2023

Abstract
Content

A new area of artificial intelligence called analog deep learning promises faster computation with a fraction of the energy usage. Programmable resistors are the key building blocks in analog deep learning, just like transistors are the core elements for digital processors. By repeating arrays of programmable resistors in complex layers, researchers can create a network of analog artificial “neurons” and “synapses” that execute computations just like a digital neural network. This network can then be trained to achieve complex AI tasks like image recognition and natural language processing.

Meta TagsDetails
Citation
"Faster Computation for Deep-Learning Applications," Mobility Engineering, January 1, 2023.
Additional Details
Publisher
Published
Jan 1, 2023
Product Code
TBMG-47336
Content Type
Magazine Article
Language
English