Analogue Imprecision In Mlp Training, Progress In Neural Processing, Vol 4

·
· Progress In Neural Processing Book 4 · World Scientific
3.0
4 reviews
Ebook
192
Pages
Eligible
Ratings and reviews aren’t verified  Learn More

About this ebook

Hardware inaccuracy and imprecision are important considerations when implementing neural algorithms. This book presents a study of synaptic weight noise as a typical fault model for analogue VLSI realisations of MLP neural networks and examines the implications for learning and network performance. The aim of the book is to present a study of how including an imprecision model into a learning scheme as a“fault tolerance hint” can aid understanding of accuracy and precision requirements for a particular implementation. In addition the study shows how such a scheme can give rise to significant performance enhancement.

Ratings and reviews

3.0
4 reviews
A Google user
September 16, 2018
I love it
8 people found this review helpful
Did you find this helpful?
Anil Das
July 2, 2021
AÀA BOSS NETWORK
Did you find this helpful?

Rate this ebook

Tell us what you think.

Reading information

Smartphones and tablets
Install the Google Play Books app for Android and iPad/iPhone. It syncs automatically with your account and allows you to read online or offline wherever you are.
Laptops and computers
You can listen to audiobooks purchased on Google Play using your computer's web browser.
eReaders and other devices
To read on e-ink devices like Kobo eReaders, you'll need to download a file and transfer it to your device. Follow the detailed Help Center instructions to transfer the files to supported eReaders.

Continue the series

More by Peter Edwards

Similar ebooks