Parallel Architectures for Artificial Neural Networks: Paradigms and Implementations / Edition 1

Hardcover (Print)
Buy New
Buy New from BN.com
$100.12
Used and New from Other Sellers
Used and New from Other Sellers
from $86.38
Usually ships in 1-2 business days
(Save 28%)
Other sellers (Hardcover)
  • All (10) from $86.38   
  • New (6) from $94.37   
  • Used (4) from $86.38   

Overview

This excellent reference for all those involved in neural networks research and application presents, in a single text, the necessary aspects of parallel implementation for all major artificial neural network models. The book details implementations on varoius processor architectures (ring, torus, etc.) built on different hardware platforms, ranging from large general purpose parallel computers to custom built MIMD machines using transputers and DSPs.

Experts who performed the implementations author the chapters and research results are covered in each chapter. These results are divided into three parts.

Theoretical analysis of parallel implementation schemes on MIMD message passing machines.

Details of parallel implementation of BP neural networks on a general purpose, large, parallel computer.

Four chapters each describing a specific purpose parallel neural computer configuration.

This book is aimed at graduate students and researchers working in artificial neural networks and parallel computing. Graduate level educators can use it to illustrate the methods of parallel computing for ANN simulation. The text is an ideal reference providing lucid mathematical analyses for practitioners in the field.

"...an excellent reference presenting the parallel implementation aspects of all major artificial neural network models with lucid mathematical analyses...ideal for graduate students, researchers, and educators."

Read More Show Less

Editorial Reviews

Booknews
Details implementations on various processor architectures built on different hardware platforms ranging from general purpose parallel computers to custom built MIMD machines. The 12 contributions are divided into sections on: the theoretical analysis of parallel implementation schemes on MIMD message passing machines; the details of parallel implementation of BP neural networks on large parallel computers; and four specific purpose parallel neural computer configurations. Specific topics include network parallelism for backpropagation neural networks on a heterogeneous architecture, parallel real-time recurrent algorithm for training large fully recurrent neural networks, and regularly structured neural networks on the DREAM machine. Intended for graduate students and researchers working with neural networks, parallel computers, and ANN simulation. Annotation c. by Book News, Inc., Portland, Or.
Read More Show Less

Product Details

  • ISBN-13: 9780818683992
  • Publisher: Wiley
  • Publication date: 12/14/1998
  • Series: Systems Series , #4
  • Edition number: 1
  • Pages: 412
  • Product dimensions: 7.20 (w) x 10.39 (h) x 1.09 (d)

Table of Contents

1. Introduction (N. Sundararajan, P. Saratchandran, Jim Torresen).

2. A Review of Parallel Implementations of Backpropagation Neural Networks (Jim Torresen, Olav Landsverk).

I: Analysis of Parallel Implementations.

3. Network Parallelism for Backpropagation Neural Networks on a Heterogeneous Architecture (R. Arularasan, P. Saratchandran, N. Sundararajan, Shou King Foo).

4. Training-Set Parallelism for Backpropagation Neural Networks on a Heterogeneous Architecture (Shou King Foo, P. Saratchandran, N. Sundararajan).

5. Parallel Real-Time Recurrent Algorithm for Training Large Fully Recurrent Neural Networks (Elias S. Manolakos, George Kechriotis).

6. Parallel Implementation of ART1 Neural Networks on Processor Ring Architectures (Elias S. Manolakos, Stylianos Markogiannakis).

II: Implementations on a Big General-Purpose Parallel Computer.

7. Implementation of Backpropagation Neural Networks on Large Parallel Computers (Jim Torresen, Shinji Tomita).

III: Special Parallel Architectures and Application Case Studies.

8. Massively Parallel Architectures for Large-Scale Neural Network Computations (Yoshiji Fujimoto).

9. Regularly Structured Neural Networks on the DREAM Machine (Soheil Shams, Jean-Luc Gaudiot).

10. High-Performance Parallel Backpropagation Simulation with On-Line Learning (Urs A. Muller, Patrick Spiess, Michael Kocheisen, Beat Flepp, Anton Gunzinger, Walter Guggenbuhl).

11. Training Neural Networks with SPERT-II (Krste Asanovic;, James Beck, David Johnson, Brian Kingsbury, Nelson Morgan, John Wawrzynek).

12. Concluding Remarks (N. Sundararajan, P. Saratchandran).

Read More Show Less

Customer Reviews

Be the first to write a review
( 0 )
Rating Distribution

5 Star

(0)

4 Star

(0)

3 Star

(0)

2 Star

(0)

1 Star

(0)

Your Rating:

Your Name: Create a Pen Name or

Barnes & Noble.com Review Rules

Our reader reviews allow you to share your comments on titles you liked, or didn't, with others. By submitting an online review, you are representing to Barnes & Noble.com that all information contained in your review is original and accurate in all respects, and that the submission of such content by you and the posting of such content by Barnes & Noble.com does not and will not violate the rights of any third party. Please follow the rules below to help ensure that your review can be posted.

Reviews by Our Customers Under the Age of 13

We highly value and respect everyone's opinion concerning the titles we offer. However, we cannot allow persons under the age of 13 to have accounts at BN.com or to post customer reviews. Please see our Terms of Use for more details.

What to exclude from your review:

Please do not write about reviews, commentary, or information posted on the product page. If you see any errors in the information on the product page, please send us an email.

Reviews should not contain any of the following:

  • - HTML tags, profanity, obscenities, vulgarities, or comments that defame anyone
  • - Time-sensitive information such as tour dates, signings, lectures, etc.
  • - Single-word reviews. Other people will read your review to discover why you liked or didn't like the title. Be descriptive.
  • - Comments focusing on the author or that may ruin the ending for others
  • - Phone numbers, addresses, URLs
  • - Pricing and availability information or alternative ordering information
  • - Advertisements or commercial solicitation

Reminder:

  • - By submitting a review, you grant to Barnes & Noble.com and its sublicensees the royalty-free, perpetual, irrevocable right and license to use the review in accordance with the Barnes & Noble.com Terms of Use.
  • - Barnes & Noble.com reserves the right not to post any review -- particularly those that do not follow the terms and conditions of these Rules. Barnes & Noble.com also reserves the right to remove any review at any time without notice.
  • - See Terms of Use for other conditions and disclaimers.
Search for Products You'd Like to Recommend

Recommend other products that relate to your review. Just search for them below and share!

Create a Pen Name

Your Pen Name is your unique identity on BN.com. It will appear on the reviews you write and other website activities. Your Pen Name cannot be edited, changed or deleted once submitted.

 
Your Pen Name can be any combination of alphanumeric characters (plus - and _), and must be at least two characters long.

Continue Anonymously

    If you find inappropriate content, please report it to Barnes & Noble
    Why is this product inappropriate?
    Comments (optional)