New Trends in Neural Computation: International Workshop on Artificial Neural Networks, IWANN'93, Sitges, Spain, June 9-11, 1993. Proceedings / Edition 1

Paperback (Print)
Buy New
Buy New from
Used and New from Other Sellers
Used and New from Other Sellers
from $6.79
Usually ships in 1-2 business days
(Save 96%)
Other sellers (Paperback)
  • All (10) from $6.79   
  • New (6) from $129.95   
  • Used (4) from $6.79   


Neural computation arises from the capacity of nervous tissue to process information and accumulate knowledge in an intelligent manner. Conventional computational machines have encountered enormous difficulties in duplicatingsuch functionalities. This has given rise to the development of
Artificial Neural Networks where computation is distributed over a great number of local processing elements with a high degree of connectivityand in which external programming is replaced with supervised and unsupervised learning.
The papers presented in this volume are carefully reviewed versions of the talks delivered at the International
Workshop on Artificial Neural Networks (IWANN '93) organized by the Universities of Catalonia and the Spanish Open
University at Madrid and held at Barcelona, Spain, in June
1993. The 111 papers are organized in seven sections:
biological perspectives, mathematical models, learning,
self-organizing networks, neural software, hardware implementation, and applications (in five subsections:
signal processing and pattern recognition, communications,
artificial vision, control and robotics, and other applications).

Read More Show Less

Product Details

  • ISBN-13: 9783540567981
  • Publisher: Springer Berlin Heidelberg
  • Publication date: 7/9/1993
  • Series: Lecture Notes in Computer Science Series, #686
  • Edition description: 1993
  • Edition number: 1
  • Pages: 754
  • Product dimensions: 9.21 (w) x 6.14 (h) x 1.51 (d)

Table of Contents

Biophysics of neural computation.- Integrated learning in rana computatrix.- A model for centering visual stimuli through adaptive value learning.- A model for the development of neurons selective to visual stimulus size.- An invariant representation mechanism after presynaptic inhibition.- The pancreatic B-cell as a voltage-controlled oscillator.- Approximation of the solution of the dendritic cable equation by a small series of coupled differential equations.- A neural network model inspired in global appreciations about the thalamic reticular nucleus and cerebral cortex connectivity.- Towards more realistic self contained models of neurons: High-order, recurrence and local learning.- McCulloch's neurons revisited.- Biologically motivated approach to face recognition.- Learning by reinforcement: A psychobiological model.- A neural state machine for iconic language representation.- Variable binding using serial order in recurrent neural networks.- Region of influence (ROI) networks. Model and implementation.- A node splitting algorithm that reduces the number of connections in a Hamming distance classifying network.- A high order neural model.- Higher-order networks for the optimization of block designs.- Neural Bayesian classifier.- Constructive methods for a new classifier based on a radial-basis-function neural network accelerated by a tree.- Practical realization of a radial basis function network for handwritten digit recognition.- Design of fully and partially connected random neural networks for pattern completion.- Representation and recognition of regular grammars by means of second-order recurrent neural networks.- Connectionist models for syllabic recognition in the time domain.- Sparsely interconnected artificial neural networks for associative memories.- Dynamic analysis of networks of neural oscillators.- Optimised attractor neural networks with external inputs.- Non-orthogonal bases and metric tensors: An application to artificial neural networks.- Genetic synthesis of discrete-time recurrent neural network.- Optimization of a competitive learning neural network by genetic algorithms.- Adaptive models in neural networks.- Self-organizing grammar induction using a neural network model.- The role of forgetting in efficient learning strategies for self-organising discriminator-based systems.- Simulation of shastic regular grammars through simple recurrent networks.- Local shastic competition and vector quantization.- MHC — An evolutive connectionist model for hybrid training.- Fast-convergence learning algorithms for multi-level and binary neurons and solution of some image processing problems.- Invariant object recognition using fahlman and Lebiere's learning algorithm.- Realization of subjective correspondence in artificial neural network trained by Fahlman and Lebiere's learning algorithm.- Bimodal distribution removal.- A simplified ARTMAP architecture for real-time learning.- B-Learning: A reinforcement learning algorithm, comparison with dynamic programming.- Increased complexity training.- Optimized learning for improving the evolution of piecewise linear separation incremental algorithms.- A method of pruning layered feed-forward neural networks.- Tests of different regularization terms in small networks.- On the distribution of feature space in Self-Organising mapping and convergence accelerating by a Kalman algorithm.- A learning algorithm to obtain self-organizing maps using fixed neighbourhood Kohonen networks.- Analysing a contingency table with Kohonen maps: A factorial correspondence analysis.- Dynamics of self-organized feature mapping.- Comparative study of self-organizing neural networks.- GANNet: A genetic algorithm for optimizing topology and weights in neural network design.- Vector quantization and projection neural network.- Constructive design of LVQ and DSM classifiers.- Linear vector classification: An improvement on LVQ algorithms to create classes of patterns.- Non-greedy adaptive vector quantizers.- Hybrid programming environments.- Automatic generation of C++ code for neural network simulation.- Urano: An object-oriented artificial neural network simulation tool.- Realistic simulation tool for early visual processing including space, time and colour data.- Language supported storage and reuse of persistent neural network objects.- Flexible operating environment for matrix based neurocomputers.- A parallel implementation of kohonen's self-organizing maps on the smart neurocomputer.- Simulation of neural networks in a distributed computing environment using Neurograph.- Full automatic ann design: A genetic approach.- Hardware implementations of artificial neural networks.- A neural network chip using CPWM modulation.- Hardware implementation of a neural network for high energy physics application.- MapA: An array processor architecture for neural networks.- Limitation of connectionism in MLP.- High level synthesis of neural network chips.- Neural network simulations on massively parallel computers: Applications in chemical physics.- A model based approach to the performance analysis of multi-layer networks realised in linear systolic arrays.- The temporal noisy-leaky integrator neuron with additional inhibitory inputs.- Architectures for self-learning neural network modules.- The generic neuron architectural framework for the automatic generation of ASICs.- A risc architecture to support neural net simulation.- Hardware design for self organizing feature maps with binary input vectors.- The Kolmogorov signal processor.- Projectivity invariant pattern recognition with high-order neural networks.- Rejection of incorrect answers from a neural net classifier.- Nonlinear time series modeling by competitive segmentation of state space.- Identification and prediction of non-linear models with recurrent neural network.- Use of unsupervised neural networks for classification of blood pressure time series.- Application of artificial neural networks to chest image classification.- Combination of self-organizing maps and multilayer perceptrons for speaker independent isolated word recognition.- An industrial application of neural networks to natural textures classification.- Use of a layered neural nets as a display method for n-dimensional distributions.- MLP modular versus yprel classifiers.- How many hidden neurons are needed to recognize a symmetrical pattern—.- Hopfield neural network for routing.- Neural network routing controller for communication parallel multistage interconnection networks.- Adaptive routing using cellular automata.- Optimal blind equalization of Gaussian channels.- Noise prediction in urban traffic by a neural approach.- A connectionist approach to the correspondence problem in computer vision.- Self-organizing feature maps for image segmentation.- Recognition of fractal images using a neural network.- Feed forward network for vehicle license character recognition.- Interpretation of optical flow through complex neural network.- CT image segmentation by self-organizing learning.- Texture image segmentation using a modified Hopfield network.- Image compression with self-organizing networks.- Neural networks as direct adaptive controllers.- A neural adaptive controller for a turbofan exhaust nozzle.- Feed-forward neural networks for bioreactor control.- Learning networks for process identification and associative action.- On-line performance enhancement of a behavioral neural network controller.- An architecture for implementing control and signal processing neural networks.- Planlite: Adaptive planning using weightless systems.- Sk prices and volume in an artificial adaptive sk market.- Application of the Fuzzy ARTMAP neural network architecture to bank failure predictions.- Combination of neural network and statistical methods for sensory evalution of biological products: On-line beauty selection of flowers.- An adaptive information retrieval system based on Neural Networks.- Software pattern EEG recognition after a wavelet transform by a neural network.

Read More Show Less

Customer Reviews

Be the first to write a review
( 0 )
Rating Distribution

5 Star


4 Star


3 Star


2 Star


1 Star


Your Rating:

Your Name: Create a Pen Name or

Barnes & Review Rules

Our reader reviews allow you to share your comments on titles you liked, or didn't, with others. By submitting an online review, you are representing to Barnes & that all information contained in your review is original and accurate in all respects, and that the submission of such content by you and the posting of such content by Barnes & does not and will not violate the rights of any third party. Please follow the rules below to help ensure that your review can be posted.

Reviews by Our Customers Under the Age of 13

We highly value and respect everyone's opinion concerning the titles we offer. However, we cannot allow persons under the age of 13 to have accounts at or to post customer reviews. Please see our Terms of Use for more details.

What to exclude from your review:

Please do not write about reviews, commentary, or information posted on the product page. If you see any errors in the information on the product page, please send us an email.

Reviews should not contain any of the following:

  • - HTML tags, profanity, obscenities, vulgarities, or comments that defame anyone
  • - Time-sensitive information such as tour dates, signings, lectures, etc.
  • - Single-word reviews. Other people will read your review to discover why you liked or didn't like the title. Be descriptive.
  • - Comments focusing on the author or that may ruin the ending for others
  • - Phone numbers, addresses, URLs
  • - Pricing and availability information or alternative ordering information
  • - Advertisements or commercial solicitation


  • - By submitting a review, you grant to Barnes & and its sublicensees the royalty-free, perpetual, irrevocable right and license to use the review in accordance with the Barnes & Terms of Use.
  • - Barnes & reserves the right not to post any review -- particularly those that do not follow the terms and conditions of these Rules. Barnes & also reserves the right to remove any review at any time without notice.
  • - See Terms of Use for other conditions and disclaimers.
Search for Products You'd Like to Recommend

Recommend other products that relate to your review. Just search for them below and share!

Create a Pen Name

Your Pen Name is your unique identity on It will appear on the reviews you write and other website activities. Your Pen Name cannot be edited, changed or deleted once submitted.

Your Pen Name can be any combination of alphanumeric characters (plus - and _), and must be at least two characters long.

Continue Anonymously

    If you find inappropriate content, please report it to Barnes & Noble
    Why is this product inappropriate?
    Comments (optional)