Uh-oh, it looks like your Internet Explorer is out of date.
For a better shopping experience, please upgrade now.
The effort to build machines that are able to learn and undertake tasks such as datamining, image processing and pattern recognition has led to the development of artificial neural networks in which learning from examples may be described and understood. The contribution to this subject made over the past decade by researchers applying the techniques of statistical mechanics is the subject of this book. The authors provide a coherent account of various important concepts and techniques that are currently only found scattered in papers, supplement this with background material in mathematics and physics, and include many examples and exercises.
|Publisher:||Cambridge University Press|
|Edition description:||New Edition|
|Product dimensions:||6.85(w) x 9.72(h) x 0.71(d)|
Table of Contents
1. Getting started; 2. Perceptron learning - basics; 3. A choice of learning rules; 4. Augmented statistical mechanics formulation; 5. Noisy teachers; 6. The storage problem; 7. Discontinuous learning; 8. Unsupervised learning; 9. On-line learning; 10. Making contact with statistics; 11. A bird's eye view: multifractals; 12. Multilayer networks; 13. On-line learning in multilayer networks; 14. What else?; Appendix A. Basic mathematics; Appendix B. The Gardner analysis; Appendix C. Convergence of the perceptron rule; Appendix D. Stability of the replica symmetric saddle point; Appendix E. 1-step replica symmetry breaking; Appendix F. The cavity approach; Appendix G. The VC-theorem.