Statistical Field Theory for Neural Networks

Statistical Field Theory for Neural Networks

ISBN-10:
3030464431
ISBN-13:
9783030464431
Pub. Date:
08/20/2020
Publisher:
Springer International Publishing
ISBN-10:
3030464431
ISBN-13:
9783030464431
Pub. Date:
08/20/2020
Publisher:
Springer International Publishing
Statistical Field Theory for Neural Networks

Statistical Field Theory for Neural Networks

$84.99
Current price is , Original price is $84.99. You
$84.99 
  • SHIP THIS ITEM
    In stock. Ships in 1-2 days.
  • PICK UP IN STORE

    Your local store may have stock of this item.


Overview

This book presents a self-contained introduction to techniques from field theory applied to shastic and collective dynamics in neuronal networks. These powerful analytical techniques, which are well established in other fields of physics, are the basis of current developments and offer solutions to pressing open problems in theoretical neuroscience and also machine learning. They enable a systematic and quantitative understanding of the dynamics in recurrent and shastic neuronal networks.

This book is intended for physicists, mathematicians, and computer scientists and it is designed for self-study by researchers who want to enter the field or as the main text for a one semester course at advanced undergraduate or graduate level. The theoretical concepts presented in this book are systematically developed from the very beginning, which only requires basic knowledge of analysis and linear algebra.


Product Details

ISBN-13: 9783030464431
Publisher: Springer International Publishing
Publication date: 08/20/2020
Series: Lecture Notes in Physics , #970
Edition description: 1st ed. 2020
Pages: 203
Product dimensions: 6.10(w) x 9.25(h) x (d)

About the Author

Moritz Helias is group leader at the Jülich Research Centre and assistant professor in the department of physics of the RWTH Aachen University, Germany. He obtained his diploma in theoretical solid state physics at the University of Hamburg and his PhD in computational neuroscience at the University of Freiburg, Germany. Post-doctoral positions in RIKEN Wako-Shi, Japan and Jülich Research Center followed. His main research interests are neuronal network dynamics and function, and their quantitative analysis with tools from statistical physics and field theory.


David Dahmen is a post-doctoral researcher in the Institute of Neuroscience and Medicine at the Jülich Research Centre, Germany. He obtained his Master's degree in physics from RWTH Aachen University, Germany, working on effective field theory approaches to particle physics. Afterwards he moved to the field of computational neuroscience, where he received his PhD in 2017. His research comprises modeling, analysis and simulation of recurrent neuronal networks with special focus on development and knowledge transfer of mathematical tools and simulation concepts. His main interests are field-theoretic methods for random neural networks, correlations in recurrent networks, and modeling of the local field potential.

Table of Contents

Introduction.- Probabilities, moments, cumulants.- Gaussian distribution and Wick’s theorem.- Perturbation expansion.- Linked cluster theorem.- Functional preliminaries.- Functional formulation of shastic differential equations.- Ornstein-Uhlenbeck process: The free Gaussian theory.- Perturbation theory for shastic differential equations.- Dynamic mean-field theory for random networks.- Vertex generating function.- Application: TAP approximation.- Expansion of cumulants into tree diagrams of vertex functions.- Loopwise expansion of the effective action - Tree level.- Loopwise expansion in the MSRDJ formalism.- Nomenclature.
From the B&N Reads Blog

Customer Reviews