TY - JOUR AU - Bengio, Yoshua AB - ORIGINAL RESEARCH published: 04 May 2017 doi: 10.3389/fncom.2017.00024 Equilibrium Propagation: Bridging the Gap between Energy-Based Models and Backpropagation Benjamin Scellier* and Yoshua Bengio Département d’Informatique et de Recherche Opérationnelle, Montreal Institute for Learning Algorithms, Université de Montréal, Montreal, QC, Canada We introduce Equilibrium Propagation, a learning framework for energy-based models. It involves only one kind of neural computation, performed in both the first phase (when the prediction is made) and the second phase of training (after the target or prediction error is revealed). Although this algorithm computes the gradient of an objective function just like Backpropagation, it does not need a special computation or circuit for the second phase, where errors are implicitly propagated. Equilibrium Propagation shares similarities with Contrastive Hebbian Learning and Contrastive Divergence while solving the theoretical issues of both algorithms: our algorithm computes the gradient of a well-defined objective function. Because the objective function is defined in terms of local perturbations, the second phase of Equilibrium Propagation corresponds to only nudging the prediction (fixed point or stationary distribution) toward a configuration that reduces prediction error. In the case of a recurrent multi-layer supervised network, Edited by: Marcel van Gerven, the output units are slightly nudged toward TI - Equilibrium Propagation: Bridging the Gap between Energy-Based Models and Backpropagation JF - Frontiers in Computational Neuroscience DO - 10.3389/fncom.2017.00024 DA - 2017-05-04 UR - https://www.deepdyve.com/lp/unpaywall/equilibrium-propagation-bridging-the-gap-between-energy-based-models-FaLavKh0JP DP - DeepDyve ER -