1 / 1

Iso-Periodic map

a. a. b. b. c. c. d. d. t. t. P continu = 4. P continu = 4. P discrete = 4 : dbbb. P discrete = 2 : cbcb. Fascinating rhythms by chaotic Hopfield Networks.

ehren
Download Presentation

Iso-Periodic map

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. a a b b c c d d t t P continu = 4 P continu = 4 P discrete = 4 : dbbb P discrete = 2 : cbcb Fascinating rhythms by chaotic Hopfield Networks Colin Molter – Hugues Bersini.Institut de Recherches Interdisciplinaires et de Developpements en Intelligence Artificielle, IRIDIA, CP 194/6, Universite’ Libre de Bruxelles, Av. F. Roosevelt 50, 1050 Bruxelles, Belgium.Fax: +32-2-6502715, Email:cmolter@iridia.ulb.ac.be, http://iridia.ulb.ac.be/ChINN/ECAL 2003 4) The Network’s capacity – statistical studies 1) Introduction – why to use Recurrent Networks A/ Given an input set, we look for a matrix which can store it • Feedforward Neural Networks •  are successfully used in different kind of applications • Fully connected recurrent neural networks •  have still not found convincing applications. • supervised learning methods (BPTT – Werbos, EKF - Feldkamp,...) … drawbacks • Successful application of the hebbian learning theory … poor capacity • Analytical description and resolution ….ok for just one neuron (Pasemann) • Seminal paper of Skarda and Freeman (1987) dedicated to chaos in real brains • chaos to efficiently store and retrieve information in NN • huge amount of potential coding memories in its cyclic regimes • Chaos as a main component in order to perform efficient non-deterministic symbolic computation during cognitive tasks (Siegelman – Carrasco – Kentridge) • In this work we boost the storing capcity of RNN by storing the inputs in the resulting dynamics occurring instead of staying in fixed point dynamic • by increasing the amount of information to be stored, it entails the network to spontaneously reinforce the "chaoticity" of its autonomous dynamics. • a description of the “chaoticity” is made and we show a distinction between period doubling chaos and “frustrated chaos”. It happens when local connectivity patterns responsible for stable oscillatory behaviors are intertwined, leading to mutually competing attractors and unpredictable itinerancy among brief appearance of these attractors (Bersini). For each number of inputs stored, 100 different input set are tested Matrix randomly chosen with a constrain • Really fast for a small size input set • Time increases drastically with the set's size • Problem NP-Hard , need for an heuristic ?? % the evolution during 100 cycles of a small perturbation of 1E-7 and this for 200 random input B/ Given a weight matrix, we compute how many inputs can be stored The precedent graph gives us the same information : a matrix having high storing potential have more complex dynamics Up to 40 stored inputs in 3 neurons 200 matrix 2) Description of the model % the evolution during 100 cycles of a small perturbation of 1E-7 and this for 200 random input 5) Studies of the dynamics Return maps inside a same chaotic region show the same morphology. • Principal result : more evidence of the Frustrated chaos Weights chosen randomly from an uniform distribution Output dynamic in function of the variation of the inputs Return map xi discretized in k intervals of same size. For example : k=4  intervals : (a,b,c,d) FFT for a chaotic point Lyapunov exponent = 0.15 • The bifurcation diagram shows how the cyclic attractors merge together when an intermediate input is given, resulting in some frustration • Structure appears in the FFT • The Lyapunov exponent is smaller than for a period doubling chaos • An input feeds the network, after "a while" the state of the network is : • Fixed point attractor • Periodic attractor with p<=pc • Periodic attractor with p>pc • Quasi-periodicity • Chaos • A given set of 5 inputs give for example the “symbolic outputs” : {(a),(cab),(ad),(d),(adca)} (pc<=4) } Input stored if generates an unique output 6) Networks of networks and Musical outputs 2) Experiences Up to 80 stored inputs in 9 neurons Iso-Periodic map Bifurcation diagram • 2 correlated studies : • the network's capacity using statistic tools • different dynamics occurring Input space 7) Conclusion • High potential of small RNN in storage using the dynamic of the outputs; • The more you want to store in a RNN, the more its dynamic besides the stored inputs becomes complex , the “chaoticity” increases ; • Frustrated chaos prevails more and more as the number of cyclic attractors increase. Lyapunov Exponent

More Related