790 likes | 1.28k Views
Feedback Networks and Associative Memories. 虞台文. 大同大學資工所 智慧型多媒體研究室. Content. Introduction Discrete Hopfield NNs Continuous Hopfield NNs Associative Memories Hopfield Memory Bidirection Memory. Feedback Networks and Associative Memories. Introduction. 大同大學資工所 智慧型多媒體研究室.
E N D
Feedback Networksand Associative Memories 虞台文 大同大學資工所 智慧型多媒體研究室
Content • Introduction • Discrete Hopfield NNs • Continuous Hopfield NNs • Associative Memories • Hopfield Memory • Bidirection Memory
Feedback Networksand Associative Memories Introduction 大同大學資工所 智慧型多媒體研究室
Feedforward/Feedback NNs • Feedforward NNs • The connections between units do not form cycles. • Usually produce a response to an input quickly. • Most feedforward NNs can be trained using a wide variety of efficient algorithms. • Feedback or recurrent NNs • There are cycles in the connections. • In some feedback NNs, each time an input is presented, the NN must iterate for a potentially long time before it produces a response. • Usually more difficultto train than feedforward NNs.
Supervised-Learning NNs • Feedforward NNs • Perceptron • Adaline, Madaline • Backpropagation (BP) • Artmap • Learning Vector Quantization (LVQ) • Probabilistic Neural Network (PNN) • General Regression Neural Network (GRNN) • Feedback or recurrent NNs • Brain-State-in-a-Box (BSB) • Fuzzy Congitive Map (FCM) • Boltzmann Machine (BM) • Backpropagation through time (BPTT)
Unsupervised-Learning NNs • Feedforward NNs • Learning Matrix (LM) • Sparse Distributed Associative Memory (SDM) • Fuzzy Associative Memory (FAM) • Counterprogation (CPN) • Feedback or Recurrent NNs • Binary Adaptive Resonance Theory (ART1) • Analog Adaptive Resonance Theory (ART2, ART2a) • Discrete Hopfield (DH) • Continuous Hopfield (CH) • Discrete Bidirectional Associative Memory (BAM) • Kohonen Self-organizing Map/Topology-preserving map (SOM/TPM)
The Hopfield NNs • In 1982, Hopfield, a Caltech physicist, mathematically tied together many of the ideas from previous research. • A fully connected, symmetrically weighted network where each node functions both as input and output node. • Used for • Associated memories • Combinatorial optimization
Associative Memories • An associative memory is a content-addressable structure that maps a set of input patterns to a set of output patterns. • Two types of associative memory: autoassociative and heteroassociative. • Auto-association • retrieves a previously stored pattern that most closely resembles the current pattern. • Hetero-association • the retrieved pattern is, in general, different from the input pattern not only in content but possibly also in type and format.
A memory memory Associative Memories Auto-association A Hetero-association Niagara Waterfall
Optimization Problems • Associate costs with energy functions in Hopfield Networks • Need to be in quadratic form • Hopfield Network finds local, satisfactory soluions, doesn’t choose solutions from a set. • Local optimums, not global.
Feedback Networksand Associative Memories Discrete Hopfield NNs 大同大學資工所 智慧型多媒體研究室
w1n w2n w3n w13 w23 wn3 w12 w32 wn2 w21 w31 wn1 . . . 1 2 3 n I1 I2 I3 In v1 v2 v3 vn The Discrete Hopfield NNs
w1n w2n w3n w13 w23 wn3 w12 w32 wn2 w21 w31 wn1 . . . 1 2 3 n I1 I2 I3 In v1 v2 v3 vn wij =wji wii =0 The Discrete Hopfield NNs
w1n w2n w3n w13 w23 wn3 w12 w32 wn2 w21 w31 wn1 . . . 1 2 3 n I1 I2 I3 In v1 v2 v3 vn wij =wji wii =0 The Discrete Hopfield NNs
State Update Rule • Asynchronous mode • Update rule Stable?
Energy Function Fact:E is lower bounded (upper bounded). If E is monotonicallydecreasing (increasing), the system is stable.
The Proof Suppose that at time t + 1, the kth neuron is selected for update.
Their values are not changed at time t + 1. Their values are not changed at time t + 1. The Proof Suppose that at time t + 1, the kth neuron is selected for update.
vk(t) Hk(t+1) vk(t+1) E E The Proof Stable 1 0 0 1 1 1 < 0 < 0 1 0 < 0 1 1 1 0 < 0
Feedback Networksand Associative Memories Continuous Hopfield NNs 大同大學資工所 智慧型多媒體研究室
vi=a(ui) Ii wi1 1 v1 wi2 ui v2 1 . . . vn win ui=a1(vi) ui gi Ci 1 vi 1 vi vi The Neuron of Continuous Hopfield NNs
Ii wi1 v1 wi2 v2 . . . vn win ui gi Ci vi vi The Dynamics Gi
I1 I2 I3 In w1n w3n w2n wn3 w13 w23 w12 w32 wn2 w31 w21 wn1 u1 u2 u3 un C1 g1 C2 g2 C3 g3 Cn gn . . . . . . v1 v2 v3 vn v1 v2 v3 vn The Continuous Hopfield NNs
I1 I2 I3 In w1n w3n w2n wn3 w13 w23 w12 w32 wn2 w31 w21 wn1 u1 u2 u3 un C1 g1 C2 g2 C3 g3 Cn gn . . . . . . v1 v2 v3 vn v1 v2 v3 vn The Continuous Hopfield NNs Stable?
Equilibrium Points • Consider the autonomous system: • Equilibrium Points Satisfy
CallE(y) as energy function. Lyapunov Theorem The system is asymptotically stable if the following holds: There exists a positive-definite function E(y) s.t.
I1 I2 I3 In w1n w3n w2n wn3 w13 w23 w12 w32 wn2 w31 w21 wn1 u1 u2 u3 un C1 g1 C2 g2 C3 g3 Cn gn . . . . . . v1 v2 v3 vn v1 v2 v3 vn Lyapunov Energy Function
u=a1(v) vi=a(ui) 1 1 v ui 1 1 v 1 1 Lyapunov Energy Function I1 I2 I3 In w1n w3n w2n wn3 w13 w23 w12 w32 wn2 w31 w21 wn1 u1 u2 u3 un C1 g1 C2 g2 C3 g3 Cn gn . . . . . . v1 v2 v3 vn v1 v2 v3 vn
Dynamics Stability of Continuous Hopfield NNs
u=a1(v) 1 v 1 Dynamics Stability of Continuous Hopfield NNs > 0
I1 I2 I3 In w1n w3n w2n wn3 w13 w23 w12 w32 wn2 w31 w21 wn1 u1 u2 u3 un C1 g1 C2 g2 C3 g3 Cn gn . . . . . . v1 v2 v3 vn v1 v2 v3 vn Stability of Continuous Hopfield NNs Stable
Local/Global Minima Energy Landscape
Feedback Networksand Associative Memories Associative Memories 大同大學資工所 智慧型多媒體研究室
Associative Memories • Also named content-addressable memory. • Autoassociative Memory • Hopfield Memory • Heteroassociative Memory • Bidirection Associative Memory (BAM)
Associative Memories Stored Patterns Autoassociative Heteroassociative
Feedback Networksand Associative Memories Associative Memories Hopfield Memory Bidirection Memory 大同大學資工所 智慧型多媒體研究室
Hopfield Memory Fully connected 14,400 weights 1210 Neurons
Memory Association Stored Patterns Example
Stored Patterns Example Memory Association
Stored Patterns Example How to Store Patterns? Memory Association
The Storage Algorithm Suppose the set of stored pattern is of dimension n. =?
Analysis Suppose thatx xi.
2 2 1 2 3 4 Example
1 1 1 1 2 2 1 2 3 4 1 1 1 1 1 1 1 1 Example E=4 Stable E=0 E=4
1 1 1 1 2 2 1 2 3 4 1 1 1 1 1 1 1 1 Example E=4 Stable E=0 E=4