340 likes | 477 Views
Theory and Applications of GF(2 p ) Cellular Automata. (LOGIC ON MEMORY). P. Pal Chaudhuri Department of CST Bengal Engineering College (DU) Shibpur, Howrah India. An Application Of LOGIC ON MEMORY. Logic on Memory. Basic Concept Classical Example Content Addressable Memory
E N D
Theory and Applications of GF(2p) Cellular Automata (LOGIC ON MEMORY) P. Pal Chaudhuri Department of CST Bengal Engineering College (DU) Shibpur, Howrah India
Logic on Memory • Basic Concept • Classical Example • Content Addressable Memory • Content Addressable Processor Bit Line Word Line Cell Comp =
Logic on Memory • Sub Micron era • Search • Storage of (Large) size table and efficient search • Memory + CA • Efficient storage and search of data with CA based Classifier
Logic-on-memory • Problem Definition • CA Based Solution Memory CA Memory Element XOR XNOR Logic Logic on Memory to Implement a specific function
GF(2p) CA as a Classifier • Classification ---- a universal problem • Given the Input, fast search for the attribute of an input element • Uses a Special Class of CA • Non Group Multiple Attractor CA (MACA)
Classifier • Design of a CA Based Classifier • Input is an element Cij ----- the classifier outputs Ai --- that is theCij belongs to class Ai • Implicit Memory • Fast Search LOGIC ON MEMORY Memory(Conventional & CA) + (CA)XOR Logic
8 10 9 11 13 15 4 7 1 12 14 00 11 01 10 2 6 5 3 0110 0101 0011 0000 0 4 8 7 10 9 11 1 13 2 12 15 14 6 5 3 0 00 01 Special Class Of CANon Group Multiple Attractor CA (MACA) MACA 10 11 D1 MACA
8 10 12 9 14 11 13 15 4 2 7 1 6 0 5 3 Problem Definition • Given Sets {P1} {P2} ………. {Pn} where each set {Pi} = {Xi1 , Xi2 , Xi3 …… Xim} • Given a randomly selected value Xkj • To Answer The Question Which Class does Xkj belong To?
8 10 12 9 14 11 13 15 4 2 7 1 6 0 5 3 Classifier • n bit CA with M Attractors is a natural Classifier • {0,3,5,6} Are the attractors • Inverted trees are the Attractor Basins
8 10 12 9 14 11 13 15 00 4 01 2 7 1 0011 0000 6 0 5 3 11 10 0110 0101 Classifier • Suppose we want to identify which class X = 7 lies in • The CA is loaded with X • CA is run in autonomous mode for k (=2) cycles where k is the depth of CA • The Pseudo Exhaustive bits (10 ) of the Attractor give the class of the pattern
13 15 1 01 00 12 14 3 0011 0000 2 0 1 13 2 12 15 14 3 0 00 01 Two Class D1 Classifier • We use Depth 1 CA (D1 CA) • We construct a CA satisfying the following 1. R1 x P1 and y P2 T (x y) 0 2. R2 T 2=T T (T I ) = 0 Depth 1 CA (D1 MACA)
13 15 1 12 14 00 01 2 3 0011 0000 0 Algorithm • Any CA Satisfying R1 & R2 is a classifier for P = { { P1} {P2} } • P1 = { 0,2,12,14} and P2 = { 3,1,13,15} • Each basin of CA will contain patterns from either P1 or P2 • 2 attractors
13 15 1 12 14 00 01 2 3 0011 0000 0 Algorithm • In general, there will be 2 n-r attractors ( n=Size of CA , r=Rank of T matrix ) • 2n-r PE positions at certain (n-r) positions • The two Classes can be identified by a single bit memory stored in a 2n-r x 1 bit memory or a simple logic circuit
8 10 9 11 13 15 4 7 1 12 14 00 11 01 10 2 6 5 3 0110 0101 0011 0000 0 Multiclass Classifier • But what about multi class classifier ? • A general CA based solution does not exist • However we can use hierarchical Two Classifier to build a solution
Multiclass Classifier • Hierarchical Two Class classifier • Built by partitioning the pattern set P • P = {P1, P2, P3 ,…Pn} as {{P1,P2,P3…Pk},{Pk+1,….Pn}} and finding a two class classifier for this • This is repeated for each subset • Number of CAs required is log2n where n is the number of classes
8 10 9 11 13 15 4 7 1 12 14 00 11 01 10 2 6 5 3 0110 0101 0011 0000 0 Multiclass Classifier Classes are • P1 = {0,2,12,14} • P2 = {3,1,13,15} • P3 = {5,7, 9,11} • P4 = {6,4,8,10}
8 10 9 11 13 15 4 7 1 12 14 00 11 01 10 2 6 5 3 0110 0101 0011 0000 0 Multiclass Classifier • Initially we built a Two Classifier to identify these two classes • Temp0 = {P1,P2} • Temp1 = {P3,P2} • Then two more Classifiers to identify {P1 and P2} and {P3 and P4} Temp 1 Temp 0
Temp0 Temp1 Temp 11 Temp 00 Templm Pn Pk P2 General Multiclass Classifier log2 n CA s
Multiclass Classifier in GF (2p) • Handles class elements of Symbol string rather than a bit string • A T matrix satisfying R1 and R2 is efficiently obtained using BDD in GF(2) • In GF (2p) we have introduced certain hueristics to get a solution T matrix in reasonably fast time
Application Areas • Fast encoding in vector quantization of images • Fault diagnosis
Image Compression • Target Pictures Portraits and similar images • Image size 352 x 240 ( CCIR size ) • Target compression ratio 97.5 % - 99 % • Target PSNR value 25 - 30 dB • Target application low bit rate coding for video telephony
B2 Blocks B1 Bi Bm Bn Algorithm Training Images • Used a training set of 12 pictures of a similar nature • The images were partitioned in sizes of 8 x 8 • These 8 x 8 blocks are clustered around 8192 pivot points using standard LBG algorithm
C1 C2 C8192 ….. C1 C2 Clusters Cn …. …. Pivot Points Algorithm • Elements are 64 length GF (2p) Symbol string --- 8 x 8 pixel block • Therefore we have 8192 clusters • And these can be addressed using 13 bits • A multi class classifier is designed for these 8192 classes • The depth of this classifier is 13 Codebook
Image Block Classifier Algorithm • The target image to be coded is divided into 8 x 8 blocks • Each of these blocks is input to the Multi Class Classifier • The Multi Class Classifier outputs the class id of the block • This is done in effectively 13 clock cycles plus some memory access times • Encoding time is thus drastically reduced Class id
Algorithm C1 C2 C8192 ….. C1 C2 Image Clusters Block Cn …. …. Classifier Pivot Points Training Images Codebook B2 Blocks B1 Bi Bm Bn
Sample Images • PSNR 27.8 db • Compression ratio 97.5 %
Sample Images PSNR 25.1 db Compression ratio 97.5 % PSNR 28.5 db Compression ratio 97.5 %
Schematic of a CA Based Vector Quantizer Memory CA Conf. CA PE bits Controller Shift Register Output
Hardware Design for CA Based Vector Quantizer
Improvements Over the Basic scheme • A hierarchical encoder has been implemented • The image is first encoded using 16 x 16 blocks …. • If a match cannot be obtained with any of the classes in the training set, then a match with 8 x 8 blocks is tried • This pushes up the Compression ratio to 99 %
Dynamic Classification • Static Database • The solution assumes the target pattern is present in the cluster set • If a new pattern outside this range is input , the classifier indicates No entry in The Database • So a linked queue of these new blocks is maintained • At periodic intervals, a new Multiclass Classifier is obtained using these updated data members after incorporating them in the appropiate classes