1 / 43

Adaptive Neuro-Fuzzy Inference Systems (ANFIS)

ICS 581: Advanced Artificial Intelligence. Lecture 13. Adaptive Neuro-Fuzzy Inference Systems (ANFIS). Dr. Emad A. A. El-Sebakhy. Term 061 Meeting Time: 6:30 -7:45 Location: Building 22, Room 132. Crisp set A. Fuzzy set A. 1.0. 1.0. .9. Membership function. .5. 170. 170. 180.

jconger
Download Presentation

Adaptive Neuro-Fuzzy Inference Systems (ANFIS)

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. ICS 581: Advanced Artificial Intelligence Lecture 13 Adaptive Neuro-Fuzzy Inference Systems (ANFIS) Dr. Emad A. A. El-Sebakhy Term 061 Meeting Time: 6:30 -7:45 Location: Building 22, Room 132

  2. Crisp set A Fuzzy set A 1.0 1.0 .9 Membership function .5 170 170 180 Heights (cm) Heights (cm) Fuzzy Sets • Sets with fuzzy boundaries A = Set of tall people

  3. “tall” in Taiwan “tall” in the US “tall” in NBA Membership Functions (MFs) • About MFs • Subjective measures • Not probability functions MFs .8 .5 .1 180 Heights (cm)

  4. high small medium resistance = 5*speed Fuzzy If-Then Rules • Mamdani style • If pressure is high then volume is small • Sugeno style • If speed is medium then resistance = 5*speed

  5. Fuzzy Inference System (FIS) If speed is low then resistance = 2 If speed is medium then resistance = 4*speed If speed is high then resistance = 8*speed MFs high low medium .8 .3 .1 Speed 2 Rule 1: w1 = .3; r1 = 2 Rule 2: w2 = .8; r2 = 4*2 Rule 3: w3 = .1; r3 = 8*2 Resistance = S(wi*ri) / Swi = 7.12

  6. ANFIS: Mamdani’s model • Layer 1: input layer • Layer 2: input membership or fuzzification layer • Neurons represent fuzzy sets used in the antecedents of fuzzy rules determine the membership degree of the input. • Activation fn: the membership fn. • Layer 3: fuzzy rule layer • Each neuron corresponds to a single fuzzy rule. • Conjunction of the rule antecedents: product • Output: the firing strength of fuzzy rule R i, R i = Aj  Bk • The weights b/t layer 3 and layer 4 : the normalized degree (k.a. certainty factors) of confidence of the corresponding fuzzy rules. They’re adjusted during training. • Layer 4: Output membership layer • Disjunction of the outputs: C i = Rj  Rj = ∑ Rj: sum • the integrated firing strength of fuzzy rule neurons Rj and Rk. • Activation fn: the output membership fn. • Layer 5: defuzzification layer • Each neuron represents a single output. • E.g.) centroid method.

  7. ANFIS: Mamdani’s model • Learning • A various learning algorithm may be applied: Back propagation • Adjustment of weights and modification of input/output membership functions. • Sum-Product composition and centroid defuzzification wass adopted, a corresponding ANFIS was constructed easily. • Extra complexity with max-min composition – no better learning capability or approximation power. • More complicated than Sugeno ANFIS or Tsukamoto ANFIS

  8. ANFIS: Mamdani’s model

  9. Rule base • If X is A1 and Y is B1 then Z = p1*x + q1*y + r1 • If X is A2 and Y is B2 then Z = p2*x + q2*y + r2 • Fuzzy reasoning A1 B1 z1 = p1*x+q1*y+r1 w1 X Y A2 B2 z2 = p2*x+q2*y+r2 w2 X Y w1*z1+w2*z2 x=3 y=2 z = P w1+w2 First-Order Sugeno FIS

  10. Adaptive Networks • Architecture: • Feedforward networks with diff. node functions • Squares: nodes with parameters • Circles: nodes without parameters • Goal: • To achieve an I/O mapping specified by training data • Basic training method: • Backpropagation or steepest descent x z y

  11. Derivative-Based Optimization • Based on first derivatives: • Steepest descent • Conjugate gradient method • Gauss-Newton method • Levenberg-Marquardt method • And many others • Based on second derivatives: • Newton method • And many others

  12. x1 . . . xn Fuzzy Modeling Unknown target system y Fuzzy Inference system y* • Given desired i/o pairs (training data set) of the form • (x1, ..., xn; y), construct a FIS to match the i/o pairs • Two steps in fuzzy modeling • structure identification --- input selection, MF numbers • parameter identification --- optimal parameters

  13. Neuro-Fuzzy Modeling • Basic approach of ANFIS Adaptive networks Generalization Specialization Neural networks Fuzzy inference systems ANFIS

  14. Fuzzy reasoning B1 A1 z1 = p1*x+q1*y+r1 w1 w1*z1+w2*z2 z = w1+w2 A2 B2 z2 = p2*x+q2*y+r2 w2 y x • ANFIS (Adaptive Neuro-Fuzzy Inference System) A1 w1 w1*z1 P x A2 Swi*zi S B1 / z P w2*z2 y B2 S Swi w2 ANFIS

  15. Input space partitioning y A2 A1 B2 x B2 B1 B1 y x A2 A1 • ANFIS (Adaptive Neuro-Fuzzy Inference System) w1 A1 P w1*z1 x A2 P S Swi*zi B1 P / z y B2 P w4*z4 Swi w4 S Four-Rule ANFIS

  16. Neuro Fuzzy System • a neuro fuzzy system is capable of identifying bad rules in prior/existing knowledge supplied by a domain expert. • e.g.) 5-rule neuro-fuzzy system for XOR operation. • Use back propagation to adjust the weights and to modify input-output membership fns. • Continue training until the error (e.g. sum of least mean square) is less than e.g.) 0.001 • Rule 2 is false and removed.

  17. Neuro Fuzzy System • a neuro fuzzy system which can automatically generate a complete set of fuzzy if-then rules, given input-output linguistic values. • Extract fuzzy rules directly from numerical data. • e.g.) 8 rule neuro fuzzy system for XOR operation: 22 2 = 8 rules. • Set the initial weights b/t layer 3-4 to 0.5. • After training, eliminate all rules whose certainty factors are less than some sufficiently small number, e.g. 0.1.

  18. ANFIS Architecture: Sugeno’s ANFIS • Assume that FIS has 2 inputs x, y and one output z. • Sugeno’s ANFIS: • Rule1: If x is A1 and y is B1, then f1 = p1x+q1y+r1. • Rule2: If x is A2 and y is B2, then f2 = p2x+q2y+r2.

  19. ANFIS Architecture: Sugeno’s ANFIS • Layer 1: fuzzification layer • Every node I in the layer 1 is an adaptive node with a node function • O1,I = Ai(x) for i=1,2 or : membership grade of a fuzzy set A1,A2 • O1,I = Bi-2(y) for i=3,4 • Parameters in this layer: premise (or antecedent) parameters. • Layer 2: rule layer • a fixed node labeled  whose output is the product of all the incoming signals: • O2,I = wi = Ai(x)Bi(y) for i=1,2 : firing strength of a rule. • Layer 3: normalization layer • a fixed node labeled N. • The i-th node calculates the ratio of the i-th rule’s firing strength to the sum of all rules’ firing strength: O3,I = wi= wi / (wi + wi ) for i=1,2 • Outputs of this layer are called normalized firing strengths. • Layer 4: defuzzification layer • an adaptive node with a node fn O4,I = wifi = wi(pi x + qi y + ri ) for i=1,2 where wiis a normalized firing strength from layer 3 and {pi , qi ri } is the parameter set of this node – Consequent Parameters. • Layer 5: summation neuron • a fixed node which computes the overall output as the summation of all incoming signals • Overall output = O5, 1 = ∑ wi fi = ∑ wifi / ∑ wi

  20. ANFIS Architecture: Sugeno’s ANFIS • How does an ANFIS learn? • A learning algorithm of the least-squares estimator + the gradient descent method • Forward Pass: adjustment of consequent parameters, pi, qi, ri. • Rule consequent parameters are identified by the least-square estimator. • Find a least-square estimate of k=[r1 p1 q1.. r2 p2 q2 .. rn pn qn] , k*,that minimizes the squared error e=|Od – O|2. • E = e2 / 2 = (Od – O)2 / 2 • The consequent parameters are adjusted while the antecedent parameters remain fixed. • Backward Pass: adjustment of antecedent parameters • The antecedent parameters are tuned while the consequent parameters are kept fixed. • E.g.) Bell activation fn: [1 + ((x-a)/c)2b]-1. Consider a correction applied to parameter of a, a., a= a + a.. where

  21. ANFIS Architecture: Sugeno’s ANFIS • The structure of the network is not unique.

  22. ANFIS Architecture: Tsukamoto ANFIS • Tsukamoto ANFIS:

  23. ANFIS Architecture • Improvement: 2 input first-order Sugeno fuzzy model with 9 rules • How the 2-dimensional input space is partitioned into 9 overlapping fuzzy regions, each of which is governed by a fuzzy if-then rule. • i.e. The premise part of a rule defines a fuzzy region, while the consequent part specifies the output within the region.

  24. Questions?

  25. The End

  26. Automatics and Code Design: FUZZY LOGIC Systems GUI • Fuzzy Logic Systems Package facilitates the development of fuzzy-logic systems using: • graphical user interface (GUI) tools • command line functionality • The package can be used for building • Fuzzy Logic Expert Systems • Adaptive Neuro-Fuzzy Inference Systems (ANFIS)

  27. Graphical User Interface (GUI) There are five primary GUI tools for building, editing, and observing fuzzy inference systems in the Fuzzy Logic package: • Fuzzy Inference System (FIS) Editor • Membership Function Editor • Rule Editor • Rule Viewer • Surface Viewer

  28. Application: Employer Salary Raise Fuzzy Logic Model • Employer Salary Raise Model • Extension Principle: • one to one • many to one • n-D Cartesian product to y

  29. Fuzzy Logic Model for Employer Salary Raise • COMMON SENSE RULES 1. If teaching quality is bad, raise is low. 2. If teaching quality is good, raise is average. 3. If teaching quality is excellent, raise is generous 4. If research level is bad, raise is low 5. If research level is excellent, raise is generous • COMBINE RULES 1. If teaching is poor or research is poor, raise is low 2. If teaching is good, raise is average 3. If teaching or research is excellent, raise is excellent OUTPUT INPUT RULES OUTPUT TERMS INPUT TERMS (assigned) (interpreted)

  30. Generic Fuzzy Logic Code for Teacher Salary Raise Model %Establish constants Teach_Ratio = 0.8 Lo_Raise =0.01;Avg_Raise=0.05;Hi_Raise=0.1; Raise_Range=Hi_Raise-Lo_Raise; Bad_Teach = 0;OK_Teach = 3; Good_Teach = 7; Great_Teach = 10; Teach_Range = Great_Teach-Bad_Teach; Bad_Res = 0; Great_Res = 10; Res_Range = Great_Res-Bad_res; %If teaching is poor or research is poor, raise is low if teaching < OK_Teach raise=((Avg_Raise - Lo_Rasie)/(OK_Teach - Bad_Teach) *teaching + Lo_Raise)*Teach_Ratio + (1 - Teach_ratio)(Raise_Range/Res_Range*research + Lo_Raise); %If teaching is good, raise is good elseif teaching < Good_Teach raise=Avg_raise*Teach_ratio + (1 - Teach_ratio)*(Raise_Range/res_range*research + Lo_Raise); %If teaching or research is excellent, raise is excellent else raise = ((Hi_Raise - Avg_Raise)/(Great_Teach - Good_teach) *(teach - Good_teach + Avg_Raise)*Teach_Ratio + (1 - Teach_Ratio) *(Raise_Range/Res_Range*research+Lo_Raise); • Naïve model • Base salary raise + Work performance • Base + Development & research performance • Base + 80% Development and 20% research

  31. Fuzzy Logic Model for Employer Salary Raise Fuzzy Inference System Editor Rule Editor

  32. Fuzzy Logic Model for Employer Salary Raise Membership function Editor

  33. Fuzzy Logic Model for Employer Salary Raise Rule Viewer Surface Viewer

  34. INPUT OUTPUT RULES TEACHING RESEARCH RAISE OUTPUT TERMS INPUT TERMS 1. If teaching is poor or research is poor, raise is low 2. If teaching is good, raise is average 3. If teaching or research is excellent, raise is excellent (assigned) (interpreted) TEACHING QUALITY RESEARCH QUALITY RAISE (assigned to be: low, average, generous) (interpreted as good, poor,excellent) IF-THEN RULES if x is A the y is B if teaching = good => raise = average BINARY LOGIC FUZZY LOGIC p -->q 0.5 p --> 0.5 q Fuzzy Logic Model for Employer Salary Raise

  35. %===================================== % Initialization parameters for Fuzzy %===================================== % Number of training epochs % epoch_n = 10; % Number of membership functions assigned to an input variable numMFs = 2; % Type of the membership function or'gbellmf'; 'gaussmf'; 'trapmf'; % 'trimf'; MFTypes = 'gaussmf'; %MFTypes = 'trimf'; %MFTypes = varargin{3}; %For selected partition: The default is 5,'gbellmf' %in_fis = genfis1([inputMatrix outputColumn],numMFs,mfType); %in_fis = genfis2(x,perm,0.5); %For a grid partition epoch_n = 20; %in_fis = genfis2(X0Tr, YNr0(:,k),0.5); in_fis = genfis1([OT_tr_XN,OT_tr_YN(:,k)],numMFs,MFTypes); out_fis = anfis([OT_tr_XN OT_tr_YN(:,k)],in_fis,epoch_n);

  36. out_fis = anfis([OT_tr_XN OT_tr_YN(:,k)],in_fis,epoch_n); figure('name', ['Initial plots for the Membership functions for Y', ... num2str(k)],'NumberTitle','off'); for i=1:col_X [x,mf] = plotmf(in_fis,'input',i); subplot(col_X,1,i), plot(x,mf); xlabel('input 1'); end yHat = evalfis(OT_tr_XN ,out_fis)'; yHats = evalfis(OT_ts_XN ,out_fis)';

  37. Initial membership functions of the input variables Final membership functions of the input variables ANFIS for Classifying Salt data

  38. Final membership functions of the input variables ANFIS for Classifying Salt data

  39. ANFIS for Classifying Salt data

  40. ANFIS for Classifying Salt data • CCR = 0.94 • Conf_Mat_training = • 41.00 6.00 • 2.00 77.00 • CCR = 0.91 • Conf_Mat_test = • 24.00 5.00 • 0 24.00 • Computational time: = 2.59375

  41. ANFIS for Classifying Salt data

  42. ANFIS for Classifying Salt data

  43. References and WWW Resources • References: • “Neuro-Fuzzy and Soft Computing”, J.-S. R. Jang, C.-T. Sun and E. Mizutani, Prentice Hall, 1996 • “Neuro-Fuzzy Modeling and Control”, J.-S. R. Jang and C.-T. Sun, the Proceedings of IEEE, March 1995. • “ANFIS: Adaptive-Network-based Fuzzy Inference Systems,”, J.-S. R. Jang, IEEE Trans. on Systems, Man, and Cybernetics, May 1993. • Internet resources:

More Related