1 / 35

Neural Computing

Neural Computing. Yeni Herdiyeni Computer Science Dept. FMIPA IPB. Neural Computing : The Basic. Artificial Neural Networks (ANN) Mimics How Our Brain Works Machine Learning. Neural Computing = Artificial Neural Networks (ANNs). Machine Learning : Overview.

drake
Download Presentation

Neural Computing

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Neural Computing Yeni Herdiyeni Computer Science Dept. FMIPA IPB Computer Science Department FMIPA IPB 2003

  2. Neural Computing : The Basic • Artificial Neural Networks (ANN) • Mimics How Our Brain Works • Machine Learning Neural Computing = Artificial Neural Networks (ANNs) Computer Science Department FMIPA IPB 2003

  3. Machine Learning : Overview • ANN to automate complex decision making • Neural networks learn from past experience and improve their performance levels • Machine learning: methods that teach machines to solve problems or to support problem solving, by applying historical cases Computer Science Department FMIPA IPB 2003

  4. Neural Network and Expert System Different technologies complement each other • Expert systems: logical, symbolic approach • Neural networks: model-based, numeric and associative processing Computer Science Department FMIPA IPB 2003

  5. Expert System • Good for closed-system applications (literal and precise inputs, logical outputs) • Reason with established facts and pre-established rules Computer Science Department FMIPA IPB 2003

  6. Major Limitation ES • Experts do not always think in terms of rules • Experts may not be able to explain their line of reasoning • Experts may explain incorrectly • Sometimes difficult or impossible to build knowledge base Computer Science Department FMIPA IPB 2003

  7. Neural Computing Use : Neural Networks in Knowledge Acquisition • Fast identification of implicit knowledge by automatically analyzing cases of historical data • ANN identifies patterns and relationships that may lead to rules for expert systems • A trained neural network can rapidly process information to produce associated facts and consequences Computer Science Department FMIPA IPB 2003

  8. Benefit NN • Pattern recognition, learning, classification, generalization and abstraction, and interpretation of incomplete and noisy inputs • Character, speech and visual recognition • Can provide some human problem-solving characteristics • Can tackle new kinds of problems • Robust • Fast • Flexible and easy to maintain • Powerful hybrid systems Computer Science Department FMIPA IPB 2003

  9. Biology Analogy : Biological Neural Network • Neurons: brain cells • Nucleus (at the center) • Dendrites provide inputs • Axons send outputs • Synapses increase or decrease connection strength and cause excitation or inhibition of subsequent neurons Computer Science Department FMIPA IPB 2003

  10. Biology Analogy : Biological Neural Network Computer Science Department FMIPA IPB 2003

  11. Neural Network ? • Neural Network is a networks of many simple processors, each possibly having a small amount of local memory. • The processors are connected with communication channels (synapses). Computer Science Department FMIPA IPB 2003

  12. Neural Network (Haykin*) • Neural Network is a massively parallel-distributed processor that has a natural prosperity for storing experiential knowledge and making it available for use. Simon Haykin, Neural Networks: A Comprehensive Foundation, Prentice Hall Inc., New Jersey, 1999. Computer Science Department FMIPA IPB 2003

  13. Neural Net = Brain ? • Knowledge is acquired by the network through a learning process. • Inter-neuron connection strengths known as synaptic weights are used to store the knowledge. Computer Science Department FMIPA IPB 2003

  14. Neural Network Fundamentals • Components and Structure • Processing Elements • Network • Structure of the Network • Processing Information by the Network • Inputs • Outputs • Weights • Summation Function Computer Science Department FMIPA IPB 2003

  15. Processing Information in an Artificial Neuron Inputs Weights x1 w1j Output Yj Neuron j  wij xi x2 w2j  Summations Transfer function xi wij Computer Science Department FMIPA IPB 2003

  16. Learning : 3 Tasks 1. Compute Outputs 2. Compare Outputs with Desired Targets 3. Adjust Weights and Repeat the Process Computer Science Department FMIPA IPB 2003

  17. Training The Network • Present the training data set to the network • Adjust weights to produce the desired output for each of the inputs • Several iterations of the complete training set to get a consistent set of weights that works for all the training data Computer Science Department FMIPA IPB 2003

  18. Testing • Test the network after training • Examine network performance: measure the network’s classification ability • Black box testing • Do the inputs produce the appropriate outputs? • Not necessarily 100% accurate • But may be better than human decision makers • Test plan should include • Routine cases • Potentially problematic situations • May have to retrain Computer Science Department FMIPA IPB 2003

  19. ANN Application Development Process 1. Collect Data 2. Separate into Training and Test Sets 3. Define a Network Structure 4. Select a Learning Algorithm 5. Set Parameters, Values, Initialize Weights 6. Transform Data to Network Inputs 7. Start Training, and Determine and Revise Weights 8. Stop and Test 9. Implementation: Use the Network with New Cases Computer Science Department FMIPA IPB 2003

  20. Data Collection and Preparation • Collect data and separate into a training set and a test set • Use training cases to adjust the weights • Use test cases for network validation Computer Science Department FMIPA IPB 2003

  21. Single Layer Perceptron Computer Science Department FMIPA IPB 2003

  22. Each pass through all of the training input and target vector is called an epoch. Computer Science Department FMIPA IPB 2003

  23. Example : Computer Science Department FMIPA IPB 2003

  24. Computer Science Department FMIPA IPB 2003

  25. Computer Science Department FMIPA IPB 2003

  26. Disadvantage Perceptron • Perceptron networks can only solve linearly separable problems see:Marvin Minsky and Seymour Papert’s book Perceptron [10]. See XOR problem [10] M.L. Minsky, S.A. Papert, Perceptrons: An Introduction To Computational Geometry, MIT Press, 1969. Computer Science Department FMIPA IPB 2003

  27. Multilayer Perceptrons (MLP) Computer Science Department FMIPA IPB 2003

  28. MLP • MLP has ability to learn complex decision boundaries • MLPs are used in many practical computer vision applications involving classification (or supervised segmentation). Computer Science Department FMIPA IPB 2003

  29. Backpropagation Computer Science Department FMIPA IPB 2003

  30. Computer Science Department FMIPA IPB 2003

  31. X = -1 : 0.1 : 1; Y = [-0.960 -0.577 -0.073 0.377 0.641 0.660 0.461... 0.134 -0.201 -0.434 -0.500 -0.393 -0.165 0.099... 0.307 0.396 0.345 0.182 -0.031 -0.219 -0.320]; Normalisasi : pr = [-1 1]; m1 = 5; m2 = 1; net_ff = newff (pr, [m1 m2], {'logsig' 'purelin'}); net_ff = init (net_ff); %Default Nguyen-Widrow initialization %Training: net_ff.trainParam.goal = 0.02; net_ff.trainParam.epochs = 350; net_ff = train (net_ff, X, Y); %Simulation: X_sim = -1 : 0.01 : 1; Y_nn = sim (net_ff, X_sim); Computer Science Department FMIPA IPB 2003

  32. Backpropagation • Backpropagation (back-error propagation) • Most widely used learning • Relatively easy to implement • Requires training data for conditioning the network before using it for processing other data • Network includes one or more hidden layers • Network is considered a feedforward approach Computer Science Department FMIPA IPB 2003

  33. Externally provided correct patterns are compared with the neural network output during training (supervised training) • Feedback adjusts the weights until all training patterns are correctly categorized Computer Science Department FMIPA IPB 2003

  34. Error is backpropogated through network layers • Some error is attributed to each layer • Weights are adjusted • A large network can take a very long time to train • May not converge Computer Science Department FMIPA IPB 2003

  35. Next Time ….. ANFIS Neural Network By Ir. Agus Buono, M.Si, M.Komp Computer Science Department FMIPA IPB 2003

More Related