1 / 26

Neural Networks

Neural Networks. References. Neural network design, M. T. Hagan & H. B. Demuth & M. Beale PWS publishing company – 1996. Neural network: A comprehensive foundation, Simon Haykin, Prentice Hall – 1999.

khanh
Download Presentation

Neural Networks

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Neural Networks

  2. References • Neural network design, M. T. Hagan & H. B. Demuth & M. Beale PWS publishing company – 1996. • Neural network: A comprehensive foundation, Simon Haykin, Prentice Hall – 1999. • Application of Neural Networks to Adaptive Control of Nonlinear Systems, G. W. Ng, John Wiley & Sons Ltd, - 1997.

  3. Other Refrences • Intelligent Control Based on Flexible Neural Networks, M. Teshnehlab, K. Watanabe, Kluwer Academic Publication, 1999. • Neural Network for pattern recognition, Christopher Bishop, Clarendon Press Oxford – 1995. • Nonlinear system Identification (from classical Approacches to neural network and fuzzy models), Oliver Nelles, Springer – 2001. • Computational Intelligence, Andries P. Engelbercht, John Wiley & Sons Ltd, - 2002.

  4. Syllabuses • Introduction: Objectives, History, Applications, Biological inspiration, Math inspiration! • Neural Network (NN) & Neuron Model: Feed Forward NN, Recurrent NN, Hamming NN, Hopfield NN, Single Layer Perceptron, and Multi Layer Perceptron. • Learning Algorithms • Supervised Learning: early learning algorithms, first order gradient methods, second order gradient methods, LRLS and IGLS algorithms. • Unsupervised Learning: hebbian learning, Boltzman machine, kohonen self- organizing learning. • Population based optimization: genetic algorithms, ant colony, particle swarm optimization (PSO).

  5. Syllabuses 2 • Applicable NN • Recurrent NN: sub connection NN, Elman NN, Jordan NN, Elman- Jordan NN, and flexible recurrent NN. • Neuro- fuzzy network: RBF, NRBF, GMDH (adline), ANFIS, LLM. • Dynamic and memorized NN: dynamic neuron models, Gamma, CMAC. • Where are these NN used? • NN control strategies • Identification • Recognition

  6. تمرين ها • همزمان با معرفي بخشهاي مختلف درس 7 سري تمرين در طول ترم داده خواهد شد. قسمتي از اين تمرينات به فرم پروژه هاي کوچک (mini project) کامپيوتري، قابل انجام با نرم افزار MATLAB خواهد بود. هدف آشنايي بيشتر با الگوريتمهايآموزش، ساختارهاي متفاوت شبکه هاي عصبي و کاربرد هاي گوناگون شبکه هاي معرفي شده، مي باشد.

  7. پروژه نهايي • مهمترين قسمت درس و خروجي نهايي آن پروژه اي تحقيقاتي در زمينه کاربرد شبکه هاي عصبي است. پروژه به صورت تک نفره ويا حداکثر دو نفره تعريف و انجام مي شود. محدوديتي در انتخاب موضوع پروژه وجود ندارد، به غير از آنکه حتما بايد از روشها و موضوعات مطرح شده در درس استفاده شود و سعي کنيد از موضوعات جديد و نو استفاده کنيد. تعريف پروژه خود را حتما با من هماهنگ کنيد و آنرا به صورت يک پيشنهاد پروژه در يک برگ A4 در سه قسمت 1-عنوان پروژه 2-شرح مختصر و نوآوري مد نظر شما 3-کارهاي انجام شده در اين زمينه (حداقل 4 مرجع اصلي خود را ذکر کنيد). موضوع پروژه خود را حداکثر تا تاريخ 30 آذرانتخاب نموده و به من e-mail بزنيد. مهلت تحويل پروژه حداکثر تا تاريخ 15 اسفند مي باشد و به هيچ عنوان تمديد نخواهد شد.

  8. امتحان ميان ترم و پايان ترم • در اواخر آذر ماه امتحان ميان ترم برگزار مي شود که بخش عمده آن به مباحث تئوري و نظري شبکه هاي عصبي مر بوط مي شود و امتحان پايان ترم به صورت Take Home مي باشد و معادل 2 سري mini project است.

  9. شيوه ارسال e-mail • تکاليف و پروژه هاي خود را به آدرس network.neural@gmail.com ارسال کنيد و در عنوان e-mailخود حتما شماره پروزه را ذکر کنيد. همچنين تمام فايالها و کد نوشته شده را zip نموده و نام خود را به آن فايل اختصاص دهيد.

  10. شيوه ارزيابي • تمرينات و امتحان پايان ترم: % 40 • امتحان ميان ترم: % 25 • پروژه نهايي: % 40

  11. "Keep it simple, as simple as possible, but no simpler" E. Einstein

  12. Introduction

  13. Historical Sketch • Pre-1940: von Hemholtz, Mach, Pavlov, etc. • General theories of learning, vision, conditioning • No specific mathematical models of neuron operation • 1940s: Hebb, McCulloch and Pitts • Mechanism for learning in biological neurons • Neural-like networks can compute any arithmetic function • 1950s: Rosenblatt, Widrow and Hoff • First practical networks and learning rules • 1960s: Minsky and Papert • Demonstrated limitations of existing neural networks, new learning algorithms are not forthcoming, some research suspended • 1970s: Amari, Anderson, Fukushima, Grossberg, Kohonen • Progress continues, although at a slower pace • 1980s: Grossberg, Hopfield, Kohonen, Rumelhart, etc. • Important new developments cause a resurgence in the field

  14. Applications • Aerospace • High performance aircraft autopilots, flight path simulations, aircraft control systems, autopilot enhancements, aircraft component simulations, aircraft component fault detectors • Automotive • Automobile automatic guidance systems, warranty activity analyzers • Banking • Check and other document readers, credit application evaluators • Defense • Weapon steering, target tracking, object discrimination, facial recognition, new kinds of sensors, sonar, radar and image signal processing including data compression, feature extraction and noise suppression, signal/image identification • Electronics • Code sequence prediction, integrated circuit chip layout, process control, chip failure analysis, machine vision, voice synthesis, nonlinear modeling

  15. Applications • Financial • Real estate appraisal, loan advisor, mortgage screening, corporate bond rating, credit line use analysis, portfolio trading program, corporate financial analysis, currency price prediction • Manufacturing • Manufacturing process control, product design and analysis, process and machine diagnosis, real-time particle identification, visual quality inspection systems, beer testing, welding quality analysis, paper quality prediction, computer chip quality analysis, analysis of grinding operations, chemical product design analysis, machine maintenance analysis, project bidding, planning and management, dynamic modeling of chemical process systems • Medical • Breast cancer cell analysis, EEG and ECG analysis, prosthesis design, optimization of transplant times, hospital expense reduction, hospital quality improvement, emergency room test advisement

  16. Applications • Robotics • Trajectory control, forklift robot, manipulator controllers, vision systems • Speech • Speech recognition, speech compression, vowel classification, text to speech synthesis • Securities • Market analysis, automatic bond rating, stock trading advisory systems • Telecommunications • Image and data compression, automated information services, real-time translation of spoken language, customer payment processing systems • Transportation • Truck brake diagnosis systems, vehicle scheduling, routing systems

  17. Do a work without thinking!!!!!

  18. Dendrites Cell body Axon Synapse Biology Inspiration • Neurons respond slowly – 10-3 s compared to 10-9 s for electrical circuits • The brain uses massively parallel computation – »1011 neurons in the brain – »104 connections per neuron • Human nervous system is built of cells call neuron Each neuron can receive, process and transmit electrochemical signals Dendrites extend from the cell body to other neurons, and the connection point is called synapse Signals received among dendrites are transmitted to and summed in the cell body If the cumulative excitation exceed a threshold, the cell fires, which sends a signal down the axon to other neurons

  19. Math Inspiration • Radial basis model introduce as a general function approximator: • If desired, the biases wko can be absorbed into the summation by including an extra basis function do whose activation is set to 1. For the case of Gaussian basis functions we have • Hartman et al. (1990) give a formal proof of this property for networks with Gaussian basis functions in which the widths of the Gaussians are treated as adjustable parameters. A more general result was obtained by Park and Sandberg (1991) who show that, with only mild restrictions on the form of the kernel functions, the universal approximation property still holds. Further generalizations of this results are given in (Park and Sandberg, 1993).

  20. Transfer Functions

  21. Transfer Functions

  22. Transfer Functions

  23. Multiple-Input Neuron

  24. Layer of Neurons

  25. Abbreviated Notation

More Related