1 / 14

Artificial Neural Networks

Artificial Neural Networks . R. Michael Winters Music Information Retrieval Dr. Ichiro Fujinaga February 28, 2012. Outline. A brief history of ANN Quick facts about biological n eural networks Simple explanation of mathematical implementation Detailed mathematical implementation

yuki
Download Presentation

Artificial Neural Networks

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Artificial Neural Networks R. Michael Winters Music Information Retrieval Dr. Ichiro Fujinaga February 28, 2012

  2. Outline • A brief history of ANN • Quick facts about biological neural networks • Simple explanation of mathematical implementation • Detailed mathematical implementation • Weights • Activation Functions • Layers • Learning in ANN • What they can be used for • Evaluation

  3. A brief history • Rashevsky, 1938. Mathematical Biophysics. • McCulloch and Pitts, 1943. A logical calculus of the ideas immanent in nervous activity. • 1949, Hebb. The Organization of Behavior. • 1958, Rosenblatt. The perceptron: A probabilistic model for information storage and organization in the brain. • 1969, Minsky and Paperst. Perceptrons: an Introduction to Computational Geometry. • 1982, Hopfield. Neural networks and physical systems with emergent collective computational abilities. • 1986,RumelhartandMcClelland.Parallel Distributed Processing: Explorations in the Microstructure of Cognition.

  4. Quick Facts about Neurons • Biological Neural Networks • Firing every 10^-3 seconds • # of Neurons in nervous system: 10^10 • # of connections: 10^13 • Energetic efficiency: 10^-16 Joules per operation per second • Machines • Silicon is 10^-9 seconds • Energetic efficiency is 10^-6 Joule per operation per second (Haykin, 1999)

  5. A simple description • Learning AND • We want {x1=1, x2=1, d=1, x1=0, x2=0, d=0, x1=1, x2=0, d=0, x1=0, x2=1, d=0} • Begin w1 = 0.7 and w2 = 0.2 • Learning algorithm is w1 + or – 0.1. • Weights gradually change until desired result is achieved (Mehrotra, 1997)

  6. More Complicated Explanation • Many connections with many weights • Summed together at junction with bias • Activation function determines output of the neuron (Haykin, 1999)

  7. Types of Activation Function • Activation function is the output given the local field as determined by input connections • Simple examples are step, ramp, and sigmoid • Sigmoid is most commonly used • Strictly increasing function • Variable a can be changed to alter slope (Haykin, 1999)

  8. Architectures • Three types of network architecture • Single Layer • Very simple, but too simple for complex computations • Multi-layer • Very commonly used and maximizes computational simplicity • Feedback (or Recurrent) • Usually very complex and non-linear

  9. Most general multilayer network (Haykin, 1999)

  10. Acylic Network (Haykin, 1999)

  11. Simple Feedforward Network (Haykin, 1999)

  12. Learning • High Level • Supervised, Unsupervised, Reinforcement • Low Level • Correlation, Competitive, Cooperative, Feedback • Hybrid Systems • Use prior knowledge and invariances to maximize efficiency of network

  13. Uses and Evaluation • Uses • Classification, Clustering, Vector Quantization, Pattern Association, Function Approximation, Forecasting, Robotic Control, Optimization, Searching • Evaluation • Quality • Generalizability • Computation Time

  14. References • Haykin, S. 1999. Neural Networks: A Comprehensive Foundation. Upper Saddle River, New Jersey: Simon and Schuster. • Hebb, D. O. 1949. The Organization of Behavior: A Neuropsychological Theory. New York: Wiley. • Hopfield, J. J. 1982, April. Neural networks and physical systems with emergent collective computational abilities. Proceedings of the National Academy of Sciences of the USA 79 (8): 25542558. • McCulloch, W. S., and W. Pitts. 1943, December. A logical calculus of the ideas immanent in nervous activity. Bulletin of Mathematical Biology 5 (4): 115–133. • Mehrotra, K., C. K. Mohan, and S. Ranka. 1997. Elements of Artificial Neural Networks. Cambridge, Massachusetts: Massachusetts Institute of Technology. • Minsky, M., and S. Papert. 1969. Perceptrons: an Introduction to Computational Geometry. Cambridge, MA: The MIT Press. • Rashevsky, N. 1938. Mathematical Biophysics. Chicago: The University of Chicago Press. • Rosenblatt, F. 1958. The perceptron: A probabilistic model for information storage and organization in the brain. Psychological Review 65 (6): 386– 408. • Rumelhart, D. E., and J. McClelland. 1986. Parallel Distributed Processing: Explorations in the Microstructure of Cognition. Cambridge, MA: The MIT Press.

More Related