1 / 21

Data Mining and Semantic Web

University of Belgrade School of Electrical Engineering Chair of Computer Engineering and Information Theory. Data Mining and Semantic Web. Neural Networks: Backpropagation algorithm. Miroslav Ti šma tisma.etf @gmail.com. But the camera sees this:. What is this?. You see this: .

dawson
Download Presentation

Data Mining and Semantic Web

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. University of Belgrade School of Electrical Engineering Chair of Computer Engineering and Information Theory Data Mining and Semantic Web Neural Networks: Backpropagation algorithm Miroslav Tišma tisma.etf@gmail.com

  2. But the camera sees this: What is this? You see this: 23.12.2011. Miroslav Tišma 2/21

  3. Computer Vision: Car detection Not a car Cars Testing: What is this? 23.12.2011. Miroslav Tišma 3/21

  4. pixel 1 Learning Algorithm pixel 2 50 x 50 pixel images→ 2500 pixels (7500 if RGB) Raw image pixel 2 Cars pixel 1 intensity “Non”-Cars pixel 2 intensity pixel 2500 intensity Quadratic features ( ): ≈3 million features pixel 1 23.12.2011. Miroslav Tišma 4/21

  5. Neural Networks • Origins: Algorithms that try to mimic the brain • Was very widely used in 80s and early 90s; popularity diminished in late 90s. • Recent resurgence: State-of-the-art technique • for many applications 23.12.2011. Miroslav Tišma 5/21

  6. Neurons in the brain Dendr(I)tes Ax(O)n 23.12.2011. Miroslav Tišma 6/21

  7. Neuron model: Logistic unit “bias unit” “output” “weights” - parameters “input wires” Sigmoid (logistic) activation function. 23.12.2011. Miroslav Tišma 7/21

  8. Neural Network “bias unit” Layer 1 Layer 2 Layer 3 “output layer” “input layer” “hidden layer” 23.12.2011. Miroslav Tišma 8/21

  9. Neural Network “activation” of unit in layer matrix of weights controlling function mapping from layer to layer If network has units in layer , units in layer , then will be of dimension . 23.12.2011. Miroslav Tišma 9/21

  10. Simple example: AND -30 +20 +20 23.12.2011. Miroslav Tišma 10/21

  11. Example: OR function -10 +20 +20 23.12.2011. Miroslav Tišma 11/21

  12. Multiple output units: One-vs-all. Pedestrian Car Motorcycle Truck Want , , etc. , when pedestrian when motorcycle when car 23.12.2011. Miroslav Tišma 12/21

  13. Neural Network (Classification) total no. of layers in network no. of units (not counting bias unit) in layer Layer 1 Layer 2 Layer 3 Layer 4 • Multi-class classification (K classes) • K output units Binary classification 1 output unit E.g. , , , pedestrian car motorcycle truck 23.12.2011. Miroslav Tišma 13/21

  14. Cost function Logistic regression: Neural network: 23.12.2011. Miroslav Tišma 14/21

  15. Gradient computation Our goal is to minimize the cost function • Need code to compute: 23.12.2011. Miroslav Tišma 15/21

  16. Backpropagation algorithm Given one training example ( , ): Forward propagation: Layer 1 Layer 2 Layer 3 Layer 4 23.12.2011. Miroslav Tišma 16/21

  17. Backpropagation algorithm Intuition: “error” of node in layer . element-wise multiplication operator For each output unit (layer L = 4) Layer 1 Layer 2 Layer 3 Layer 4 the derivate of activation function can be written as 23.12.2011. Miroslav Tišma 17/21

  18. Backpropagation algorithm Training set Set (for all ). used to compute • For • Set • Perform forward propagation to compute for • Using , compute • Compute 23.12.2011. Miroslav Tišma 18/21

  19. Advantages: • Relatively simple implementation • Standard method and generally wokrs well • Many practical applications: • * handwriting recognition, autonomous driving car • Disadvantages: • Slow and inefficient • Can get stuck in local minima resulting in sub-optimal solutions 23.12.2011. Miroslav Tišma 19/21

  20. Literature: • http://en.wikipedia.org/wiki/Backpropagation • http://www.ml-class.org • http://home.agh.edu.pl/~vlsi/AI/backp_t_en/backprop.html 23.12.2011. Miroslav Tišma 20/21

  21. Thank you for your attention! 23.12.2011. Miroslav Tišma 21/21

More Related